Is teacher performance data ready for primetime in Ohio?
The Los Angeles Times brought the concept of value-added education data front and center before the public when it conducted an analysis of teachers’ value-added scores in that city, and then published the findings – complete with teachers’ names attached – on its website.
The Los Angeles story is the most prominent example of a national movement to link student data to individual teacher performance. Federal Race to the Top dollars will fund more than a dozen states’ – including Ohio’s – efforts to incorporate value-added data into teacher evaluations. The Bill and Melinda Gates Foundation and other philanthropies, are investing millions in teacher effectiveness initiatives. In Ohio, public value-added results are available by district, school, grade level, and subject – teachers and administrators have access to more robust student-level data. And districts participating in the federally funded Teacher Incentive Fund program will use value-added data along with other measures to award performance bonuses to teachers.
Three state education experts interviewed by The Gadfly believe it is time to use value-added here not only for informing teachers about their students’ progress but also to evaluate the teachers themselves -- to help them improve, but, if necessary, dismiss ineffective ones.
“Clearly other districts use value-added as part of evaluation of performance, but the difference between [what happened in] Los Angeles and [what takes place in a district like] Winston Salem, N.C., is that they use value-added to assess but the data are not made public,” said Thomas J. Lasley, retired dean of the School of Education and Allied Professions at the University of Dayton. “Principals have access to it but parents do not. It’s more logical to have building leaders who understand the nuances...so they can know how to work with teachers.”
In Winston Salem, Lasley said, if low value-added student scores indicate a teacher is having problems, the teacher is placed on a professional development plan and if there’s no improvement, eventually, there’s a move toward dismissal.
While Lasley thinks it could be 20 years before value-added teacher data are accurate enough for public release, Indiana Superintendent of Public Instruction Tony Bennett plans to make the teacher data public – names and all – on the Indiana Department of Education’s website this year or next. Bennett thinks putting heat on teachers will lead to better teaching. “(If I’m a parent) I can go to the principal and say I want the best teacher for my child,” he told education policymakers in Cleveland recently.
Jim Mahoney, executive director of Battelle for Kids, the organization that pioneered the use of value-added data in Ohio, said the state’s value-added formula could be used right now for classroom-level data and evaluation. In fact, 45 districts, including Columbus, do so, he said.
But Mahoney cautions that using value-added data for evaluation purposes won’t work unless curricula and assessments are aligned, and that remains a major problem. Also, as powerful as value-added analysis can be, he said it should never be the sole measure of teacher effectiveness. Multiple measures such as attendance, student surveys of teacher performance, and knowledge of content are also important. “A lot of this is in the power of the relationship between the teacher and the student,” he said.
Beyond the classroom, Mahoney said value-added data could be used to track graduates of teacher colleges (Ohio promised in its winning Race to the Top application to connect student achievement data back to the colleges that prepared teachers) and to help clarify the ongoing debate about the effectiveness of alternatively trained teachers (like Teach For America corps members) versus those that are trained in colleges of education.
Identifying good teachers and understanding why they’re good is vital. “Some people are generating huge gains and it is not random luck,” he said. Even so, Mahoney issued a caution, “This is a very powerful measure but there’s no one measure that should be used or can capture the essence of great teaching.”
One problem with Ohio’s value-added measure, especially if it were used to evaluate and publically identify teachers, is that it is not transparent. The model in use is proprietary, and it’s not clear how the calculations are actually done.
But there are other models. Indiana uses one developed in Colorado. And Ohio’s participation in the Common Core State Standards Initiative could bode well for the future of value-added here. “As you get common standards those commons standards are going to drive us toward more common assessments and that will drive us toward more common value-added procedures,” Lasley said.
Another problem is that value-added measures are not applied to all school subjects. Right now, value-added analysis is done only in the fourth through eighth grades in reading and math, though Governor Strickland’s 2009 education reform legislation and the state’s Race to the Top grant promise to expand value-added across more grades and subjects and tie the results to the classroom level.
That’s a good thing, according to Deb Tully, director of professional issues for the Ohio Federation of Teachers. “It’s important to weed out bad teachers, she said. “[As a teacher,] I don’t want the kids the next year after they’ve had somebody terrible,” said Tulley.
Tulley said there needs to be agreement in a school district concerning how value-added data will be used. “When you locally develop a (teacher evaluation plan) everyone can agree on we don’t have a problem using value-added as one of the measures,” she said.
However, the measure must be fine-tuned first, she said, since Ohio’s current value-added model was never designed for teacher evaluation or (determining) compensation. “We are concerned about is using it incorrectly or using data that doesn’t give a clear picture of what a teacher is doing in the classroom,” she said.