Rating Teachers: The Trouble with Value-Added Data
Most people assume teachers are held accountable for student learning — seems obvious considering why we have teachers in the first place. In fact, until last year, five states had laws explicitly banning the use of student achievement data in teacher evaluations and only four states required that a teacher’s evaluation be based primarily on students’ test scores. But since the launch last summer of the federal “Race to the Top” program, in which states compete for grant money by implementing education reform, 12 states have passed legislation to improve their teacher evaluations, and all the data “firewalls” are gone.
But in a sign that the push to improve teacher evaluations is at once moving too fast and too slow, the Los Angeles Times last month published a searchable database of the “value-added” scores of 6,000 Los Angeles teachers, boiling down each educator’s effectiveness in the classroom into a single stat. Researchers, teachers’ unions, nonprofit advocates and even Secretary of Education Arne Duncan rushed to praise or condemn the landmark data dump. The numbers had been obtained via California’s Public Records Act and then crunched by a seasoned education analyst. But most parents were baffled — what does “value-added” mean, anyway?