Testimony before the State Board of Education

TEACHNJ
by NJEA Associate Director of Government Relations Francine Pfeffer
June 5, 2013

Good afternoon. I am Francine Pfeffer, an associate director of government relations for NJEA.

From the very beginning of the pilot of the new evaluation system in 2011, NJEA has been monitoring its implementation in the pilot districts. We’ve gathered information through interviews and focus groups of our members, all in an effort to ensure that the new system was fair and focused on what really matters: teaching and learning. Last year we also reached out to teachers who work in the Pilot II districts.

When it comes to putting the new system into place, we discovered one over-arching theme: the need for time.

Training teachers and principals in a new instrument to measure teacher practice takes time. Ensuring that everyone from the superintendent to the kindergarten teacher understands the metrics of a new formula takes time. Creating the technological infrastructure needed to gather testing data and allow for the use of online observation rubrics takes time. Reading and comprehending the mountain of memos and materials the Department has generated on the new evaluation system takes time.

There is plenty of research that addresses the need for a slow and measured approach to implementing a new teacher evaluation system. I’m only going to quote one report, and it comes from the independent researcher that the Department hired to analyze our own pilot program in New Jersey.

In a research brief released by the Rutgers Graduate School of Education just last month, school districts were provided with strategies for training on a teacher practice instrument. The advisory served another essential purpose: it reinforced the notion that teachers and observers need time to develop concrete experience with the procedures and materials associated with the new system.

The Rutgers researchers said: “The learning curve for schools and districts implementing the rigorous teacher practice evaluation instruments required by law is steep in the first two years, although it eases somewhat in the second year.”

While it’s comforting to know that the learning curve eases in the second year, it’s important to note that the overwhelming majority of New Jersey’s school districts have had less than one year to train their teachers and administrators in their teacher practice model.

The Department has said that many districts have adopted models that aren’t that different from what was used in the past. But according to the brief prepared by Dr. William Firestone and his fellow researchers, “Even districts with some past experience with systematic forms of teacher observation have a great deal to learn.”

That’s because during the first year under a new system, teachers and observers focus on the procedural changes because, as the Rutgers researchers explain, “learning to operate the mechanics of a system is extremely time consuming.”

The brief also explains that it is also during this initial phase that educators assess the fairness of the new system. A belief that teachers and principals will be treated equitably is critical to the success of this endeavor.

As Firestone put it, “A teacher must be assured that the observation process is fair and open to learning from the feedback provided through observations.”

Implementing a system that pilot districts acknowledge takes up to two years to get right when other districts have had less than a year to prepare hardly creates a sense of fairness.

When asked why we need to rush into the new system, Department officials point to the TEACH NJ Act, saying that they are bound by law to execute a new system this fall.

The truth is, TEACH NJ does allow for flexibility here. The law reads that, starting in September 2013, school districts must be using an approved evaluation rubric for all teachers in the district and that the rubric must be “partially based on multiple objective measures of student learning.”

Under the proposed system, student learning will be measured through Student Growth Objectives, or SGOs, and Student Growth Percentiles, or SGPs.

Although we believe that all educators should have the benefit of a practice year in creating SGOs, they represent, in fact, what great teachers have always done—set goals for groups of students and work together with students to achieve those goals.

The use of standardized test scores to generate a student growth percentile, however, represents uncharted waters. Despite the Department’s insistence that research proves that this is the best way to assess teacher effectiveness, this is a complex calculation.  Even Colorado is taking a more measured approach to implementation, calling the first year a hold harmless year.  Scores of ineffective or partially effective in the first year of its new evaluation system will have no consequences to tenured status.  The use of SGPs is NOT required by the TEACH NJ Act.  

New Jersey is proud to be first in the nation in graduation rates and writing scores and the number of students taking Advanced Placement classes. And it’s because of this outstanding record of accomplishment in our public schools that I urge you to move with extreme caution in basing teacher evaluation on standardized test scores. We DO NOT want to be the first in the nation in implementing bad policy.

Since TEACH NJ was passed unanimously, it’s safe to assume that every legislator wants it to be carried out successfully, and I’m sure the members of the State Board of Education want that too.

It’s important to remember that this is not the only new initiative facing our schools. Our state’s teacher evaluation system is being put into operation at the same time that districts are transitioning to the Common Core Standards and preparing for the technological demands of the upcoming PARCC assessments.

Department officials have stated their belief that all of these changes must be implemented at the same time. They argue that a new evaluation system is intertwined with these other reforms.

They are wrong. A process that strengthens teaching and learning is timeless, and it should be created and carried out in such a way that ensures it success.

NJEA has long stood for high standards for teachers that emphasize student learning, and the Association’s support for meaningful teacher evaluations is well known.

But something that is worth doing is worth doing right. The department must recognize that while it’s been steeped in the preparation of this new evaluation system, teachers across the State of New Jersey have been busy doing what they do every day: developing young minds and helping each and every child grow into a productive and confident adult. These professionals deserve to be given the time to understand a new system that will impact the great work that they do and could affect their livelihoods as well.

We urge the State Board of Education to step on the brakes when it comes to the new system of teacher evaluation. Give New Jersey’s educators another year to interact with the teacher practice instruments chosen by their districts. Give principals and other administrators another year to hone their skills as observers and reorganize their management teams in the face of the increased demands of these regulations. Most important, give the Department time to analyze and present to the public what the pilot districts have learned about the use of student growth data to measure teacher effectiveness. We all deserve to see the final research report from the independent evaluator before we implement a system that affects so many.