Ten school districts and 19 schools that are receiving School Improvement Grants (SIG) continue to implement the state’s proposed teacher evaluation program, and NJEA continues to monitor the pilot.
To date, the emphasis in these schools and districts has been on training administrators and teachers in the new observation protocols that measure teacher practice. Districts were required to choose one of four teacher evaluation models to pilot in their districts (see the October and November 2011 NJEA Reviews for more information). Administrators were to receive three days of training, with teachers receiving two days of training in the various models. Initially all training was to be completed by Nov. 30; now the N.J. Department of Education (NJDOE) is reporting that all training should be completed by the end of January 2012.
The Evaluation Pilot Advisory Committee (EPAC) is monitoring the impact of the pilot program and is to provide feedback to the NJDOE. NJEA Secretary-Treasurer Marie Blistan is a member of EPAC.
“We are less concerned about the date the training is completed than the quality and thoroughness of the training,” notes Blistan. “We want to make sure evaluators are properly trained to evaluate using the new models and that teachers understand how their performance will be measured before anyone is observed or evaluated using the new domains proposed by the models.”
In addition, NJEA is meeting regularly with representatives of the Department of Education to resolve concerns about the pilot that emerge from conversations with local association representatives and members.
“We anticipate that we will have differences of opinion as we move forward with the pilot, but believe that wherever possible, we should work together to resolve problems in the best interest of students and teachers in New Jersey,” states Blistan.
The results of the pilot evaluation system will ultimately be used by the NJDOE to propose new teacher evaluation regulations for all teachers in New Jersey. A $1.1 million grant was offered to encourage school districts to participate in the program, which advances the recommendations released by Gov. Chris Christie’s Task Force on Educator Effectiveness in March 2011. The stated goals of the new evaluation system are to make teacher evaluation criteria more uniform and to accurately assess teacher effectiveness so educators can get meaningful feedback in an open and collaborative setting.
The centerpiece of the program is to increase student achievement as measured by standardized test scores. The base evaluation formula proposed by the Educator Effectiveness Task Force directed that at least 50 percent of the evaluation framework be based on “identified measures of student achievement,” with the remainder based on “demonstrated practices of effective teachers and leaders.”
The state’s original plan was for the new evaluation system to be implemented statewide next school year. But many, including NJEA, are questioning this ambitious timeline.
“This is too important to rush,” believes Blistan. “Our members’ careers, the culture of our schools, and the learning environment we provide for our students will be significantly impacted by these new regulations. We need to take the time to design new evaluation procedures that are not only fair and transparent, but a system that promotes best practices, teacher collaboration, and optimum student outcomes.”
Along with the Evaluation Pilot Advisory Committee, the program will be evaluated by an independent evaluator. The state is in the final stages of completing the contract negotiations with the researcher chosen to evaluate the pilot.
NJEA is working with the local associations in pilot districts to provide the necessary support to assure that the local association and educators in these districts have input into the pilot study. The Association is also monitoring the impact of the program on proposals to change teacher evaluation regulations.