Local takes the lead on evaluation

Published on Friday, February 7, 2014

Educators throughout the state seem to have one major thing in common – a shared frustration and confusion about the state’s new teacher evaluation requirements.

However, the members of the Moorestown Education Association (MEA) are feeling a little better than most. That’s because they took a proactive approach to ensuring that their members were prepared.

It started in 2007, when then-Superintendent, John Bach, initiated a conversation about implementing a new evaluation system in the district with MEA President Lisa Trapani.  At the time, the district was using a model based loosely on the Madeleine Hunter model.  After reviewing several, it was decided jointly, that the Danielson model was the best fit for the district.  The local and administration developed a new format and template for evaluations and classroom observations.  Staff members were properly trained and the new model was a success. 

“When the state determined that districts must choose a specific model, we had already been using the Danielson model for a few years,” explained Trapani.  “We still had to implement some kind of an on-line system though for tracking purposes.  We formed an evaluation committee in the spring of 2011.  The committee included our head building reps, some administrators and the superintendent. This committee is now our DEAC.”

Site visit affects decision-making

Teachscape seemed to be the tool of choice because it followed the Danielson model.  Before implementing it, MEA members and administrators did a site visit to Pemberton, a pilot 1 district, in May of 2012. 

“Our committee was not impressed at all with Teachscape, noted Trapani. “.There were many technical and logistical issues with the software, not to mention all of the wasted time spent on responding and attaching documents.”

As a result, the committee created sub-committees to work on the new observation tool, the new common language timelines, the professional development plan, the state requirements, and the process or plan on how to train staff.

“As president of the MEA, I requested from our then new superintendent a meeting with the Teachscape representatives before we paid for a program that we did not like or want to use.  All of our research on it indicated that it was not what we wanted to use in Moorestown,” added Trapani.   “After meeting with Teachscape, our committee told the superintendent that we did not want to use Teachscape as it did not meet our needs and would be a waste of $60,000.”

Committee develops own system

The committee decided to develop its own evaluation system.  The members researched the approved models on the NJDOE website and chose the Kim Marshall model.  The committee identified T-Eval as the on-line database that would feed into NJSMART. 

The sub-committees continued to work on the evaluation tool.  Each job category was asked to form a committee to write the criteria for each category of the evaluation tool.  Therefore, each group had input into their evaluation tool.  Speech and OT therapists, CST members, nurses, guidance counselors, media specialists and teachers wrote the criteria and the rating scale. This jointly developed, mutually agreed upon tool was then sent to

 T-Eval to be placed into a software program specifically designed to meet our needs.  In the early spring of 2013, the members of the DEAC committee piloted the new system.  The committee worked together with administration to iron out any glitches and to make corrections as necessary.

In May and June of 2013, all staff members were trained via a partnership between administration and the Moorestown Education Association on the new evaluation tool and the district and state requirements.  “We fully implemented the new evaluation tool in September of 2013 with very few problems,” said Trapani. “We received positive feedback on the process.  Our DEAC will meet in January to discuss the annual review document.”

“NJEA locals who are having difficulty navigating the new evaluation system should work closely with their NJEA field representative to ensure that their members are being properly trained and informed,” concluded NJEA UniServ Field Representative Patrick Manahan.

Bookmark and Share