Brevard Public Schools
Updated: Monday, June 23, 2014  














MultiGrade
National Boards Link

 

 

 

 

 

 

 

 

Printer Friendly Page

Evaluation/Implementation Procedures

 PROCEDURES FOR DOING IMPLEMENTATION AND INSERVICE EVALUATION WITH STUDENT ACHIEVEMENT EVIDENCE

 

Implementation Methods and Procedures

Two Kinds of Inservice  

  • Information meetings – designed to help target group do a better job, but not designed to be implemented with students   in a classroom.

  • Classroom implementation training- content, methods, or specialized training designed to be implemented in the classroom by teachers.

Back to the top

Professional Development Follow-Up Definitions

The method of follow up may be any of the following, depending on type of inservice activity and instructor preference. 

“M” – Structured Coaching/Mentoring (may include direct observation, conferencing, or lesson demonstration – the coach or mentor uses a printed document -  Observation Checklist or lesson plan - and this instrument is turned in to the inservice provider as evidence of follow up and as evaluation for the training).

“N” Action Research related to training (may be the Action Research Project tied to their Professional Development Plan – the teacher writes a SMART objective, teaches to it, and measures student achievement, providing the report to the inservice provider  as evidence of follow up and as evaluation for the training).

“O” – Collaborative Planning related to training (a document is provided by the inservice instructor, and the participants complete the document “back home” with collaboration of someone at the school on how they can use the training – the document is completed and returned to the inservice provider as evidence of follow up and as evaluation for the training).

“P” – Participant Product related to training (may include lesson plans, written reflection, audio/videotape, case study, or samples of student work – the teacher teaches a lesson plan, collects data on students in her classroom and records it, completes a tape or case study, or analyzes student work and completes a report – evidence is returned to the inservice provider as evidence of follow up and as evaluation for the training).

“Q” – Study Group Participation (“back home”, teachers participate in professional learning communities using information from the training as a starting point and adding to the knowledge base in the school on the topic – a report is given to the inservice provider as evidence of follow up and as evaluation for the training).

“R” – Electronic – Interactive (“back home”, teachers participate in online follow up via discussion boards or web site postings and share with each other – the inservice provider uses the postings as evidence of follow up and as evaluation for the training).

“S” – Electronic – Non-interactive (“back home”, teachers implement the training in their classroom and post data as requested in an email or discussion board to the inservice provider, who uses the postings as evidence of follow up and as evaluation for the training).

Back to the top

Special Procedures for Professional Development Day

Follow-Up

In general, follow up for workshops on PDD is the job of the school.  Administrators at the school will use an already-scheduled faculty meeting or schedule a special one to have teachers meet in departments/ grade levels to share and discuss how the workshops they attended on PDD can be implemented or used at the school level.  PDD workshops will have their individual inservice follow-up plan completed on the day of the workshop and turned in to the provider at that time.  Follow up for most workshops on PDD will not entail additional inservice points, unless the Site Inservice Representative plans formalized follow up sessions with inservice procedures at the school.

Back to the top

Professional Development Evaluation

Methods and Procedures

The current usage of “professional development evaluation” refers to STUDENT DATA showing success of the “training objectives” for which the training program was ultimately designed.  This terminology and these procedures are to be used with training programs involving classroom application and involve measuring student responses to new teaching methods or strategies.  If student responses are deemed positive, then the workshop or training program is determined to be “successful” and a worthwhile use of the district’s resources.  If student responses are less than desired, then the workshop or training program is suspect and may be discontinued. 

For classroom implementation training, student data MUST be collected and reported AFTER the date of the workshop and during or after implementation in the classroom by teachers.  The method of evaluation may be any of the following: 

“A” – Results of district-developed/standardized student test.  IT IS NOT RECOMMENDED THAT YOU USE FCAT data.  If you are implementing subject training attached to standardized student testing (such as reading tests given throughout the school year), use A on your inservice record for “evaluation method” and have the teacher report to the inservice provider the number of students expected to meet criteria and the number who did meet the criteria after the teacher implemented the training. 

“B” – Results of school-constructed student test.  Unless your school has school-constructed student tests, this is not an option. 

“C” – Portfolios of student work.  (An example of this might be an extended writing workshop of two days where teachers learn to implement a writing program, they go back to school, and they implement the program in class).  On the inservice record, the instructor uses a “P” for follow up and has the teachers write lesson plans for writing; the instructor uses a “C” for evaluation and has the teachers evaluate student portfolios of writing work for one month after the inservice.  The instructor sends a report containing both the number of lessons designed and the number of student portfolios expected to be “excellent” and the number judged to be “excellent” to the inservice provider one month after the inservice; this report serves to get the teacher a “completion code 1”, or credit for the inservice hours (and perhaps “follow up additional points” for the learning that occurred by doing the data collection, grading, and reporting to the inservice provider–this is the option of the provider). 

“D” – Checklist of student performance (An example of this might be a classroom management training of two days where teachers learn Harry Wong techniques, they go back to school, and they try out the techniques in class).  On the inservice record, the instructor uses a “P” for follow up and has the teachers write a case study on a non-achieving or misbehaving student including strategies they will use to get positive student engagement in the lesson; the instructor uses a “D” for evaluation and distributes a checklist for student performance indicating engagement in the lesson, positive comments / questions given in class by the student, and student achievement on a teacher-made instrument or test.  The teacher takes the checklist and collects data on the student after trying strategies learned in the inservice.  The teacher sends the completed case study and checklist to the instructor; they serve to get the teacher a “completion code 1”, or credit for the inservice hours (and perhaps “follow up additional points” for the learning that occurred by doing the case study and checklist implementation – this is the option of the provider). 

“E” – Charts and graphs of student progress

“F” – Other performance assessment

Back to the top

 

Contact us for questions on this site
2700 Judge Fran Jamieson Way     Viera, FL 32940     321-633-1000 ext. 170 
Disclaimer, Security Statement, Privacy Statement and Conditions of Use

  MS Excel Document Excel   MS Word Document Word Text Document Text  MS PowerPoint Document PowerPoint  Get Adobe Reader Get Adobe Reader  External Link External Link