Rising Above a Gathering Storm of Data: Triangulating Findings Through Formative and Summative Evaluation Methodologies
Author: Sonya Martin

Print Poster
« Back to Poster Hall
2. Claims Examined
Next »

In evaluating the success of teacher development programs, measures of teaching practice that are valid, reliable, and scalable are needed. Previous investigations of changing teacher practice as a result of professional development have utilized several techniques to measure practice. Self-report questionnaires have the benefit of ease of administration and the possibility of large sample size, aiding statistical analysis. One limitation of this strategy, however, is that the researcher cannot distinguish between high and low quality of implementation of strategies. Research clearly indicates that how a teacher implements a reform, not merely its presence, influences the effectiveness of that reform in the classroom (Brown & Campione, 1996). Direct observation by a trained evaluator using an instrument such as the Reformed Teaching Observation Protocol (Sawada, et al, 2002) or the Approaches to Teaching Inventory (Trigwell & Prosser, 2004) addresses this issue. One drawback of this approach is that the number of teachers and classes that can be observed is resource limited and therefore not highly scalable. Regardless of the measurement method used, alignment of teacher with the reform agenda or teacher education curricula being studied is vital.

The evaluation for this MSP involves a variety of qualitative and quantitative measures, such as RTOP observations, interviews, surveys, video/audio analysis, and other instruments to characterize the teaching and learning in this program. In an attempt to provide another means to triangulate data gathered from participants to make sense of how participation in this MSP is informing teaching practices, we have developed, validated and piloted the Science Lesson Plan Analysis Instrument (SLPAI) for quantitative evaluation of teacher-submitted multi-day lesson plans. This method is complementary to traditional tools such as teacher surveys and direct observational protocols. A pilot study was designed to track and describe changes in teaching practice and pedagogical knowledge at the individual and cohort levels over time, and thereby provide evidence of program effectiveness. Our goal was to capture the extent to which our teacher development program is meeting its goals of increasing teacher content and pedagogical knowledge and impacting teaching practice in grades 5-12 science classrooms. The purpose of this presentation is to share the development and validation the SLPAI, to demonstrate its use in a pilot study examining teacher change as a result of program instruction, and to discuss how these findings are informing program changes. From this presentation, we seek to demonstrate that SLPAI is a unique and powerful tool for measuring teaching practices over time, especially when used in concert with other measures. We offer the SLPAI as a complementary, but not redundant instrument to measure teaching practice, and which is a more easily scalable method than direct observation.