Building and Maintaining Strong K-16 Partnerships Across State, Regional and Local Levels
Authors: Sheila Jones, Nancy Vandergrift

Contents
3. Design, Data & Analysis
Print Poster
« Back to Poster Hall
3. Design, Data & Analysis
Next »

From its inception, PRISM partners have taken a data-driven approach to monitor and track changes and to document successes and challenges. PRISM used a combination of quantitative and qualitative data to evaluate their partnerships. The state leadership team developed their own assessment and management tools and evaluators monitored PRISM partnerships. The following briefly describes quantitative and qualitative data analyses.

Through the state leadership team a partnership rubric was developed for the regions to use to judge progress in order to help determine the quality and strength of each facet of their partnership. The partnership rubric measures progress in: shared vision and goals; communication; decision making; responsibility and accountability; and change and sustainability. Partners use the rubric to determine the strengths and weaknesses in their partnership(s) and to make changes and improvements as needed.

PRISM's ten strategies were mapped on to the five key features of MSP: partnership-driven; teacher quality, quantity, and diversity; challenging courses and curriculum; evidence-based design and outcomes; and institutional change and sustainability. The five key features were used to develop the PRISM Management Tools to monitor progress. On an annual basis, each region completes the regional sections and the entire state leadership team completes the rest of the document. Ratings are entered each year for the 166 items in the Tools document. Sixty-seven of the 166 ratings measure progress on the MSP key feature - Partnership Driven. Ratings go from "no progress" to "sustained" across scales that measure a practice from emerging up to change institutionalized. Results are used by the state and regions to determine where progress is being made and sustained and where the partnership needs to place additional emphasis in order to achieve the goals of PRISM.

Monitoring the multi-faceted partnerships at every level entailed ongoing evaluation from internal and external evaluators who comprised the PRISM Evaluation Team. Each region was assigned an internal and external evaluator to assess progress toward PRISM goals, collect data for PRISM benchmarks, provide feedback to both the state leadership team and the region for continual improvement, and disseminate findings. At the request of the PI and the state leadership team, the evaluation team completed a case study on partnerships in Spring 2007. Since the beginning of PRISM, evaluators have attended state leadership team meetings and regional coordinating committee meetings to observe the partnership(s) at work. Interviews are conducted annually with the regional co-PIs and K-12 coordinators, higher education faculty and their administrators, as well as teachers and their administrators. Additional sources of data included meeting minutes and reports, handouts and other meeting materials, conversations, and telephone interviews.