Evaluation Planning

Indicator: Measurable performance goals for the dispute resolution system have been set and evaluation activities are outlined.

Stakeholders provide input on what success looks like and assists with the development of measurable performance goals (beyond those required by the SPP/APR) for the dispute resolution system. Evaluation activities and responsible persons are identified, along with timelines. Evaluation activities include the collection, analysis, summarization, and review of both process and outcome data.
Examples: logic model, performance measures, table of evaluation activities

Process & Practitioner Evaluation

Indicator: Process and practitioner effectiveness are evaluated and monitored (e.g., participant satisfaction surveys, interviews, expert review of decisions, practitioner self-assessments, observations).

Data are collected to examine ease of access, participant preparation, implementation fidelity, and service delivery satisfaction. Data informs improvement activities such as practitioner professional development and personnel assignments.
Examples: surveys completed by participants and practitioners, observations, interviews with practitioners, timelines        

System Use/Outcomes

Indicator: Data on system use and outcomes are compiled, analyzed, and summarized to improve system design and implementation.

Data are collected to examine who (including underserved groups) is using the system and how processes are accessed. Data is used to identify those who are not accessing the system and improvements are made to reduce barriers. Data are also collected to examine immediate, intermediate, and long-term outcomes. 
Examples: demographics of people making requests, number of requests, number of withdrawals, number of successful resolutions, trends in system use over time, cost and cost-effectiveness of certain processes, impact on relationships, durability of resolutions 

Analysis and Reporting

Indicator: Data are analyzed and findings are reported to various stakeholder groups and the public.    

Data is analyzed to determine gaps in access and service delivery. Information about system improvement activities and progress is regularly provided to leadership, staff, and stakeholders.  Required reporting activities are completed and requests for information are addressed in a timely manner.

Examples: responses to individual requests, regular updates to advisory groups, written reports, request necessary funding

System Improvement

Indicator: Data are used to monitor how the system and processes are performing to guide improvement activities.

Program leadership, staff, and stakeholders use data to regularly make improvements. Issues that are raised in DR processes are aggregated and used to inform monitoring and other activities aimed at improving the quality of educational programming.
Examples: data reviewed quarterly by team, website revised to improve access, professional development activity developed and implemented to meet new need of practitioners         

See Related Resources