Indicator: Measurable performance goals for the dispute resolution system have been set and evaluation activities are outlined.
Process & Practitioner Evaluation
Indicator: Process and practitioner effectiveness are evaluated and monitored (e.g., participant satisfaction surveys, interviews, expert review of decisions, practitioner self-assessments, observations).
Data are collected to examine ease of access, participant preparation, implementation fidelity, and service delivery satisfaction. Data informs improvement activities such as practitioner professional development and personnel assignments.
Examples: surveys completed by participants and practitioners, observations, interviews with practitioners, timelines
Indicator: Data on system use and outcomes are compiled, analyzed, and summarized to improve system design and implementation.
Data are collected to examine who (including underserved groups) is using the system and how processes are accessed. Data is used to identify those who are not accessing the system and improvements are made to reduce barriers. Data are also collected to examine immediate, intermediate, and long-term outcomes.
Examples: demographics of people making requests, number of requests, number of withdrawals, number of successful resolutions, trends in system use over time, cost and cost-effectiveness of certain processes, impact on relationships, durability of resolutions
Analysis and Reporting
Indicator: Data are analyzed and findings are reported to various stakeholder groups and the public.
Data is analyzed to determine gaps in access and service delivery. Information about system improvement activities and progress is regularly provided to leadership, staff, and stakeholders. Required reporting activities are completed and requests for information are addressed in a timely manner.
Examples: responses to individual requests, regular updates to advisory groups, written reports, request necessary funding
Program leadership, staff, and stakeholders use data to regularly make improvements. Issues that are raised in DR processes are aggregated and used to inform monitoring and other activities aimed at improving the quality of educational programming.
Examples: data reviewed quarterly by team, website revised to improve access, professional development activity developed and implemented to meet new need of practitioners