6 steps of program evaluation




















Inputs i. As a health educator, you want to be sure that the evaluation is useful not only to your health department but also to the partner organizations that help to implement the program. To ensure buy-in and later use of the evaluation, ask stakeholders to develop questions that they would like to have answered.

For example, the YMCA staff may want to know whether their membership increases. Increased membership becomes an outcome in the logic model. Note that the logic model can be made from right to left, that is, by asking, "How will we increase membership? Or it can be made from left to right, by asking, "Why are we doing that?

Either way, a focused evaluation will be one that poses questions based on the program and one that results in answers that serve the purpose of the evaluation. The purpose often will be to improve the program; other purposes may include gaining insight and assessing program effects.

Defining your purpose is an important component of this step. In our example, the stakeholders have already agreed on a logic model in step 2 Figure , and so they can use it to focus their questions. They might decide to ask both process and outcome evaluation questions. Process questions relate to the inputs and activities, and outcome questions relate to the expected outcomes.

It is possible to generate a long list of possible questions from the logic model, but then the list needs to be prioritized. Evaluating all questions may not be essential or even feasible.

The stakeholders should remember the purpose of the evaluation and decide what would be useful for decision makers in prioritizing the list of questions. Examples of process and outcome questions include the following:. Example of a logic model for an intervention to create or enhance access to physical activity PA combined with informational outreach activities.

To answer the questions posed in step 3, evidence needs to be collected. How much evidence quantity and what kind of evidence quality are central to feasibility and accuracy. There must be a balance between collecting enough data and assuring it is of high quality. Sometimes a mix of quantitative and qualitative data will help achieve that balance: quantitative data can provide the numbers you need to answer some questions e. Data are available from people, documents, observations, and existing information.

The Table provides a guide to collecting data for process and outcome questions by indicators, data sources, and performance measures. Indicators are what answer the question, data sources are the methods by which you collect data about the indicators, and performance measures are the outcomes you would like to achieve.

It is helpful to have more than one indicator and more than one data source to answer each evaluation question. Using multiple indicators and data sources is often called triangulation and is recommended to increase accuracy. There are many tools available for collecting physical activity data.

There are three parts to this step: 1 analyze the data, 2 interpret the results, and 3 make judgments about the program. Having the performance measures helps to justify your conclusions. Perhaps a community college student needs an internship.

You can hire him or her often without financial compensation to help with the evaluation. With guidance from you and the supervising professor, the student can analyze the data. Analysis for some questions will be easier than for others. For example, the difference between participation rates preintervention and postintervention is simple math, whereas analyzing focus group and interview data takes more time because all of the text must be read and common themes identified to answer the appropriate evaluation question.

After the analyses, you should convene a meeting of stakeholders to go over the results. Talk about possible alternative explanations to the findings of the evaluation. Additional tip : Often times, program evaluations can be politically charged; to mitigate the risk of political influences, we recommend you obtain an objective third-party analysis.

Not only is it important for the analysis to be completed by an organization with a reputable research background and from a neutral standpoint, it is also imperative that it addresses all the questions key stakeholders reviewing it may have. After identifying audiences and stakeholders, an important step is to work collaboratively to shape the questions being asked, determine the way in which the results will be communicated, and decide what the tone, applicability, and actionable advice of the results should look like.

Please note : Sometimes this collaboration requires primary research that can be time-consuming and expensive if done on an ad-hoc basis. A common challenge Hanover Research hears from partners is how to create a comprehensive report that is actionable and digestible for decision makers and stakeholders who may not be intimately involved with the program. Creating a list of questions can provide stakeholders with a focused scope and the ability to prioritize analysis and recommendations in a summary report.

Partners will typically task Hanover to assist with creating such a list of questions and the necessary data points in order to ensure the validity and reliability of the report. It is important to consult with administrators and staff tasked with conducting the evaluation and to assess realistic timelines and available bandwidth.

Does your district possess the dedicated internal resources to prioritize, schedule, and proceed with the evaluation process in a timely fashion? For several partners, Hanover has served the dedicated source for program evaluations, resulting in a significant increase in the number of discrete program evaluations being completed within each academic year.

By creating a map of the necessary stages of an analysis, and by assigning responsibilities and timelines to those stages, everyone can easily be held accountable as the evaluation progresses. It will also be easier to pinpoint breakdowns within the process. This is the stage at which many organizations become overwhelmed. With several different people wearing multiple hats, it can become tricky to course-correct and complete the evaluation on time. In performing the above planning steps, your organization should be well-positioned to successfully execute an evaluation.

However, if you have additional questions or would like expert assistance with program evaluation planning or analysis, e-mail us to speak with a K research director. Quantify the economic impact of your project, place or organisation. Our flexible methodology suits government, tourism and decision-makers.

What is the big picture? What change outcome has been made against your previous results, peers, and programs? The benefit of applying an evaluation framework is realised here. With access to consistent evidence, your team is empowered to continually learn and grow your impact.

Our professionally designed published reports are a striking and effective way to showcase your outcomes, performance and achievements. A dashboard for teams and large projects that brings your outcomes, outputs and third-party data in one place. With consistent, outcomes-based data, you can easily demonstrate your value and realise your impact. Through deliberate evaluation planning, the process has delivered results. Your data is aligned to your goals. Your goals underpin your strategy.

Focus the evaluation design to assess the issues of greatest concern to stakeholders while using time and resources as efficiently as possible. Consider the purpose, users, uses, questions, methods and agreements. Gather credible evidence to strengthen evaluation judgments and the recommendations that follow. These aspects of evidence gathering typically affect perceptions of credibility: indicators, sources, quality, quantity and logistics.

Justify conclusions by linking them to the evidence gathered and judging them against agreed-upon values or standards set by the stakeholders. Ensure use and share lessons learned with these steps: design, preparation, feedback, follow-up and dissemination. For additional details, see Ensuring Use and Sharing Lessons Learned pdf icon as well as a checklist pdf icon of items to consider when developing evaluation reports. Skip directly to site content Skip directly to page options Skip directly to A-Z link.

Section Navigation.



0コメント

  • 1000 / 1000