Part VI
C. Three Evaluation Scenarios

Outlined below are some examples of scenarios in which it makes sense to do an evaluation. Each scenario attempts to identify the following:

• The audience(s) for the evaluation;
• What information the audience needs for making decisions;
• What tools and methods are to be used in obtaining this information;
• Who carries out the evaluation;
• Evaluation outcomes;
• Use of evaluation information in the decision making process.




Parts of this analysis have already been described above. A local non-profit organization that is active in human rights initiated an after-school human rights education program for junior high school students run in a church facility near the local junior high school. It started very successfully but, over the years, enrollments began to dwindle. The decline was especially marked during the last school year. The board of the non-profit decided they needed an evaluation to help them figure out: (a) whether to continue the program; and (b) if they were to continue the program, what needed to be done to reinvigorate it.

The audience(s) for the evaluation:

• The person in charge of hiring the teachers and running the program;

• Members of the board of directors of the non-profit organization.

What information the audience needs for making decisions:

• Why were fewer parents enrolling their children in the program?

• What was the quality of the training (e.g., curriculum, materials, delivery by the instructors)?

• Were there any internal problems with logistics or communications that they should be aware of?

• What did the youth who attended like or not like about the program?

• Were other activities available that might be more appealing to parents and/or youth?

Tools and methods for obtaining this information:

• A questionnaire sent home to parents asking them why they enrolled their children in the program, what were they hearing about it from their children, and what they thought their children were getting out of the program;

• Semi-structured interviews with a sample of students (e.g., why they were enrolled in this program; what they liked/didn't like about it; what they thought of the teachers and the materials; what would make it more interesting);

• An open-ended interview with the teachers to determine what they liked about the program, what most frustrated them, their assessment of their own strengths and weaknesses, and what they would like to do to revitalize the program;

• Observation of several classes to assess the nature of interactions among students and teachers and the extent to which students were engaged;

• Assessment of the internal management of the program (e.g., review of records, interviews with teachers and board members to identify how effectively the program is being carried out).

• Review of materials used in the program;

• Review of literature in the field to identify other options for carrying out similar after-school programs.

Who carries out the evaluation:

• An outside evaluator selected by the board, who happens to be a member of the community in collaboration with a member of the board who has evaluation experience. As a member of the community, the outside evaluator offers to do the evaluation pro-bono, as does the board member.

Key evaluation findings:

• Classroom observation showed that teachers seemed bored and students were not challenged;

• Review of materials showed they weren't attractively presented;

• Interviews with students identified an after-school sports program at a nearby community center that many would like to attend meeting at the same time;

• The review of records and interviews with the teachers and the board identified some serious logistical problems delaying the arrival of necessary materials. This delay was, in turn, having an adverse effect on the teachers' ability to deliver the program effectively;

• The interview with the teachers surfaced another after-school program which had promise where, instead of staying in the classroom, students and teachers went out into the community, identified human rights issues, and took actions to address them.

Use of evaluation information in the decision making process:

• Based on the evaluation findings, the board decided to substitute the alternative program with a more activist focus the following school year and to do a similar assessment at the end of the next year.




Three elementary school teachers attended a two-day human rights education course offered by Amnesty International. The teachers came back from the course and enthusiastically started applying what they learned in their classrooms. Seeing the enthusiasm of the three teachers, other teachers at the school also expressed an interest in participating. The teachers persuaded the principal to look for funds that would enable them to expand the program within their school with the idea that, once implemented on a school-wide basis, their school could serve as a model for the school district. The teachers and the principal decided to apply to the county board of education and the school's Parent Teacher Association (PTA) for funding. However, they were told that they must show that it is a program worth supporting. They have limited funds and realize that they have to find a creative way to carry out an evaluation.

The audiences for the evaluation:

• The teachers who took the course and are implementing what they learned, who want feedback on their own performance;

• Other teachers, who would like to be further convinced of the value of participating in the program;

• The school principal, who also wants to support the program;

• The county board of education and the PTA, who are possible funders.

What information the audience needs for making decisions:

• Exactly what is required of the teachers who participate in this program?

• What kind of support will the principal and others require to administer the program?

• Do the students seem to like it? What do the parents think of it?

• Are the students learning about human rights?

• Are there any changes in classroom behavior on the part of the students and the teacher?

• How much is this program going to cost to start and maintain?

What tools and methods for obtaining this information:

• Review of the course training module and materials;

• Classroom observation of the teachers applying what they learned with their students;

• Semi-structured interviews with all three teachers and a sample of students to find out what they liked and didn't like about the program and what changes they have perceived in themselves or the classroom as a result of participating in the program;

• A written examination to assess teacher and student knowledge of human rights;

• Visits to two other schools in nearby cities that are implementing the curriculum to find out about their experiences, management issues, and costs.

Who carries out the evaluation:

• Through Amnesty International, the school was able to identify an outside evaluator with experience in human rights who was willing to carry out the evaluation at a reduced fee.

Key evaluation findings:

• Data (scores on written examination, self-reports, classroom observations) suggest that teachers, students, and parents like the program and that it is having a positive impact on the students and the classroom atmosphere;

• A review of management implications suggests that the program's administration and management requirements are reasonable for the school;

• Cost data show minimal costs per student.

Use of evaluation information in the decision making process:

• The principal submitted the evaluation report along with his proposal for expanding the program school-wide to both the PTA and the school board;

• The school board decided, based on both the proposal and the evaluation data, to fund the program on a trial basis for two years.




The Governor's task force on domestic violence would like to introduce human rights within its program for assisting communities and victims to understand the dimensions of domestic violence and help them take action to address domestic violence. The task force has identified a program being used overseas with the elements they are looking for and would like to adapt the program to their needs. In order to do this, they must get help from the organization overseas. They would then like to pilot the program in several communities in their state. If the program is successful, the task force would like to expand it for use throughout the state. The task force has a small amount of funding for the program. They also have access to funding for both the pilot and its evaluation through several grants. As part of the pilot, and before expanding it, they need to know: (a) if the program is effective in achieving its objectives; (b) how it is received in the pilot communities; (c) what the cost implications are of assuming the program.

The audiences for the evaluation:

• Members of the Governor's task force;

• Program participants;

• Potential funders of an expanded program.

What the audiences want to know:

• How does the program complement what is already being done to address domestic violence?

• Is the program worth the investment of time and human resources?

• What do people (e.g., community activists, victims of abuse) think of the program?

• What are the costs?

• What are the preliminary data on program impacts in terms of addressing and reducing issues of domestic violence?

What tools and methods for obtaining this information:

• Observation of the training course and review of materials for quality and relevance;

• Semi-structured interviews with course participants (e.g., community activists, victims of domestic violence) to assess what they thought of the training methodology and materials;

• Follow up through semi-structured interviews with people trained (e.g., after one month, six months, one year) in order to document what they are doing with their training and what impacts they believe the training had on them;

• Open-ended interviews with key individuals in the communities (e.g., workers in domestic violence shelters, hospitals) to identify whether they have seen any changes as a result of the program;

• A detailed review of costs for training and follow up.

Who carries out the evaluation:

• An outside evaluator with experience using qualitative methodologies.

Key evaluation findings:

• Both community activists and victims of abuse like the training methodology, although a number of the materials need to be adapted for use in their community;

• The evaluation follow-up shows that both the community activists and victims of abuse, based on what they learned, are taking action in their homes and in their communities to curb abuse;

• Initial positive indications (e.g., reductions in physical and psychological abuse) when abuse victims have acquired some tools they can use to defend themselves;

• A detailed cost study shows that while costs are appreciable, given that the program has a significant follow-up component, the benefits in reductions in domestic violence outweigh the costs.