Part 4 EVALUATION GUIDELINES Introduction Training course Newsletter Website Question-and-answer service Small library / resource centre Online community Rural radio Database Selective dissemination of information service Smart Toolkit for Evaluating Information Projects, Products and Services This Part features case studies of evaluations of selected information projects, products and services.The information activities covered are: z Training course z Newsletter z Website z Question-and-answer service z Small library / resource centre z Online community z Rural radio z Database z Selective dissemination of information service In each case we describe the activity and its concept and objectives, and look at how to prepare, implement and report an evaluation of the activity. We include guidelines on compiling logical frameworks and logic models, determining stakeholder participation in the evaluation, defining the evaluation focus and indicators, collecting and analysing data, and communicating the evaluation results. 176 Part 4: EVALUATION GUIDELINES: Training course TRAINING COURSE These guidelines relate to evaluating a training course, using the terms of reference (see Part 3, pages 99-103) as the basis of the evaluation and drawing on examples where appropriate. It is important to make the evaluation process as participatory as possible, involving colleagues and key stakeholders from the outset, including your primary stakeholders (see Part 1, pages 3-5). By doing this, you will find the experience more rewarding and more likely to result in a general acceptance of the evaluation findings, making it much easier to implement change. The notion of ‘critical reflection’ is discussed towards the end of the guidelines, but we advise you and your evaluation team to ‘reflect’ on the main findings and problems at each stage of the evaluation process, for learning purposes and to improve the way you conduct the evaluation. (See Part 1, pages 5-6). The background to the evaluation process in general – concepts, context, terms, trends and core ingredients – is described in Part 2. Do read that section if you are not familiar with certain concepts or terms that occur in these guidelines. And in Part 3 you will find more on the evaluation tools described here. Before evaluating a training course, you need to be clear about: z what the training course is about z its concept(s) and objectives z who the primary and secondary stakeholders are z how to go about identifying the needs of the primary and secondary stakeholders z what the focus of the evaluation is, the questions to ask and the indicators to use z how to collect the data and analyse them z how to communicate with your key stakeholders, to critically reflect upon the evaluation results and to report the results in such a way that they will be accepted and acted upon What is a training course? A training course is a single event or process that is designed to improve the participants’ skills, knowledge and/or attitudes. It can be a workshop, a course lasting a few hours, or one ranging from several days to several weeks. It can also consist of a series of training events spread out over time. It could be a face-to-face or online activity. Whether the course is a one-off event or a series of events on the same subject, the main focus of any evaluation will be on learning lessons for the organisation of future courses and how they can be improved. Determining the course concept and objectives You can’t evaluate the training course unless you are clear what it is about.The concept (idea) behind the course and the objectives it seeks to achieve need to be clearly stated, otherwise you 177 Smart Toolkit for Evaluating Information Projects, Products and Services can’t compare its actual performance with its intended performance.Also, without a clear concept it is difficult to make the right choices during the evaluation process. In determining the concept of the training course, you should ask these questions: z Why was the training course developed? z What were the main objectives (expected results) and problems to be addressed at the time? z What is the goal of the training course? z What is the purpose of the training course? z What are the core values of the training course? z Who are the targeted trainees? In evaluating the course, you need to ask yourself key questions related to the concept, such as: z Do the expected results achieved reflect its core values and approach? z Do they reflect the project purpose and contribute to the goal? z Does the course address the main objectives and problems it was supposed to? z Are the correct messages being conveyed? Box 4.1 Sample of the concept behind a training course for agricultural extension workers Main problem: Agricultural extension officers are not effective enough because they use a top-down approach in transferring knowledge to farmers. Experiences in other countries show that a more participatory approach leads to better results in which farmers are better able to apply the lessons learnt. Target group: The targeted participants are agricultural extension officers from both government and non-governmental organisations. Goal: To contribute to improved agricultural extension services. Purpose: Effectiveness and efficiency of extension officers increased. Expected results: Extension officers trained in participatory techniques for assisting farmers. (A specific challenge is to overcome the attitude of extension officers seeing themselves as ‘the experts’ telling farmers what to do, rather than seeing farmers as partners solving the problem together). Main approach: A participatory training approach is used to develop the training skills of the extension officers. The course uses the practical situations in which the participants find themselves. Core values: Relevance, effectiveness, usabilty, impact The logical framework is a useful tool that can help you clarify the training course objectives (see Part 3, pages 68-83). Commonly known as the ‘logframe’, it helps to summarise a project in a logical sequence. If a project plan does not have a logframe, it is useful to construct one so that you have a good idea of your objectives and the consequent hierarchy of activities.An example of a logframe for a training course is given in Table 4.1 178 Part 4: EVALUATION GUIDELINES: Training course If the training course is not seen as a project, but as an ongoing activity, the logic model (see Part 3, pages 103-105) might be the more appropriate tool to use. It gives an overview of activities associated with the course, the resources used, the outputs/results, the outcomes/effects and the impact. An example of a logic model applied to a training course is given in Table 4.2. Table 4.1 Logical framework for a training course for extension officers INTERVENTION OBJECTIVELY VERIFIABLE MEANS OF ASSUMPTIONS LOGIC INDICATORS VERIFICATION GOAL To contribute to improved agricultural extension services COURSE The effectiveness and After 2 years, 70 extension Follow-up reports Applying participatory PURPOSE efficiency of extension officers officers apply participatory Survey techniques contributes to increased techniques improved farm practices EXPECTED 1. Awareness created among Feedback from participants Participants are able to give Extension officers willing to RESULTS participants saying they are aware of the examples of improved apply new techniques 2. Training programme and need to change their techniques materials developed approach 3. Extension officers trained Programme and materials Archives: documentation on used for the training training materials In 2 years time 100 extension No. of extension officers officers trained trained (sometimes a summary of (sometimes a summary of (if the activities are completed, ACTIVITIES 1.1 Identify examples of similar courses resources/means is provided costs/budget is provided in what assumptions must hold in this box) this box) true to deliver the expected 1.2 Organise meeting with results) extension officers to identify needs and interest 1.3 Organise an exchange visit to extension staff elsewhere 2.1 Identify priority subjects 2.2 Identify resource persons 2.3 Develop training material and exercises material for each subject 3.1 Organise a pilot training course 3.2 Improve the training course 3.3 Run the course on a regular basis 3.4 Organise follow up INPUTS Project funding from donor List resources available agency; core funding from Ministry 179 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.2 Logic model for a training course for extension officers FOCUS RESOURCES ACTIVITIES OUTPUTS OUTCOMES IMPACT (indicators of results) (indicators of effects) INDICATORS Planning and development 1. Training Open-minded staff, Designing and No. of questionnaires No. of training sessions needs transport, postage, administering administered identification photocopies questionnaires and No. of interviews Willingness of interviews conducted respondents to Conducting meetings No. of meetings participate Analysing data and conducted drawing conclusions 2. Involving External-oriented staff, Calls, visits, meetings with No. of calls/visits made No. of sessions dealt stakeholders and transport, external partners and meetings organised with by external trainers partners Willingness of partners Developing agreements No. of new external No. of participants experts involved referred by partners 3. Training Dedicated staff Programme formulation Training programme No. of relevant sessions programme Session design Text for training brochure increased development Session outlines Participant satisfaction with content increased 4. Developing Own resources (books, Identifying and selecting Background materials and Participants satisfied with training materials reports, databases), information resources exercises training materials Internet databases Developing background Expertise / material documentation of Developing exercises partners 5. Logistical Dedicated staff Arrangements with venue Reservation of venue Level of satisfaction with arrangements Financial resources Support to transport Transport arrangements venue, transport and Communication and arrangements /participants Training material ready equipment increased copying equipment Arranging for training Course venue and equipment transport 6. Budgeting Capable staff Making a financial plan Clear financial proposal Funds available and financing Support of management Specifying activities Clear financial report Expenditure within and donor agencies Calculating the budget budget limits Organisational Staff and management guidelines agree on financial donor financial priorities guidelines Implementation/operations 7. Promotion Motivated, skilled staff Posters/brochures/ No. of brochures/letters % of potential Increase in number and invitation of Promotional materials invitation letters produced distributed participants being of inquiries/ participants (website, brochures, and distributed No. of meetings + aware of the service registrations on invitation letters) Meetings organised participants No. of partners (experts programme. Relationship with media Websites maintained No. of website visitors involved) increases Reduced % of Media informed No. of press releases questions asked outside scope of the course 180 Part 4: EVALUATION GUIDELINES: Training course Table 4.2 (continued) FOCUS RESOURCES ACTIVITIES OUTPUTS OUTCOMES IMPACT (indicators of results) (indicators of effects) INDICATORS Implementation/operations 8. Delivery of Motivated, skilled staff Conducting training No. of participants % of participants % of users able to the training Training venue and programme No. of training courses following the training name participatory programme equipment conducted Percentage of techniques to help Monitoring and Training materials No. of training sessions participants satisfied apply lessons evaluation Contracts with experts learned % of farmers able to improve agricultural production as result of course Monitoring and evaluation 9. Monitoring Capable staff Defining indicators Clear evaluation reports Improved effectiveness and evaluation Clear procedures Designing report formats produced, per training and efficiency of the Adequate registration Organising data collection course, 3-months, year training course Indicators of quality, Generating statistics and effectiveness, efficiency reports Clarifying data needs and stakeholder participation When preparing for an evaluation, you will need to know who your primary and secondary stakeholders are so that you can determine your data needs and involve stakeholder representatives in the evaluation.The stakeholders should be taken into account at every stage of the evaluation, from its planning and design, to being part of the team as well as sources of information, and providing feedback on the evaluation results. It will be difficult to implement the evaluation recommendations if they are not acceptable to your stakeholders. The range of stakeholders in a training course could include: z trainees/course participants – the primary stakeholders e.g., researchers, extension officers, community workers, staff from NGOs, local government and companies z resource persons, including trainers z the institution organising the course and the individuals managing it z collaborating institutions (e.g., those providing the trainees, resource persons, research findings, case studies) z funding agencies (financiers, donor organisations, banks, student award schemes, research institutions) z government (providing support and, where appropriate, counterpart funding) One way of identifying your primary and secondary stakeholders is to conduct a stakeholder analysis. It helps you to identify the key stakeholders, how they benefit from and contribute to the training course and how to include them in the evaluation.Table 4.3 provides an example of a stakeholder analysis. 181 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.3 Stakeholder analysis for a training course for agricultural extension officers STAKEHOLDER BENEFITS FROM CONTRIBUTIONS / INFLUENCE ON POTENTIAL CATEGORY THE PROJECT SACRIFICES THE PROJECT INVOLVEMENT IN THE EVALUATION Trainees Improved skills Time, change of approach Through needs assessment / Represented on the (extension officers) evaluation evaluation team. Involved in analysis and decisions on actions for change An information source Informed about main findings Managers of extension New approach to show Time to convince staff of Sending staff for training Being informed about and agencies training needs reacting to findings Represented on the team Resource persons Extra business Time, expertise On developing and Represented on the team (including trainers) implementing the programme Involved in analysis and decisions on actions for change Valuable source of information Collaborating agencies Contact with and feedback Time and resources Through resource Being informed and from trainees contribution reaction to findings Managers of the New product to show to Time to develop concept / On project concept and Represented on the team training institution others convince funding agency planning Drawing up terms of reference Decisions on findings Funding agency Results to show to their Finance and time On project concept and Terms of reference clientele planning Decision on continuing project financing When considering the stakeholders’ interests in the course, be aware of the kind of questions they would like to have answered.Table 4.4 gives an overview of the kind of key questions that might be asked. If you know your stakeholders really well, you can start the process by working with your colleagues to formulate the questions based on your knowledge of them. However, the stakeholders must become involved at some stage to ensure that you have their views.You can do this using interviews, workshops, and questionnaires with individuals or groups of stakeholders. Defining the evaluation focus, questions and indicators Most evaluation exercises need to be limited in focus to reflect time and budget limitations. It is not possible to cover all elements of a training course extensively every time you carry out an evaluation, so you need to choose the focus of your evaluation based on stakeholders’ key questions and the time and cost of accessing data. Choosing the focus will determine the key questions (areas) of the evaluation, taking into account its objectives (stated in the terms of reference) and the concerns of the key stakeholders.To define 182 Part 4: EVALUATION GUIDELINES: Training course Table 4.4 Key questions stakeholders could be interested in STAKEHOLDER QUESTIONS THEY ARE POTENTIAL USE OF CATEGORY INTERESTED IN THE ANSWERS Trainees (extension officers) How effective is the new approach? Apply to attend the course How good is the training institution? Recommend the course to others Managers of extension agencies How effective is the new approach? Send more extension officers How effective is the training? Change training strategy How good is the training agency? Change training institution Resource persons (including trainers) How can we improve the training course? Adjust training course Collaborating agencies Do we benefit enough, considering our Reduce/increase contribution contribution? Managers of the training institution How effective is the training course? Change training strategy How effective are the trainers? Change trainers Funding agency How effective is the training course? Improve training course How effective is the training institution? Improve training institution How sustainable is the training course? Decide on continuing to provide funds the focus you can use the model developed by Donald Kirkpatrick (1994), which has four levels of measurement of effectiveness and impact. Table 4.5 Applying Kirkpatrick’s model to the evaluation of a training course LEVEL MEASURES WHEN / HOW TO MEASURE 4. Impact Whether the course improved trainees’ Six months after the course, using impact stories performance and focus group discussions 3. Transfer/adoption Whether the training resulted in a change in A few months after the course, using trainees’ behaviour questionnaires and observation 2. Learning How much ‘learning’ took place Pre- and post-course, using tests, interviews, questionnaires and focus groups 1. Satisfaction How satisfied the trainees were about the course At the end of the course, using a short questionnaire To use Kirkpatrick’s model, start from the bottom 4. Impact 3. Transfer/adoption 2. Learning 1. Satisfaction 183 Smart Toolkit for Evaluating Information Projects, Products and Services The levels of ‘satisfaction’, ‘learning’ and ‘transfer/adoption’ can also be considered as being part of the evaluation criterion of effectiveness, each level representing an increasingly more precise measure of the effectiveness, and requiring more time-consuming data collection and more rigorous analysis.Apart from these four levels, it is also useful to consider other evaluation criteria, such as relevance, efficiency and sustainability. Table 4.6 provides an example of focus, key questions and indicators for a training course.You can use the information here to develop a more specific set of questions relevant to the focus of the course. Using the feedback from your stakeholders and the data available to you, you will be able to determine which questions are important for your evaluation. With reference to Table 4.6 you may want to focus on one criterion (e.g., impact) because of its importance to the continuation of the training course, or on the criteria of usability and accessibility because of their importance to the users. If accountability is the main purpose of the evaluation, you may want to limit the scope of the evaluation to the relevance, effectiveness and efficiency of the training course. Drawing up a matrix like Table 4.6 shows how important it is to have access to good data to support your evaluation and, by extension, how important it is to maintain a good monitoring system of data relating to your training course. Collecting the data When you have determined your scope, focus and indicators, you will need to decide how to collect the data. It is important to develop a data collection strategy that ensures: z that all relevant data will become available during the evaluation z that no more data are collected than what is needed and can be analysed Data collection methods for evaluating a training course could include the following: z Training needs survey: This method would provide data on trainees and their training needs (see Box 4.2).The data would be useful for assessing the applicability of the course and defining areas of improvement, as well as for creating baseline data against which to measure the effectiveness and impact of the course. z Course registration: This is an important instrument for collecting data from the trainees. These data could include name and address details, age, education and training, profession, organisation, position and course expectations. z Tests: You can design an instrument to test the trainees’ knowledge and skills at the end of the course. It can be either formal or informal. It can be used not only to measure their level of ability, but also to provide you with useful data, particularly if you also tested them at the start of the course. z Evaluation forms and group discussions: Evaluation forms for the trainees to complete at the end of the course provide interesting data on how the course was implemented. However, they tend to measure trainee satisfaction with different aspects of the course, rather than giving precise information on learning and application. Group discussions at the end of the course can provide a better understanding of why trainees were satisfied/dissatisfied with certain issues. 184 Part 4: EVALUATION GUIDELINES: Training course Table 4.6 Determining the focus, key questions and indicators for the evaluation of a training course, using some evaluation criteria as examples EVALUATION KEY QUESTIONS INDICATORS STAKEHOLDER DATA CRITERIA (SCOPE) (FOCUS) INTEREST ACCESSIBILITY IMPACT Has application of skills, knowledge and % of trainees showing improved High Low attitudes improved trainees’ performance? performance RELEVANCE Did the training address a major need % of trainees and their managers High Medium among trainees? indicating that the course addressed a major need ACCESSIBILITY Were trainees able to get onto the course % trainees who experienced problems in High High easily? applying, being accepted and getting time off to attend Were the venue and logistical % of trainees satisfied with the venue Medium High arrangements satisfactory? and logistics Were course fees set at a reasonable level % of trainees satisfied with the costs High High for these trainees? Was length of course adequate? (this links % of trainees satisfied with the duration Medium High to ease of access; if course is too long, of the course people might not get time off) Was language of course one that all could % of trainees happy with the language High High readily understand and communicate in? used and style of language USABILITY Were training materials easy to follow? % of trainees who were able to follow High High materials when on the course, and after the course EFFECTIVENESS Outputs How many and what type of people No. of trainees trained, by category High High participated? (age, education, etc.) How many courses were run? No. of courses High High Satisfaction How satisfied were the trainees with the % of trainees satisfied with the course High High course content? content Did the course meet trainees’ expectations % of trainees whose needs were met High High and learning needs? Were trainees satisfied with the training % of trainees satisfied with the training Medium High methodology? approach How satisfied were trainees with the % of trainees satisfied with skills and Medium High resource persons, including trainers? knowledge of resource persons Learning Did the course improve trainees’ % of trainees passing final exam or test High Medium knowledge and skills? % of trainees who could identify specific High Medium skills/knowledge item that they obtained Has the course changed trainees’ attitudes % of trainees demonstrating a change in High Low in any way? attitude Transfer Are trainees applying the knowledge/skills % of trainees saying they had applied High High they obtained? the knowledge and skills learned Have trainees shared their knowledge? % of trainees indicating knowledge High High sharing EFFICIENCY Was the training worth the time and costs % of trainees who felt the training worth High Low (e.g., in terms of absence from work)? the time and cost involved SUSTAINABILITY Are the training institution and its financiers Willingness of managers of training High High committed to continuing to run the course? institution and financing agency to continue support for course; indications in their strategic plans; budget allocations 185 Smart Toolkit for Evaluating Information Projects, Products and Services Box 4.2 Example of conducting a training needs survey A training institution ran a wide range of training courses. They found it useful to carry out needs surveys to determine whether their courses filled a gap and met demand. They surveyed the trainees at various stages: z before the course took place, to see what the trainees expected from the course z on completion of the course, to see if the trainees’ expectations had been met z after 3–6 months, to see the extent to which the trainees had been able to apply their new knowledge in practice; after several months, trainees are usually able to give a more realistic picture about the usefulness and effectiveness of the course (you need to bear in mind that to conduct a survey at this stage does require more resources) z Follow up: This can take the form of questionnaires, interviews and observations. Questionnaires can generate data on the extent to which the trainees were able to apply the knowledge and skills learned during the course in their daily work practice. Interviews (individual or group) are more time-intensive than questionnaires, and make it possible to get more detailed answers, providing insight into the application of new knowledge and skills and what is needed to improve the level of application. Observation can sometimes be done during the course, providing a very direct way of assessing the effectiveness of the course and the opportunity to correct mistakes on the spot. z Impact study: To determine the impact of training, it is not only necessary to know that trainees applied the lessons learnt, but also to assess whether there has been a positive change in their work performance.This requires defining performance indicators related to the training course (e.g., if the training includes communication skills aimed at improving the way extension officers offer advice to farmers, it would be necessary to assess whether or not farmers are more satisfied with the advice given). Analysing the data It is advisable to think about data analysis tools and methods before starting the data collection. This will help to ensure that the right types of data are being collected. Table 4.8 will help you to start thinking in this way. For example, if you want to know what trainees think about the course, you should already be thinking about ways to interview them, individually or in groups. In designing the data analysis for a training course evaluation, you should ask the following questions: z Which analytical tools should be used? (e.g., a table featuring the type of questions versus the type of trainees, a graph showing development of trainees over time, or specific analytical tools such as a problem tree or a SWOT analysis) z Which collection methods will provide the necessary data for each of the tools? z What type of observations do you expect to make for each tool? z What type of conclusions do you expect to reach using these tools? 186 Part 4: EVALUATION GUIDELINES: Training course Table 4.7 Example of data collection methods for evaluating a training course DATA INFORMATION KEY ISSUES / ANSWERS EXPECTED COLLECTION METHOD SOURCE QUESTIONS AND CONCERNS Desk study Registration forms/course No. and background of trainees No., age, education, experience reports No. of training sessions Trainee satisfaction with the course, materials, Very satisfied – very dissatisfied methodology, resource persons, timing, venue and logistics Did the training course meet trainees’ expectations Fully, partly, not at all and learning needs? Were there any topics that should have been Topics and how often included but were not? mentioned Were the course fees set at a reasonable level? Very reasonable – unreasonable Did the course improve trainees’ knowledge? Very much – not at all Did the course improve their skills? Very much – not at all Has the course changed their behaviour and/or Very much – not at all opinions ? Accounts department What is the cost per trainee? Average per participant Similar case studies (internet) What alternative approaches to the training course Compare with on-the-job are there? training Project documents What were the original objectives and indicators? Potential deviations and different insights Questionnaire Follow up questionnaire to all Are trainees using the skills obtained? Ways trainees use the skills participants Has the course changed trainees’ opinions? Do they see improvement of their performance? Have they applied their new skills to change their Hindrances encountered project or organisation? Have they trained other people in the new skills? Suggestions Interviews Randomly selected trainees Are the trainees using the skills obtained? Ways trainees use the skills Has the course changed their behaviour and/or opinions? Do they see an improvement in their performance? Hindrances to applying skills Have they applied their new skills to change their Suggestions project or organisation? Have they trained other people in the new skills? Observations Randomly selected trainees Are the trainees using the skills obtained? Type of skills used/not used and reasons Workshop Selected trainees Strengths, weaknesses, opportunities and threats of Avoid wish list extension work Contribution of present training course New training opportunities and needs Verify that needs are real needs Once you have designed the data analysis, it is important to see if your data collection includes all elements.The results from some data collection methods (e.g., interviews) could be specifically meant for checking and/or interpreting other data. Explain why you were unable to gather certain data. 187 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.8 Example of a data analysis design for a training course evaluation ANALYTICAL TOOLS RESULTS FROM TYPE OF OBSERVATIONS TYPE OF CONCLUSIONS AND TABLES COLLECTION METHOD TO MAKE TO REACH USED Table 1. Desk study Categories of participants that are over/under- Categories to give more Information on the represented attention to participants and the course (quantitative results) Table 2. Desk study Elements of high and low satisfaction Priorities for improvement Participant satisfaction Questionnaires Interviews Table 3. Questionnaires Areas of good/ weak application Areas for improvement Participant learning and Interviews application: areas of Observations application Table 4. Questionnaires Range of subjects covered adequately/ Areas for improvement; areas Usability of the course Interviews inadequately that are irrelevant Observations Figure 1 Participants’ Questionnaires Hindrances to application of techniques learned Àreas for improvement application: influencing Interviews factors (e.g., problem tree) Observations Workshop Table 5. Data from accounts Costs per element of the training programme Potential for reducing costs Cost-effectiveness Table 6. Interviews Impact examples Improvement and continuation Impact stories Observations Indication of increase/decrease in consultation of the programme Data extension agency Table 7. Workshop Strengths and weaknesses of the present training Improvement of the programme New training opportunities New training needs New potential training programmes to develop and offer Preparing a communication plan Designing the communication plan is an important activity in the evaluation process to ensure that conclusions and recommendations from the evaluation are well understood and supported. In designing this strategy, you need to consider these issues: z Which communication methods will suit which stakeholders, especially the primary stakeholders? z What are the main issues to be discussed and reported, taking into consideration the communication method and target group? An effective communication plan for critical reflection during an evaluation and reporting findings during and after the evaluation helps to create common understanding among stakeholders, 188 Part 4: EVALUATION GUIDELINES: Training course providing a good basis for implementing recommendations. If your stakeholders don’t accept the conclusions and recommendations resulting from the evaluation, it will be almost impossible to motivate them to make the required changes. However, if they have been included throughout the process, they are more likely to accept the conclusions. During the process of critical reflection and reporting, you should ask: z Do the stakeholders share the same views on the problems with the course? z Do they have the same views on the solution(s)? z Are they prepared to support the same solution(s)? z What are the obstacles that will prevent them from implementing the solution(s) proposed? z What can be done to address the obstacles they face? There are various ways in which you can convey the results of the evaluation (e.g., providing a summary of your findings, organising a meeting, a memorandum). If the stakeholders are properly involved in the evaluation process in terms of participation and input, then getting their agreement on the relevance, reliability and quality of the data collected, the adequacy of the analysis provided and the conclusions drawn should not be difficult. Be aware, however, that although there might be general acceptance of a proposed solution, you might not be able to implement the solution because of a lack of resources, time and/or capacity. An important component of the communication process is to find out what difficulties the stakeholders expect and/or experience in implementing the recommendations.Where changes can be implemented without difficulty, it is advisable to implement them as quickly as possible. Table 4.9 Example of a communication plan for reporting the findings from a training course evaluation COMMUNICATION METHOD STAKEHOLDERS TO MAIN ISSUES TO DISCUSS / REPORT TO REPORT ON Critical reflection meeting on draft report Resource persons, extension agency, Adequacy of training course management Full report, including executive summary Funding agency Quantitative outputs, trainee satisfaction, learning Training agency and application, Extension agency Impact indication, cost-effectiveness, new or potential courses Personal meeting Management training agency Report recommendations Feedback on individual trainers Article Professionals active in similar field of training and Evaluation approach and results extension Extension brochure Farmers Examples of new extension approach Workshops / field days Farmers Feedback and discussion of findings Training brochure Extension officers Training contents Effects of the training Examples of participants 189 Smart Toolkit for Evaluating Information Projects, Products and Services Box 4.3 Evaluating a training course: guidelines checklist These guidelines on evaluating a training course are covered above: z Course concept and objectives z Data needs and stakeholder participation z Evaluation focus, questions and indicators z Data collection z Data analysis z The communication plan For more on: z data analysis, see Part 2, page 60 and Part 3, 157-162 z data collection, see Part 2, pages 58-59 and Part 3, pages 106-115 z evaluation communication and follow-up, see Part 3, pages 163-173 z evaluation criteria (scope), see Part 2, pages 33-34 z indicators, see Part 3, pages 91-98 z logframe, see Part 3, pages 68-83 z logic model, see Part 3, pages 103-105 z stakeholder participation, see Part 1, pages 3-5 z terms of reference, see Part 2, pages 32-35 190 Part 4: EVALUATION GUIDELINES: Newsletter NEWSLETTER These guidelines relate to evaluating a newsletter, using the terms of reference (see Part 3, pages 99-103) as the basis of the evaluation and drawing on examples where appropriate. It is important to make the evaluation process as participatory as possible, involving colleagues and key stakeholders from the outset, including your primary stakeholders (see Part 1, pages 3-5). By doing this, you will find the experience more rewarding and more likely to result in a general acceptance of the evaluation findings, making it much easier to implement change. The notion of ‘critical reflection’ is discussed towards the end of the guidelines, but we advise you and your evaluation team to ‘reflect’ on the main findings and problems at each stage of the evaluation process, for learning purposes and to improve the way you conduct the evaluation. (See Part 1, pages 5-6). The background to the evaluation process in general – concepts, context, terms, trends and core ingredients – is described in Part 2. Do read that section if you’re not familiar with certain concepts or terms that occur in these guidelines. And in Part 3 you will find more on the evaluation tools described here. Before evaluating a newsletter, you need to be clear about: z what the main elements of the newsletter are z its concept(s) and objectives z who the primary and secondary stakeholders are z how to go about identifying the needs of primary and secondary stakeholders z what the focus of the evaluation is, the questions to ask and the indicators to use z how to collect the data and analyse them z how to communicate with your key stakeholders, to critically reflect upon the evaluation results and to report the results in such a way that they will be accepted and acted upon What is a newsletter? A newsletter is a periodical publication with at least two issues per year. It usually contains news and announcements and is focused on a particular subject. Generally, it will have a limited circulation. Newsletters are often used to disseminate information and knowledge.They may be distributed by e-mail or by post. Material for newsletters can be drawn from various sources. A newsletter usually has a defined target group. For example, the readers of a newsletter may be a professional audience of policy-makers, managers, intermediaries or a mixture of these. Other newsletters are targeted at institutions or at a more grassroots audience, such as farmers. Determining the newsletter concept and objectives You can’t evaluate your newsletter unless you are clear what it is about.The concept (idea) behind the newsletter and the objectives it seeks to achieve need to be clearly stated, otherwise you can’t compare its actual performance with its intended performance.Also, without a clear concept it is difficult to make the right choices during the evaluation process. 191 Smart Toolkit for Evaluating Information Projects, Products and Services In determining a newsletter concept, you should ask these questions: z Why was the newsletter launched? z What were the main objectives (expected results) and problems at the time? z What is the goal of the newsletter? z What is the purpose of the newsletter? z What are its core values? z Who are the targeted readers? z How have the newsletter format, language, style and content been determined, bearing in mind the target group? In evaluating the newsletter, you need to ask yourself key questions related to the concept, such as: z Do the expected results achieved reflect its core values and approach? z Do they reflect the project purpose and contribute to the goal? z Does the newsletter address the main objectives and problems it was supposed to? z Are the correct messages being conveyed? Box 4.4 Example of a concept behind a newsletter for farmers Main problem: Extension officers from the Ministry of Agriculture have been using the farmers’ newsletter to promote improved agricultural production techniques. The chief extension officer is interested in finding out if the newsletter is meeting the needs of the farmers, as well as determining what more needs to be done to increase its distribution. Target group: Farmers in the region targeted. Goal: To contribute to agricultural production and development. Project purpose: Farmers provided with a newsletter containing appropriate, relevant and timely information. Expected results: Improved newsletter content and format, and increased distribution and impact among farmers. Main approach: Meet with farmers’ groups and on a one-to-one basis during field visits to get feedback and promote the newsletter. Core values: Relevance, effectiveness, impact, sustainability The logical framework can help you clarify the newsletter objectives (see pages 68-83). Commonly known as the ‘logframe’, it helps to summarise a project in a logical sequence. If a project plan lacks a logframe, it is useful to construct one so that you have a good idea of your objectives and the consequent hierarchy of activities.An example of a logframe for a newsletter is given in Table 4.10. If the newsletter is not seen as a project, but as an ongoing activity, the logic model (see pages 103- 108) might be the more appropriate tool to use. It provides an overview of activities associated with the newsletter, the resources used, the outputs/results, the outcomes/effects and the impact.An example of a logic model applied to a newsletter is given in Table 4.11. In using this framework, the key questions to see if the newsletter is meeting the stated objectives over time become evident. For example: 192 Part 4: EVALUATION GUIDELINES: Newsletter Table 4.10 Logical framework for a farmers’ newsletter INTERVENTION OBJECTIVELY VERIFIABLE MEANS OF ASSUMPTIONS LOGIC INDICATORS VERIFICATION GOAL To contribute to improved agricultural production Newsletter on agricultural After 2 years the number of Routine and existing records Newsletter contributes directly NEWSLETTER techniques for farmers readers doubles to agriculture production PURPOSE developed, promoted and disseminated EXPECTED 1. Newsletter structure and - Feedback from readers - Survey The newsletter meets an RESULTS design developed (farmers) - Letters to the editor important need 2. Improved newsletter content - Demand for newsletter - Subscription rates 3. Improved promotion and increases by 100 % - Routine records showing distribution of newsletter - Distribution of newsletter distribution rates increased ACTIVITIES 1.1 Meet staff and selected (sometimes a summary of (sometimes a summary of (if the activities are stakeholders to discuss resources/means is provided costs/budget is provided in completed, what assumptions newsletter objectives, design in this box) this box) must hold true to deliver the and content; brainstorm to expected results) formulate questionnaire for feedback and strategy to obtain feedback from target group 1.2 Identify priority topics through questionnaires and meetings with key stakeholders and test newsletter design and content with target group(s) 1.3 Develop plan with stakeholders for the various issues for the year 2.1 Research to produce the articles for the newsletter in line with identified topics 2.2 Identify and liaise with contributors to the newsletter 2.3 Publish newsletter 3.1 Meet with staff and key stakeholders (publishers and extension staff) to develop promotion and distribution strategy for the newsletter 3.2 Implement strategy INPUTS Project funding from donor agency and core funding from the Ministry of Agriculture List resources available z Are readers satisfied with the newsletter? z Does the newsletter content reflect its objectives? z Has its readership increased? 193 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.11 Logic model for a farmers’ newsletter FOCUS RESOURCES ACTIVITIES OUTPUTS OUTCOMES IMPACT (indicators of results) (indicators of effects) INDICATORS Planning and development Laying the Staff, room facilities Meeting to discuss the List of objectives/ ideas, Draft newsletter concept groundwork for objectives/ ideas for the list of initial key and select which the newsletter newsletter and identify key stakeholders stakeholders to involve in stakeholders the process Involving Time needed for Brainstorm meetings to Newsletter structure and Changes made to stakeholders representatives from the design newsletter and design developed structure and design, Ministry of Agriculture, questionnaires for Questionnaires distributed and topics identified key stakeholder feedback and identify to stakeholders institutions and farmers’ other stakeholders organisations to meet Stationery, transport, postage, fax, telephone, computer, printer Process Time and willingness of Develop plan for the year List issues and topics to Operational plan for the development staff and stakeholders to for the various issues be addressed by the year for the newsletter meet newsletter over the year Funds Funds from donor Budget prepared for the Budget Agreement on funds agency, Ministry of preparation, publication, Financial report available for newsletter Agriculture and research promotion and distribution institutes of the newsletter Implementation/operations Compiling the Capable staff to Research to produce Articles submitted for No. of articles newsletter research and produce materials for the newsletter possible inclusion in the addressing the needs of materials for the in line with feedback from newsletter farmers newsletter questionnaires Key contributors from partner institutions or independent actors Publishing the Staff to prepare Edit, layout, typeset and No. of newsletters Knowledge gained % farmers changing newsletter newsletter for publication publish newsletter published their practices Publishing skills and consequent on the software newsletter Promotional Time of staff and key Meetings to develop Strategy developed % change in farmers Increased requests activities stakeholders promotion and distribution receiving the for newsletters strategy newsletter Monitoring and evaluation Monitoring and Time for staff to Determine indicators Development of a Improved newsletter (in evaluation implement clear Identify data collection database to support M&E terms of effectiveness procedures to record methods and efficiency) meeting routine data Design reporting formats to farmers’ needs Time for staff to input collect and analyse data routine data 194 Part 4: EVALUATION GUIDELINES: Newsletter Clarifying data needs and stakeholder participation When preparing for an evaluation, you will need to know who your primary and secondary stakeholders are, so that you can determine your data needs and involve stakeholder representatives in the evaluation.The stakeholders should be taken into account at every stage of the evaluation, from its planning and design, to being part of the team as well as sources of information, and providing feedback on the evaluation results. It will be difficult to implement the evaluation recommendations if they are not acceptable to your stakeholders. The range of stakeholders in a newsletter could include: z demand side/primary stakeholders: individuals/institutions (paying and non-paying subscribers) z supply-side/contributors: writers and graphic designers z production-side: editors, printers, publishers, distributors z resources: funding agencies and advertisers (financial), and staff (human) The stakeholders will have a vested interest in the evaluation itself, and should be consulted and involved in some way.The input of the different groups will vary, depending on the particular situation.You should therefore take into account: z their interests when designing the evaluation z how they can actively contribute to the evaluation process z how to include them as part of the evaluation team z their role as a source of data (e.g., via questionnaires, interviews and workshops) z their feedback on evaluation results The users of the newsletter – the primary stakeholders – will be the people who subscribe to, read and use the newsletter, and (if applicable) the institutions they belong to.The newsletter producers need to have a clear idea who they are and who the secondary stakeholders are. One way of identifying these stakeholders is to conduct a stakeholder analysis. It helps you to identify the key stakeholders, how they benefit from and contribute to the newsletter and how to include them in the evaluation. Table 4.12 provides an example of a stakeholder analysis. When considering the stakeholders’ interests in the newsletter, be aware of the kind of questions that they would like to have answered.Table 4.13 gives an overview of the kind of key questions that might be asked. If you know your stakeholders really well, you can start the process by working with your colleagues to formulate the questions based on your knowledge of them. However, the stakeholders must become involved at some stage to ensure that you have their views.You can do this using interviews, workshops, and questionnaires with individuals or groups of stakeholders. Defining the evaluation focus, questions and indicators Most evaluation exercises need to be limited in focus to reflect time and budget limitations. It is not possible to cover all elements of a newsletter extensively every time you carry out an evaluation, so you need to choose the focus of your evaluation, based on stakeholders’ key questions and the time and cost of accessing data. The focus of the evaluation might stem from your desire to improve it, from low or declining circulation figures, from feedback from others or from a need to cut costs.There might be 195 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.12 Stakeholder analysis for a newsletter for farmers STAKEHOLDER BENEFITS FROM CONTRIBUTIONS / INFLUENCE ON POTENTIAL CATEGORY THE PROJECT SACRIFICES THE PROJECT INVOLVEMENT IN THE EVALUATION Readers Improved knowledge on Time to read newsletter Determine newsletter content Represented on the team agricultural production Cost of newsletter through feedback to the Involved in analysis and editor decisions on actions for change An information source Informed about main findings Contributors/ partners Improved newsletter content Time and technical input Newsletter content Representation on the team Better targeted and improved improved, stronger network Informed about main distribution of contributors/ partnership findings Editors/staff/ Improved newsletter Time to develop newsletter Determine objectives, Representation on the team, management/ Improved outreach and convince funding agency planning, management, to be involved in planning publishers Promotional effect implementation, and the evaluation distribution of newsletter To be informed about the results Funding agencies Results to show to their Finance and time Have input into objectives Informed about the stakeholders and plan, and through evaluation and the results monitoring and evaluation Table 4.13 Key questions stakeholders could be interested in STAKEHOLDER QUESTIONS THEY ARE POTENTIAL USE OF CATEGORY INTERESTED IN THE ANSWERS Readers Does the newsletter meet the needs of the Decide whether to continue the newsletter readers? Is the target group aware of the newsletter? Is the newsletter timely? Is it satisfactory? Decide how to improve it Contributors Do contributors provide relevant and accurate Look at ways of strengthening collaboration information? Is there an incentive for the partners to promote the newsletter? Editors/staff/management/ publishers How effective and efficient is the newsletter? Decide whether to continue the newsletter (e.g., Is it meeting the needs of the readers? Decide how to improve it Are the articles well written? Is it cost-effective? Can it be produced on a sustainable basis?) Look at ways of reducing costs Funding agency Does the newsletter make a difference to the Determine whether funding will be made quality of lives of the readers? available to produce the newsletter How sustainable is it, given the available resources? Do staff have the capacity to produce a good- quality newsletter? 196 Part 4: EVALUATION GUIDELINES: Newsletter concerns about the performance of the newsletter in comparison with a competitor.You might want to assess current achievements so as to determine strategies for future publications. Other examples of areas of focus could include: z a specific objective (e.g., providing information to extension officers) z specific assessment criteria such as the newsletter’s effectiveness and impact z a specific problem area (e.g., distribution) z the primary stakeholders, or stakeholders voicing problems about the newsletter z a combination of all of these In evaluating a newsletter, it useful to draw on the model developed by Donald Kirkpatrick (1994). You can use it to measure the effectiveness and impact of the newsletter (see Table 4.14). Table 4.14 Applying Kirkpatrick’s model to the evaluation of a newsletter LEVEL MEASURES WHEN / HOW TO MEASURE 4. Impact Has the newsletter helped to improve farming A year after dissemination of the newsletter, ask practices, etc. whether adoptions have led to any benefits (through impact stories and focus group discussions) 3. Transfer/adoption What practices have farmers adopted; how has A few months later, go into the field to determine the newsletter changed their behaviour the level of adoption – use questionnaires and personal observations 2. Learning To what extent the farmers understand the content Before and after the dissemination of the of the newsletter and the practices it promotes newsletter, assess the situation in the field, through questionnaires, interviews, focus groups 1. Satisfaction How satisfied the farmers are with the newsletter After dissemination of the newsletter, interview farmers receiving the newsletter See also Table 4.5 The levels of ‘satisfaction’, ‘learning’ and ‘transfer/adoption’ can also be considered as being part of the evaluation criterion of effectiveness, each level representing an increasingly more precise measure of the effectiveness, and requiring more time-consuming data collection and more rigorous analysis.Apart from these four levels, it is also useful to consider other evaluation criteria, such as relevance, efficiency and sustainability. From Table 4.14 (under Measures), you can see how asking specific questions can help you to improve key areas of your newsletter.With each key question, you need to think about the indicators to help you answer that question. Indicators also determine the type of data you will need to collect. It is also important to consider the interests of the stakeholders and the ease of access you have to the data you require in terms of time and cost. Table 4.15 provides an example of focus, key questions and indicators for a newsletter.You can use the information here to develop a more specific set of questions, relevant to the focus of the newsletter. Using the feedback from your stakeholders and the data available to you, you will be able to determine which questions are important for your evaluation. 197 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.15 Determining the focus, key questions and indicators for the evaluation of a newsletter, using some evaluation criteria as examples EVALUATION FOCUS KEY QUESTIONS INDICATORS STAKEHOLDER DATA CRITERIA (SCOPE) INTEREST ACCESSIBILITY IMPACT Changed Have your attitudes or practices % of readers who say that their High Medium practices, changed in any way? own personal practices / policies, policies / products have been products at changed by the knowledge individual level gained from the newsletter RELEVANCE Design Do the readers find the % of readers who find the High Medium newsletter useful? newsletter useful % of readers who wish more topics to be included USABILITY Readability Is the language used by the % of readers who feel that the High Medium newsletter at the right level? use of language and style are appropriate EFFECTIVENESS User Who and where are the readers % of readers in different Medium Low characteristics/ of the newsletter? professional groups defining a % of readers in different age profile of the groups user % of readers located in different geographical areas % of readers in terms of gender Reach What is the range of readership No. of persons reading each Medium Medium over time? copy of the newsletter (be aware of multiple readers per copy) No. of newsletters distributed to target group No. of target group who say that they have read the newsletter Circulation/ How many newsletters are being No. of copies circulated Medium High distribution distributed? No. of copies printed Is the newsletter easily No. of subscribers accessible in electronic and/or print form? Appearance of Are readers satisfied with the % of readers satisfied with the High Medium the newsletter quality of the paper/printing of paper and printing quality the newsletter? No. of readers who have been Are readers able to share the able to share their copy newsletters with others? because the paper has been strong enough to hand around without disintegrating Reader Are the readers satisfied with the % of readers satisfied with the High Medium satisfaction information provided in the newsletter overall newsletter? % of readers who think that the newsletter is too long / too short / just right % of subscriptions cancelled 198 Part 4: EVALUATION GUIDELINES: Newsletter Table 4.15 (continued) EVALUATION FOCUS KEY QUESTIONS INDICATORS STAKEHOLDER DATA CRITERIA (SCOPE) INTEREST ACCESSIBILITY EFFECTIVENESS Learning Did the newsletter help improve % of readers who think that the Low Low the reader’s knowledge and newsletter helped improve skills? knowledge and skills Adoption Has the reader adopted the % of readers who say that they Low Low techniques promoted? Have they have applied knowledge and shared their knowledge skills; % readers who say they've shared their knowledge EFFICIENCY Timeliness Is the newsletter produced on a % of readers who say they High Medium timely basis? receive the newsletter on a timely basis How much does How many copiers are needed What is the unit cost per Medium High it cost to and what are the costs and newsletter? What is the produce the the time involved to produce average cost/time involved in newsletter? them? producing the newsletter? SUSTAINABILITY To what extent Availability of funds/ human Can the newsletter continue to Medium High can you access resources to produce the be produced after funding from diverse newsletter various sources is reduced or resources to Should the newsletter be ends? ensure continued? Is there sufficient capacity in- sustainability of house to produce the the newsletter? newsletter? What is the level of institutional commitment to support the newsletter? Are there competing newsletters on the market? With reference to Table 4.15 you may want to focus on one criterion (e.g., impact) because of its importance to the continuation of the newsletter, or on the criterion of usability because of its importance of the readers. If accountability is the main purpose of the evaluation you may want to limit the scope of the evaluation to the relevance, effectiveness and efficiency of the newsletter. Drawing up a matrix like Table 4.15 shows how important it is to have access to good data to support your evaluation and, by extension, how important it is to maintain a good monitoring system of data relating to your newsletter. Collecting the data When you have determined your scope, focus and indicators, you will need to decide how to collect the data. It is important to develop a data collection strategy that ensures that: z all relevant data will become available during the evaluation z no more data are collected than what is needed and can be analysed 199 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.16 gives some guidance on which data collection tools are most appropriate to use for the indicators identified. Note that some of the tools needed for some indicators might be too expensive to use and you might have to find more creative ways to get the information you need or perhaps decide against using that particular indicator. Many newsletter evaluations use questionnaires as their primary data collection tool. However, other tools can also be used. Below are some tips specific to a newsletter that you should take into consideration when conducting interviews, compiling routine records and analysing recent issues of the newsletter: z Desk study: Analyse recent issues of the newsletter, reading each article that has been published to see if it adheres to the stated focus of the newsletter.The frequency of publication should also be reviewed, as should the timely delivery of recent issues. Recent letters to the editor could also be studied, including those letters that were not published. Many of them will give feedback that could be useful to the evaluation. But it is important to remember that letter writers do not necessarily form a typical selection of readers’ opinions, so any findings from the letters to the publication should be verified, or at least treated with caution. z Interviews and questionnaires: Background data such as the main objectives, core values, approach and working methods can be collected by interviewing publishers and editors.The most critical data will be provided by the target group, who will usually be interviewed or asked to complete a questionnaire. Table 4.16 Example of data collection methods for evaluating a newsletter DATA INFORMATION KEY ISSUES / ANSWERS EXPECTED COLLECTION METHOD SOURCE QUESTIONS AND CONCERNS Desk study Letters to the editor; evaluation How satisfied are you with the newsletter? Level of satisfaction forms; subscription forms; recent Is the newsletter making a difference to the users Level of subscriptions and issues of the newsletter (primary stakeholders)? cancellations Is the newsletter addressing the areas it set out Newsletter not making a to do? difference in the field Articles produced are not within the objectives of the newsletter Accounts department What is the cost of producing the newsletter? Unit cost of the newsletter is too high Interviews Questionnaires, focus groups (to Can you get the newsletter easily? Accessibility, timeliness and follow up and get a better Is it timely? usefulness of the newsletter picture of target group Is the newsletter useful? experiences) Is it easy to read? Suggestions Are there other topics you want to see in the newsletter? Have you changed your attitude as a result of the newsletter? Meetings Partners/ stakeholders How can collaboration be Research institutes not willing to increased/strengthened? collaborate by supporting / contributing technical content 200 Part 4: EVALUATION GUIDELINES: Newsletter z Routine and existing records: Regardless of the scope of the evaluation, you need to be systematically collecting routine information that can be used for the general assessment of the newsletter. Routine records are an important source of data as they provide information on the how, where and when people use the newsletter. If you haven’t been compiling the data on a regular basis, you should start right away. Examples of the type of data you should be collecting include how many copies of newsletters are being produced and distributed monthly/quarterly/half-yearly/annually, and the number of subscriptions cancelled. Important sources of this type of data include: subscription forms (which give information on the readers) and evaluation forms (which give information on the reader’s view on the relevance of the newsletter, etc.). People cancelling their subscriptions comprise an interesting group to include in an evaluation.They will be able to share with you why they cancelled the subscription, which will provide you with interesting data.You can obtain these data by sending them a short questionnaire immediately on cancellation. Keep the questionnaire short. In general, these people have lost interest and loyalty, and will not want to invest a lot of time in providing you with information. Analysing the data It is advisable to think about data analysis tools and methods before starting the data collection. This will help to ensure that the right types of data are being collected. You will often find that the data you’ve collected from your routine records, feedback and interviews is quite rich and the challenge will be to interpret the information accurately.Table 4.17 shows you some of the ways you could analyse and interpret your data. In designing the data analysis for a newsletter evaluation, you should ask the following questions: z Which analytical tools should be used? (e.g., a table featuring the type of questions versus the type of readers, a graph showing development of readers over time, or specific analytical tools such as a problem tree or a SWOT analysis) z Which collection methods will provide the necessary data for each of the tools? z What type of observations do you expect to make for each tool? z What type of conclusions do you expect to reach using these tools? Once you have designed the data analysis, it is important to see if your data collection includes all elements.The results from some data collection methods (e.g., interviews) could be specifically meant for checking and/or interpreting other data. Explain why you were unable to gather certain data (e.g., in some communities the interviewers were not allowed to interview the women). Preparing a communication plan Designing the communication plan is an important activity in the evaluation process to ensure that conclusions and recommendations from the evaluation are well understood and supported. In designing this strategy, you need to consider these issues: z Which communication methods will suit which stakeholders, especially the primary stakeholders? z What are the main issues to be discussed and reported, taking into consideration the communication method and target group? 201 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.17 Example of a data analysis design for a newsletter evaluation ANALYTICAL TOOLS RESULTS FROM TYPE OF OBSERVATIONS TYPE OF CONCLUSIONS AND TABLES COLLECTION METHOD TO MAKE TO REACH USED Table 1: Desk study, routine records Categories of readers over/ under-represented Categories of readers to give Who are the readers? more attention to when collecting data Table 2: (target group) Desk study How useful is the newsletter given the needs of the Whether the newsletter is Relevance Questionnaires target group relevant or not Interviews Table 3: Desk study How appropriate is the language used? Whether the language used Usability Questionnaires should change Readability Interviews Table 4: (target group) Desk study - High and low satisfaction with the newsletter as Priorities for improvement Effectiveness: Questionnaires a whole, content Obstacles to improvement - Reader satisfaction Interviews - Level of awareness of newsletter - Awareness Letters to the editor - Change in readership over time - Reach - Format appropriate - Appearance - Does it promote learning? - Learning Table 5: Data from accounts Cost to produce the newsletter Potential areas for reducing Efficiency Time involved in its production costs/time Graph: Farmers’ time / Procedures Bottlenecks in the process Steps to improve the process Process flow chart manual / Interviews Table 6: Personal stories of how lives have been affected Determine whether there has Impact stories In-depth interviews, case by the newsletter been any change in the quality studies, feedback from Statistics showing change agricultural production, of life in farming communities correspondence etc. An effective communication plan for critical reflection during an evaluation and reporting findings during and after the evaluation helps to create common understanding among stakeholders, providing a good basis for implementing recommendations. If your stakeholders don’t accept the conclusions and recommendations resulting from the evaluation, it will be almost impossible to motivate them to make the required changes. However, if they have been included throughout the process, they are more likely to accept the conclusions. During the process of critical reflection and reporting, you should ask: z Do the stakeholders have the same views on the problems with the newsletter? z Do they have the same views on the solutions? z Are they prepared to support the same solutions? z What obstacles prevent them from implementing the solutions proposed? z What can be done to address the obstacles they face? There are various ways in which you can convey the results of the evaluation (e.g., providing a summary of your findings, organising a meeting, a memorandum).You might also want to consider 202 Part 4: EVALUATION GUIDELINES: Newsletter the type of information you make available to the various target groups.The communication plan you use will depend on your circumstance and the facilities available to you. Consider sharing the results of your evaluation with your readers in the actual newsletter itself, so that if there are changes in terms of how the newsletter is managed or presented, they will not come as a surprise to the reader. Whatever you decide, your findings and recommendations should be presented in a way that is easily understood and accepted by all so that the desired changes can be implemented (see Table 4.18). Table 4.18 Example of a communication plan for reporting the findings from a newsletter evaluation COMMUNICATION METHOD STAKEHOLDERS TO MAIN ISSUES TO DISCUSS / REPORT TO REPORT ON SWOT analysis Representatives of newsletter and key farmers Critically reflecting on the results from the data collection Brainstorming Staff and management of the newsletter and Creative solutions to problems identified selected stakeholders Full report, including executive summary Funding agency Readers’ development Partners Readers’ satisfaction Staff and management Topics Costs and income Conclusions and recommendations Personal meeting Management Conclusions and recommendations Feedback on individual staff members Article in newsletter Readers Major findings If the stakeholders are properly involved in the evaluation process in terms of participation and input, then getting their agreement on the relevance, reliability and quality of the data collected, the adequacy of the analysis provided and the conclusions drawn should not be difficult. Be aware, however, that although there might be general acceptance of a proposed solution, you might not be able to implement the solution fully because of a lack of resources, time and/or capacity. An important component of the communication process is to find out what difficulties the stakeholders expect and/or experience in implementing the recommendations.Where changes can be implemented without difficulty, it is advisable to implement them as quickly as possible. 203 Smart Toolkit for Evaluating Information Projects, Products and Services Box 4.5 Evaluating a newsletter: guidelines checklist These guidelines on evaluating a newsletter are covered above: z Newsletter concept and objectives z Data needs and stakeholder participation z Evaluation focus, questions and indicators z Data collection z Data analysis z The communication plan For more on: z data analysis, see Part 2, page 60 and Part 3, 157-162 z data collection, see Part 2, pages 58-59 and Part 3, pages 106-115 z evaluation communication and follow-up, see Part 3, pages 163-173 z evaluation criteria (scope), see Part 2, pages 33-34 z indicators, see Part 3, pages 91-98 z logframe, see Part 3, pages 68-83 z logic model, see Part 3, pages 103-105 z stakeholder participation, see Part 1, pages 3-5 z terms of reference, see Part 2, pages 32-35 204 Part 4: EVALUATION GUIDELINES: Website WEBSITE These guidelines relate to evaluating a website, using the terms of reference (see Part 3, pages 99-103) as the basis of the evaluation and drawing on examples where appropriate. It is important to make the evaluation process as participatory as possible, involving colleagues and key stakeholders from the outset, including your primary stakeholders (see Part 1, pages 3-5). By doing this, you will find the experience more rewarding and more likely to result in a general acceptance of the evaluation findings, making it much easier to implement change. The notion of ‘critical reflection’ is discussed towards the end of the guidelines, but we advise you and your evaluation team to ‘reflect’ on the main findings and problems at each stage of the evaluation process, for learning purposes and to improve the way you conduct the evaluation. (See Part 1, pages 5-6). The background to the evaluation process in general – concepts, context, terms, trends and core ingredients – is described in Part 2. Do read that section if you are not familiar with certain concepts or terms that occur in these guidelines. And in Part 3 you will find more on the evaluation tools described here. Before evaluating a website, you need to be clear about: z what the main elements of the website are z its concept(s) and objectives z who the primary and secondary stakeholders are z how to go about identifying the needs of the primary and secondary stakeholders z what the focus of the evaluation is, the questions to ask and the indicators to use z how to collect the data and analyse them z how to communicate with your key stakeholders, to critically reflect upon the evaluation results and to report the results in such a way that they will be accepted and acted upon What is a website? A website is a collection of online files (pages) on a particular subject that includes an opening file, the ‘home page’.All websites have an address, known as a ‘universal resource locator’ (URL), which is their home page address. A website can be viewed throughout the world by people with computers with very different specifications, so long as they have a working browser (e.g., Internet Explorer, Firefox or Safari). It can have multiple goals and target groups (e.g., an information site providing details of a service, organisation or company to the general public, or a retail site selling goods to particular groups, or a site that encourages the exchange of knowledge and experiences among a particular group of professionals).The goal(s) and audience(s) will determine the website content and design. Determining the website concept and objectives You can’t evaluate your website unless you are clear what it is about.The concept (idea) behind the website and the objectives it seeks to achieve need to be clearly stated, otherwise you can’t 205 Smart Toolkit for Evaluating Information Projects, Products and Services compare its actual performance with its intended performance.Also, without a clear concept it is difficult to make the right choices during the evaluation process. In determining a website concept, you should ask these questions: z Why was the website developed? z What were the main objectives (expected results) and problems to be addressed at the time? z What is the goal of the website? z What is the purpose of the website? z What are its core values? z What target group(s) are you aiming for? z How do you want to develop the website in terms of content, organisation, presentation, accessibility and technical competence, bearing in mind its target group(s)? In evaluating the website, you need to ask yourself key questions related to the concept, such as: z Do the expected results achieved reflect its core values and approach? z Do they reflect the project purpose and contribute to the goal? z Does the website address the main objectives and problems it was supposed to? z Are the correct messages being conveyed? Box 4.6 Example of a concept behind a soils data website Main problem: A research institute specialising in the collection of primary world soils data wants its clients to have easy access to its database and its publications. Target group: Researchers, scientists, planners and students. Goal: To contribute to improved agricultural and environmental planning through the provision of relevant and appropriate information on soils of the world. Purpose: A good source of scientific information and knowledge on world soils, allowing visitors to access and share information developed. Expected results: Increased awareness and understanding of key issues in soils. Main approach: To provide an interactive website where users can access information and ask soils-related questions in an easy and user-friendly way. Core values: Relevance, accessibility, usability and effectiveness The logical framework can help you clarify the website objectives (see page68-83). Commonly known as the ‘logframe’, it helps to summarise a project in a logical sequence. If a project plan does not have a logframe, it is useful to construct one so that you have a good idea of your objectives and the consequent hierarchy of activities.An example of a logframe for a website is given in Table 4.19. 206 Part 4: EVALUATION GUIDELINES: Website Table 4.19 Logical framework for a soils data website INTERVENTION OBJECTIVELY VERIFIABLE MEANS OF ASSUMPTIONS LOGIC INDICATORS VERIFICATION GOAL To contribute to improved agricultural and environmental planning through the provision of information on soils of the world WEBSITE A good source of scientific After 2 years the number of Routine and existing records, Contributes to improved PURPOSE information and knowledge on visitors increases fourfold including web statistics agriculture and environmental world soils, which allows planning visitors to access and share information developed Increased awareness and Feedback from visitors on key Interviews Is relevant EXPECTED understanding of key issues in soil issues Letters/emails to the Increased awareness leads to RESULTS soils and use of data organisation increased visits to the website Web statistics Contributions to the website 1.1 Meet staff and selected (sometimes a summary of (sometimes a summary of (if the activities are completed, ACTIVITIES stakeholders to discuss resources/means is provided costs/budget is provided in what assumptions must hold website objectives, design in this box) this box) true to deliver the expected and content, brainstorm to results) formulate strategy to obtain feedback from target group(s) 1.2 Identify priority areas for target group(s) 1.3 Put information on the website 1.4 Develop and carry out strategy for promoting and sharing soil information with stakeholders INPUTS Project funding from donor List resources available agencies If the website is not seen as a project, but as an ongoing activity, the logic model might be the more appropriate tool to use. It provides an overview of activities associated with the course, the resources used, the outputs/results, the outcomes/effects and the impact. An example of a logic model applied to a website is given in Table 4.20. In using this framework, the key questions to ask to see if the website is meeting the stated objectives become evident. For example: z Are users satisfied with the website? z Does the website content reflect its objectives? z Is the website easy to navigate? z Have visits to the website increased? 207 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.20 Logic model for a soils data website FOCUS RESOURCES ACTIVITIES OUTPUTS OUTCOMES IMPACT (indicators of results) (indicators of effects) INDICATORS Planning and development In-house Staff Meeting to discuss idea List of ideas, list of key Concept for website Facilities behind the website and stakeholders developed, who to how to go about its involve development Involving key Time needed for staff Brainstorm meetings to Structure and design of Feedback on website stakeholders from key stakeholder design site and website developed institutions and national questionnaires to obtain Questionnaires sent to soils institutes to feedback and identify stakeholders meet/discuss other stakeholders Stationery, transportation, postage, fax, telephone, computer, printer Process Time and willingness of Develop plan to place Plan to identify Operational plan for the development staff and stakeholders to information on the website information for the website meet and update it regularly website Funds Funds from donor Budget prepared for the Budget; financial report Agreement on funds governments development of the available for the website website Implementation/operations Designing Staff to modify design of Modification of website Website modified New ‘look’ for website Information website website based on feedback No. of items submitted for Key items selected for addressing the Capable staff to Materials identified for the possible inclusion on the the website needs of the target research and contribute website in line with website group content for the website feedback from Key contributors from questionnaires partner institutions or independent actors Launching the Staff to put information Set up the website and Website online Knowledge and % target users website on the website input data information made changing their accessible practices consequent on the website Promotional Time of staff and key Meetings to develop Strategy developed Percentage change in Increased visits to activities stakeholders promotion and distribution the number of website the website strategy visitors Monitoring and evaluation Monitoring and Time staff need to Determine indicators Development of a Improved website (in evaluation implement clear Identify appropriate data- database to support terms of its effectiveness procedures to record collection methods monitoring and evaluation and efficiency) to meet routine data Design reporting formats to the needs of the users Time staff need to input collect and analyse data routine data 208 Part 4: EVALUATION GUIDELINES: Website Clarifying data needs and stakeholder participation When preparing for an evaluation, you will need to know who your primary and secondary stakeholders are, so that you can determine your data needs and involve stakeholder representatives in the evaluation.The stakeholders should be taken into account at every stage of the evaluation, from its planning and design, to being part of the team as well as sources of information, and providing feedback on the evaluation results. It will be difficult to implement the evaluation recommendations if they are not acceptable to your stakeholders. The range of stakeholders in a website could include: z target users (be aware that the visitors to your website are not necessarily those visitors you have defined as the website target group) z webmaster z test group z content manager z partners The users of the website – the primary stakeholders – will be individuals looking for information on your organisation and/or information produced by your organisation.Who they are should be identified by whoever is responsible for developing the website content and design. One way of identifying these stakeholders is to conduct a stakeholder analysis.Table 4.21 provides an example of a stakeholder analysis. When considering the stakeholders’ interests in the website, be aware of the kind of questions that they would like to have answered.Table 4.22 gives an overview of the kind of key questions that Table 4.21 Stakeholder analysis for a soils data website STAKEHOLDER BENEFITS FROM CONTRIBUTIONS / INFLUENCE ON POTENTIAL CATEGORY THE PROJECT SACRIFICES THE PROJECT INVOLVEMENT IN THE EVALUATION Website users/ test Improved knowledge on soil Time to navigate the website Format (navigation) of Represented on the team group issues and access to soil Cost to access the service website Involved in analysis and data Time to contribute Content of website decisions on actions for change An information source Told about main findings Webmaster, content Better information for and Time and technical input Concept, structure, design Represented on the team, manager feedback from target groups, and contents of website involved in the planning, networking evaluation and findings Management of soils Better information for and Time and costs Website concept Represented on the team, research institute from target group involved in the planning, Improved image evaluation and findings Partners Improved information, Time and effort to support the Quality of information made Represented on the team, networking process available involved in the planning, evaluation and findings Funding agencies Results to show to their Finance and time Project concept and plan Informed about the clientele Monitoring and evaluation evaluation and its findings 209 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.22 Key questions stakeholders could be interested in STAKEHOLDER QUESTIONS THEY ARE POTENTIAL USE OF CATEGORY INTERESTED IN THE ANSWERS Website users/ test group Is the information provided relevant, timely and Should the website be modified? applicable? Should it be recommended to others? Does it meet their needs? Is the website easy to navigate? Webmaster, content manager Does it serve its target group well? How should we continue with the website? Does it have enough statistical information to allow for visitor analysis? What other information might be useful on the website? How relevant and effective is the website? What are its strengths, weaknesses, opportunities and threats? Is it informative, popular and well presented? Management of soils research institute Is the target group better informed? Should we continue with the website? And if so, Is there improved feedback from the target how? group? Does the website contribute to our image? What resources are needed? Partners What information should be on the website? Should we continue to support the website? How relevant and effective is the website? Is it informative, popular and well presented? Is there an incentive for partners to promote it? Funding agencies Does the website have an impact on agricultural Should we continue, reduce or increase our planning? funding? How sustainable is it if we stop funding it? In what areas can/should we assist? might be asked. If you know your stakeholders really well, you can start the process by working with your colleagues to formulate questions based on your knowledge of them. But the stakeholders must become involved at some stage to ensure that you have their views.You can do this using interviews, workshops, and questionnaires with individual or groups of stakeholders. Defining the evaluation focus, questions and indicators Most evaluation exercises need to be limited in focus to reflect time and budget limitations. It is not possible to cover all elements of a website extensively every time you carry out an evaluation, so you need to choose the focus of your evaluation, based on stakeholders’ key questions and the time and cost of accessing data. Among the main reasons for wanting to conduct an evaluation might be to improve your website and serve your users better. Evaluating a website is not an easy task. In general, your users are unknown and could be scattered around the world. On the other hand, you can easily obtain a lot of data at very little cost, and you only need to analyse these data to be able to transform them into valuable information.Asking specific questions will also help you to improve key areas of your 210 Part 4: EVALUATION GUIDELINES: Website Table 4.23 Determining the focus, key questions and indicators for the evaluation of a website, using some evaluation criteria as examples EVALUATION FOCUS KEY QUESTIONS INDICATORS STAKEHOLDER DATA CRITERIA (SCOPE) INTEREST ACCESSIBILITY IMPACT How has the information helped Improvements achieved in High Low to change soil management agriculture practices? RELEVANCE Information Do visitors find the website % of visitors who find the High Medium needs of visitors useful? website useful % of visitors who want more information on the website Most visited parts of the website Least visited parts of the website ACCESSIBILITY Technology How well constructed is the Browser compatibility Low Medium website? Average loading time in partner How long does it take to countries download the website? Technology used to develop the website – Flash, Java, Javascript, etc; also look at HTML (HyperText Markup Language)/XTML compatibilities, CSS, RSS feeds, printer friendly formats, etc. USABILITY Learnability How easy is it for visitors to % of visitors who find it easy to High Medium accomplish basic tasks the first do the basic tasks the first time time they encounter the design? they go to the website Basic tasks include for example, finding the latest issue of the Soils newsletter, or a soil map of Africa, etc Efficiency Once visitors to the website Perceived time taken to perform Medium Medium have learned the design, how the tasks, does it take too long, quickly can they perform the is it adequate? tasks? Memorability When visitors return to the Was it easy or difficult to use Medium Medium website after a period of not the website again? using it, how easily can they re- establish proficiency? Satisfaction How pleasant is it to use the % of visitors indicating their High Medium design? satisfaction with the design of the website Readability Can visitors understand the % of visitors who say that they High Medium language on the website (i.e., is understand the language used it pitched at the right level, or on the website too difficult, or too easy)? % of visitors who report that they are satisfied with the language used 211 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.23 (continued) EVALUATION FOCUS KEY QUESTIONS INDICATORS STAKEHOLDER DATA CRITERIA (SCOPE) INTEREST ACCESSIBILITY USABILITY Overall user Are visitors satisfied with the % of users satisfied with the High Medium satisfaction website? website overall Which areas of the website are % of users satisfied with specific they satisfied with? sections (bits) of the website Itemise them Which areas of the website are they not satisfied with? Areas suggested for improvement How would you improve the website? Navigability How easy is it for visitors to find % of visitors who say it was High Low what they want to know? easy to find the information Average loading time in partner Why is navigation structured the countries way it is? Browser compatibility Drop-down menus used Search engines used Number of links within the website that are not working (‘link rot’) EFFECTIVENESS Usage How many visitors go to the No. of visits to the website High Medium website? How many resources accessed? No. of resources accessed Content and Is the website regularly updated? Frequency of updating of High Medium accuracy content, depending on the type Is the information provided of information provided (e.g., accurate? news sites will need to be updated more often, but an old report of a meeting can’t be updated) Feedback from the users Reach Does the website reach a wide No. of visitors and return Medium Low range of people? visitors Average time spent on the website Percentage of users who say that they have passed the website URL on to others Networking Why do people go to the Ranking of the website pages Medium Low website? in search engines (e.g., Yahoo, AltaVista and Google) How easy is it to find the No. of referrals to your website website? Users of interactive parts of your website (see Online community) 212 Part 4: EVALUATION GUIDELINES: Website website (e.g., you might want to know how relevant your website is versus its accessibility and usability). The type of technology used is also important to consider in your evaluation. For example: z Does it use Flash, Shockwave, Java, Javascript or ASP? z Is there a good content management system? z Does it use HTML/XTML compatibilities, and/or offer interesting newsfeeds and printer-friendly formats? Table 4.23 provides an example of focus, key questions and indicators for a website evaluation.You can use the information here to develop a more specific set of questions, relevant to the focus of the website. Using the feedback from your stakeholders and the data available to you, you will be able to determine which questions are important for your evaluation.Whatever your focus is, you should link it to primary stakeholder (user) considerations, your ease of access to data and your resources in terms of time and costs. Table 4.23 shows that the evaluation of the website could focus on factors such as relevance, usability and impact primarily because of their importance to the stakeholders and those responsible for the website.The scope of the evaluation can always be broadened to areas such as the sustainability of the website if, for example, a strategic decision on its continuity has to be made. It is evident that access to good, relevant data to support your evaluation is important. By extension it is also important to maintain a good monitoring system on information relating to your website. Some general data about your website that you might want to collect routinely includes: number of users and their user profiles (i.e., data about the users, such as country, organisation, gender, area of employment, language, age (category), specialisation, equipment used, reasons for visiting the site, and expectations).These data allow you to draw meaningful conclusions from the evaluation results in relation to your user groups and are useful when looking at various evaluation criteria. Not all data can be routinely collected, however. It might be necessary to conduct extensive surveys among your users. Some key questions you might want to ask are: z Do you know about the website? z Do you find it relevant? z Are you satisfied with it? z Has it helped you in any way? Collecting the data When you have determined your scope, focus and indicators, you will need to decide how to collect the data. It is important to develop a data collection strategy that ensures: z that all relevant data will become available during the evaluation z that no more data are collected than what is needed and can be analysed Table 4.24 gives some guidance on which data collection tools are most appropriate to use for the indicators identified. Note that some of the tools needed for some indicators might be too expensive to use and you might have to find more creative ways to get the information you need or decide against using that particular indicator. 213 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.24 Example of data collection methods for evaluating a website DATA INFORMATION KEY ISSUES / ANSWERS EXPECTED COLLECTION METHOD SOURCE QUESTIONS AND CONCERNS Desk study Feedback from emails Are visitors satisfied with the website? Level of satisfaction and web statistics Is the website addressing the areas it set out to Website not being visited by address? your target groups Who are the visitors? Which sections of the website are they most interested in? Automated tests Are the website links up-to-date? Accounts department How much time and resources are needed for Time and money to develop and developing and maintaining the website? maintain a website can vary widely Interviews /online Visitors to the website and test How did you find the website? Online questionnaires are questionnaire group How easy was it to find the information you immediate and less expensive, wanted? but the drawback is that you’re Do you find the website useful? getting feedback from visitors Have you learned anything on your visits to the who already know your website website? (and even these may not be a representative sample of visitors) Direct observation Identify 5 users to actually test How easy is it to find the information asked? Difficulty in getting the ‘right’ the website by assigning set Were there problems in downloading documents? testers for your website tasks to do Is the website pleasant to look at? Suggestions Meetings Production personnel, NGOs, What do you think about the message, format, Difficulty in getting partners to government agencies and presentation of the website? work with your organisation. How to improve collaboration? Do remember to: z use individual page views, as a measurement rather than website hits, z check which pages are visited the most z use a reliable web statistics programme z look at the number of other websites linking to yours z seek the opinions of other people on what changes are needed, especially if you were involved in developing the website You might also want to consider: z whether your web pages are recognised by search engines such as Yahoo,AltaVista and Google z whether your website uses graphics that allow for quick loading z taking advantage of new technology where appropriate (even if that means leaving some users behind) Some other data collection tools specific to websites can also be used.These include: z Log-file data and web statistics: Websites have a great advantage over most other information products in that they can provide you with the statistics on your website’s performance. If you have not worked with web statistics before, you will need some help in their 214 Part 4: EVALUATION GUIDELINES: Website interpretation. Most internet service providers (ISPs) can provide you with log-files and web statistics, which will feed your evaluation. Before analysing your log-files, you need a ‘good’ sample to work with. Log-files register all requests made by the web server, so you need to clean them up by removing some ‘statistical noise’.This is done by excluding robots and crawlers from the analysis because they don’t involve any human interaction and by identifying and excluding staff from the analysis. Google Analytics offers a service that filters data in this way, so you might want to try using it. z Automated tests: For some mechanical aspects of website performance, web-based analysis tools can be used.Automated website tests are useful to evaluate purely mechanical features (e.g. whether or not a link works) that would be tedious or impossible to measure by eye. z Testing and browsing the website: Many evaluation questions can be answered simply by browsing the website.Testing with actual users, who are undertaking real-world tasks, is one of the best methods for evaluating websites. For example, to find out how long a person takes to find specific information, you could ask a user to find the document on the website that you know has that information; you can see how the user starts searches for the document, if he or she finds it, and how long it takes them. z Surveys online and face-to-face: You would use a survey to ask the users questions because this is more feasible when the target group is scattered all over the world and it is useful for finding out people’s general opinions. An online questionnaire is a cost-effective way to obtain information from people who might be difficult to reach via other means.The disadvantage is that you reach only those people who already know your website. It is a good idea for a user survey to be a permanent feature of your website, but keep it posted for at least 3 months before attempting to analyse the results. Surveys receive the largest response if they are prominent and/or offer an incentive. Examples of incentives include allowing the user access to a protected area of the website, subscription to a newsletter, an RSS feed, etc.You can also increase the response by creating a pop-up hyperlink to the survey on your website, and/or notify by e-mail all registered members of your website. It is worth using SurveyMonkey to publish your questionnaire online; for more on this, go to www.surveymonkey.com. Table 4.25 Example of web statistics obtained for a website LOCATION PAGE VIEWS VISITORS TOTAL TIME SPENT AVERAGE TIME SPENT PAGE VIEWS BY VISITORS PER VISITOR (dd:hh:mm:ss) (hh:mm:ss) Jamaica 2,500 150 01.50.00 00:00:43 16.67 Ghana 10,580 600 05:40:00 00:00:34 17.63 The Netherlands 70,567 7,000 1d 2:50:00 00:00:14 10.08 The last column is most telling. It gives an indication of the amount of interest the website generates for the different geographic groups of users. It is important to monitor this, because if the number of page views increases over time, it could mean that these countries are an important target group for the website. 215 Smart Toolkit for Evaluating Information Projects, Products and Services If you have access to information centres, you can use face-to-face interviews with users.This will take more time, but you will get more questionnaires completed. It would also be a good opportunity to observe some users carrying out specific tasks on the website. Analysing the data It is advisable to think about data analysis tools and methods before starting the data collection.This will help to ensure that the right types of data are being collected. Table 4.26 will help you to start thinking in this way. For example, if you want to know what staff think about the website, you should already be thinking that you might have to interview staff individually or in focus groups. In designing the data analysis for a website evaluation, you should ask the following questions: z Which analytical tools should be used? (e.g., a table featuring the type of questions versus the type of users, a graph showing visitor use over time, or specific analytical tools such as a problem tree or a SWOT analysis) z Which collection methods will provide the necessary data for each of the tools? Table 4.26 Example of a data analysis design for a website evaluation ANALYTICAL TOOLS RESULTS FROM TYPE OF OBSERVATIONS TYPE OF CONCLUSIONS AND TABLES COLLECTION METHOD TO MAKE TO REACH USED Table 1: Desk study Categories of visitors over/under-represented Categories of visitors to give No. and type of visitors Web statistics more attention to Table 2: Desk study How useful the website is given the needs of the Whether the website needs to Relevance Questionnaires target group be overhauled Interviews Table 3: Interviews Level of access of the target group to the internet Can determine whether the User access Access barriers website should be changed Table 4: Questionnaires Is the website attractive? Can visitors easily Whether the website needs to Usability Interviews navigate the website and find the information they be modified, areas for Satisfaction of the users Direct observation of how users want? Can they understand the language used? improvement use the website High and low satisfaction with the website Whether the users are happy with website Quality (accuracy) of information provided Table 5: Data from accounts Cost to produce and maintain the website Potential areas for reducing Efficiency costs/time Graph: Website procedures manual, Bottlenecks in the process Process steps to improve Process flow chart interviews Table 6: In-depth interviews, case Information being provided by the website is Determine whether there has Impact stories studies, feedback from emails, making a difference to people’s lives been some change in the way online questionnaires communities manage their resource 216 Part 4: EVALUATION GUIDELINES: Website z What type of observations do you expect to make for each tool? z What type of conclusions do you expect to get from the tools? Once you have designed the data analysis, it is important to see if your data collection includes all elements.The results from some data collection methods (e.g., interviews) could be specifically meant for checking and/or interpreting other data. Explain why you were unable to gather certain data (e.g., in some communities internet access was very erratic). Preparing a communication plan Designing the communication plan is an important activity in the evaluation process to ensure that conclusions and recommendations from the evaluation are well understood and supported. In designing this strategy, you need to consider these issues: z Which communication methods will suit which stakeholders, especially the primary stakeholders? z What are the main issues to be discussed and reported, taking into consideration the communication method and target group? An effective communication plan for critical reflection during an evaluation and reporting findings during and after the evaluation helps to create common understanding among stakeholders, providing a good basis for implementing recommendations. If your stakeholders don’t accept the conclusions and recommendations resulting from the evaluation, it will be almost impossible to motivate them to make the required changes. However, if they have been included throughout the process, they are more likely to accept the conclusions. During the process of critical reflection and reporting, you should ask: z Do the stakeholders share the same views on the problems with the website? z Do they have the same views on the solution(s)? z Are they prepared to support the same solution(s)? z What are the obstacles that will prevent them from implementing the solution(s) proposed? z What can be done to address the obstacles they face? An important component of a communication plan is to find out what difficulties stakeholders have in implementing changes. In the case of websites, the stakeholders to focus on here would be the designers, donors and content providers, rather than the primary stakeholders - the users. There are various ways in which you can convey the results of the evaluation (e.g., providing a summary of your findings, organising a meeting, internal communication via e-mail, a memorandum). You might also consider sharing the results on the website, so that if there are changes in terms of how the website is presented, they will not come as a surprise to the user.Table 4.27 provides an example of a communication strategy for a website evaluation. If the stakeholders are properly involved in the evaluation process in terms of participation and input, then getting their agreement on the relevance, reliability and quality of the data collected, the adequacy of the analysis provided and the conclusions drawn should not be difficult. Be aware, however, that although there might be general acceptance of a proposed solution, you might not be able to implement the solution because of a lack of resources, time and/or capacity. An important component of the communication process is to find out what difficulties the stakeholders expect and/or experience in implementing the recommendations.Where changes can be implemented without difficulty, it is advisable to implement them as quickly as possible. 217 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.27 Example of a communication plan for reporting the findings from a website evaluation COMMUNICATION METHOD STAKEHOLDERS TO MAIN ISSUES TO DISCUSS / REPORT TO REPORT ON Critical reflection Staff, key partners Critically reflecting on the results from the data collection and any other issues which may arise Summary report Partners Results from data collection Full report, including executive summary Funding agency Main messages of the evaluation: relevance, Staff and management satisfaction, awareness, access, reach Personal meeting Staff and management Conclusions and recommendations Feedback on individual staff members Short report of the evaluation on the website Users Main findings in broad terms and how they will affect staff and management Box 4.7 Evaluating a website: guidelines checklist These guidelines on evaluating a website are covered above: z Website concept and objectives z Data needs and stakeholder participation z Evaluation focus, questions and indicators z Data collection z Data analysis z The communication plan For more on: z data analysis, see Part 2, page 60 and Part 3, 157-162 z data collection, see Part 2, pages 58-59 and Part 3, pages 106-115 z evaluation communication and follow-up, see Part 3, pages 163-173 z evaluation criteria (scope), see Part 2, pages 33-34 z indicators, see Part 3, pages 91-98 z logframe, see Part 3, pages 68-83 z logic model, see Part 3, pages 103-105 z stakeholder participation, see Part 1, pages 3-5 z terms of reference, see Part 2, pages 32-35 218 Part 4: EVALUATION GUIDELINES: Question-and-answer service QUESTION-AND-ANSWER SERVICE These guidelines relate to evaluating a question-and-answer service (QAS), using the terms of reference ((see Part 3, pages 99-103) as the basis of the evaluation and drawing on examples where appropriate. It is important to make the evaluation process as participatory as possible, involving colleagues and key stakeholders from the outset, including your primary stakeholders (see Part 1, pages 3-5). By doing this, you will find the experience more rewarding and more likely to result in a general acceptance of the evaluation findings, making it much easier to implement change. The notion of ‘critical reflection’ is discussed towards the end of the guidelines, but we advise you and your evaluation team to ‘reflect’ on the main findings and problems at each stage of the evaluation process, for learning purposes and to improve the way you conduct the evaluation. (See Part 1, pages 5-6). The background to the evaluation process in general – concepts, context, terms, trends and core ingredients – is described in Part 2. Do read that section if you are not familiar with certain concepts or terms that occur in these guidelines. And in Part 3 you will find more on the evaluation tools described here. Before evaluating a question-and-answer service (QAS), you need to be clear about: z what the QAS consists of z its concept(s) and objectives z who the primary and secondary stakeholders are z how to go about identifying the needs of the primary and secondary stakeholders z what the focus of the evaluation is, the questions to ask and the indicators to use z how to collect the data and analyse them z how to communicate with your key stakeholders, to critically reflect upon the evaluation results and to report the results in such a way that they will be accepted and acted upon What is a QAS? When an information centre or library systematically responds to questions from users, they are offering a question-and-answer service (QAS).A QAS provides tailor-made answers, often complemented with photocopies of documents or information on useful contacts. It can be a stand-alone service or part of a package of information services offered by an organisation (e.g., a newsletter, a library and a radio programme). A QAS could be offered for a particular reason or combination of reasons, such as: z it could be a part of an institution’s knowledge transfer strategy z it could be a service complementing another service z it could be part of a promotional strategy z it could be an income-generating activity A QAS can focus on any subject field.An agriculture-oriented QAS might have been set up to: z offer practical answers to everyday problems in order to support the improvement of agricultural practices and ultimately help to eradicate hunger and poverty 219 Smart Toolkit for Evaluating Information Projects, Products and Services z help people to become self-sufficient in agricultural production z promote the use of existing information and research findings z complement and strengthen extension, training and research efforts z gain insight into the information needs of users, in order to select the focus of extension, training and research initiatives z document farmer experiences z strengthen the link between farmers, institutions and government. Determining the QAS concept and objectives You can’t evaluate your QAS unless you are clear what it is about.The concept (idea) behind the service and the objectives it seeks to achieve need to be clearly stated, otherwise you can’t compare its actual performance with its intended performance.Also, without a clear concept it is difficult to make the right choices during the evaluation process. In determining a QAS concept, you should ask these questions: z What is the background to the QAS? z What were the main objectives (expected results) and problems to be addressed at the time? z What is the goal of the QAS? z What is the purpose of the QAS? z What are its core values? z What is the main approach used to obtain questions and deliver answers? In conducting the evaluation, you will need to assess whether the QAS concept is valid by asking, for example: z Do the expected results achieved reflect the QAS core values and approach? z Do they reflect the project purpose and contribute to the goal of the QAS? z Does the QAS address the main objectives and problems it was supposed to? Box 4.8 Example of a concept behind a Ministry of Agriculture QAS Main problem: The QAS in the Ministry of Agriculture was set up to address the lack of available agricultural information for people involved in farming, agro-processing and produce marketing. It appeared that existing information was not adequately used. Goal: To enhance the agricultural development process and increase agricultural production, processing and marketing. Purpose: Relevant and appropriate information on a timely basis provided to assist farmers, extension workers, researchers, students and development agencies. Expected results: Awareness among the primary stakeholders created, and an adequate resource base and network with partner organisations developed. Main approach: To invite primary stakeholders to use the service and submit their questions, and provide answers through appropriate channels of communication, directly from main QAS or, where appropriate, using partner organisations to answer questions. Core values: Relevance, effectiveness and efficiency. 220 Part 4: EVALUATION GUIDELINES: Question-and-answer service The logical framework is a useful tool that can help you clarify the project objectives (see pages 68-93). Commonly known as the ‘logframe’, it helps to summarise a project in a logical sequence. If a project plan does not have a logframe, it is useful to construct one so that you have a good idea of your objectives and the consequent hierarchy of activities. An example of a logframe for a QAS is given in Table 4.28. Where the QAS is not seen as a project, but as an ongoing activity, the logic model might be the more appropriate tool to use. It provides an overview of activities associated with the QAS, the resources used, the outputs/results, the outcomes/effects and the impact.An example of a logic model applied to a QAS is given in Table 4.29. In using this framework, the key questions to ask to see if the QAS is meeting the stated objectives become evident. For example: z Is adequate agricultural information more accessible? z Is the QAS directly or indirectly reaching the farmers, researchers, etc.? z Does it contribute to increased agricultural production? z Has awareness of the QAS increased? z Is there more information flow from farmers, extensionists, researchers, etc.? z Are the answers the QAS provides timely, relevant and cost-effective? Clarifying data needs and stakeholder participation When preparing for an evaluation, you will need to know who your primary and secondary stakeholders are, so that you can determine your data needs and involve stakeholder representatives in the evaluation.The stakeholders should be taken into account at every stage of the evaluation, from its planning and design, to being part of the team as well as sources of information, and providing feedback on the evaluation results. It will be difficult to implement the evaluation recommendations if they are not acceptable to your stakeholders. The range of stakeholders in a QAS could include: z users (primary stakeholders, e.g., farmers, extension officers, researchers, students, policy-makers) z QAS staff and management z partners in delivering the answers (e.g., research institutes, government agencies, experts, extension officers) z information suppliers (e.g., magazines, agricultural databases) z funding agencies/donors Your stakeholders will play different roles depending on the focus of the evaluation.You should therefore take into account: z their interests (stake) when designing the evaluation z how they can actively contribute to the design of the evaluation z how to include them in the evaluation team z their role as a source of information (e.g., via questionnaires, interviews, workshops) z their feedback on evaluation results 221 Smart Toolkit for Evaluating Information Projects, Products and Services One way of identifying your primary and secondary stakeholders is to conduct a stakeholder analysis. It helps you to identify the key stakeholders, how they benefit from and contribute to the QAS and how to include them in the evaluation.Table 4.30 provides an example of a stakeholder analysis. Table 4.28 Logical framework for a QAS INTERVENTION OBJECTIVELY VERIFIABLE MEANS OF ASSUMPTIONS LOGIC INDICATORS VERIFICATION GOAL To contribute to improved national agricultural production QAS Availability of agricultural After 3 years, no. of users Management information QAS contributes to increased PURPOSE information increased increased to 200 per month system (MIS) (routine records) production EXPECTED Improved awareness among After 3 years: Survey Increased awareness leads to RESULTS stakeholders 50% of primary stakeholders MIS (routine records) increased use Improved availability of aware of QAS. appropriate information 60% of questions can be Researchers take farmers’ materials addressed with available needs seriously Improved feedback on information materials farmers’ needs to researchers 75% agri-researchers aware Survey of priority information needs of farmers ACTIVITIES 1.1 Organise meetings with (sometimes a summary of (sometimes a summary of (if the activities are completed, stakeholders to discuss QAS resources/means is provided costs/budget is provided in what assumptions must hold and its promotion in this box) this box) true to deliver the expected 1.2 Develop promotional results) campaign with stakeholders 1.3 Implement campaign 2.1 Conduct user needs survey of stakeholders to identify priority areas that the QAS should be addressing and the type of materials needed 2.3 Develop implementation plan to increase resources to information access based on needs identified 2.4. Access resources to build QAS 3.1 Identify information gaps that researchers need to be addressing for farmers 3.2 Feed results of farmer needs to researchers through meetings INPUTS Project funding from donor List resources available agencies 222 Part 4: EVALUATION GUIDELINES: Question-and-answer service Table 4.29 Logic model for a QAS FOCUS RESOURCES ACTIVITIES OUTPUTS OUTCOMES IMPACT (indicators of results) (indicators of effects) INDICATORS Planning and development User needs Open-minded staff Design and administer No. of questionnaires Information needs of identification transport, postage, questionnaires and administered target users copies interviews No. of interviews No. of changes made Willingness of users to Conduct meetings conducted based on user needs participate Analyse data and draw No. of meetings analysis conclusions conducted Involving External-oriented staff Calls, visits, meetings with No. of calls and visits Agreements made and stakeholders Transport external partners made and meetings network developed to Willingness of Develop agreements organised support QAS stakeholders to be No. of new external involved experts involved Process Dedicated staff, view Process analysis No. of bottlenecks Time and costs per development from client perspective Address bottlenecks reduced question reduced Willingness to address Develop procedures and Procedures manual User satisfaction bottlenecks forms compiled increased Request form designed Evaluation form designed Follow-up form designed Developing Own resources (books, Identify and select No. of new relevant % of questions that can information reports, databases), information resources books, full text articles be fully answered sources Partners Develop access to and brochures made Internet databases resources accessible Budgeting and Capable staff Make a financial plan Clear financial proposal Funds available financing Support of management Specify activities Clear financial report Expenditure within and donor agencies Calculate the budget budget limits Organisational Staff and management guidelines agree on financial priorities Implementation/operations Promotional Motivated, skilled staff Posters/cards/brochures No. of designed cards, % of potential target Increase in no. of activities Promotional materials distributed brochures distributed groups aware of the questions asked; (website, cards, Meetings organised No. of meetings and service Decrease in % of brochures) Websites maintained participants Number of partners questions asked Relationship with media Media informed No. of QAS users (experts involved) outside scope No. of press releases increases Delivery of Motivated, skilled staff Questions received and No. of questions % of users being % of users able to services Office space, equipment registered answered satisfied with the implement the database, information Search for answers No. of unique users answers answer. sources Relate with experts increased % of users with new % of farmers able to Contracts with experts Refer to other QAS’s No. of answers followed requests. improve agricultural Relations with other Questions answered up production as a information centres Answers followed up result of the QAS Adequate procedures Monitoring and evaluation d evaluation Define indicators Monitoring and Capable staff Design report formats Clear management Improved effectiveness evaluation Clear procedures Organise data collection information system (MIS) and efficiency of the Adequate registration Generate statistics and reports produced every service Indicators of quality, reports month, 3 months and 12 effectiveness, efficiency months 223 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.30 Stakeholder analysis for a QAS STAKEHOLDER BENEFITS FROM CONTRIBUTIONS / INFLUENCE ON POTENTIAL CATEGORY THE PROJECT SACRIFICES THE PROJECT INVOLVEMENT IN THE EVALUATION Users Improved knowledge about Time Through needs assessment Represented on the team agricultural production and Cost to access the service Involved in analysis and marketing decisions on actions for change An information source Informed about main findings Partners (research, Improved services Expertise, time and costs Through contracts/ Representation in team extension, other Improved outreach involved agreements To be informed about the libraries) Feedback Change of approach Through evaluation main findings QAS management and Improved services Time to develop Through project plan, project To be involved in planning staff Improved outreach concept/convince funding management and To be in the team Promotional effect agency implementation To be informed about results Extra efforts Funding agencies Results to show to their Finance and time Project concept and plan To be informed about clientele Through monitoring and evaluation planning evaluation To be informed about results Table 4.31 Key questions stakeholders could be interested in STAKEHOLDER QUESTIONS THEY ARE POTENTIAL USE OF CATEGORY INTERESTED IN THE ANSWERS Users Is the information provided relevant, timely and Continue/discontinue using the QAS applicable? Recommend the QAS to others Is it provided in an appropriate format? Is the overall service satisfactory? Partners (extension, research, other libraries) How effective and efficient is the QAS? Continue collaboration with the QAS Does it enhance their own service? Does it generate adequate feedback on information needs? QAS management and staff How effective and efficient is the QAS? Continue or discontinue the QAS Which target groups and themes provide Identify areas of expansion or improvement of the possibilities to expand the service or to improve QAS its quality? Identify ways to increase income and reduce What are the strengths, weaknesses, opportunities costs and threats? How can the QAS be financed in future? Funding agency Does the QAS have an impact on agricultural Decide whether to continue, reduce or increase production and marketing? funding for the QAS How sustainable is the QAS after we stop Identify areas of assistance funding? 224 Part 4: EVALUATION GUIDELINES: Question-and-answer service When considering the stakeholders’ interests in the QAS, it is important to be aware of the kind of questions that they would like to have answered.To ensure that you have their views on what questions they are interested in you need to work with them to make an inventory of these questions. You can do this using interviews, workshops, and questionnaires with individual or groups of stakeholders. If you know your stakeholders really well, you can start the process by working with your colleagues to formulate the questions based on your knowledge of them. Table 4.31 gives an overview of the kind of key questions this initial work might produce. Defining the evaluation focus, questions and indicators Most evaluation exercises need to be limited in focus to reflect time and budget limitations. It is not possible to cover all elements of a QAS extensively every time you carry out an evaluation, so you need to choose the focus of your evaluation, based on stakeholders’ key questions and the time and cost of accessing data. The focus of the evaluation may be on: z one or more of the key evaluation criteria (i.e., accessibility, impact, relevance, sustainability, usability, utility, effectiveness, efficiency) z one or more of the primary stakeholder groups (e.g., livestock farmers, extension officers, policy- makers) z one or more themes (e.g., poultry keeping, soil erosion) z one or more steps in the delivery of the QAS (e.g., timeliness, follow-up) Besides the general questions you need to ask during your QAS evaluation (e.g., who are the actual users, what type of information is in most demand), asking specific questions will help you to improve key areas of the QAS (e.g.,Are you providing appropriate information to livestock farmers, one of your key stakeholder groups?). With each key question, you need to think about the indicators that will help answer that question and determine the type of data you need to collect. Remember to consider the interests of the stakeholders and the ease of access you have to the data you need in terms of time and cost. Table 4.32 provides an example of the focus, key questions and indicators for a QAS evaluation.You can use the information here to develop a more specific set of questions, relevant to the focus of the QAS. Using the feedback from your stakeholders and the data available to you, you will be able to determine which questions are important for your evaluation. With reference to Table 4.32 you may want to focus on one criterion (e.g., impact) because of its importance to the continuation of the QAS, or on the criteria of usability and accessibility because of their importance to the users. If accountability is the main purpose of the evaluation you may want to limit the scope of the evaluation to the relevance, effectiveness and efficiency of the QAS. It is clear that access to good, relevant data depends, to a large extent, on how well your QAS registration and monitoring system functions. However, certain aspects, such as impact, relevance 225 Smart Toolkit for Evaluating Information Projects, Products and Services and awareness, require an extensive survey amongst users and/or potential users. If the QAS carries out regular user needs surveys and follow-up activities, it is possible that some of these data already exist and can be used to determine impact, relevance, awareness, and so on. Table 4.32 Determining the focus, key questions and indicators for the evaluation of a QAS, using some evaluation criteria as examples EVALUATION FOCUS KEY QUESTIONS INDICATORS STAKEHOLDER DATA CRITERIA (SCOPE) INTEREST ACCESSIBILITY IMPACT General Have QAS users been able to % of users who have been able High Low use the service for improving to use answers for improving productivity? productivity RELEVANCE Information To what extent does the QAS % of users indicating that the High Low needs of users address main information needs QAS addresses one of their of its users? main information needs EFFECTIVENESS Volume What is the volume of the QAS? No. and type of questions High High received No. and type of questions answered Awareness Are target groups aware of the % of target group indicating High Low service? awareness of the service ACCESSIBILITY Ease of access Is the service accessible to target % of target group that does not High groups? have easy access to one of the Medium QAS communication channels. -USABILITY Were the answers provided % of users saying that the High Medium easily understood? answers were prepared in in a way that was easily understood Were answers provided to users % of users being satisfied with High Medium in an appropriate format? the format of the answer User satisfaction Were they satisfied with the % of users who were satisfied High Medium content of the answer? with the content of answer Are users satisfied with the % of users being satisfied with High Medium QAS? the QAS EFFICIENCY Time and costs Is the service delivered in a Average cost and time spent Medium Medium timely and cost-efficient way? per question by the QAS staff Methodology Is the methodology adequate? Weaknesses and complaints Medium Medium used indicated by stakeholders on methods used to promote and to deliver Information Are the information sources used % of relevant questions that Medium High sources adequate? can’t be adequately answered SUSTAINABILITY Potential for What is the potential for No. of years of guaranteed Medium Medium continuation continuation of the QAS? funding % and growth rate of self- financing 226 Part 4: EVALUATION GUIDELINES: Question-and-answer service Collecting the data When you have determined your scope, focus and indicators, you will need to decide how to collect the data. It is important to develop a data collection strategy that ensures: z that all relevant data will become available during the evaluation z that no more data are collected than what is needed and can be analysed Table 4.33 gives some guidance on which data collection tools are most appropriate to use for the indicators identified. Note that some of the tools needed for some indicators might be too expensive to use, you might have to find more creative ways to get the information you need or decide against using that particular indicator. You need to check that all the questions relevant to your focus are included in your data collection design. Table 4.33 Example of data collection methods for evaluating a QAS DATA INFORMATION KEY ISSUES / ANSWERS EXPECTED COLLECTION METHOD SOURCE QUESTIONS AND CONCERNS Desk study QAS records What is the volume of the service? Check reliability of registration No. and type of users No. and type of questions Are users satisfied with: - the service as a whole - the relevance of the answer - the information format - the timeliness? % of answers being late % users implementing answers Accounts department What is the cost per question? Total cost of the QAS divided by no. of questions answered gives What is the time spent per step in the QAS you the cost per question process? Time spent from the time of the registration system Bureau of Statistics No. of farmers and extension workers Check reliability of data Access to communication channels (post, radio, telephone, internet etc.) Focus group discussion Selected users What helps or hinders using the QAS? Time consuming to include large What helps or hinders the implementation of QAS enough random sample answers? Interviews Staff and partners Strengths, weaknesses, opportunities and threats of Interviews can also be used to the QAS, in particular, in information sources, help forge linkages delivery process, promotion and partnerships Workshop Partners Strengths, weaknesses, opportunities and threats Aim to finalise workshop with of the QAS clear statement on what to Possibilities to improve collaboration improve and who will do what 227 Smart Toolkit for Evaluating Information Projects, Products and Services For acceptance and learning purposes, you should include the stakeholders, as far as possible, in your data collection activities.This increases their involvement in the evaluation process, and they will learn directly from the feedback they get, and will be less likely to challenge the results. Workshops are useful not only for data collection purposes, but also for data analysis and reporting purposes. Analysing the data It is advisable to think about data analysis tools and methods before starting the data collection. This will help to ensure that the right types of data are being collected. Table 4.34 will help you to start thinking in this way. For example, if you want to know what users think about the QAS, you should already be thinking about ways to select interviewees and to interview them. In designing the data analysis for a QAS evaluation, you need to ask the following questions: z Which analytical tools should I use? (e.g., a table featuring type of questions versus the type of users, a graph showing increase in users over time, or specific analytical tools such as a problem tree, process flow chart or SWOT analysis) Table 4.34 Example of a data analysis design for a QAS evaluation ANALYTICAL TOOLS RESULTS FROM TYPE OF OBSERVATIONS TYPE OF CONCLUSIONS AND TABLES COLLECTION METHOD TO MAKE TO REACH USED Table 1: Desk study High or low total output Overall conclusion on results Effectiveness: Type of Categories of participants who are over/under- Categories to give more questions and type of users represented attention to Table 2: Desk study Awareness of the service Whether the service needs to Effectiveness:Awareness Questionnaires be more aggressively promoted Interviews Table 3: Desk study High and low satisfaction in relation to: Priorities for improvement Usability: User satisfaction Interviews the service as a whole The language used in the answer Format Completeness of the answer Table 3: Data from accounts Costs per question Potential areas for reducing Efficiency Time per process step costs/time Figure 1: QAS procedures manual Bottlenecks in the process Process steps to improve Process flow chart Interviews Figure 2: Questionnaires Hindrances in implementing the QAS effectively Areas for improvement Problem tree Interviews Workshop Figure 3: Workshop with staff and Strengths, weaknesses, opportunities and threats of New QAS strategy SWOT analysis stakeholder representatives the QAS 228 Part 4: EVALUATION GUIDELINES: Question-and-answer service z Which collection methods will provide the necessary information for each of the tools? z What type of observations do you expect to make from each tool? z What type of recommendations do you expect to get from it? Once you have designed the data analysis, it is important to see if your data collection includes all elements.The results from some data collection methods (e.g., interviews) could be specifically meant for checking and/or interpreting other data. Preparing a communication plan Designing the communication plan is an important activity in the evaluation process to ensure that conclusions and recommendations from the evaluation are well understood and supported. In designing this strategy, you need to consider these issues: z Which communication methods will suit which stakeholders, especially the primary stakeholders? z What are the main issues to be discussed and reported, taking into consideration the communication method and target group? An effective communication plan for critical reflection during an evaluation and reporting findings during and after the evaluation helps to create common understanding among stakeholders, providing a good basis for implementing recommendations. If your stakeholders don’t accept the conclusions and recommendations resulting from the evaluation, it will be almost impossible to motivate them to make the required changes. However, if they have been included throughout the process, they are more likely to accept the conclusions. During the process of critical reflection and reporting, you should ask: z Do the stakeholders share the same views on the problems with the QAS? z Do they have the same views on the solution(s)? z Are they prepared to support the same solution(s)? z What are the obstacles that will prevent them from implementing the solution(s) proposed? z What can be done to address the obstacles they face? There are various ways in which you can convey the results of the evaluation (e.g., providing a summary of your findings, organising a meeting, a memorandum, a web page).Table 4.35 gives an example of a communication strategy for a QAS evaluation. If the stakeholders are properly involved in the evaluation process in terms of participation and input, then getting their agreement on the relevance, reliability and quality of the data collected, the adequacy of the analysis provided and the conclusions drawn should not be difficult. Be aware, however, that although there might be general acceptance of a proposed solution, you might not be able to implement the solution because of a lack of resources, time and/or capacity. An important component of the communication process is to find out what difficulties the stakeholders expect and/or experience in implementing the recommendations.Where changes can be implemented without difficulty, it is advisable to implement them as quickly as possible. 229 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.35 Example of a communication plan for reporting the findings from a QAS evaluation COMMUNICATION METHOD STAKEHOLDERS TO MAIN ISSUES TO DISCUSS / REPORT TO REPORT ON SWOT analysis Representatives of the QAS and partners Critically reflecting on the results from the data collection Brainstorming QAS staff and management and selected Creative solutions to problems identified partners Full report, including executive summary Funding agency Quantitative and qualitative outputs, user QAS management satisfaction Impact indication, cost-effectiveness Personal meeting QAS management Conclusions and recommendations Feedback on individual staff members Follow-up workshop Representatives of QAS and partners Decisions by management Addressing obstacles to implementation Article on experiences of the QAS Professionals active in similar fields Evaluation approach and results QAS brochure Farmers Examples of QAS impact stories Box 4.9 Evaluating a QAS: guidelines checklist These guidelines on evaluating a QAS are covered above z QAS concept and objectives z Data needs and stakeholder participation z Evaluation focus, questions and indicators z Data collection z Data analysis z The communication plan For more on: z data analysis, see Part 2, page 60 and Part 3, 157-162 z data collection, see Part 2, pages 58-59 and Part 3, pages 106-115 z evaluation communication and follow-up, see Part 3, pages 163-173 z evaluation criteria (scope), see Part 2, pages 33-34 z indicators, see Part 3, pages 91-98 z logframe, see Part 3, pages 68-83 z logic model, see Part 3, pages 103-105 z stakeholder participation, see Part 1, pages 3-5 z terms of reference, see Part 2, pages 32-35 230 Part 4: EVALUATION GUIDELINES: Small library / Resource centre SMALL LIBRARY / RESOURCE CENTRE These guidelines relate to evaluating a small library or resource centre, using the terms of reference (see Part 3, pages 99-103) as the basis of the evaluation and drawing on examples where appropriate. It is important to make the evaluation process as participatory as possible, involving colleagues and key stakeholders from the outset, including your primary stakeholders (see Part 1, pages 3-5). By doing this, you will find the experience more rewarding and more likely to result in a general acceptance of the evaluation findings, making it much easier to implement change. The notion of ‘critical reflection’ is discussed towards the end of the guidelines, but we advise you and your evaluation team to ‘reflect’ on the main findings and problems at each stage of the evaluation process, for learning purposes and to improve the way you conduct the evaluation. (See Part 1, pages 5-6). The background to the evaluation process in general – concepts, context, terms, trends and core ingredients – is described in Part 2. Do read that section if you are not familiar with certain concepts or terms that occur in these guidelines. And in Part 3 you will find more on the evaluation tools described here. Before evaluating a small library or resource centre, you need to be clear about: z what the main elements of the library or resource centre are z its concept(s) and objectives z who the primary and secondary stakeholders are z how to go about identifying the needs of primary and secondary stakeholders z what the focus of the evaluation is, the questions to ask and the indicators to use z how to collect the data and analyse them z how to communicate with your key stakeholders, to critically reflect upon the evaluation results and to report the results in such a way that they will be accepted and acted upon What is a small library or resource centre? A library is more than a large organised collection of books for reading or reference – it is an entity that provides access to knowledge and learning. It is also an information service that responds to the needs of a particular group of people and is central to knowledge management. It can be an organisation in itself, or part of an organisation. With the increased availability of new technologies such as the internet, library services have evolved to take advantage of new opportunities. Collections are no longer restricted to books and journals; there is now online access to databases and a wide range of publications from all over the world.Thus, the difference between a small library and a resource centre has narrowed, and can now be regarded as more or less the same. Here, we regard them as the same, and use the term ‘library’ to cover both small libraries and resources centres. Determining the library concept and objectives You can’t evaluate your library unless you are clear what it is about.The concept (idea) behind the library and the objectives it seeks to achieve need to be clearly stated, otherwise you can’t 231 Smart Toolkit for Evaluating Information Projects, Products and Services compare its actual performance with its intended performance.Also, without a clear concept it is difficult to make the right choices during the evaluation process. In determining a library concept, you should ask these questions: z Why was the library established? z What were the main objectives (expected results) and main problems to be addressed at the time? z What is the goal of the library? z What are its core values? z Who are the target users? In evaluating the library, you need to ask yourself key questions related to the concept, such as: z Do the expected results achieved reflect its core values and approach? z Do they reflect the project purpose and contribute to the overall objective? z Does the library address the main objectives and problem it was supposed to? z Are the correct messages being conveyed? Box 4.10 Example of a concept behind a library focusing on economic planning Main problem: The DocCentre was established to support the National Planning Agency in carrying out macro- economic planning. A new chief librarian was hired to oversee operations in the DocCentre, and was faced with the challenge of building its resources and developing links with other key information providers to support the government in its efforts to meet the demands of an increasingly globalised world. Goal: To support macro-economic planning through the provision of information to government officers. Purpose: Economists, policy-makers and planners provided with appropriate information to support their planning, research and modelling activities to help the government formulate appropriate policies and to help decision-making processes. Expected results: Services improved to meet the information needs of the policy-makers, planners and economists within the government service. Main approach: Questionnaires have been sent out and meetings planned for the stakeholders to identify their data needs and ways in which they can help strengthen the information flow process. Also, librarians are being trained to use appropriate information technologies to access global information resources. Core values: Relevance, effectiveness, impact The logical framework can help you clarify the project objectives (see pages 68-83). Commonly known as the ‘logframe’, it helps to summarise a project in a logical sequence. If a project plan does not have a logframe, it is useful to construct one so that you have a good idea of your objectives and the consequent hierarchy of activities.An example of a logframe for a library is given in Table 4.36. 232 Part 4: EVALUATION GUIDELINES: Small library / Resource centre Table 4.36 Logical framework for a library focusing on economic planning INTERVENTION OBJECTIVELY VERIFIABLE MEANS OF ASSUMPTIONS LOGIC INDICATORS VERIFICATION GOAL To promote macro-economic Reports produced to support Quarterly and yearly reports planning of the economy the decision-making process on the performance through the provision of of the economy information LIBRARY Access to a wide range of After 2 years key resources Routine and existing records Library contributes to better PURPOSE information to support acquired to support economic Interviews, questionnaires economic planning and policy planning and policy-making planning and policy analysis analysis provided Feedback from the users EXPECTED 1. Improved resources Level of satisfaction of users Interviews/ questionnaire Library is important to RESULTS Feedback from library users routine and existing records stakeholder needs 2. Improved awareness Increase level of visits Increased awareness leads to among stakeholders increased visits ACTIVITIES 1.1 Develop and administer (sometimes a summary of (sometimes a summary of (if the activities are completed, questionnaire to identify resources/means is provided costs/budget is provided in what assumptions must hold priority areas important to the in this box) this box) true to deliver the expected stakeholders results) 1.2 Meet with librarians, staff, partners and primary stakeholders to discuss objectives, and needs in line with feedback so as to develop the library resources 1.3 Identify and acquire resources 2.1 Develop and implement plan with stakeholders on how to promote the library INPUTS Project funding from donor List resources available agency and core funding from the Ministry of Finance and Planning Where the library is not seen as a project, but as an ongoing activity, the logic model might be the more appropriate tool to use. It provides an overview of activities associated with the library, the resources used, the outputs/results, the outcomes/effects and the impact.An example of a logic model applied to a library is given in Table 4.37. In using this framework you can see the key questions to ask to see whether the library is meeting its objectives over time. For example: z Are the expected results that have been achieved in line with its core values and approach? z Are the expected results in line with the project purpose and do they contribute to the goal? z Does the library address the main problem it was supposed to? z Have visits to the library increased? z Are users satisfied with the library? 233 Smart Toolkit for Evaluating Information Projects, Products and Services Regular evaluation will help you to ensure your library’s continued relevance to its primary stakeholders, and suggest ways in which you might improve it. Be aware, however, that librarians have differing views on the approach an evaluation should take.Approaches to consider include: Table 4.37 Logic model for a library FOCUS RESOURCES ACTIVITIES OUTPUTS OUTCOMES IMPACT (indicators of results) (indicators of effects) INDICATORS Planning and development Identify Staff Meeting to discuss List of ideas, list of initial Draft library concept stakeholders’ Facilities to hold development of the library, key stakeholders and who to involve - needs meetings, transportation identify key stakeholders stakeholders and their needs Involve key Time needed by staff Brainstorming meetings to Library objectives Agreed objectives stakeholders from key stakeholder develop questionnaires for Number of questionnaires institutions to feedback and to identify sent to stakeholders Feedback on library meet/discuss via other stakeholders telephone, email Stationery, transportation, postage, fax, telephone, computer, printer Process Time and willingness of Develop plan to improve Plan to expand resources Operational plan for development stakeholders to meet library’s facilities and range of library’s library services Funds Funds from organisations Budget prepared for the Budget Agreement on funds to support the library development of the library financial report available for the library Implementation/operations Implementation of Time and assistance of Accessing resources and Additional resources Knowledge gained % of primary the library additional expertise to placing additional services made available to users stakeholders services help librarians provide a in the library to help users changing their range of services access information in an practices as a result Library resources and easier way of using the library facilities facilities Promotional Time of staff and key Meetings to develop Strategy developed % change in visits to the Increased visits and activities stakeholders promotion of the library library enquiries Monitoring and evaluation Monitoring and Time for staff to Determine indicators Development of a Improved library evaluation implement clear Identify appropriate data database to support services (in terms of procedures to record collection methods monitoring and evaluation effectiveness and routine data Design reporting formats to efficiency) to meet Time for staff to input collect and analyse data stakeholder needs routine data 234 Part 4: EVALUATION GUIDELINES: Small library / Resource centre z an objective-oriented approach which looks at the library’s goals and objectives so as to determine the extent to which they have been achieved z a management-oriented approach, which tries to identify and meet the information needs of management z an expertise-oriented approach, which relies on professional expertise to determine the quality of the library service z a participant-oriented approach, which emphasises the role of stakeholders in the whole evaluation process Here, we describe a combination of the objective- and participant-oriented approaches. Clarifying data needs and stakeholder participation When preparing for an evaluation, you will need to know who your primary and secondary stakeholders are, so that you can determine your data needs and involve stakeholder representatives in the evaluation.The stakeholders should be taken into account at every stage of the evaluation, from its planning and design, to being part of the team as well as sources of information, and providing feedback on the evaluation results. It will be difficult to implement the evaluation recommendations if they are not acceptable to your stakeholders. The range of stakeholders in a library could include: z individuals using the library (e.g., researchers, journalists, students, teachers, farmers) – the primary stakeholder group z organisation that houses the library (e.g., university, research institute) z librarian(s) z local authorities z funding agencies One way of identifying your primary and secondary stakeholders is to conduct a stakeholder analysis. It helps you identify key stakeholders, how they benefit from and contribute to the library and how to include them in the evaluation.Table 4.38 ges an example of a stakeholder analysis. When considering the stakeholders’ interests in the library, be aware of the kind of questions that they would like to have answered.Table 4.39 gives an overview of the kind of key questions that might be asked. If you know your stakeholders really well, you can start the process by working with your colleagues to formulate the questions based on your knowledge of them. But the stakeholders must become involved at some stage to ensure that you have their views.You can do this using interviews, workshops, and questionnaires with individual or groups of stakeholders. Defining the evaluation focus, questions and indicators Most evaluation exercises need to be limited in focus to reflect time and budget limitations. It is not possible to cover all elements of a library extensively every time you carry out an evaluation, so you need to choose the focus of your evaluation, based on stakeholders’ key questions and the time and cost of accessing data. Evaluations of libraries can vary greatly in scale and style.They might cover the whole range of library activities, or focus on an individual activity, such as the enquiry service.Thus, the evaluation might take more than a week or it might take only a day. 235 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.38 Stakeholder analysis for a library focusing on economic planning STAKEHOLDER BENEFITS FROM CONTRIBUTIONS / INFLUENCE ON POTENTIAL CATEGORY THE PROJECT SACRIFICES THE LIBRARY INVOLVEMENT IN THE EVALUATION Users Improved knowledge and Time to access the service How the service is delivered Represented on the team access to economic data Cost to access the service Range of services offered Involved in analysis and decisions on actions for change An information source Informed about main findings Librarians, management, Better information for their Time and technical input Library service improved Representation on the staff target groups evaluation team Networking Need to be informed about the evaluation results Partners Improved information Time and effort to provide Quality of information made Could be included on the Networking information and promote the available evaluation team library Need to be consulted during the evaluation and informed of key results Funding agencies Results to show to their Finance and time On project concept and Should be informed about clientele plan and monitoring and the evaluation results evaluation Table 4.39 Key questions stakeholders could be interested in STAKEHOLDER QUESTIONS THEY ARE POTENTIAL USE OF CATEGORY INTERESTED IN THE ANSWERS Users Is the information provided relevant, timely, and Will they use the library? applicable? Are they likely to recommend it to others? Staff (librarians) and management of the How relevant and effective is the library? How should we maintain/develop the library? library Is it informative, popular, well presented? Partners How relevant and effective is the library? Should we continue to support the library? Funding agencies Does the library have an impact on policy-making Should we continue, reduce or increase our and planning? funding? In what areas can/should we assist? How sustainable is the library if we stop funding? Examples of key questions include: How can the library have more impact? Who is using the library? Which of the target groups are not accessing it, and why? Obtaining such information could help you to introduce innovations to attract more users. By documenting the books used, you can find out what to do to increase usage by buying more relevant resources. With each key question, you need to think about the indicators you will need to help you answer that question. Indicators also determine the type of data you will need to collect.Table 4.40 236 Part 4: EVALUATION GUIDELINES: Small library / Resource centre Table 4.40 Determining the focus, key questions and indicators for the evaluation of a library, using some evaluation criteria as examples EVALUATION FOCUS KEY QUESTIONS INDICATORS STAKEHOLDER DATA CRITERIA (SCOPE) INTEREST ACCESSIBILITY RELEVANCE Usefulness Do the users find the library % of users who find the library High High useful? useful % of users who want more Which of the target groups use it resource material in the library most? The most used resources in the library ACCESSIBILITY How easy is it to access the % of users who have no High High library resources? difficulty accessing the library’s resources Are the opening hours % of users who say that the appropriate? opening hours are appropriate USABILITY Overall Are you satisfied with the % of users satisfied with the High Medium satisfaction of library’s services? overall library library service % of users satisfied with specific services of the library Facilities Are the library’s collections Satisfaction level of feedback High High appropriate for the target from library staff and users groups’ needs? (right level, % who say they are satisfied up-to-date, etc.) with library’s facilities Are the library’s resources organised in a way that you can easily find them? Is the library’s building sufficient for the proposed services? (space, storage, security, etc.) Services Are the staff competent enough % of users who say that they High Medium to give satisfactory service to all are satisfied with the library library users? staff’s level of service EFFECTIVENESS Awareness Do you know about the library? % of visitors who say that they High Medium know about the library How did you find out about it? Ways in which users find out about the library (e.g., via colleagues, friends, newsletters, publications, website, etc.) IMPACT Improved Have the attitudes or practices of % of users who say that they High Low knowledge the users changed in any way? gained knowledge from the materials consulted and changed their attitude as a result % who say they have passed on their knowledge to others provides an example of focus, key questions and indicators for a library evaluation.You can use the information here to develop a more specific set of questions, relevant to the focus of the library. Using the feedback from your stakeholders and the data available to you, you will be able to determine which questions are important for your evaluation. 237 Smart Toolkit for Evaluating Information Projects, Products and Services With reference to Table 4.40 you may want to focus on one criterion (e.g., impact) because of its importance to the continuation of the library or on the criteria of usability and accessibility because of their importance to the users. If accountability is the main purpose of the evaluation you may want to limit the scope of the evaluation to the relevance, effectiveness and efficiency of the library. Drawing up a matrix like Table 4.40 shows how important it is to have access to good data to support your evaluation and, by extension, how important it is to maintain a good monitoring system of data relating to your library. Collecting the data When you have determined your scope, focus and indicators, you will need to decide how to collect the data. It is important to develop a data collection strategy that ensures: z that all relevant data will become available during the evaluation z that no more data are collected than what is needed and can be analysed Table 4.41 gives some guidance on which data collection tools are most appropriate to use for the indicators identified. Note that some of the tools needed for some indicators might be expensive to use and you might have to find more creative ways to get the information you need or perhaps decide against using that particular indicator. It is helpful if you have been collecting data on the library on a regular basis. Routine monitoring records can provide a lot of data at no cost, and most of the data can be easily transformed into valuable information. Table 4.41 Example of data collection methods for evaluating a library focusing on economic planning DATA INFORMATION KEY ISSUES / ANSWERS EXPECTED COLLECTION METHOD SOURCE QUESTIONS AND CONCERNS Desk study Library records No. and type of users Some economists, planners do Internal memoranda on What are their main subjects of interest not have access to the service meetings with colleagues, Is the service making a difference to research, letters/emails planning and policy-making? Very satisfied to very dissatisfied Is the service doing what it set out to do? Are the users satisfied? Accounts department How much time and money are involved in Time and money to develop a accessing and updating resources? library might vary widely Interviews and Economists, planners, etc. Are the users using the information? Time consuming and expensive questionnaires Has the quality of papers/ policy briefs/ analyses Self-administered questionnaires of the economy improved? are less expensive, but the respondents might not complete them correctly Workshop Partners, international agencies Strengths, weaknesses, opportunities and threats How to improve collaboration (e.g., World Bank, commodity relating to the library agencies) Possibilities for improving collaboration 238 Part 4: EVALUATION GUIDELINES: Small library / Resource centre Table 4.42 Example of routine data to collect for a library, to support evaluations FOCUS ROUTINE INFORMATION TO COLLECT Physical facilities and services No. of (new) books, periodicals, reports, slide sets, posters, audio-cassettes, videos, CD-ROMs, etc. for the public No. of desks and seats No. of rooms No. of photocopy machines No. of computers, with or without CD-ROM Access to Internet Access to information sources (also a virtual library) Assistance No. of enquiries per month (as against expected number of enquiries) Materials used to answer enquiries Information sources No. of materials for which new editions have been obtained No. of materials thrown away over time No. of new acquisitions as a result of efforts made by staff No. of materials donated or exchanged for publications No. of materials obtained that are published regularly in the country No. of materials added to each subject area of the collection on a regular basis Availability No. of hours intended to be open per month (or quarter or year) No. of hours actually open per month (or quarter or year) Visitors No. of visitors each month No. of male/female visitors Average no. of visits made by each user each month Average no. of visits made each day that the library is open No. of visits made each month by different categories of users Materials used Subject areas most often requested or used in the past month Types of materials most often requested or used in the past month Types of services used No. of times the various services offered to visitors are used per month (e.g., lending, photocopying, use of the database, document supply, or literature searches) Services used No. of resources consulted/lent on a regular basis (e.g., monthly, quarterly) No. of enquires made on a regular basis No. of photocopies made on a regular basis Feedback Information on how visitors became aware of the service Opinions on the service provided Areas for improvement Your routine records could contain, for example, name, age, gender, profession, time entering and leaving library, average time spent in library, materials consulted, subjects, and remarks (see Table 4.42).You could also position cards strategically in the library, asking users for feedback.The data from all these records can be tabulated, and mapped over time. Other methods can be used to explore further the patterns or trends emerging from these data. Analysing the data It is advisable to think about data analysis tools and methods before starting the data collection. This will help to ensure that the right types of data are being collected. 239 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.43 will help you to start thinking in this way. For example, if you want to know what staff think about the library, you should already be thinking that you might have to interview staff individually or in focus groups. In designing the data analysis for a library evaluation, you should ask the following questions: z Which analysis tools should I use? (e.g., a table featuring type of questions versus the type of users, a graph showing increase in users over time, or specific analytical tools such as a problem tree, process chart or SWOT analysis) z Which collection methods will provide the necessary data for each tool? z What type of observations do you expect to obtain from each tool? z What type of conclusions do you expect to reach using each tool? Once you have designed the data analysis, it is important to see if your data collection includes all elements.The results from some data collection methods (e.g., interviews) could be specifically meant for checking and/or interpreting other data. Table 4.43 Example of a data analysis design for a library evaluation ANALYTICAL TOOLS RESULTS FROM TYPE OF OBSERVATIONS TYPE OF CONCLUSIONS AND TABLES COLLECTION METHOD TO MAKE TO REACH USED Table 1: Desk study Range of the users of the library Category of users to give more Who are the users of the Routine records attention to library? Type of resources that would be of interest to the users Table 2: Desk study How useful is the library given the needs of the Whether the library provides Relevance Questionnaires group? information that is relevant or Interviews not Table 3: Desk study Are the users satisfied with the quality of the Whether the resources, Usabilty of the service Questionnaires service in terms of materials offered, building and building, and staff assistance Interviews services of staff provided are adequate Priorities for improvement, obstacles Can determine direction the library should take Table 4: Desk study Level of awareness of library Promotion strategy Effectiveness Questionnaires High and low satisfaction with library Awareness of the library Interviews Table 5: Data from accounts Cost to equip and maintain the library Potential areas for reducing Efficiency costs/time Figure 1: Library procedures manual, Bottlenecks in the process Process steps to improve Process flow chart interviews Table 6: In-depth, interviews, case Provision of resource materials is making a Determine whether there has Impact stories studies, feedback cards, letters difference in the quality of analysis in terms of the been some change in the to the organisation in which the economy’s performance and the way decision- quality of work done by the library is housed making, policy-making and planning is carried out users 240 Part 4: EVALUATION GUIDELINES: Small library / Resource centre Preparing a communication plan Designing the communication plan is an important activity in the evaluation process to ensure that conclusions and recommendations from the evaluation are well understood and supported. In designing this strategy, you need to consider these issues: z Which communication methods will suit which stakeholders, especially the primary stakeholders? z What are the main issues to be discussed and reported, taking into consideration the communication method and target group? An effective communication plan for critical reflection during an evaluation and reporting findings during and after the evaluation helps to create common understanding among stakeholders, providing a good basis for implementing recommendations. If your stakeholders don’t accept the conclusions and recommendations resulting from the evaluation, it will be almost impossible to motivate them to make the required changes. However, if they have been included throughout the process, they are more likely to accept the conclusions. During the process of critical reflection and reporting, you should ask: z Do the stakeholders share the same views on the problems with the library? z Do they have the same views on the solution(s)? z Are they prepared to support the same solution(s)? z What are the obstacles that will prevent them from implementing the solution(s) proposed? z What can be done to address the obstacles they face? There are various ways in which you can convey the results of the evaluation (e.g., providing a summary of your findings, organising a meeting, a brochure, a memorandum, a web page).You might also consider sharing the results with members of your staff and with library users, so that if there are changes or new additions to the library, they will not come as a surprise to them. Table 4.44 Example of a communication plan for reporting the findings from a library evaluation COMMUNICATION METHOD STAKEHOLDERS TO MAIN ISSUES TO DISCUSS / REPORT TO REPORT ON SWOT analysis Library staff Critically reflecting on the results from the data collection Brainstorming Library staff and key stakeholders Solutions to problems identified Summary report Partners, users Results from the data collection Full report, including executive summary Management, funding agency Main messages of the evaluation: relevance, satisfaction, awareness, sustainability Short report on the evaluation of the library Users Main findings in broad terms and how they will affect the library A few paragraphs about the evaluation in Anyone interested in the organisation Give main findings of the evaluation the organisation’s annual report 241 Smart Toolkit for Evaluating Information Projects, Products and Services If the stakeholders are properly involved in the evaluation process in terms of participation and input, then getting their agreement on the relevance, reliability and quality of the data collected, the adequacy of the analysis provided and the conclusions drawn should not be difficult. Be aware, however, that although there might be general acceptance of a proposed solution, you might not be able to implement the solution because of a lack of resources, time and/or capacity. An important component of the communication process is to find out what difficulties the stakeholders expect and/or experience in implementing the recommendations.Where changes can be implemented without difficulty, it is advisable to implement them as quickly as possible. Box 4.11 Evaluating a library: guidelines checklist These guidelines on evaluating a library are covered above: z Library concept and objectives z Data needs and stakeholder participation z Evaluation focus, questions and indicators z Data collection z Data analysis z The communication plan For more on: z data analysis, see Part 2, page 60 and Part 3, 157-162 z data collection, see Part 2, pages 58-59 and Part 3, pages 106-115 z evaluation communication and follow-up, see Part 3, pages 163-173 z evaluation criteria (scope), see Part 2, pages 33-34 z indicators, see Part 3, pages 91-98 z logframe, see Part 3, pages 68-83 z logic model, see Part 3, pages 103-105 z stakeholder participation, see Part 1, pages 3-5 z terms of reference, see Part 2, pages 32-35 242 Part 4: EVALUATION GUIDELINES: Online community ONLINE COMMUNITY These guidelines relate to evaluating an online community, using the terms of reference (see Part 3, pages 99-103) as the basis of the evaluation and drawing on examples where appropriate. It is important to make the evaluation process as participatory as possible, involving colleagues and key stakeholders from the outset, including your primary stakeholders (see Part 1, pages 3-5). By doing this, you will find the experience more rewarding and more likely to result in a general acceptance of the evaluation findings, making it much easier to implement change. The notion of ‘critical reflection’ is discussed towards the end of the guidelines, but we advise you and your evaluation team to ‘reflect’ on the main findings and problems at each stage of the evaluation process, for learning purposes and to improve the way you conduct the evaluation. (See Part 1, pages 5-6). The background to the evaluation process in general – concepts, context, terms, trends and core ingredients – is described in Part 2. Do read that section if you are not familiar with certain concepts or terms that occur in these guidelines. And in Part 3 you will find more on the evaluation tools described here. Before evaluating an online community, you need to be clear about: z what the main elements of the online community are z its concept(s) and objectives z who the primary and secondary stakeholders are z how to go about identifying the needs of the primary and secondary stakeholders z what the focus of the evaluation is, the questions to ask and the indicators to use z how to collect the data and analyse them z how to communicate with your key stakeholders, to critically reflect upon the evaluation results and to report the results in such a way that they will be accepted and acted upon What is an online community? In recent years, as a result of the increased adoption of information and communication technologies (ICTs), particularly e-mail and group-ware, existing and new networks have taken to online interaction. In the development field, these online communities are flourishing.They include ‘communities of ideas’, ‘communities of practice’ and ‘communities of purpose’, and are being used to: z upgrade the quality of the activities, outputs and impact of development organisations z facilitate a collective learning process z share information on development activities with national and international audiences These communities generally have two main elements: a discussion list, and a platform that carries posted messages and provides access to useful resources such as documents and websites. Online communities can: z serve as a place where people with similar goals, interests, problems and approaches can learn from each other 243 Smart Toolkit for Evaluating Information Projects, Products and Services z provide a forum where people can respond rapidly to individual enquiries from fellow community members with specific answers z develop and transfer best practices on specific topics, through the sharing of knowledge z influence development outcomes by promoting greater and better-informed dialogue z link diverse groups of people from different disciplines z promote innovative approaches to address specific development challenges Determining the online community concept and objectives You can’t evaluate your online community unless you are clear what it is about.The concept (idea) behind the online community and the objectives it seeks to achieve need to be clearly stated, otherwise you can’t compare its actual performance with its intended performance.Also, without a clear concept it is difficult to make the right choices during the evaluation process. In determining an online community concept, you should ask these questions: z Why was the online community developed? z What were the main objectives (expected results) and problems to be addressed at the time? z What is the goal of the online community? z What are its core values? z Who are the target users? In evaluating the online community, you need to ask yourself key questions related to the concept, such as: z Do the expected results achieved reflect its core values and approach? z Do they reflect the project purpose and contribute to the goal? z Does the online community address the main objectives and problems as it was supposed to? z Are the correct messages being conveyed? The logical framework can help you clarify the online community objectives (see pages 68-83). Commonly known as the ‘logframe’, it helps to summarise a project in a logical sequence. If a project plan does not have a logframe, it is useful to construct one so that you have a good idea of your objectives and the consequent hierarchy of activities.An example of a logframe for an online community is given in Table 4.45 Where the online community is not seen as a project, but as an ongoing activity, the logic model might be the more appropriate tool to use. It provides an overview of activities associated with the online community, the resources used, the outputs/results, the outcomes/effects and the impact.An example of a logic model applied to an online community is given in Table 4.46. In using this framework you can see the key questions to ask to see if the online community is meeting the stated objectives over time. For example: z Do the expected results achieved reflect its core values and approach? z Do they reflect the project purpose and contribute to the goal? z Does the online community address the main problem it was supposed to? z Are the members satisfied with the online community? z Has the community developed in line with its objectives? z Has community membership increased? 244 Part 4: EVALUATION GUIDELINES: Online community Box 4.12 Example of a concept behind an online community of practice Main problem: EVAL is an online community of practice of information practitioners and evaluation experts. The members of the community have been working together to promote monitoring and evaluation practice for information products and services. EVAL is a joint initiative of five key development agencies. Since 2005, the members have worked on various initiatives, exchanging experiences and approaches. Initially, there was intense collaboration among members, but recently EVAL has not been very active. Goal: To have an online community on evaluation practice which supports development practitioners in the field of information products and services. Project purpose: Information specialists, evaluators and practitioners provided with a platform to communicate, exchange experiences and collaborate on the evaluation of information products and services. Expected results: EVAL, an online community practice established to support exchange of information and knowledge sharing among practitioners on evaluation of information products and services. Main approach: A Dgroup (this is a place on the internet where individuals and development organisations can come together and interact with one another) has been developed, where practitioners can exchange information and communicate with other members of the group. Face-to-face meetings have also been planned to help strengthen links and work on collaborative activities. Core values: Relevance, effectiveness, sustainability Clarifying data needs and stakeholder participation When preparing for an evaluation, you will need to know who your primary and secondary stakeholders are, so that you can determine your data needs and involve stakeholder representatives in the evaluation.The stakeholders should be taken into account at every stage of the evaluation, from its planning and design, to being part of the team as well as sources of information, and providing feedback on the evaluation results. It will be difficult to implement the evaluation recommendations if they are not acceptable to your stakeholders. The range of stakeholders in an online community could include: z Platform providers z Funding agencies (e.g., banks, student award schemes, research institutions) z Community moderators and facilitators z Community members and their organisations (e.g., practitioners, researchers, policy-makers, managers) – the primary stakeholders (and other groups, such as people known as ‘lurkers’ who don’t contribute to the postings but benefit from reading them) One way of identifying your primary and secondary stakeholders is to conduct a stakeholder analysis. It helps you to identify the key stakeholders, how they benefit from and contribute to the online community and how to include them in the evaluation.Table 4.47 provides an example of a stakeholder analysis. 245 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.45 Logical framework for an online community INTERVENTION OBJECTIVELY VERIFIABLE MEANS OF ASSUMPTIONS LOGIC INDICATORS VERIFICATION GOAL To contribute to improved evaluation practices in development efforts ONLINE Exchange of knowledge and After 3 years, a considerable Routine and existing records Online community contributes COMMUNITY experiences on evaluation to amount of resource material Contributions to the online to better information project PURPOSE better manage information and sharing of experiences community management projects promoted documented EXPECTED 1. Online community to Threefold increase in online Web statistics and routine Online community is relevant RESULTS support information community membership records show level of Increased awareness leads to practitioners in the field of Facilitators and providers interaction within the increased visits to the online evaluation strengthened aware of the needs of the community, and resources community online community available 100% increase in the level of Survey interaction Feedback of members ACTIVITIES 1.1 Meet moderator (sometimes a summary of (sometimes a summary of (if the activities are completed, /facilitator and selected resources/means is provided costs/budget is provided in what assumptions must hold stakeholders (providers, in this box) this box) true to deliver the expected development agencies) to results) discuss Dgroup objectives, structure 1.2 Brainstorm to formulate questionnaire for feedback and develop strategy to obtain feedback from target group 1.3 Identify priority areas for target group(s) 1.4 Develop plan with stakeholders on how to develop the Dgroup 1.5 Place resources on the Dgroup platform and invite members to add their own resources INPUTS Project funding from donor List resources available agency and core funding from the Ministry of Agriculture When considering the stakeholders’ interests in the online community, be aware of the kind of questions that they would like to have answered.Table 4.48 gives an overview of the kind of key questions that might be asked. If you know your stakeholders really well, you can start the process by working with your colleagues to formulate the questions based on your knowledge of them. But the stakeholders must become involved at some stage to ensure that you have their views.You can do this using interviews, workshops, and questionnaires with individual or groups of stakeholders. 246 Part 4: EVALUATION GUIDELINES: Online community Table 4.46 Logic model for an online community FOCUS RESOURCES ACTIVITIES OUTPUTS OUTCOMES IMPACT (indicators of results) (indicators of effects) INDICATORS Planning and development Identify users’ Staff Meeting to discuss the List of ideas, list of initial Draft concept of online needs Facilities online community and how key stakeholders community, who to to go about it and involve determine who should be involved Involve key Time staff need from key Brainstorm meetings to Objectives and structure Agreed to objectives stakeholders stakeholder institutions to develop online community of the online community and structure; key meet/discuss via and questionnaires for developed materials also identified telephone, email feedback and identifying No.of questionnaires sent Access rights to the Stationery, transportation, other stakeholders to stakeholders online community, postage, fax, telephone, membership profile computer, printer developed Process Time and willingness of Develop plan to facilitate Plan to identify Operational plan for the development stakeholders to meet community, place information sources and community information and update it obtain information from regularly the members Funds Funds from organisations Budget prepared for the Budget; financial report Agreement on funds to support the online development of the online available for the online community community community Implementation/operations Implementation of Facilitator to put Accessing site to input Community online Knowledge gained % of users changing the online information online data and to facilitate their practices as a community result of the online community Promotional Time required from staff Meetings to develop Strategy developed and % change in the no. of Increased activities and key stakeholders promotion of the implemented members membership and community enquiries Monitoring and evaluation Monitoring and Time required by staff to Determine indicators Development of a Improved online evaluation implement clear Identify appropriate data database to support community (in terms of its procedures to record collection methods monitoring and evaluation effectiveness and routine data Design reporting formats to efficiency) to meet the Time required by staff to collect and analyse data needs of the members input routine data Defining the evaluation focus, questions and indicators Many development organisations are investing in online communities, but the benefits from these investments are by no means clear.As these communities are relatively new, there are no generally accepted standards with regard to evaluation, except that existing principles apply.Anecdotal evidence and lessons indicate that the success of online communities is related to the human 247 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.47 Stakeholder analysis for an online community STAKEHOLDER BENEFITS FROM CONTRIBUTIONS / INFLUENCE ON POTENTIAL CATEGORY THE PROJECT SACRIFICES THE PROJECT INVOLVEMENT IN THE EVALUATION Members/ Improved knowledge on Time, knowledge and Format of the community Represented on the team organisations evaluation issues as they expertise involved in Topics for discussion Involved in analysis and (users) relate to information services participating in the community decisions on actions for and products change An information source Informed about main findings Platform provider Better information for their Time and technical input Dgroup platform improved Represented on the team (Dgroup) target groups, networking Informed about main findings Moderators/ facilitators Improved information, Time and effort to support the Quality of information made Represented on the team networking process available Involved in the evaluation planning, implementation and findings Funding agencies Results to show to their Finance and time On project concept and Informed about the clientele plan and monitoring and evaluation results evaluation Table 4.48 Key questions stakeholders could be interested in STAKEHOLDER QUESTIONS THEY ARE POTENTIAL USE OF CATEGORY INTERESTED IN THE ANSWERS Members/ organisations Is the information provided relevant? Will they continue to be part of the online (users) Is the target group aware of the online community? community? Does the community meet target group needs? Are they likely to recommend it to others? Is it easy for users to ask questions and/or contribute? Is the target group satisfied with the community? Platform provider (Dgroup) How relevant and effective is the platform? How should we continue with the platform? Is it user-friendly, informative, popular, well presented? Is it doing what it was set up for? Do providers of information contribute relevant and accurate information? Is there an incentive for members to promote the online community? Moderators/ facilitators Is the platform adequate? Should we continue with this platform? How relevant and effective is the community? Should we continue to facilitate the community? Is the platform pleasing to look at? What information should it carry? Is the information accurate, relevant and up-to-date? Is it easy to reach the target group? Funding agencies Does the community have an impact on evaluation Should we continue, reduce or increase our practice? funding? In what areas can/should we assist? How sustainable is it if we stop funding? Does it contribute in some way to the promotion of the evaluation of information products and services? Should we continue to fund it? 248 Part 4: EVALUATION GUIDELINES: Online community relationships within these communities.This means that, when conducting an evaluation, you will need to find indicators relating to people’s experience of interaction. Most evaluation exercises need to be limited in focus to reflect time and budget limitations. It is not possible to cover all elements of an online community extensively every time you carry out an evaluation, so you need to choose the focus of your evaluation, based on stakeholders’ key questions and the time and cost of accessing data. When evaluating, you must be clear about why you are doing it. How will the information affect your future actions? What difference will the evaluation make? Some issues are more straightforward to interpret and translate into action than others. For example, it is easier to count the number of community members than it is to measure ‘trust and reciprocity’; these are complex matters and are likely to mean different things to different people. The kinds of questions you could ask include: z Is the content relevant? z Who are the main members of the online community in terms of profession and organisation? z How many ‘lurkers’ vis-à-vis active members are there? z Are best practices being shared and disseminated? z Which are the lead organisations and who are the key individuals? z How easy is it for you to contribute ideas and follow through on them? Table 4.49 provides an example of focus, key questions and indicators for an online community evaluation.You can use the information here to develop a more specific set of questions, relevant to the focus of the online community. Using the feedback from your stakeholders and the data available to you, you will be able to determine which questions are important for your evaluation. Whatever your focus is, you should link it to primary stakeholder (user) considerations, your ease of access to data and your resources in terms of time and costs. With reference to Table 4.49 you may want to focus on one criterion (e.g., impact) because of its importance to the continuation of the online community or on the criteria of usability and accessibility because of their importance to the users. If accountability is the main purpose of the evaluation you may want to limit the scope of the evaluation to the relevance, effectiveness and efficiency of the online community. Drawing up a matrix like Table 4.49 shows how important it is to have access to good data to support your evaluation and, by extension, how important it is to maintain a good monitoring system of data relating to your online community. Collecting the data When you have determined your scope, focus and indicators, you will need to decide how to collect the data. It is important to develop a data collection strategy that ensures: z that all relevant data will become available during the evaluation z that no more data are collected than what is needed and can be analysed Table 4.50 gives some guidance on which data collection tools are most appropriate to use for the indicators identified. Note that some of the tools needed for some indicators might be too 249 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.49 Determining the focus, key questions and indicators for the evaluation of an online community, using some evaluation criteria as examples EVALUATION FOCUS KEY QUESTIONS INDICATORS STAKEHOLDER DATA CRITERIA (SCOPE) INTEREST ACCESSIBILITY IMPACT Change in the Have you improved in terms of % of members who conduct High Low knowledge and knowledge and professional evaluations of their projects practice in the practice? % who publish on evaluation of field of the information projects, or who evaluation of have joint publications, shared information models, shared language, new projects collaborative approaches Change in dialogue and issues discussed RELEVANCE Content and Are the contacts made useful? % of members who see the High Low contact Are the resources provided contacts as relevant useful? % of members who see the content as relevant Access to Frequency of contact between ACCESSIBILITY How much contact is there High Low content and between the members? members contacts How willing are members to Level of access to peers and share their resources experts Level of access to insider tacit and explicit knowledge e.g. non-public/confidential reports and documents EFFECTIVENESS Facilitation How well is the facilitation No. of messages sent by High High done? facilitator Response time by the facilitator % of members satisfied with the facilitator Participation Are members leaving? If so, % of members who cancelled High High why? their membership How active are the members? Balance of contributions (e.g., Where is leadership located? in terms of gender, location, How easy is it to contribute? profession) What are the obstacles to No. of messages posted by no. participation? of people No. of resources posted by no. of people Trust Is there a high level of trust, % of members who perceive Low Medium reciprocity and willingness trust within the community among the members of the community? Reach Where do members come from, No. of members (both Medium High which organisations are individual and institutional) represented? No. of organisations No. of national cultures Ratio experts : practitioners 250 Part 4: EVALUATION GUIDELINES: Online community Table 4.49 (continued) EVALUATION FOCUS KEY QUESTIONS INDICATORS STAKEHOLDER DATA CRITERIA (SCOPE) INTEREST ACCESSIBILITY USABILITY Technology/ Are the users satisfied with the % of members satisfied with the High Medium design design of the Dgroup? design Extent to which you can access Is the technology used % of members satisfied with the resources to appropriate for the community? navigability ensure Is it easy to upload and sustainability of download documents? the community Was it easy for the members to learn to use the platform? SUSTAINABILITY What resources are required to Amount and type of resources Low Low maintain the community? required to maintain the community expensive to use and you might have to find more creative ways to get the information you need or perhaps decide against using that particular indicator. Online communities have a great advantage over many other information products in that almost all web hosts can provide you with the statistics from your platform.These are routinely produced reports. If you have never worked with web statistics before, you will need some help in their interpretation. Table 4.50 Example of data collection methods for evaluating an online community DATA INFORMATION KEY ISSUES / ANSWERS EXPECTED COLLECTION METHOD SOURCE QUESTIONS AND CONCERNS Desk study Software information Is the community easy to access? Some target groups don’t have (technology and design) Online comments Is it easy to navigate? access to the internet Web statistics/routine data Who are the members? Accounts department How much time and resources are involved in Time and money to facilitate and facilitating and maintaining the community? maintain an online community can vary widely Interviews /online Members of the community Is the community useful? Level of satisfaction questionnaire Partners Are you satisfied with it? Have you learned anything since joining it? Who are your members? Are you reaching those you want to? Face-to-face meetings What do you think about the way the community Organisations not willing to is designed and run? support the community How can collaboration be improved? 251 Smart Toolkit for Evaluating Information Projects, Products and Services Analysing the data It is advisable to think about data analysis tools and methods before starting the data collection. This will help to ensure that the right types of data are being collected. Table 4.51 will help you to start thinking in this way. For example, if you want to know what members think about the online community, you should already be thinking about ways to interview them. In designing the data analysis for the online community, you should ask the following questions: z Which analysis tools should be used? (e.g., a table featuring type of questions versus the type of members, a graph showing increase in members over time, or specific analytical tools such as a problem tree, process chart or SWOT analysis) z Which collection methods will provide the necessary data for each tool? z What type of observations do you expect to obtain from each tool? z What type of conclusions do you expect to reach using each tool? Preparing a communication plan Designing the communication plan is an important activity in the evaluation process to ensure that conclusions and recommendations from the evaluation are well understood and supported. In designing this strategy, you need to consider these issues: Table 4.51 Example of a data analysis design for an online community evaluation ANALYTICAL TOOLS RESULTS FROM TYPE OF OBSERVATIONS TYPE OF CONCLUSIONS AND TABLES COLLECTION METHOD TO MAKE TO REACH USED Table 1: Desk study Range of the members Category of members to give Who are the members Routine records more attention to joining the community Type of resources that would be of interest to the members Table 2: Desk study How useful is the community given the needs of Whether the online community Relevance Questionnaires the group is relevant or not Interviews Table 3: Desk study Level of awareness of community Promotion strategy Effectiveness Questionnaires Quality (accuracy) of information provided Priorities for improvement, Awareness of community Interviews Is there a lot of interaction within the community? obstacles Facilitation Can determine how community Participation should be changed Table 4: Data from accounts Cost to facilitate and maintain the community Potential areas for reducing Efficiency costs/time Figure 1: Community procedures manual, Bottlenecks in the process Process steps to improve Process flow chart interviews Table 5: In-depth interviews, case Interaction in the community is making a difference Determine whether there has Impact stories studies, feedback from emails, to the way people practice their profession been some change in the way online questionnaires people manage their information projects 252 Part 4: EVALUATION GUIDELINES: Online community z Which communication methods will suit which stakeholders, especially the primary stakeholders? z What are the main issues to be discussed and reported, taking into consideration the communication method and target group? An effective communication plan for critical reflection during an evaluation and reporting findings during and after the evaluation helps to create common understanding among stakeholders, providing a good basis for implementing recommendations. If your stakeholders don’t accept the conclusions and recommendations resulting from the evaluation, it will be almost impossible to motivate them to make the required changes. However, if they have been included throughout the process, they are more likely to accept the conclusions. During the process of critical reflection and reporting, you should ask: z Do the stakeholders share the same views on the problems with the online community? z Do they have the same views on the solution(s)? z Are they prepared to support the same solution(s)? z What are the obstacles that will prevent them from implementing the solution(s) proposed? z What can be done to address the obstacles they face? There are various ways in which you can convey the results of the evaluation (e.g., providing a summary of your findings, organising a meeting, a memorandum).You might also consider sharing the results on the community platform, so that if there are changes in terms of how it is presented, they will not come as a surprise to the members.Table 4.52 gives an example of a communication strategy for an online community evaluation. If the stakeholders are properly involved in the evaluation process in terms of participation and input, then getting their agreement on the relevance, reliability and quality of the data collected, the adequacy of the analysis provided and the conclusions drawn should not be difficult. Be aware, however, that although there might be general acceptance of a proposed solution, you might not be able to implement the solution because of a lack of resources, time and/or capacity. Table 4.52 Example of a communication plan for reporting the findings from an online community evaluation COMMUNICATION METHOD STAKEHOLDERS TO MAIN ISSUES TO DISCUSS / REPORT TO REPORT ON Critical reflection Staff, selected partners Critically reflect on results from the data collection Summary report/ internet page Partners, members Results from the data collection Full report, including executive summary Funding agency Main messages of the evaluation: relevance, satisfaction, awareness, access, reach, sustainability Short report of the evaluation of the online Members of the online community Main findings in broad terms and how it will community affect the community Articles Potential partners and members Main messages of the evaluation, lessons learned Face-to-face meetings Members of the online community Main findings and the implication for the community 253 Smart Toolkit for Evaluating Information Projects, Products and Services An important component of the communication process is to find out what difficulties the stakeholders expect and/or experience in implementing the recommendations.Where changes can be implemented without difficulty, it is advisable to implement them as quickly as possible. Box 4.13 Evaluating an online community: guidelines checklist These guidelines on evaluating an online community are covered above: z Online community concept and objectives z Data needs and stakeholder participation z Evaluation focus, questions and indicators z Data collection z Data analysis z The communication plan For more on: z data analysis, see Part 2, page 60 and Part 3, 157-162 z data collection, see Part 2, pages 58-59 and Part 3, pages 106-115 z evaluation communication and follow-up, see Part 3, pages 163-173 z evaluation criteria (scope), see Part 2, pages 33-34 z indicators, see Part 3, pages 91-98 z logframe, see Part 3, pages 68-83 z logic model, see Part 3, pages 103-105 z stakeholder participation, see Part 1, pages 3-5 z terms of reference, see Part 2, pages 32-35 254 Part 4: EVALUATION GUIDELINES: Rural radio RURAL RADIO These guidelines relate to evaluating a programme for rural radio, using the terms of reference (see Part 3, pages 99-103) as the basis of the evaluation and drawing on examples where appropriate. It is important to make the evaluation process as participatory as possible, involving colleagues and key stakeholders from the outset, including your primary stakeholders (see Part 1, pages 3-5).By doing this, you will find the experience more rewarding and more likely to result in a general acceptance of the evaluation findings, making it much easier to implement change. The notion of ‘critical reflection’ is discussed towards the end of the guidelines, but we advise you and your evaluation team to ‘reflect’ on the main findings and problems at each stage of the evaluation process, for learning purposes and to improve the way you conduct the evaluation. (See Part 1, pages 5-6). The background to the evaluation process in general – concepts, context, terms, trends and core ingredients – is described in Part 2. Do read that section if you are not familiar with certain concepts or terms that occur in these guidelines. And in Part 3 you will find more on the evaluation tools described here. Before evaluating your rural radio programme, you need to be clear about: z what the main elements of the radio programme are z its concept(s) and objectives z who the primary and secondary stakeholders are z how to go about identifying the needs of the primary and secondary stakeholders z what the focus of the evaluation is, the questions to ask and the indicators to use z how to collect the data and analyse them z how to communicate with your key stakeholders, to critically reflect upon the evaluation results and to report the results in such a way that they will be accepted and acted upon What is a rural radio? Rural radio is a broadcasting system in which programmes are tailored for rural areas. It aims to facilitate development by helping to identify, debate and discuss issues of concern and provide information on ways of addressing the issues, relying on local knowledge systems and information communication tools. In Africa, rural radio has taken on many forms, the most recent being the ‘community’ type local rural radio. Whether it is called local or community radio, or free or participative radio, it is the process used to produce the programmes, together with their content, which determines if it can be called ‘rural radio’. Rural radio should have the following features: z the programmes are constructed around the needs of the communities they serve z in a highly participatory process, various people and organisations contribute to their production z exchanges are interactive z local culture and knowledge are valued and developed 255 Smart Toolkit for Evaluating Information Projects, Products and Services Rural radio is therefore a community’s means of expression. It has most influence where it is inclusive, accessible and trusted by the community. It is also important if the community has a sense of ownership of the service that it has an element of sustainability. How do you know if rural radio is meeting the needs of the community? How do you know if the programmes have been developed properly or not? Evaluating rural radio programmes will help you find answers to these questions. It will also help you to find out who is listening and when, and what difference the programmes make to the community. It can also provide feedback on the production process, improve your relationship with the listeners, demonstrate value for money and provide ideas for future programmes. Determining a rural radio programme concept and objectives You can’t evaluate your radio programme unless you are clear what it is about.The concept (idea) behind the radio programme and the objectives it seeks to achieve need to be clearly stated, otherwise you can’t compare its actual performance with its intended performance.Also, without a clear concept it is difficult to make the right choices during the evaluation process. In determining the concept of a rural radio programme, you should ask these questions: z Why was the rural radio programme developed? z What were the main objectives (expected results) and problems to be addressed at the time? z What is the goal of the rural radio programme? z What is the project purpose of the radio programme? z What are the core values of the radio programme? z Who is the target group? In evaluating your rural radio programme, you need to ask yourself key questions related to the concept, such as: z Do the expected results achieved reflect its core values and approach? z Do they reflect the project purpose and contribute to the goals? z Does the rural radio programme address the main objectives and problems it was supposed to? z Are the correct messages being conveyed? The logical framework can help you clarify the radio programme objectives (see pages 68-83). Commonly known as the ‘logframe’, it helps to summarise a project in a logical sequence. If a project plan does not have a logframe, it is useful to construct one so that you have a good idea of your objectives and the consequent hierarchy of activities. An example of a logframe for a radio prgramme is given in Table 4.53. Where the rural radio programme is not seen as a project, but as an ongoing activity, the logic model might be the more appropriate tool to use. It provides an overview of activities associated with the programme, the resources used, the outputs/results, the outcomes/effects and the impact. An example of a logic model applied to a rural radio programme is given in Table 4.54. 256 Part 4: EVALUATION GUIDELINES: Rural radio Box 4.14 Example of a concept behind a rural radio programme on health issues Main problem: The public relations department in the Ministry of Health developed the HealthTime radio programme to address the limited access that some population groups have to basic health information. Goal: To improve the quality of life of the population in rural communities. Project purpose: Rural communities (particularly women, children, and health and social workers) provided with relevant and appropriate health information on a timely basis. Expected results: Awareness created among target groups; information provided in such a way that the target groups will be able to understand and act upon it. Main approach: To transmit the radio programme in an attractive way in the main local language, providing simple health messages and aired at a time slot favoured by women and children. Core values: Relevance, effectiveness, efficiency and impact Clarifying data needs and stakeholder participation When preparing for an evaluation, you will need to know who your primary and secondary stakeholders are, so that you can determine your data needs and involve stakeholder representatives in the evaluation.The stakeholders should be taken into account at every stage of the evaluation, from its planning and design, to being part of the team as well as sources of information, and providing feedback on the evaluation results. It will be difficult to implement the evaluation recommendations if they are not acceptable to your stakeholders. The range of stakeholders in a rural radio programme could include: z listeners (including specific targets such as women, children, and health and social workers) – the primary stakeholder group z volunteers, contributors z advisors z NGOs z production personnel z broadcasters z government agencies z funding sources (e.g., donor agency) One way of identifying your primary and secondary stakeholders is to conduct a stakeholder analysis. It helps you to identify the key stakeholders, how they benefit from and contribute to the programme and how to include them in the evaluation.Table 4.55 provides an example of a stakeholder analysis. When considering the stakeholders’ interests in the rural radio programme, be aware of the kind of questions that they would like to have answered.Table 4.56 gives an overview of the kind of key questions that might be asked. If you know your stakeholders really well, you can start the process 257 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.53 Logical framework for a rural radio programme on health issues INTERVENTION OBJECTIVELY VERIFIABLE MEANS OF ASSUMPTIONS LOGIC INDICATORS VERIFICATION GOAL To contribute to improved After 5 years: migration rate National statistics on living conditions in rural in communities reduced by population, health and living communities 20%; incidence of water- standards borne diseases and other basic illnesses reduced by 40% RADIO Availability of health After 3 years, no. of listeners Routine records at the radio Rural radio programme PROGRAMME information to women and increased threefold station contributes to improvement in PURPOSE children in rural communities % of households following National statistics living conditions increased recommended practices: Field visits - 80% boil drinking water Interviews, case studies, - 90% immunise children diaries against polio - 70% use pit latrine to dispose of human waste No. of listeners and range of listenership EXPECTED 1. Improved availability of Level of listenership aware of Routine records Increase in awareness leads RESULTS health information programme content to increased listenership 2. Improved awareness Questionnaires among the listeners Interviews ACTIVITIES 1.1 Meet staff and selected (sometimes a summary of (sometimes a summary of (if the activities are completed, stakeholders to discuss resources/means is provided costs/budget is provided in what assumptions must hold programme objectives and in this box) this box) true to deliver the expected presentation and increase results) programme exposure 1.2 Draft questionnaire for feedback and develop strategy to obtain feedback from target group 1.3 Identify priority health issues, key information on how to improve the programme through questionnaires and meetings with key stakeholders 1.4 Develop plan with stakeholders for a series on health issues for the year 1.5 Produce series of programmes with stakeholders reflecting identified health issues 2.1 Meet staff and key stakeholders to develop promotion and distribution strategy for programme 2.2 Implement strategy to promote it, such as preparing brochure, advertising on billboards and on website, organising competitions Funding from donor agency; List resources available INPUTS core funding from Ministry| 258 Part 4: EVALUATION GUIDELINES: Rural radio Table 4.54 Logic model for a rural radio programme on health issues FOCUS RESOURCES ACTIVITIES OUTPUTS OUTCOMES IMPACT (indicators of results) (indicators of effects) INDICATORS Planning and development 1. Laying the Capable staff Conduct meetings to List of ideas and key Draft rural radio groundwork for discuss rural radio stakeholders programme concept and the rural radio programme and decide who to involve programme who to involve 2. Involving Representatives from the Brainstorm with partners to Rural radio format Changes made to rural stakeholders Ministry of Health, key determine rural radio developed radio programme format stakeholders (inc. women) format Questionnaire developed in response to the from representative Develop questionnaire for and administered feedback obtained communities targeted feedback and ideas External-oriented staff Transport, photocopies Willingness of partners 3. Process Committed staff and Process analysis No. of bottlenecks development partners Identify priority health reduced Time and willingness to messages Procedures manual address bottlenecks Address bottlenecks compiled Develop procedures and forms 4. Producing the Resources (books, Identify information No. of short stories radio programme reports, databases) from resources written around special Ministry of Health and Develop access to themes suited for radio partners resources drama along with Internet databases Write the story to support information packet Storyboard writer to the message Information resources develop sketches identified and consulted 5. Budgeting and Capable staff Budget for all activities to Budget Funds available financing Support of management, support the production and Financial report Costs in budget limits donors and NGOs airing of the programme Staff and management agree on financial priorities Implementation/operations 6. Promotional Motivated, skilled staff Posters/cards/brochures No. of brochures sent % of potential target Increase in no. of activities Promotional materials developed No. of meetings and groups aware of service questions asked (website, brochures) Meetings organised participants No. of partners (experts Decrease in % of Relationship with media Media informed No. of press releases involved) increases questions asked outside scope 7. Delivery of Motivated, skilled staff Broadcasting rural radio No. of radio programmes % of listeners listening to % of listeners radio programme Office space, equipment programme via various aired the programme implementing what Database, information radio stations No. of responses to % of listeners satisfied they’ve learned sources Liaise with experts feedback followed up with the programme Writer, actor contracts Feedback followed up Relations with other radio stations Monitoring and evAadleuqautaioten procedures 8.Monitoring and Capable staff Define indicators Clear routine for Improved effectiveness evaluation Clear procedures Design report formats recording reports and efficiency of the Adequate registration Organise data collection (updated regularly, daily, radio programme Indicators of quality, Generate statistics and weekly, monthly quarterly, effectiveness, efficiency report yearly) 259 Smart Toolkit for Evaluating Information Projects, Products and Services by working with your colleagues to formulate the questions based on your knowledge of them. But the stakeholders must become involved at some stage to ensure that you have their views.You can do this using interviews, workshops, and questionnaires with individual or groups of stakeholders. Table 4.55 Stakeholder analysis for a rural radio programme on health issues STAKEHOLDER BENEFITS FROM CONTRIBUTIONS / INFLUENCE ON POTENTIAL CATEGORY THE PROJECT SACRIFICES THE LIBRARY INVOLVEMENT IN THE EVALUATION Listeners Improved knowledge on Time listening to the radio Format and content of the Represented on the team relevant health issues Cost to access the service programme Involved in analysis and (radio) decisions on actions for change An information source Informed about main findings Volunteers, NGOs, Better information for their Time and technical input Rural radio programme Should be represented on contributors, advisors, target groups improved, sharper the evaluation team health and community Informed about main groups findings Broadcasters, Improved programme Time to develop programme Through project, plan project Should be involved in the government agencies, Improved outreach concept and convince funding management and programme planning and production personnel Promotional effect agency implementation, distribution evaluation Extra efforts of programme Should be informed about evaluation findings Funding agencies Results to show to their Finance and time On project concept and Should be informed about clientele plan and monitoring and the evaluation findings evaluation Table 4.56 Key questions stakeholders could be interested in STAKEHOLDER QUESTIONS THEY ARE POTENTIAL USE OF CATEGORY INTERESTED IN THE ANSWERS Listeners Is the rural radio programme relevant and timely? Will they continue to listen to the programme Is the information provided in an appropriate Are they likely to recommend it to others? format? Is the programme satisfactory? Volunteers, NGOs, contributors, advisors, How effective and efficient is the programme? Should we continue to collaborate with its health and community groups producers Broadcasters, government agencies, How effective and efficient is the programme? Should we continue with the programme? production personnel What are its strengths, weaknesses, opportunities What should we do to improve it? and threats? What should we do to reduce costs and increase Is it informative, popular and well presented? income? How can it be financed in the future? Funding agencies Does the programme have an impact on the Should we continue, reduce or increase the health of the target population? funding? How sustainable is the programme after funding In what areas can/should we assist? is stopped? 260 Part 4: EVALUATION GUIDELINES: Rural radio Defining the evaluation focus, questions and indicators Most evaluation exercises need to be limited in focus to reflect time and budget limitations. It is not possible to cover all elements of a rural radio programme extensively every time you carry out an evaluation, so you need to choose the focus of your evaluation, based on stakeholders’ key questions and the time and cost of accessing data. The focus of the evaluation might stem from your desire to improve it, from low or declining listenership, from feedback from others, or from a need to cut costs.There might be concerns about the programme in comparison with a competitor.You might want to assess current achievements so as to determine strategies for future publications. Other examples of areas of focus include: z a specific objective (i.e., providing information to certain vulnerable groups) z specific assessment criteria (e.g., effectiveness and impact of the programme, or a focus on those criteria that are most problematic) z the primary stakeholders or those experiencing the most problems z a combination of all of these Asking specific questions will help you to improve key areas of the radio programme.With each key question, you need to think about the indicators you will need to help you answer that question. Indicators also determine the type of data to collect. Remember to consider the interests of the stakeholders and the ease of access you have to the data you need in terms of time and cost. All this information can give you insight on how to schedule the programme better, or target it to specific groups. It may seem simply common sense, but often mistakes are made in scheduling or targeting. By checking that your assumptions are right, and that you are reaching the people you intend to reach, you can ensure that the programme has the maximum impact. Table 4.57 provides an example of the focus, key questions and indicators for a rural radio programme.You can use the information here to develop a more specific set of questions, relevant to the focus of the programme. Using the feedback from your stakeholders and the data available to you, you will be able to determine which questions are important for your evaluation. With reference to Table 4.57 you may want to focus on one criterion (e.g., impact) because of its importance to the continuation of the radio programme or on the criteria of usability and accessibility because of their importance to the listeners. If accountability is the main purpose of the evaluation you may want to limit the scope of the evaluation to the relevance, effectiveness and efficiency of the radio programme. Drawing up a matrix like Table 4.57 shows how important it is to have access to good data to support your evaluation and, by extension, how important it is to maintain a good monitoring system of data relating to your radio programme. Collecting the data When you have determined your scope, focus and indicators, you will need to decide how to collect the data. It is important to develop a data collection strategy that ensures: z that all relevant data will become available during the evaluation z that no more data are collected than what is needed and can be analysed 261 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.57 Determining the focus, key questions and indicators for the evaluation of a rural radio programme, using some evaluation criteria as examples EVALUATION FOCUS KEY QUESTIONS STAKEHOLDER DATA CRITERIA (SCOPE) INTEREST ACCESSIBILITY IMPACT Has the health of Percentage of target group who have changed High Low the target their attitudes/behaviour (e.g., hygiene, eating population habits) as a result of the programme improved? Changes in the programming: the extent to which the programme gives opportunities to the target group to contribute to programming RELEVANCE Needs of the target Percentage of target group indicating that the High Medium group programme addresses their information needs Percentage who want more issues to be addressed ACCESSIBILITY Access Percentage of target group who have access to High Medium a working radio? USABILITY Listener satisfaction Percentage of target group satisfied with the High Medium programme Percentage of the target group satisfied with specific sections of the programme Feedback from listeners on: - what they thought the main message of the programme was - if the programme is the right length - if it’s in the right language EFFECTIVENESS Awareness Percentage of target group aware of the High Medium programme Percentage of target group who say they learned from the programme Message Feedback from staff on what they thought the High Low Format main message of the programme was and what Duration it should have been, whether the programme is Presentation appropriate for the target audience, right length, right language and well presented Reach Percentage and range of the target group who High Medium listen to the programme over time EFFICIENCY Expenditure Average cost to produce and air the programme High Low Data collection therefore requires careful preparation. Defining the scope of the evaluation to use as the basis for designing the evaluation questions and indicators allows you to see clearly what data are required and the most appropriate data collection methods to use. The data collection methods for a radio programme include: z Internal interviews: Interview both volunteers and staff, asking them their opinion about the programme.This inclusion will assist buy-in and ownership, and provide valuable information.You need to reassure them that their input is confidential.This method can be applied informally, or it could be structured around a simple questionnaire. 262 Part 4: EVALUATION GUIDELINES: Rural radio z Audience market research: Many stations will not be able to do this because it is a large- scale and costly activity. But you could conduct the research via a partner (e.g., if you’re working with a large NGO, it might have access to market research analysts; or in your local university there might be students on marketing or related courses who could conduct a study as part of their course work). z Listener feedback analysis: Document the number and time of all the phone calls taken during phone-in programmes, or in response to competitions. Get some details about the caller, such as where they live and their age (preferably their age category). Record all the feedback the station receives. z Vox pops: Conduct on-the-street interviews systematically, asking the same questions of a range of people.Vox pops are a quick and easy way to gauge public opinion on an issue or programme. z In-depth interviews/case studies: Conducting in-depth interviews with a small, carefully selected group of people might provide information that is just as useful as a large statistical survey. Use a checklist of topics to guide the interview, but follow up any unexpected or negative responses. For an in-depth case study, you could identify a group of listeners representative of your target audience, and then visit them regularly (weekly, monthly) to obtain their feedback on the programmes after they are broadcast. z Listener club feedback: Many successful radio projects use listener clubs to listen in regularly and report back on programmes that are broadcast.This works particularly well in rural areas and with women’s groups.The members of a club gather regularly to listen to a programme, discuss it and provide feedback. z Listener diaries: These can be useful for collecting information on women and people living in remote areas. Encourage representative members of the target group to write listener diaries, recording their reactions to programmes, what information they thought most or least useful, and whether they put into practice any of the advice they heard.This can be time-consuming, so it is often appropriate to offer modest incentives, such as running a competition. Do remember to include the ‘silent audience’ – those people who never contact a radio station but listen to its programmes, as well as those people in your target audience who don’t listen to any of the programmes, to find out why. Table 4.58 gives some guidance on which data collection tools are most appropriate to use for the indicators identified. Note that some of the tools needed for some indicators might be too expensive to use and you might have to find more creative ways to get the information you need or perhaps decide against using that particular indicator. Analysing the data It is advisable to think about data analysis tools and methods before starting the data collection. This will help to ensure that the right types of data are being collected. Table 4.59 will help you to start thinking in this way. For example, if you want to know what listeners think about the programme, you should already be thinking about ways to interview them. 263 Smart Toolkit for Evaluating Information Projects, Products and Services In designing the data analysis for a radio programme evaluation, you should ask the following questions: z Which analytical tools should be used? (e.g., a table featuring the type of questions versus the type of listeners, a graph showing increase in listeners over time, or specific analytical tools such as a problem tree, process chart or SWOT analysis) z Which collection methods will provide the necessary data for each of the tools? Table 4.58 Example of data collection methods for evaluating a rural radio programme DATA INFORMATION KEY ISSUES / ANSWERS EXPECTED COLLECTION METHOD SOURCE QUESTIONS AND CONCERNS Desk study Programme records Are colleagues satisfied with the radio Some communities don’t have Internal memoranda on programme? access to a working radio meetings with colleagues, Are the targeted listeners satisfied with it? Listeners who give feedback letters, emails, recordings of the Is it making a difference in communities? might not be in the target group programme Is it addressing the areas it set out to do? Need to test the findings by looking at the listener profiles Accounts department How much time is involved in producing each Time and money to produce a programme? programme can vary widely What is the cost of production? Interviews/case studies/vox Selected listeners Are you aware of the programme? Time consuming, expensive pops/ questionnaires Is the programme useful? Self-administered questionnaires Is it broadcast at an appropriate time? are less expensive, but the Have you changed your attitude as a result of the respondent may fill in the programme? questionnaire incorrectly or not at all Use the radio station to encourage listeners to contact you for an interview by broadcasting your address and telephone number, and organising competitions. A short intercept interview by the receptionist who, for example, takes music requests can also help (e.g., name, sex, address, opinion of programme) Informal interviews Advisors, staff and volunteers What is their opinion on how the programme was Need to ensure that feedback is produced and content? confidential to encourage trust and openness Listener club feedback Target listeners Ask a group of listeners to regularly give feedback Listeners are close to the material on the programme and therefore not objective in their feedback Listener diaries Target listeners Ask listeners to record their reactions to the Time consuming programme in terms of what was useful and what was not useful Meetings Production personnel, NGOs, What do you think about the message, format, Difficulty in getting partners to government agencies duration, and presentation of the programme? work with your organisation How to improve collaboration? 264 Part 4: EVALUATION GUIDELINES: Rural radio z What type of observations do you expect to make for each tool? z What type of conclusions do you expect to reach from using the tools? Once you have designed the data analysis, it is important to see if your data collection includes all elements.The results from some data collection methods (e.g., interviews) could be specifically meant for checking and/or interpreting other data. Explain why you were unable to gather certain data. Table 4.59 Example of a data analysis design for a rural radio programme evaluation ANALYTICAL TOOLS RESULTS FROM TYPE OF OBSERVATIONS TYPE OF CONCLUSIONS AND TABLES COLLECTION METHOD TO MAKE TO REACH USED Table 1: Desk study Categories of listeners who are over/- under- Categories of listeners to give (target group) represented more attention to Listenership by age, location, programme penetration by group Table 2: Desk study How useful the programme is given the needs of Whether the programme has (target group) Questionnaires the target group outlived its purpose Relevance Interviews Table 3: Desk study High and low satisfaction with the programme as Areas for attention Usability: Questionnaires a whole - Listener satisfaction Interviews Listener club feedback Listener diaries Table 4: Desk study Level of awareness of the programme Priorities for improvement (target group) Questionnaires Level of access of the target group to a working Obstacles to improvement Effectiveness: Interviews radio - Awareness Change in number of target listeners who listen to - Access the programme over time - Reach Table 5: Interviews Whether staff and partners are happy with the Can determine whether the (staff, partners) programme radio has to change its way of Effectiveness: programming - Message - Format - Duration - Presentation Table 6: Data from accounts Cost to produce the programme Potential areas for reducing Efficiency Time involved in the production process costs/time Figure 1: Programme procedures manual, Bottlenecks in the process Process steps to improve Process flow chart interviews Table 7: In-depth interviews, case Personal stories of how lives have been affected Determine whether there has Impact stories studies, feedback from letters, by the programme been some change in the emails, call-ins, health statistics Statistics showing, e.g., fewer children getting quality of life in communities water-borne diseases 265 Smart Toolkit for Evaluating Information Projects, Products and Services Preparing a communication plan Designing the communication plan is an important activity in the evaluation process to ensure that conclusions and recommendations from the evaluation are well understood and supported. In designing this strategy, you need to consider these issues: z Which communication methods will suit which stakeholders, especially the primary stakeholders? z What are the main issues to be discussed and reported, taking into consideration the communication method and target group? An effective communication plan for critical reflection during an evaluation and reporting findings during and after the evaluation helps to create common understanding among stakeholders, providing a good basis for implementing recommendations. If your stakeholders don’t accept the conclusions and recommendations resulting from the evaluation, it will be almost impossible to motivate them to make the required changes. However, if they have been included throughout the process, they are more likely to accept the conclusions. During the process of critical reflection and reporting, you should ask: z Do the stakeholders share the same views on the problems with the rural radio programme? z Do they have the same views on the solution(s)? z Are they prepared to support the same solution(s)? z What are the obstacles that will prevent them from implementing the solution(s) proposed? z What can be done to address the obstacles they face? There are various ways in which you can convey the results of the evaluation (e.g., providing a summary of your findings, organising a meeting, a memorandum). Table 4.60 Example of a communication plan for reporting the findings from a rural radio programme evaluation COMMUNICATION METHOD STAKEHOLDERS TO MAIN ISSUES TO DISCUSS / REPORT TO REPORT ON Critical reflection Staff, selected partners Critically reflect on results from the data collection Summary report Representatives from health and rural communities Results from data collection and partners Full report, including executive summary Funding agency Message, content, duration, presentation Programme management listenership, relevance, satisfaction, awareness, access, reach, impact , cost-effectiveness Personal meeting Programme management Conclusions and recommendations Feedback on individual staff members Short report of the evaluation aired on the Listeners Main findings in broad terms and how they will programme affect the programme Article on experiences of the programme Professionals active in similar fields Evaluation approach and results Programme promotional brochure Target listeners Examples of programme impact stories 266 Part 4: EVALUATION GUIDELINES: Rural radio If the stakeholders are properly involved in the evaluation process in terms of participation and input, then getting their agreement on the relevance, reliability and quality of the data collected, the adequacy of the analysis provided and the conclusions drawn should not be difficult. Be aware, however, that although there might be general acceptance of a proposed solution, you might not be able to implement the solution because of a lack of resources, time and/or capacity. An important component of the communication process is to find out what difficulties the stakeholders expect and/or experience in implementing the recommendations.Where changes can be implemented without difficulty, it is advisable to implement them as quickly as possible. Box 4.15 Evaluating a rural radio programme: guidelines checklist These guidelines on evaluating a rural radio programme are covered above: z Rural radio programme concept and objectives z Data needs and stakeholder participation z Evaluation focus, questions and indicators z Data collection z Data analysis z The communication plan For more on: z data analysis, see Part 2, page 60 and Part 3, 157-162 z data collection, see Part 2, pages 58-59 and Part 3, pages 106-115 z evaluation communication and follow-up, see Part 3, pages 163-173 z evaluation criteria (scope), see Part 2, pages 33-34 z indicators, see Part 3, pages 91-98 z logframe, see Part 3, pages 68-83 z logic model, see Part 3, pages 103-105 z stakeholder participation, see Part 1, pages 3-5 z terms of reference, see Part 2, pages 32-35 267 Smart Toolkit for Evaluating Information Projects, Products and Services DATABASE These guidelines relate to evaluating a database, using the terms of reference (see Part 3, pages 99-103) as the basis of the evaluation and drawing on examples where appropriate. It is important to make the evaluation process as participatory as possible, involving colleagues and key stakeholders from the outset, including your primary stakeholders (see Part 1, pages 3-5). By doing this, you will find the experience more rewarding and more likely to result in a general acceptance of the evaluation findings, making it much easier to implement change. The notion of ‘critical reflection’ is discussed towards the end of the guidelines, but we advise you and your evaluation team to ‘reflect’ on the main findings and problems at each stage of the evaluation process, for learning purposes and to improve the way you conduct the evaluation. (See Part 1, pages 5-6). The background to the evaluation process in general – concepts, context, terms, trends and core ingredients – is described in Part 2. Do read that section if you are not familiar with certain concepts or terms that occur in these guidelines. And in Part 3 you will find more on the evaluation tools described here. Before evaluating a database, you need to be clear about: z what the main elements of the database are z its concept(s) and objectives z who the primary and secondary stakeholders are z how to go about identifying the needs of the primary and secondary stakeholders z what the focus of the evaluation is, the questions to ask and the indicators to use z how to collect the data and analyse them z how to communicate with your key stakeholders, to critically reflect upon the evaluation results and to report the results in such a way that they will be accepted and acted upon What is a database? A database is a large, regularly updated file of digitised information (e.g., bibliographic records, abstracts, full-text documents, directory entries, images, statistics) related to a specific subject or field. It consists of records of uniform format organised for ease and speed of search and retrieval and is managed with the aid of database management system software. Every database has been created with a specific goal.They are usually stored in one location and made available to several users at the same time for various applications involving rapid search and retrieval.There are also databases that can only be accessed online. Databases can also be stored in a machine-readable form (on magnetic tape, disk or optical disk), or printed on paper (as in a book). Common databases include: z project databases (these include details of projects for one or more institutions) z bibliographic databases (these include bibliographic records, with or without abstracts) z contact databases (these include details of individuals and institutions relevant to an organisation’s work; sometimes, these data are shared among a group of organisations) 268 Part 4: EVALUATION GUIDELINES: Database If it takes too long to find the required information in a database, or people need expertise to conduct an adequate search, then the intended users will not be able to access it.To encourage greater use of databases, search and retrieval processes need to be efficient. Determining the database concept and objectives You can’t evaluate a database unless you are clear what it is about.The concept (idea) behind the database and the objectives it seeks to achieve need to be clearly stated, otherwise you can’t compare its actual performance with its intended performance.Also, without a clear concept it is difficult to make the right choices during the evaluation process. In determining the database concept, you should ask these questions: z Why was the database developed? z What were the main objectives (expected results) and main problems to be addressed at the time? z What is the goal of the database? z What are its core values? z Who are the targeted readers? z Is it a database that your institution has developed, or one that your institution is subscribing to? In evaluating the database, you need to ask yourself key questions related to the concept, such as: z Do the expected results achieved reflect its core values and approach? z Do they reflect the project purpose and contribute to the goal? z Does the database address the main objectives and problems it was supposed to? z Are the correct messages being conveyed? Box 4.16 Example of a concept behind a database to support scientific research Main problem: A Scientific Research Institute has been documenting its work since its inception in 1990 and trying to build up its resources to support scientific research in the country. A few years ago, the institute received funds to boost its library resources on the condition that it made its database accessible to the wider scientific community. Since then, it has acquired scientific literature and access to key scientific bibliographic databases. The director is uncertain, however, if the current database is meeting the needs of his institute and other stakeholders. If it is not, he wants to know what needs to be done so that the database can better meet the needs of staff and other stakeholders. Goal: To contribute to raising the level of scientific research through the establishment of a reliable database with extensive resources for scientists, researchers and other stakeholders to support national development and production. Project purpose: Access to current and reliable information to support the research and scientific investigations of scientists, researchers and students provided. Expected results: Comprehensive database which is available and accessible to the scientific community developed. Main approach: To meet staff and key stakeholders to identify their needs and document their experiences in using the current database. Core values: Relevance, effectiveness, efficiency and impact 269 Smart Toolkit for Evaluating Information Projects, Products and Services The logical framework can help you clarify the database objectives (see pages 68-83). Commonly known as the ‘logframe’, it helps to summarise a project in a logical sequence. If a project plan does not have a logframe, it is useful to construct one so that you have a good idea of your objectives and the consequent hierarchy of activities.An example of a logframe for a database is given in Table 4.61. Where the database is not seen as a project, but as an ongoing activity, the logic model might be the more appropriate tool to use. It provides an overview of activities associated with the database, the resources used, the outputs/results, the outcomes/effects and the impact.An example of a logic model applied to a database is given in Table 4.62. Table 4.61 Logical framework for a scientific database INTERVENTION OBJECTIVELY VERIFIABLE MEANS OF ASSUMPTIONS LOGIC INDICATORS VERIFICATION GOAL To raise the level of scientific research for improved national production DATABASE Access of scientists, After 3 years, number of users Routine records Access to the database PURPOSE researchers and students to a increased by 50% contributes to increased wide range of resources national production increased EXPECTED 1. Improved awareness After 3 years, 90% of target Survey Increased awareness leads to RESULTS among stakeholders about the group aware of the database; Routine records increased use of the database database 75% can access the 2. Improved availability of database; and there is a 50% adequate scientific information increase in number of publications written ACTIVITIES 1.1 Organise meeting with (sometimes a summary of (sometimes a summary of (if the activities are completed, partners to discuss the resources/means is provided costs/budget is provided in what assumptions must hold database and its promotion in this box) this box) true to deliver the expected 1.2 Develop promotional results) campaign with partners 1.3 Promote the database using researchers/scientists who use it 1.4 Implement campaign 2.1 Conduct user needs survey of selected researchers, scientists and students 2.2 Identify gaps in the database 2.3 Acquire appropriate resources INPUTS Funding from donor agency; List resources available core funding from Ministry 270 Part 4: EVALUATION GUIDELINES: Database Table 4.62 Logic model for a scientific database FOCUS RESOURCES ACTIVITIES OUTPUTS OUTCOMES IMPACT (indicators of results) (indicators of effects) INDICATORS Planning and development 1. User needs Open-minded staff Design and administer No. of questionnaires sent No. of changes made identification Transport, postage, questionnaires/interviews No. of interviews based on user needs photocopies Conduct meetings conducted analysis Users willing to Analyse data and draw No. of meetings participate conclusions conducted 2. Involving External-oriented staff Calls, visits, meetings with No. of calls/visits made Stronger network; stakeholders Transport, photocopies external partners and meetings organised awareness and number Willingness of partners Develop agreements No. of external agencies of information sources involved increased 3. Development Dedicated staff, Process analysis No. of bottlenecks Time and costs of the database database expert, view Address bottlenecks reduced providing the information from client perspective, Develop database structure Procedures manual User satisfaction Willingness to address and content compiled increased bottlenecks Develop updating and Database developed user registration Evaluation form made procedures Follow-up form made 4. Developing Own resources (books, Identify information sources Broader range of % increase in the volume information reports, databases) (e.g., bibliographic literature offered of information sources Partners databases) available Internet databases Select information resources Develop access to resources 5. Budgeting and Capable staff Make a financial plan Clear financial proposal Funds available financing Support of management Specify activities Clear financial report Expenditure within and donor agencies Calculate the budget budget limits Organisational Staff and management guidelines agree on financial Donors’ financial priorities guidelines Implementation/operations 6. Promotional Motivated, skilled staff Posters/ brochures No. of brochures sent % of potential target % change in no. of activities Promotional materials developed No. of meetings and groups aware of service user profiles (website, posters, Meetings organised participants No. of partners (experts developed to meet brochures) Media informed No. of press releases involved) increases the needs of users % users who share information received from the database 7. Delivery of Motivated, skilled staff Registration of users No. of persons using the % of users being % change in the services Office space, equipment Regularly sending out database satisfied with the volume of published Database, information information when new More, better targeted information available and unpublished sources material available literature/ bibliographic papers Registration and Updating and maintaining references offered updating procedures the database Monitoring and evaluation 8. Monitoring Capable staff Designing report formats Routine records regularly Improved effectiveness and evaluation Clear procedures Organising data collection updated monthly, and efficiency of the Adequate registration Generating statistics & quarterly, yearly database Indicators of quality, reports effectiveness, efficiency 271 Smart Toolkit for Evaluating Information Projects, Products and Services In using this framework, some key questions to see if the database is meeting the stated objectives over time become evident. For example: z Is the research and scientific community aware of the database? z Do researchers and scientists have access to the database? z Is the database useful? z Does the database contribute to an increased number of local scientific publications/research materials ? z Is the information provided in a timely and cost-effective way? Clarifying data needs and stakeholder participation When preparing for an evaluation, you will need to know who your primary and secondary stakeholders are, so that you can determine your data needs and involve stakeholder representatives in the evaluation.The stakeholders should be taken into account at every stage of the evaluation, from its planning and design, to being part of the team as well as sources of information, and providing feedback on the evaluation results. It will be difficult to implement the evaluation recommendations if they are not acceptable to your stakeholders. The range of stakeholders in a database could include: z researchers/scientists z extension workers z farmers z community workers z policy-makers z database manager z content manager z funding agencies z implementing agency z partners The stakeholders will have a vested interest in the evaluation itself, and should be consulted and involved in some way.The input of the different groups will vary, depending on the particular situation.You should therefore take into account: z their interests when designing the evaluation z how they can actively contribute to the evaluation process z how to include them as part of the evaluation team z their role as a source of data (e.g., via questionnaires, interviews and workshops) z their feedback on evaluation results One way of identifying these stakeholders is to conduct a stakeholder analysis. It helps you to identify the key stakeholders, how they benefit from and contribute to the database and how to include them in the evaluation.Table 4.63 provides an example of a stakeholder analysis. When considering the stakeholders’ interests in the database, be aware of the kind of questions that they would like to have answered.Table 4.64 gives an overview of the kind of key questions that might be asked. If you know your stakeholders really well, you can start the process by working with your colleagues to formulate the questions based on your knowledge of them. But 272 Part 4: EVALUATION GUIDELINES: Database Table 4.63 Stakeholder analysis for a scientific database STAKEHOLDER BENEFITS FROM CONTRIBUTIONS / INFLUENCE ON POTENTIAL CATEGORY THE PROJECT SACRIFICES THE PROJECT INVOLVEMENT IN THE EVALUATION Implementing agency Improved database Time and expertise to develop Through project, plan project Involve staff in the Improved outreach the database concept/ management and evaluation and inform staff Promotional effect convince funding agency implementation of the of the findings Extra efforts database Users Improved knowledge on Time to search the database Contents and accessibility of Represented on the team issues they are researching Cost to access the database the database Involved in analysis and decisions on actions for change An information source Informed about main findings Database manager/ Feedback of users Time and expertise to develop Concept, structure and Part of the evaluation team content manager the database and make it contents of the database Should be involved in all accessible stages of the evaluation Informed about main findings Partners, NGOs Better information for their Time and technical input Improved database Involve key partners in the target groups evaluation and inform them of the main findings Funding agencies Results to show to their Finance and time On project concept and Inform them of the main clientele plan, as well as monitoring findings of the evaluation and evaluation Table 4.64 Key questions stakeholders could be interested in STAKEHOLDER QUESTIONS THEY ARE POTENTIAL USE OF CATEGORY INTERESTED IN THE ANSWERS Users Is the database relevant and timely? Should the database be used/recommended? Is the information provided in an appropriate format? Is the overall database satisfactory? Partners, NGOs How effective and efficient is the database? Decide on whether to continue collaboration with staff of the database Implementing agency (agency management How effective and efficient is the database? Should the database be continued? and database management) What are its strengths, weaknesses, opportunities What should be done to improve it? and threats? What should be done to reduce costs and Is it popular and well presented? increase income? How can it be financed in the future? Funding agencies Does the database have an impact on the quality Should funding be continued, reduced or of the scientific research produced? increased? How sustainable is the database after we stop In what areas is assistance needed? funding? 273 Smart Toolkit for Evaluating Information Projects, Products and Services the stakeholders must become involved at some stage to ensure that you have their views.You can do this using interviews, workshops, and questionnaires with individual or groups of stakeholders. Defining the evaluation focus, questions and indicators Most evaluation exercises need to be limited in focus to reflect time and budget limitations. It is not possible to cover all elements of a database extensively every time you carry out an evaluation, so you need to choose the focus of your evaluation, based on stakeholders’ key questions and the time and cost of accessing data. The focus of the evaluation might stem from your desire to improve the database, from low or declining use, from feedback from others, or from a need to cut costs.There might be concerns about the database in comparison with a competitor. Other examples of areas of focus include: z a specific objective (i.e. providing information on particular subjects) z specific assessment criteria (e.g., relevance and impact of the database) z the primary stakeholders or those experiencing the most problems z a combination of all of these Asking specific questions will help you to improve key aspects of the database.With each key question, you need to think about the indicators you will need to help you answer that question. Indicators also determine the type of data to collect. Remember to consider the interests of the stakeholders and the ease of access you have to the data you need in terms of time and cost. Table 4.65 provides an example of the focus, key questions and indicators for a database.You can use the information here to develop a more specific set of questions, relevant to the focus of the database. Using the feedback from your stakeholders and the data available to you, you will be able to determine which questions are important for your evaluation. With reference to Table 4.65 you may want to focus on one criterion (e.g., impact) because of its importance to the continuation of the database or on the criteria of usability and accessibility because of their importance to the users. If accountability is the main purpose of the evaluation you may want to limit the scope of the evaluation to the relevance, effectiveness and efficiency of the database. Drawing up a matrix like Table 4.65 shows how important it is to have access to good data to support your evaluation and, by extension, how important it is to maintain a good monitoring system of data relating to your database. Collecting the data When you have determined your scope, focus and indicators, you will need to decide how to collect the data. It is important to develop a data collection strategy that ensures: z that all relevant data will become available during the evaluation z that no more data are collected than what is needed and can be analysed Data collection therefore requires careful preparation. It happens too often that too much data are collected, or that insufficient ‘good’ data are collected to draw the relevant conclusions. Properly 274 Part 4: EVALUATION GUIDELINES: Database Table 4.65 Determining the focus, key questions and indicators for the evaluation of a database, using some evaluation criteria as examples EVALUATION FOCUS KEY QUESTIONS INDICATORS STAKEHOLDER DATA CRITERIA (SCOPE) INTEREST ACCESSIBILITY IMPACT Change in the Have researchers/scientists been % of researchers/scientists who quality of able to improve the quality of say that the quality of their High Low research their research? research has improved Have they been able to produce % of researchers producing the type of information needed information relevant for to increase production? increasing production RELEVANCE Usefulness Do users find the database % of users who find the High Medium useful? database useful % of users who find the information of little use ACCESS Accessibility/ Are there any restrictions to Rules governing usage of the High Medium reliability accessing the database (e.g., database language, browser required)? % of users who say that the Is there a charge to access the database is easy to access and database? How do charges reliable compare with other services? Is the service reliable and reasonably fast to access? Is there online help? What is the source of the information? (If it is a bibliographic database, ask if it is from a reputable publisher) Technology/ How often is the hardware Type of media used: CD-ROM, Low Medium/ media updated/maintained? web-based, etc. High What software is used? Whether the following are available: - thesaurus searching - support for non-traditional (non-Boolean) searching - possibility for Boolean operators to be used - nested Boolean searches accepted to support complex searches USABILITY Learnability How easy is it for you to use the % of users who find it easy to High High database the first time you use it? use database Efficiency Once you use the database how Perceived time taken to find the Medium Medium easy is it to find the information information, does it take too you need? long, is it adequate? Memorability When you return to the Was it easy or difficult to use Medium Medium database after a period of not the database again? using it, how easily can you re-establish proficiency? Satisfaction Is the database design pleasant % of users who are satisfied Medium Medium to use? with the database 275 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.65 (continued) EVALUATION FOCUS KEY QUESTIONS INDICATORS STAKEHOLDER DATA CRITERIA (SCOPE) INTEREST ACCESSIBILITY USABILITY Content Is there a full-text resource? Type of database in place High Medium Indexes and abstracts? Index % of users who say that the only? database is accurate and up-to- Quality of information/ date accuracy; contents up to date? Retrieval What are the search features? Itemise High Low Can indexes be browsed and terms selected for searching? Can a combination of fields be searched simultaneously? Can search limits be applied (by year, language, publication type)? User satisfaction Are the users satisfied with the % of users satisfied with the High High information provided? information provided Are the users satisfied with the % of users satisfied with the personal service offered? personal service from those operating the database EFFECTIVENESS Awareness Do you know about the % of scientists / researchers High Medium database? who know about the database defining the scope of the evaluation to use as the basis for designing the evaluation questions and indicators allows you to see clearly what data are required and the most appropriate data collection methods to use. It should be possible to generate automatically much of the data required to evaluate your database. In addition, routine records on dates of updating and numbers of records should also be easily accessible. Table 4.66 gives some guidance on which data collection tools are most appropriate to use for the indicators identified. Note that some of the tools needed for some indicators might be too expensive to use and you might have to find more creative ways to get the information you need or perhaps decide against using that particular indicator. Analysing the data It is advisable to think about data analysis tools and methods before starting the data collection. This will help to ensure that the right types of data are being collected. Table 4.67 will help you to start thinking in this way. For example, if you want to know what the users think about the database, you should already be thinking about ways to interview them. 276 Part 4: EVALUATION GUIDELINES: Database Table 4.66 Example of data collection methods for evaluating a database DATA INFORMATION KEY ISSUES / ANSWERS EXPECTED COLLECTION METHOD SOURCE QUESTIONS AND CONCERNS Desk study Feedback from users (e.g., Are users satisfied with the database? Some users may not have access letters, emails) Is the database making a difference to the type to a database and quality of research being done? Users who give feedback may Is the database addressing the objectives it set out not be the target group. to do? Need to test the findings by looking at the user profile (i.e., in this case researchers, scientists and students) Accounts department How much time is involved in developing and updating the database? What is the cost of production? Interviews/questionnaires/ Selected users Are you aware of the database? Time consuming, expensive self-administered Staff, partners Is the database useful? Self-administered questionnaire questionnaire How have you changed/improved your research less expensive; but the Informal interviews as a result of the database? respondent may not fill in the What is your opinion on how the database was questionnaire correctly or at all produced and content included? Need to ensure that feedback is confidential to encourage trust and openness Direct observation Identify 5 users to test the How easy is it to find the information asked? database by assigning set tasks Were there problems in accessing the information to do requested? Meetings Staff, NGOs, government What do you think about the structure, content, Difficulty in getting partners to agencies interactive use of the database? work with your organisation How to improve collaboration? In designing the data analysis for a database evaluation, you should ask the following questions: z Which analytical tools should be used? (e.g., a table featuring the type of questions versus the type of users, a graph showing development of users over time, or specific analytical tools such as a problem tree, process chart or SWOT analysis) z Which collection methods will provide the necessary data for each of the tools? z What type of observations do you expect to make for each tool? z What type of conclusions do you expect to reach from using the tools? Once you have designed the data analysis, it is important to see if your data collection includes all elements.The results from some data collection methods (e.g., interviews) could be specifically meant for checking and/or interpreting other data. Explain instances where you were unable to gather certain data. Preparing a communication plan Designing the communication plan is an important activity in the evaluation process in order to ensure that the conclusions and recommendations from the evaluation are well understood 277 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.67 Example of a data analysis design for a database evaluation ANALYTICAL TOOLS RESULTS FROM TYPE OF OBSERVATIONS TYPE OF CONCLUSIONS AND TABLES COLLECTION METHOD TO MAKE TO REACH USED Table 1: Desk study Categories of users that are over/under- Categories of users to give Type of users represented more attention to Table 2: Desk study How useful is the database given the needs of the Whether the database has (target group) Questionnaires target group outlived its purpose Relevance to the target Interviews group Table 3: Desk study High and low satisfaction with the database as a Priorities for improvement, (target group) Questionnaires whole obstacles Access Interviews Level of awareness of the database Level of access of the target group to the database Table 4: Interviews Whether staff and partners are happy with the Can determine whether the Usabilty database database has to be modified Learnability Efficiency Memorability Satisfaction of target group Content Retrieval Table 5: Data from accounts Cost to produce the database Potential areas for reducing Efficiency Time involved in the production process costs/time Figure 1: Database management Bottlenecks in the process Process steps to improve Process flow chart procedures manual Interviews Table 6: In-depth interviews, case Personal stories of how research has improved Determine whether there has Impact stories studies, feedback from because of increased access to the database been some change in the correspondence Statistics showing, for example, change type and quality and number of research quality of research undertaken papers produced and supported. In designing thiscommunication strategy, you need to consider the following issues: z Which communication methods will suit which stakeholders, especially the primary stakeholders? z What are the main issues to be discussed and reported taking into consideration the communication method and target group? An effective communication plan for critical reflection during an evaluation and reporting findings during and after the evaluation helps to create common understanding among stakeholders, providing a good basis for implementing recommendations. If your stakeholders don’t accept the conclusions and recommendations resulting from the evaluation, it will probably be almost impossible to motivate them to make the required 278 Part 4: EVALUATION GUIDELINES: Database Table 4.68 Example of a communication plan for reporting the findings from a database evaluation COMMUNICATION METHOD STAKEHOLDERS TO REPORT TO MAIN ISSUES TO DISCUSS / REPORT ON Critical reflection Staff, selected partners Critically reflect on results from the data collection Summary report Representatives from the users and partners Results from data collection Full report, including executive summary Funding agency Impact, effectiveness and efficiency and Database management conclusions and recommendations Personal meetings Database management Conclusions and recommendations Database promotional brochure Target users Main findings in broad terms and how the database has changed in response to the needs of users Examples of database impact stories changes. However, if they have been included throughout the process, they are more likely to accept the conclusions. During the process of critical reflection and reporting, you should ask: z Do stakeholders have the same views on the problems with the database? z Do they have the same views on the solutions? z Are they prepared to support the same solutions? z What obstacles prevent them from implementing the solutions proposed? z What can be done to address the obstacles they face? There are various ways in which you can convey the results of the evaluation (e.g., providing a summary of your findings, organising a meeting, a memorandum, an internet page).You might also want to consider the type of information you make available to the various target groups.The communication plan you use will depend on your circumstance and the facilities available to you.You might also want to consider sharing the results of your evaluation with your users, so that if there are changes to the database they will not come as a surprise to the users. If the stakeholders are properly involved in the evaluation process in terms of participation and input, then getting their agreement on the relevance, reliability and quality of the data collected, the adequacy of the analysis provided and the conclusions drawn should not be difficult. Be aware, however, that although there might be general acceptance of a proposed solution, you might not be able to implement the solution because of a lack of resources, time and/or capacity. An important component of the communication process is to find out what difficulties the stakeholders expect and/or experience in implementing the recommendations.Where changes can be implemented without difficulty, it is advisable to implement them as quickly as possible. 279 Smart Toolkit for Evaluating Information Projects, Products and Services Box 4.17 Evaluating a database: guidelines checklist These guidelines on evaluating a database are covered above: z Database concept and objectives z Data needs and stakeholder participation z Evaluation focus, questions and indicators z Data collection z Data analysis z The communication plan For more on: z data analysis, see Part 2, page 60 and Part 3, 157-162 z data collection, see Part 2, pages 58-59 and Part 3, pages 106-115 z evaluation communication and follow-up, see Part 3, pages 163-173 z evaluation criteria (scope), see Part 2, pages 33-34 z indicators, see Part 3, pages 91-98 z logframe, see Part 3, pages 68-83 z logic model, see Part 3, pages 103-105 z stakeholder participation, see Part 1, pages 3-5 z terms of reference, see Part 2, pages 32-35 280 Part 4: EVALUATION GUIDELINES: Selective dissemination of information (SDI) service SELECTIVE DISSEMINATION OF INFORMATION (SDI) SERVICE These guidelines relate to evaluating an SDI service, using the terms of reference (see Part 3, pages 99-103) as the basis of the evaluation and drawing on examples where appropriate. It is important to make the evaluation process as participatory as possible, involving colleagues and key stakeholders from the outset, including your primary stakeholders (see Part 1, pages 3-5). By doing this, you will find the experience more rewarding and more likely to result in a general acceptance of the evaluation findings, making it much easier to implement change. The notion of ‘critical reflection’ is discussed towards the end of the guidelines, but we advise you and your evaluation team to ‘reflect’ on the main findings and problems at each stage of the evaluation process, for learning purposes and to improve the way you conduct the evaluation. (See Part 1, pages 5-6). The background to the evaluation process in general – concepts, context, terms, trends and core ingredients – is described in Part 2. Do read that section if you’re not familiar with certain concepts or terms that occur in these guidelines. And in Part 3 you will find more on the evaluation tools described here. Before evaluating an SDI service, you need to be clear about: z what the main elements of the SDI service are z its concept(s) and objectives z who the primary and secondary stakeholders are z how to go about identifying the needs of the primary and secondary stakeholders z what the focus of the evaluation is, the questions to ask and the indicators to use z how to collect the data and analyse them z how to communicate with your key stakeholders, to critically reflect upon the evaluation results and to report the results in such a way that they will be accepted and acted upon What is an SDI service? An SDI service is a current awareness service that provides users with up-to-date and relevant information on a regular basis: monthly, bi-monthly or quarterly. Libraries and information services often provide such a service. It is based on profiles that describe the users’ information needs. A profile is compiled by the user in conjunction with the information professionals who provide the service.The profile is a symbolic description of a documentary/information requirement represented by a search equation. It can be a standard profile designed for a group of users sharing a common information need, or a personalised profile designed for an individual user.The profile is used as the starting point to select relevant information for the user.The profiles are run on multiple databases to ensure optimum recall based on subject, geographical and linguistic coverage.The service usually sends users newly recorded bibliographic records, including abstracts corresponding to their respective profiles. 281 Smart Toolkit for Evaluating Information Projects, Products and Services An SDI service is often coupled with a document delivery service that allows users to order documents of particular interest.The service is usually provided by information centres, and is frequently run by larger organisations. The establishment of an SDI service represents a significant investment in both human and financial resources. It is therefore essential to ensure that this service is meeting the objectives for which it was established. Determining the SDI service concept and objectives You can’t evaluate your SDI service unless you are clear what it is about.The concept (idea) behind the service and the objectives it seeks to achieve need to be clearly stated, otherwise you can’t compare its actual performance with its intended performance.Also, without a clear concept it is difficult to make the right choices during the evaluation process. In determining an SDI service concept, you should ask these questions: z Why was the SDI service developed? z What were the main objectives (expected results) and main problems to be addressed at the time? z What is the goal of the service? z What are its core values? z What is the project purpose of the service? In evaluating the SDI service, you need to ask yourself key questions related to the concept, such as: z Do the expected results achieved reflect its core values and approach? z Do they reflect the project purpose and contribute to the overall objective? z Does the SDI service address the main objectives and problems it was supposed to? z Are the correct messages being conveyed? Box 4.18 Example of a concept behind an SDI service supporting economic development Main problem: Researchers and scientists in developing countries have limited access to the requisite academic literature to support their research activities and to develop appropriate technologies needed to support the development of their country’s economies. Goal: To improve the quality of research and development of appropriate technologies to improve the quality of life of the population. Purpose: Access of information to researchers and scientists improved. Expected results: An appropriate SDI service for the target group provided. Main approach: To provide the service to key institutions concerned with research and scientific development within the country. Core values: Relevance, usability, effectiveness 282 Part 4: EVALUATION GUIDELINES: Selective dissemination of information (SDI) service The logical framework can help you clarify the SDI service objectives (see pages 68-83). Commonly known as the ‘logframe’, it helps to summarise a project in a logical sequence. If a project plan lacks a logframe, it is useful to construct one so that you have a good idea of your objectives and the consequent hierarchy of activities.An example of a logframe for an SDI service is given in Table 4.69. Where the SDI service is not seen as a project, but as an ongoing activity, the logic model might be the more appropriate tool to use. It provides an overview of activities associated with the service, the resources used, the outputs/results, the outcomes/effects and the impact.An example of a logic model applied to an SDI service is given in Table 4.70. Clarifying data needs and stakeholder participation When preparing for an evaluation, you will need to know who your primary and secondary stakeholders are, so that you can determine your data needs and involve stakeholder representatives in the evaluation.The stakeholders should be taken into account at every stage of the evaluation, from its planning and design, to being part of the team as well as sources of information, and providing feedback on the evaluation results. It will be difficult to implement the evaluation recommendations if they are not acceptable to your stakeholders. The range of stakeholders in an SDI service could include: z users (primary stakeholders, e.g., researchers, lecturers, policy-makers, extension workers, farmers, groups/organisations including women’s groups and farmer associations, input suppliers, students, training institutions) z partners (organisations with resources needed for the success of the SDI service) z policy-makers (those who have approved the implementation of the service) z implementing agency z funding agencies (e.g., development organisations, NGOs, government ministries) The stakeholders will have a vested interest in the evaluation itself, and should be consulted and involved in some way.The input of the different groups will vary, depending on the particular situation.You should therefore take into account: z their interests when designing the evaluation z how they can actively contribute to the evaluation process z how to include them as part of the evaluation team z their role as a source of data (e.g., via questionnaires, interviews and workshops) z their feedback on evaluation results One way of identifying these stakeholders is to conduct a stakeholder analysis. It helps you to identify the key stakeholders, how they benefit from and contribute to the SDI service and how to include them in the evaluation. Table 4.71 provides an example of a stakeholder analysis. When considering the stakeholders’ interests in the SDI service, be aware of the kind of questions that they would like to have answered.Table 4.72 gives an overview of the kind of key questions that might be asked. If you know your stakeholders really well, you can start the process by working with your colleagues to formulate the questions based on your knowledge of them. But the stakeholders must become involved at some stage to ensure that you have their views. You can do this using interviews, workshops, and questionnaires with individual or groups of stakeholders. 283 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.69 Logical framework for an SDI service INTERVENTION OBJECTIVELY VERIFIABLE MEANS OF ASSUMPTIONS LOGIC INDICATORS VERIFICATION GOAL To contribute to improved quality of life of the population through research and development of appropriate technologies SDI SERVICE Availability of information to After 3 years number of users Routine records Access to SDI contributes to PURPOSE researchers and scientists tripled increased production improved EXPECTED Improved awareness among After 3 years: Survey Increased awareness leads to RESULTS users and partners - 90% of users and partners Routine records increased use Improved availability of aware of the SDI adequate information - 75% of requests can be materials addressed with existing information resources - 50% increase in number of publications written ACTIVITIES 1.1 Organise meeting with (sometimes a summary of (sometimes a summary of (if the activities are completed, partners to discuss SDI and its resources/means is provided costs/budget is provided in what assumptions must hold promotion in this box) this box) true to deliver the expected 1.2 Develop promotional results) campaign with partners 1.3 Implement campaign 2.1 User needs survey and discussions with key stakeholders 2.2 Formulate project proposal 2.3. Generate donor interest and funding to strengthen SDI 2.4 Develop implementation plan 2.5 Acquire additional resources INPUTS Funding from donor agency; List resources available core funding from Ministry Defining the evaluation focus, questions and indicators Most evaluation exercises need to be limited in focus to reflect time and budget limitations. It is not possible to cover all elements of an SDI service extensively every time you carry out an evaluation, so you need to choose the focus of your evaluation, based on stakeholders’ key questions and the time and cost of accessing data. The focus of the evaluation might stem from your desire to improve the SDI service, from low or declining use, from feedback from others, or from a need to cut costs.There might be 284 Part 4: EVALUATION GUIDELINES: Selective dissemination of information (SDI) service Table 4.70 Logic model for an SDI service FOCUS RESOURCES ACTIVITIES OUTPUTS OUTCOMES IMPACT (indicators of results) (indicators of effects) INDICATORS Planning and development 1. User needs Open-minded staff Design and administer List of user needs, list of Changes based on user identification Transport, postage, questionnaires/interviews ideas identified, list of needs analysis photocopies Conduct meetings stakeholders who should Willingness of users to Analyse data and draw be involved participate conclusions 2. Involving External-oriented staff Calls, visits, meetings with No. of calls and visits Increased level of stakeholders and Transport, copies external partners made and meetings awareness of role of SDI partners Willingness of partners organised No. of information No. of new external sources increased agencies involved 3. Process Dedicated staff, view Developing agreements Bottlenecks reduced Time and costs of development from client perspective, Process analysis Procedures manual providing the information Willingness to address Address bottlenecks compiled User satisfaction bottlenecks Develop/update User profile database increased procedures and user developed profiles Evaluation form made Follow up form made 4. Developing Own resources (books, Identify and select Broader range of % increase in the volume information reports, databases) information resources literature offered of information available sources Partners Develop access to Internet databases resources 5. Budgeting Capable staff Make a financial plan Clear financial proposal Funds available and financing Support of management Specify activities Clear financial report Expenditure within and donor agencies. Calculate the budget budget limits Organisational Implementation/operations Staff and management guidelines agree on financial Donor financial guidelines priorities Implementattion/operations 6. Promotional Motivated, skilled staff Posters/brochures No. of brochures sent % of potential target % change in activities Promotional materials developed No. of meetings and no. groups being aware of number of user (website, posters, Meetings organised of participants in meetings the service profiles developed brochures) Websites maintained No. of website visitors No. of partners (experts to meet the needs Media informed No. of press releases involved) increases of users % of users who share information received from SDI 7. Delivery of Experts Liaise with experts No. of requests for full % of users being % change in the services Database of information Regularly sending out text papers increases satisfied with the volume of published sources, user profiles information based on the More, better targeted information received and unpublished Alerting/message tools user profiles literature/bibliographic papers to regularly send out Update user profiles references offered information Full-text documents Monitoring and evaluation 8. Monitoring Capable staff Define indicators Routine records regularly Improved effectiveness and evaluation Clear procedures Design report formats updated (monthly, and efficiency of the Adequate registration Organise data collection quarterly, yearly) service Indicators of quality, Generate statistics & effectiveness, efficiency reports 285 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.71 Stakeholder analysis for an SDI service STAKEHOLDER BENEFITS FROM CONTRIBUTIONS / INFLUENCE ON POTENTIAL CATEGORY THE PROJECT SACRIFICES THE PROJECT INVOLVEMENT IN THE EVALUATION Users (e.g., researchers, Improved knowledge on Time to look through the Content, form of delivery, Represented on the team scientists, students, issues they are investigating information format of the information Involved in analysis and farmers) Cost to access the service provided decisions on actions for change An information source Informed about main findings Policy-makers and Better information for the Time and support for the Through contracts and Source of information partners (e.g., libraries scientific and research service, costs are also agreements Represented on the team and research institutes) community involved Informed about main findings Implementing agency Improved service Time to develop SDI concept Through project plan, project Represented on the team Improved outreach and convince funding agency management and Involved in the planning Promotional effect implementation, provision of Should be informed of the service findings Funding agencies Results to show to their Finance and time On project concept, Should be informed of the (e.g., NGOs, ministries, clientele planning and monitoring findings development agencies) and evaluation Table 4.72 Key questions stakeholders could be interested in STAKEHOLDER QUESTIONS THEY ARE POTENTIAL USE OF CATEGORY INTERESTED IN THE ANSWERS Users (e.g., researchers, scientists, students, Is the information provided relevant, timely and Will they continue to use the service? farmers) applicable? Are they likely to recommend it to others? Is the information provided in an appropriate format? Is the overall service satisfactory? Policy-makers and partners (e.g., libraries, How effective and efficient is the service? Should we continue to collaborate with the research institutes) organisation implementing the service? Implementing agency How effective and efficient is the service? Should we continue with the service? How can the service be financed in the future? What should we do to improve it? What should we do to reduce costs and increase income? Funding agencies (e.g., NGOs, ministries, Does the service have an impact on the quality of Should we continue, reduce or increase our development agencies) research and development in the country? How funding? In what areas can/should we assist? sustainable is the service after we stop funding? 286 Part 4: EVALUATION GUIDELINES: Selective dissemination of information (SDI) service concerns about the SDI service in comparison with a competitor. Other examples of areas of focus include: z a specific objective (i.e., providing information on particular subjects) z specific assessment criteria (e.g., relevance and impact of the service) z the primary stakeholders or those experiencing the most problems z a combination of all of these Asking specific questions will help you to improve key aspects of the SDI service.With each key question, you need to think about the indicators you will need to help you answer that question. Indicators also determine the type of data to collect. Remember to consider the interests of the stakeholders and the ease of access you have to the data you need in terms of time and cost. Table 4.73 provides an example of the focus, key questions and indicators for an SDI service.You can use the information here to develop a more specific set of questions, relevant to the focus of the SDI service. Using the feedback from your stakeholders and the data available to you, you will be able to determine which questions are important for your evaluation. With reference to Table 4.73 you may want to focus on one criterion (e.g., impact) because of its importance to the continuation of the SDI service or on the criteria of usability and accessibility because of their importance to the users. If accountability is the main purpose of the evaluation you may want to limit the scope of the evaluation to the relevance, effectiveness and efficiency of the service. Drawing up a matrix like Table 4.73 shows how important it is to have access to good data to support your evaluation and, by extension, how important it is to maintain a good monitoring system of data relating to your SDI service. For an SDI service, it is particularly important to keep routine records, providing details of each user profile, when it was received, when updates were sent, the information provided (level, subject) and means of delivery.This will provide the basis for the evaluation. However, certain aspects such as impact, relevance and awareness require an extensive survey among users and potential users. So, depending on what you want to know, you will need to have evaluation questions that focus on the users’ view of the service. Collecting the data When you have determined your scope, focus and indicators, you will need to decide how to collect the data. It is important to develop a data collection strategy that ensures: z that all relevant data will become available during the evaluation z that no more data are collected than what is needed and can be analysed Data collection therefore requires careful preparation. It happens too often that too much data are collected, or that insufficient ‘good’ data are collected to draw the relevant conclusions. Properly defining the scope of the evaluation to use as the basis for designing the evaluation questions and indicators allows you to see clearly what data are required and the most appropriate data collection methods to use. Good SDI evaluations should collect information from both the service provider and users to ensure that decisions made are based on comprehensive, accurate and relevant data.While absolute 287 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.73 Determining the focus, key questions and indicators for the evaluation of an SDI service, using some evaluation criteria as examples EVALUATION FOCUS KEY QUESTIONS INDICATORS STAKEHOLDER DATA CRITERIA (SCOPE) INTEREST ACCESSIBILITY IMPACT Change in Is there more local applied % of producers who are able to High Low quality and type research aimed at improving improve their production after of research production? having access to locally produced produced research and technology RELEVANCE Information Do the users find the service % of users who find the SDI High Medium needs of users useful? useful % of users who find the information irrelevant ACCESSIBILITY Access Are stakeholders able to % of stakeholders who want to High Medium subscribe to the service? subscribe to the service, but Can users easily get the can’t information needed from the % of researchers/scientists who service? face difficulties accessing information from the service Delivery How was the service delivered? No. of users who receive their Medium High information by paper, e-mail User satisfaction Are the users satisfied with the % of users satisfied with the High information provided by the Medium service service? Are the users satisfied with the % of users who are satisfied personal service offered? with the personal service they received from those operating the service User profile Are the profiles up-to-date and No. of users who feel their High High cover user information needs? profile reflected their regular information requirements EFFECTIVENESS Awareness Do you know about the service? % of scientists/researchers who High Medium know about the service Timeliness Is the information needed % of users who receive the High Medium produced on a timely basis? information on a timely basis No. of complaints received related to delays in receiving the right information EFFICIENCY Expenditure: How many profiles need to be What is the unit cost per Medium High How much does developed/updated and what profile? What is the average it cost to have are the costs and the time cost/time involved in producing access to the involved to meet the information the information for the profiles? information? needs based on the profiles? 288 Part 4: EVALUATION GUIDELINES: Selective dissemination of information (SDI) service Table 4.74 Example of data collection methods for evaluating an SDI service DATA INFORMATION KEY ISSUES / ANSWERS EXPECTED COLLECTION METHOD SOURCE QUESTIONS AND CONCERNS Desk study SDI records Number and type of users, subject interest Some researchers and scientists Internal memoranda on Is the service making a difference to research and don’t have access to the service meetings with colleagues, scientific development? Very satisfied to very dissatisfied letters, emails Is the service doing what it set out to do? Are the users satisfied? Accounts department How much time and money are involved in Costs can vary depending on developing the database, producing the the number of requests for information and delivering the services including document deliveries the full-text documents? Interviews/questionnaires Researchers, scientists Are the users using the information? Time consuming, expensive Have they written more papers or done more Self-administered questionnaires research as a result of increased access to the are less expensive; but the SDI? respondent may not fill in the questionnaire correctly or at all Workshop Partners, publishing houses Strengths, weaknesses, opportunities and threats of the SDI Possibilities to improve collaboration numbers are important (to determine the size of the need, for comparison between groups and over time), you should also calculate percentages. Table 4.74 gives some guidance on which data collection tools are most appropriate to use for the indicators identified. Note that some of the tools needed for some indicators might be too expensive to use and you might have to find more creative ways to get the information you need or decide against using that particular indicator. Analysing the data It is advisable to think about data analysis tools and methods before starting the data collection. This will help to ensure that the right types of data are being collected. Table 4.75 will help you to start thinking in this way. For example, if you want to know what users think about the SDI service, you should already be thinking about ways to interview them. In designing the data analysis for an SDI service evaluation, you should ask the following questions: z Which analytical tools should be used? (e.g., a table featuring the type of questions versus the type of users, a graph showing increase in users over time, or specific analytical tools such as a problem tree, process chart or SWOT analysis) z Which collection methods will provide the necessary data for each of the tools? z What type of observations do you expect to make for each tool? z What type of conclusions do you expect to reach from using the tools? 289 Smart Toolkit for Evaluating Information Projects, Products and Services Table 4.75 Example of a data analysis design for an SDI service ANALYTICAL TOOLS RESULTS FROM TYPE OF OBSERVATIONS TYPE OF CONCLUSIONS AND TABLES COLLECTION METHOD TO MAKE TO REACH USED Table 1: Desk study Categories of users that are over/under- Categories of users to give Type of questions and type represented more attention to of users Table 2: Desk study How useful is the service given the needs of the Whether or not the SDI is (target group) Questionnaires target group providing the ‘right’ information Relevance Interviews Table 3: Desk study Whether the target group is able to access the Need to devise ways of Accessibility Questionnaires service increasing access to target Interviews group Table 4: Desk study High and low satisfaction with the service as a Areas for improvement Usabilty Questionnaires whole User satisfaction Interviews Tables 5: Desk study Level of awareness of the service Priorities for improvement (target group) Questionnaires Level of access of the target group to the service Obstacles to improvement Effectiveness: Interviews Range of topics covered by the SDI - Awareness - Coverage Table 6: Data from accounts Cost to produce the information Potential areas for reducing Efficiency costs/time Figure 1: Questionnaires, interviews, Constraints in the process Process steps to improve Problem tree workshop Table 7: In-depth interviews, case Anecdotal evidence that locally produced scientific Continue/redirect the SDI Impact stories studies, feedback from users material is promoting development in certain areas of the economy Figure 2: Workshop with staff, Strengths, weaknesses, opportunities and threats of New strategy for the SDI SWOT analysis stakeholders the SDI Once you have designed the data analysis, it is important to see if your data collection includes all elements.The results from some data collection methods (e.g., interviews) could be specifically meant for checking and/or interpreting other data. Explain instances where you were unable to gather certain data. Preparing a communication plan Designing the communication plan is an important activity in the evaluation process to ensure that conclusions and recommendations from the evaluation are well understood and supported. In designing this strategy, you need to consider these issues: z Which communication methods will suit which stakeholders, especially the primary stakeholders? z What are the main issues to be discussed and reported, taking into consideration the communication method and target group? 290 Part 4: EVALUATION GUIDELINES: Selective dissemination of information (SDI) service An effective communication plan for critical reflection during an evaluation and reporting findings during and after the evaluation helps to create common understanding among stakeholders, providing a good basis for implementing recommendations. If your stakeholders don’t accept the conclusions and recommendations resulting from the evaluation, it will be almost impossible to motivate them to make the required changes. However, if they have been included throughout the process, they are more likely to accept the conclusions. During the process of critical reflection and reporting, you should ask: z Do stakeholders have the same views on the problems with the service? z Do they have the same views on the solutions? z Are they prepared to support the same solutions? z What obstacles prevent them from implementing the solutions proposed? z What can be done to address the obstacles they face? There are various ways in which you can convey the results of the evaluation (e.g., providing a summary of your findings, organising a meeting, a memorandum, an internet page).You might also want to consider the type of information you make available to the various target groups.The communication plan you use will depend on your circumstance and the facilities available to you. Whatever you decide, your findings and recommendations should be presented in a way that is easily understood and accepted by all so that the desired changes can be implemented (see Table 4.76). If the stakeholders are properly involved in the evaluation process in terms of participation and input, then getting their agreement on the relevance, reliability and quality of the data collected, the adequacy of the analysis provided and the conclusions drawn should not be difficult. Be aware, however, that although there might be general acceptance of a proposed solution, you might not be able to implement the solution because of a lack of resources, time and/or capacity. Table 4.76 Example of a communication plan for reporting the findings from an SDI service COMMUNICATION METHOD STAKEHOLDERS TO MAIN ISSUES TO DISCUSS / REPORT TO REPORT ON Critical reflection Staff, selected partners Critically reflect on results from the data collection SWOT analysis Representatives of SDI, partners, other key Critically reflecting on the results from the data stakeholders collection Brainstorming Evaluation team Problematic issues as they arise Summary report Users of the service Results from data collection in terms of whether the service is useful or not; positives; constraints Full report, including executive summary Funding agency, SDI management Relevance, satisfaction, awareness, access, reach Impact indication, cost-effectiveness Personal meeting SDI management Conclusions and recommendations Feedback on individual staff members SDI promotional brochure Target users Examples of SDI impact stories 291 Smart Toolkit for Evaluating Information Projects, Products and Services An important component of the communication process is to find out what difficulties the stakeholders expect and/or experience in implementing the recommendations.Where changes can be implemented without difficulty, it is advisable to implement them as quickly as possible. Box 4.17 Evaluating an SDI service: guidelines checklist These guidelines on evaluating an SDI service are covered above: z SDI service concept and objectives z Data needs and stakeholder participation z Evaluation focus, questions and indicators z Data collection z Data analysis z The communication plan For more on: z data analysis, see Part 2, page 60 and Part 3, 157-162 z data collection, see Part 2, pages 58-59 and Part 3, pages 106-115 z evaluation communication and follow-up, see Part 3, pages 163-173 z evaluation criteria (scope), see Part 2, pages 33-34 z indicators, see Part 3, pages 91-98 z logframe, see Part 3, pages 68-83 z logic model, see Part 3, pages 103-105 z stakeholder participation, see Part 1, pages 3-5 z terms of reference, see Part 2, pages 32-35 292