Monitoring and Evaluation (M and E) is an integrated activity and communication system as part of project supervision that is properly planned, managed and resourced in order to achieve desired results.
The M and E is undertaken to increase the impact of MDARRC’s strategies, programs, projects and studies, provide reliable and useful feedback about ongoing and completed studies, and increase the ability to analyze this feedback. Moreover, through the conduct of M and E, relevant strategic implications can be deduced and provide inputs on learn how to do better in the future. MDARRC’s monitoring and evaluation challenge is to structure this learning process in a way for the generation of relevant knowledge for the stakeholders and partners and at the same time ensuring that this knowledge can and will be applied in practical and immediate ways.
The conduct of evaluation is based on the clear cut and well- defined logical framework (Log frame) of MDARRC’s program and individual projects and studies. In this way, tracking progress against carefully defined output indicators provides a clear basis for monitoring progress, verifying purpose and goals so that evaluation is simplified.
This approach stresses the collaboration of multiple stakeholders within as well as outside MDARRC and learn and share knowledge in a cooperative relationship with the evaluation partners, (our end users and technology adopters) in order to increase the likelihood of our partners adopting and using our technologies. MDARRC shall also adopt a good and workable M and E so that the relevant and specific findings and recommendations are applied by the researchers and partners to improve performance.
MDARRC’ M and E shall stress the importance of a thematic evaluation by revisiting the existing or formulating new operational policies and strategies to increase the involvement of the researchers and target beneficiaries in dealing with issues related to evaluation of strategies and programs at various levels.
The center’s evaluation process shall also stress the importance of learning to those who are expected to use the evaluation recommendations and lessons for improving performance and also to participate in the process of developing these recommendations. Our target beneficiaries and partners entrusted with the implementation of activities should be involved in the evaluation process would also become the motivated learners, who are able to translate into action what they have learned in evaluation work.
6.1 Program Level
The Logical Framework will be at the core of MDARRC’s monitoring and evaluation system to track the progress of the RDE Action Programs being implemented. This takes the form of a series of connected propositions:
- That if these Activities are implemented, and these Assumptions hold, then these Outputs will be delivered.
- If these Outputs are delivered, and these Assumptions hold, then this Purpose will be achieved.
- If this Purpose is achieved, and these Assumptions hold, then this Goal will be achieved.
As part of a continuing Monitoring and Evaluation, MDARRC shall conduct an Annual In-House Review to assess and evaluate the performance of program/project/study with experts from ERDB Central Office and partner organizations such as PCARRD, SMARRDEC and SUCs as evaluators. This shall consist of a presentation of research results to provide baseline information on a project vis a vis a particular indicator. It will also provide a good venue help explain whether changes are occurring or not in a particular project or study.
Preparation and submission of annual accomplishment reports shall be conducted to determine if the project is still relevant to the expected outcome the program/project/studies hope to achieve.
6.2 Project Level
For an effective Monitoring and Evaluation activity, MDARRC shall employ direct observation through field visits at every stage of project implementation. This is to obtain first-hand information that is useful and timely information by observing what people do to help make decisions on improving projects’ performance or for generating insights and findings that can serve as basis for more focused studies. This method is crucial to complement collected data, can be used to understand the context why the information is collected and can help explain results.
Preparation and submission of monthly physical accomplishment and quarterly reports reflecting targets versus accomplishments to determine the progress of implementation of the various component activities on a timely basis with corresponding financial utilization aspect shall be regularly conducted.
6.3 Study Level
This will measure the bio-physical changes over time related to any indicator (e.g., no. of database developed, no. of technologies generated, no. of thematic/GIS maps prepared, growth and survival data, etc.) using any accepted measurement unit and procedure. From an M and E perspective, the bio-physical measurements can provide reliable, statistically verifiable data that form an important basis for measuring change and impact. The data shall be recorded in tables or diagrams and graphs with words or numbers. These data can then be used as a framework to follow so that comparisons can be done over time.