Introduction
This Master course is for capturing essential quantitative and qualitative changes of program/ project which is extremely important and challenging in the current situations. How to design an M&E system that goes beyond tracking activities?
Learn how to show results and make sure your system also enables genuine learning by all actors. Learn to automate and build systems to cut down the work time and focus on quality aspects of M & E.
This course provides you with an extended set of relevant, practical and participatory M&E methodologies and tools to boost your programme and organisational accountability, decision-making and learning practices.
After attending this workshop, participants will be able to:
- Understand and apply results-based M&E system principles to your M&E framework
- Process automation, program level coverage and control, program lean management and reduce work time by 60% to 90%
- Know how to turn M&E into a participative and systematic learning practice
- Be able to design and use key performance indicators
- Be able to apply Outcome Mapping and design and use progress markers
- Be able to apply the core steps of qualitative evaluation methodology Most Significant Change
- Develop effective systems for data collection, monitoring, storage, analysis, and reporting
- Project management, monitoring and evaluation plans to results, including overseeing research firms, external evaluators or partners
- Build trust with grantees and long term relationships to support and shape their learning and ability to change based on that learning
- Identify ways to provide low-cost, high-value technical assistance on M&E to grantees
- Support Director of R&D in building relationships across potential research and analytics firms and individuals
Who should attend?
- Officers in Government & Grant commissions (Mid and Top level)
- Junior / experienced officials interested in growth & promotions
- Anyone who wants to start a career in M&E Engineering
Training Methodology
Training course will utilise a variety of proven learning techniques to ensure maximum understanding, comprehension and retention of the information presented.
This training course is an interactive mixture of lecture, discussion, activities and practice on several management skills. It provides definitions, examples, discussion and activities designed to promote skill building with interaction and discussion among participants.
Course Outline
Results-Based Management and Monitoring and Evaluation
- Monitoring and Evaluation in the Context of RBM
- Outcome Monitoring (WBS structure)
- Outcome Evaluation
- Relationship Between Outcome Monitoring and Evaluation
- Importance of Partnerships to Outcome M&E
- Significance of “Soft” Assistance for Outcome M&E
- Implications for the Country Office
- Changes in M&E Tools and Processes
- Roles and Responsibilities
- Practical Challenges for Programme Managers
Planning for Monitoring and Evaluation
- Key Principles for Planning M&E
- Overall Work planning
- Minimum Requirements
- Planning M&E at the Country Programme level
- The M&E Planning Process
- Planning Monitoring
- Planning Evaluation
- Project Work planning and M&E
- Excel in planning
- Vision building
- Output forecast
- Building logics and connecting to the process & testing
- Template development for collection of quality data & pre-testing
- Training field champions
The Monitoring Process
- Key Principles for Monitoring
- Conduct of monitoring
- Scope of monitoring
- Monitoring responsibilities
- Selecting the right monitoring tools
- The Building Blocks: Key Monitoring Tools and Mechanisms
- Modern and result oriented approach in Monitoring
- Field visits – SMART Planning concepts (Future planning)
- Annual project report (APR)
- Outcome groups
- Annual review (AR)
The Evaluation Process
- Preparing for an Evaluation
- Purpose and Timing
- Involving Partners and Stakeholders
- Revisiting the Outcome
- Defining the Scope
- Drafting the Terms of Reference
- Budgeting
- Organizing the Relevant Documentation
- Forming the Evaluation Focal Team (Appointing Champions)
- Selecting the Evaluation Team
- Managing an Evaluation
- Collecting and Analysing Data
- Backstopping and Feedback
- Reporting
- Following up
- Joint Evaluations
Performance Measurement
- Performance Measurement
- Rating System
- Development
- Setting weightages
- Average setting
- Selecting Indicators
- Key Steps
- Indicator Planning
- Indicator development and tracking (automatically)
- Using Indicators
- Involving Stakeholders
- Using Results Indicators for Monitoring
Knowledge and Learning – Use of Evaluative Evidence
- Knowledge and Learning from Experience
- Definitions
- RBM and Knowledge Management
- Feedback from Monitoring and Evaluation
- The Feedback Process
- Information Tools and Methods
- Applying the Recommendations from M&E Feedback
- Publication of Evaluative Evidence and Feedback Material
M & E Key preparation
- Holistic preparation
- Linking M& E to Project Life Cycle
- Ethics & Standards of M& E
- M& E Framework development and implementation
- Project Stakeholder Matrix management
- Planning for Data Collection (Quantitative & Qualitative)
- Planning for Data Utilisation, Analysis & Storage
- Preventing leakage of data (Risk)
- Budgeting in M& E
- Preparing M& E Report for various stakeholders (Automation of reports & analysis)
- Preparing from real time reports with min staff
- Developing and preparing Audit Report
- Lessons learnt, recommendations and highlights