Staff and structure

At a glance

Core function

Strategic Planning Monitoring and Evaluation is an advisory unit providing technical support to ensure that all of our work sustain meaningful impacts across the region.

Farid Ahmad arrow

Head of Unit

About the Strategic Planning, Monitoring, and Evaluation Unit

Our Strategic Planning, Monitoring, and Evaluation (SPM&E) Unit focusses on performance and outcomes on the ground. Through participatory strategic planning, review, and evaluation processes, SPM&E provides guidance, advice and technical support to integrate results-based planning, monitoring and evaluation across all of our work  so that we can together produce the impactful results. SPM&E also ensures timely internal and external quality evaluations and impact assessments on institutional and programmatic achievements against set objectives and outcomes.

Download Reports

Fourth Quinquennial Review
Download
Response to Quinquennial Review recommendations
Download
External evaluation of our cryosphere programme
Download
External evaluation of our HICAP initiative and atmosphere programme
Download
Impact assessment of our apiculture project in Chitral, Pakistan
Download
Impact assessment of allo value chain in Nepal
Download
Monitoring and evaluation framework for gender integration

Our monitoring and evaluation approach

Impact Pathways Analysis

Our impact as a regional knowledge organisation is achieved through utilisation and upscaling of knowledge generated with our vast network of working partners and through knowledge sharing. Fundamentally important to us are three vital ‘I’s’ — Innovation, Integration and Impact. Not everything can be known about how change occurs, yet its important to have and share a vision for potential impact from the start. This is a fundamental tenet of innovation with regard to impact pathways.

Using a theory-based approach, the main impact pathways can be identified making the theory of change explicit in terms of different actors, users of outputs, and outcomes leading to clear development impacts. With this in mind, we work on the twin elements of impact pathways: validation of our contribution to changes related to the HKH region’s poverty and wellbeing and in the physical and social vulnerabilities and ecosystem services; and to increase process understanding enabling change to take place in complex biophysical and socio-cultural contexts. This approach provides us improved evidence toward impact within our strategic ad results framework. The focus is to facilitate stakeholder accountability and to support learning, policy information, and influence.

Results-based monitoring and evaluation plans

Each of our regional programmes and initiatives have developed a results-based log-frame with common sets of specific, measurable, attainable, relevant, and timely indicators, as well as reporting mechanisms which satisfy diverse development partners and assist implementation partners in measuring, documenting, and reporting. Indicators incorporate issues of gender equality, environmental sustainability, and other emerging cross-cutting issues such as governance, poverty, and economic analysis. Log-frame indicators are set to ensure measurement of both quantitative and qualitative outputs and outcomes. Each initiative develops a results based monitoring and evaluation plan to serve as an important management instrument for implementing partners to track and report results. The initiative-level monitoring and evaluation plans consider detailed pathways to impact. To evaluate regional programme and achievements, information collected at the initiative levels are compiled and analysed systematically. Quality scientific, product use, and the up-scaling of ICIMOD-generated knowledge are given special attention in monitoring and evaluation.

Our programme impact

Our mission and vision are focused on making a positive difference in the wellbeing of people and the environment through impact on poverty, people’s vulnerabilities and ecosystem services. Impact is an essentially important aim for us and we work together with partners, seeking to benefit women, men and children of the HKH region. Our institutional theory of change has a long chain of results for reaching all of our beneficiaries and our impact pathway analysis approach in monitoring and evaluation of RPs and initiatives, conducts external evaluations, and measures impact of selected initiatives and projects.

We think about our reach at three levels:

  • Immediate reach – individual people reached through pilot and action research and implementing partners directly involved in ICIMOD programmes
  • Intermediate reach – the reach of our programmes’ goods and services from upscaling
  • Ultimate reach – community people reached directly and indirectly through pilot or action research, or spill over and upscaling

We think of beneficiaries in two ways:

  • Direct beneficiaries are those who’ve benefitted from our programmes  and can be verified physically (e.g. community people who’ve participated in pilot and action research, and institutions who participated in implementation of ou initiatives). There are some differences on which people and how people benefit from different RPs and initiatives based on the nature of the programme and development pathway followed.
  • Indirect beneficiaries are those individuals and institutions who’ve benefitted from our programmes but cannot be easily verified physically (e.g. community people reached through upscaling and institutions not directly working in ICIMOD programmes but using our products and services).

Our result linkages

Guided by our vision, we work on the multiple elements of impact pathways — validation of our contributions to changes in regional poverty, wellbeing, physical and social vulnerabilities and ecosystem services, as well as to increase process understanding enabling changes to take place in complex biophysical and socio cultural contexts.

Multiple pathways — science policy practice impact pathways, regional cooperation pathways and capacity building and communication impact pathways — are critical to achieving expected outcomes and impacts defined in the strategic result framework.

Evaluation principles

Our evaluation function serves both learning and accountability purposes for our stakeholders. Impact pathways developed for each regional programme and initiative are the main basis for evaluating our programmes. At the institutional level, both internal and external mid-term, terminal evaluations and external quinquennial reviews are conducted based on the terms of reference approved by our Board of Governors and the ICIMOD Support Group to assess the programmes in general and our overall performance in terms of relevance, efficiency, effectiveness, impact and sustainability. Principles, methods, and tools used for evaluation are of international standard. Internal evaluations and impact studies mainly focus on learning aspects from our programmes and external evaluations serve multiple purpose of independence, transparency, accountability and learning.

The evaluation’s credibility are ensured through independent expert evaluators and a transparent evaluation process. Each evaluation emphasizes both intended and unintended results and also positive and negative impacts such as external factors impacting the programmes, changes in basic policy environments, general economic and financial conditions. The evaluations reflect the different interests and needs of the many parties involved in development cooperation by also including perspectives of gender, governance, economic analysis and poverty.

Programme evaluation is mandatory for all of our regional programmes and initiatives. All initiatives undergo midterm and final evaluations given specific donor requirements.

Rigorous impact evaluation are given highest priority. Various impact evaluation designs are developed for initiatives given the nature of the mandate of the initiative. Randomised evaluation methodologies are applied in initiatives wherever appropriate given the population coverage of the initiative. Both experimental and non-experimental evaluation methodologies are also developed and applied.

Dissemination of results and partner feedback

The purpose of conducting monitoring and evaluation is both to learn and to provide a mechanism for accountability for various stakeholders. Our evaluation processes are transparent with results widely available. Learning coming from programme monitoring and evaluation are communicated both internally and externally. Evaluation findings are reported in a timely fashion to donors and the Board of Governors and opportunities to disseminate learning to other stakeholders are constantly reviewed, identified, and taken up.

Feedback mechanisms are built into the cycle of programme planning, implementation monitoring and evaluation. Review and planning organised every four months helps consolidate learning from each regional programme and Partners also regularly report results and learning.

explore