STEP UP – KPI Selection Guide

A Guide to Selecting Key Performance Indicators (KPIs)

A structured approach to defining meaningful, measurable indicators for the STEP UP project.

Use this guide to align KPIs with the DMF, GAP, PAM, and Risk Management Matrix, ensuring coherent monitoring across all project components.

Illustration: KPI Selection Funnel

Show how broad project documents and the results chain are progressively narrowed into a focused set of high-quality KPIs using the CREAAM criteria.

Graphic placeholder: Funnel diagram with layers labeled “DMF/GAP/PAM/RMM”, “Results Chain”, “Brainstormed Indicators”, “CREAAM Filter”, and “Final KPIs”. Icons for documents, logic model, ideas, filter, and dashboard at each stage.

The KPI selection process

Selecting the right KPIs is crucial for monitoring progress, evaluating impact, and ensuring the STEP UP project achieves its objectives. The process should be systematic and collaborative, involving the Project Management Unit (PMU), Implementing Agencies (IAs), and M&E specialists.

1. Review core project documents

Begin by thoroughly reviewing the foundational documents that outline the project's goals and expected results:

  • The Design and Monitoring Framework (DMF)
  • The Gender Action Plan (GAP)
  • The Project Administration Manual (PAM)
  • The Risk Management Matrix (RMM)
Graphic: Document Alignment Matrix

Table-style graphic with columns for DMF, GAP, PAM, RMM and rows for Impact, Outcome, Outputs. Cells show how each document contributes to KPI selection.

2. Understand the results chain

Clarify the project's logic, from high-level impact to specific outputs and activities:

  • Impact: Long-term goal (e.g., supporting Cambodia's human capital development).
  • Outcome: Direct effect of outputs (e.g., effectiveness of upper secondary education improved).
  • Outputs: Tangible products and services (e.g., teachers trained, systems developed).
  • Activities: Tasks undertaken to produce outputs (e.g., training workshops).
Graphic: Results Chain Diagram

Vertical or horizontal flowchart showing Activities → Outputs → Outcome → Impact, with example statements from STEP UP at each level.

3. Brainstorm potential indicators

For each level of the results chain—especially outcomes and outputs—brainstorm a list of potential indicators. Use the ToR tasks and responsibilities for each specialist (EdTech, STEM, GESI, etc.) to generate ideas.

4. Apply selection criteria

Vet the brainstormed list against a clear set of criteria to ensure the chosen indicators are robust and practical. The CREAAM framework is recommended for STEP UP.

5. Define baselines and targets

For each selected KPI, establish a baseline value and set realistic, time-bound targets. Baseline studies referenced in the ToR provide the starting point for measuring change.

6. Develop a monitoring plan

Finalize the Project M&E Framework (PMEF) by detailing how each KPI will be measured, including data source, collection frequency, responsible party, and analysis method. Specify how data will be disaggregated (e.g., by gender, indigenous groups, province) as required by the GAP and ToR.

Graphic: KPI Monitoring Dashboard

Mock dashboard showing sample KPIs with baseline, target, current value, and disaggregation filters (gender, province, school type). Use traffic-light colors to indicate status.

Criteria for selecting effective KPIs

Use the CREAAM framework to evaluate and select the most effective indicators for the STEP UP project.

  • Clear: The indicator is precise and unambiguous; all stakeholders interpret it the same way.
    Guiding question: Will everyone understand what is being measured?
  • Relevant: The indicator is directly linked to project outputs, outcomes, or objectives.
    Guiding question: Does this indicator measure something central to the success of STEP UP?
  • Economic: Data can be collected and analyzed in a timely, cost-effective manner.
    Guiding question: Are the resources required to collect this data reasonable?
  • Adequate: The indicator provides sufficient information to assess performance without being overly complex.
    Guiding question: Does this indicator, alone or with others, give enough information to make decisions?
  • Monitorable: The indicator can be tracked regularly throughout the project lifecycle.
    Guiding question: Can we realistically and consistently collect this data over 5 years?
Graphic: CREAAM Radar Chart

Radar/spider chart with axes for Clear, Relevant, Economic, Adequate, Monitorable. Each candidate KPI is scored visually to compare strengths and weaknesses.

Potential KPI categories for STEP UP

1. STEM teaching and learning quality

  • Number of teachers and principals completing CPD on the school-based STEM Framework.
  • Percentage of target USSs implementing the updated STEM classroom/laboratory safety system.
  • Student performance scores in STEM subjects, disaggregated by gender and school type (USS/GTHS).
  • Number of Professional Learning Communities (PLCs) for STEM subjects established and active.
  • Qualitative assessment of STEM practicum improvements at the National Institute of Education (NIE).

2. Education technology (EdTech) integration

  • Number of target USSs with a fully deployed and operational OpenEMIS system.
  • Number of MoEYS, provincial, and school staff trained on OpenEMIS data collection, analysis, and reporting.
  • Number of digital education modules and e-books developed and made available.
  • Percentage of STEM teachers in target schools integrating EdTech equipment and digital resources into teaching.
  • Completion rate of facility needs assessments (e.g., electricity loads, space) for EdTech deployment.

3. Teacher and staff capacity development

  • Number of teachers, education specialists, and school managers trained, disaggregated by subject, gender, and role.
  • Post-training assessment scores or competency evaluation results.
  • Number of career guidance counsellors trained on the harmonized education and vocational training framework.
  • Percentage of trained staff reporting increased confidence and application of new skills.

4. Gender equality and social inclusion (GESI)

  • Percentage of female students enrolled in STEM subjects in target schools.
  • Number of teaching and learning materials reviewed and updated to integrate GESI principles.
  • Ratio of female-to-male participation in STEM extra-curricular programs and competitions.
  • Number of GESI-responsive STEM education practices showcased (e.g., case studies on outstanding female students).
  • All quantitative indicators disaggregated by sex and, where relevant, indigenous peoples and geographic location.

5. Educational leadership and management

  • Number of school leaders trained in educational leadership and school-based management.
  • Percentage of target schools with an approved and implemented school development plan incorporating STEM and EdTech.
  • M&E capacity assessment score for MoEYS departments and IAs, measured at baseline, midline, and endline.

6. Infrastructure and civil works

  • Number of classrooms/laboratories renovated, rehabilitated, or constructed according to design standards.
  • Percentage of civil works milestones completed on time and on budget.
  • Number of site inspections conducted with adherence to quality control and safeguards requirements.

7. Partnerships and frameworks

  • Development and approval of a strategic PPP plan for the Cambodia Science and Technology Center (CSTC).
  • Number of active partnerships established with tertiary education institutions and private sector industries.
  • Official endorsement of the harmonized MoEYS and MLVT education and vocational training framework.
Graphic: KPI Category Wheel

Circular diagram with seven segments representing each KPI category (STEM Quality, EdTech, Capacity, GESI, Leadership, Infrastructure, Partnerships). Each segment links visually to example indicators.