Create
- Create a Theory of Change: Build a theory of change showing how your program activities lead to outcomes and impact, with assumptions and risks identified.
- Create a Logframe: Build a logical framework with goals, outcomes, outputs, activities, indicators, means of verification, and assumptions.
- Create a MEL Plan: Develop a monitoring, evaluation, and learning plan with indicators, data collection methods, timelines, and responsibilities.
- Create an Evaluation Matrix: Build an evaluation matrix linking evaluation questions to criteria, indicators, data sources, and methods.
- Create SMART Indicators: Generate specific, measurable, achievable, relevant, and time-bound indicators from your program objectives.
- Create an Indicator Reference Sheet: Document how each indicator is defined, measured, disaggregated, and reported : with baselines and targets.
- Create a Learning Agenda: Develop learning questions, methods, and a plan for how your program will capture and use lessons.
- Create a Data Use Plan: Map out who needs what data, when, and how it will inform decisions at each level of your program.
- Create a Stakeholder Map: Identify and categorize stakeholders by their interest, influence, and role in your M&E system.
- Create an M&E Budget: Build a justified M&E budget with line items for data collection, staffing, analysis, and reporting.
- Create a Results Framework: Build a multi-level results chain showing how inputs lead to activities, outputs, outcomes, and impact.
- Create a Beneficiary Feedback System: Design channels for collecting, processing, and responding to feedback from the people your program serves.
- Create an Adaptive Management Plan: Create an adaptive management plan with decision triggers, pivot criteria, and learning cycles for your program.
- Create an After-Action Review Template: Create an After-Action Review template for program activities or events, with structured reflection questions and documentation formats.
- Create a Learning Brief: Create a learning brief that synthesizes evidence from multiple M&E sources into actionable recommendations for program decision-makers.
- Create a Knowledge Product: Create a knowledge product (evidence brief, learning note, or practice paper) from program data to share lessons with internal and external audiences.
- Create a Most Significant Change Guide: Create a Most Significant Change story collection and selection guide with interview protocols, selection criteria, and analysis methods.
- Create an Evaluation Utilization Plan: Create an evaluation utilization plan that maps findings to decisions, audiences, and communication channels to ensure evidence drives action.
- GESI Analysis Framework for M&E: Create a Gender Equality and Social Inclusion (GESI) analysis framework that integrates into your M&E system, with GESI-sensitive indicators, data collection guidance, and reporting requirements.
- Environmental and Climate Screening Checklist: Create an environmental and climate screening checklist for program monitoring, identifying environmental risks, mitigation measures, and compliance requirements across program activities.
- Climate-Sensitive Indicators and Risk Monitoring Framework: Create climate-sensitive indicators and a climate risk monitoring framework that tracks both program contributions to climate resilience and exposure to climate-related risks.
- Protection Mainstreaming Monitoring Framework: Create a protection mainstreaming monitoring framework with safeguarding indicators, complaint mechanism tracking, and protection risk analysis tools for humanitarian or development programs.
- Value for Money Assessment Framework (4Es): Create a Value for Money (VfM) assessment framework using the 4Es model (Economy, Efficiency, Effectiveness, Equity), with indicators, benchmarks, and reporting templates.
- Locally-Led M&E Plan: Create a locally-led M&E plan that centers community ownership of data, indicators, and evaluation processes, aligned with Grand Bargain localization commitments.
- Beneficiary Accountability Framework: Create a beneficiary accountability framework with feedback loops, complaint mechanisms, response protocols, and community participation structures aligned with the Core Humanitarian Standard.
- Create a Contribution Analysis Framework: Build a contribution analysis framework with contribution claims, evidence assessment criteria, and systematic testing of alternative explanations for observed outcomes.
- Create a Most Significant Change Protocol: Build a complete Most Significant Change (MSC) data collection and selection protocol, including story collection guides, selection criteria, and domain definitions.
- Create an Outcome Harvesting Protocol: Build an Outcome Harvesting protocol with outcome description templates, substantiation questions, verification procedures, and analysis frameworks.
- Create a Developmental Evaluation Design: Create a Developmental Evaluation design for innovation or complex programming contexts, with real-time feedback loops, emergent learning questions, and adaptive evaluation architecture.
- Create a Process Tracing Protocol: Create a process tracing protocol for causal inference in single-case or small-n evaluations, with hypothesis formulation, evidence tests, and Bayesian confidence updating.
- Create an Evaluability Assessment: Create an evaluability assessment to determine if a program is ready for evaluation, examining program design clarity, data availability, stakeholder readiness, and feasibility constraints.
- Create a Stakeholder Engagement Plan: Build a comprehensive stakeholder engagement plan for an evaluation, including influence-interest mapping, tailored engagement methods, communication channels, and an implementation timeline.
- Create a Dissemination Strategy: Design an evaluation findings dissemination strategy with audience-specific products, communication channels, and a timeline for maximizing evaluation use.
- Create a Data Quality Audit Protocol: Create a data quality audit protocol with verification procedures, sampling strategy, scoring rubric, and corrective action framework for assessing M&E data reliability.
- Create an Enumerator Training Plan: Create an enumerator training agenda and manual outline for a data collection exercise, covering methodology, tools, ethics, practice sessions, and field protocols.
- Create a Field Supervision Checklist: Create a field supervision checklist and data quality spot-check protocol for monitoring enumerator performance and data integrity during data collection.
- Create an Evaluation Proposal Scoring Matrix: Create a technical scoring matrix for evaluating proposals from evaluation firms or M&E consultants, with weighted criteria, scoring scales, and consensus procedures.
- Create a Data Management Plan: Create a comprehensive data management plan covering collection protocols, storage, cleaning, analysis workflows, archiving, ethical safeguards, and data sharing agreements.
- Create a Data Storytelling Brief: Translate M&E findings into a structured data storytelling brief with a narrative arc, key messages, supporting data points, and audience-specific framing.
- Create M&E Infographic Content: Create structured content for an M&E infographic with headline stats, supporting data points, visual hierarchy, and layout guidance organized for a designer.
- Create a Results Snapshot: Create a one-page results snapshot (factsheet) with headline indicators, progress-to-target tracking, and key achievements for a reporting period.
- Create an Evidence Brief: Create a structured evidence brief that synthesizes findings across multiple evaluations or studies on a specific M&E topic, with strength-of-evidence ratings.
- Create an M&E Capacity Assessment Tool: Create an M&E capacity assessment tool that evaluates organizational and individual M&E competencies across key domains, with scoring rubrics and gap analysis.
- Create an M&E Capacity Building Plan: Create an M&E capacity building plan with learning objectives, activities, timeline, responsible parties, and success indicators linked to assessment findings.
- Create an M&E Training Curriculum: Create a multi-day M&E training curriculum with detailed session plans, learning objectives, interactive exercises, and a complete materials list.
- Create a Partner M&E Assessment: Create a partner M&E capacity assessment and strengthening plan for sub-grantees or implementing partners, with risk-based tiering and tailored support packages.
- Create an M&E Competency Framework: Create an M&E staff competency framework with job families, proficiency levels, behavioral indicators, and development pathways for career progression.
- Create a Mobile Data Collection Protocol: Create a comprehensive mobile and digital data collection protocol covering device management, data syncing, offline procedures, quality checks, and troubleshooting.
- Create a Remote Monitoring Plan: Create a remote monitoring plan for contexts where direct field access is limited, covering phone surveys, satellite imagery, third-party monitoring, and remote verification methods.
- Create an Ethics Review / IRB Submission: Create an ethics review or Institutional Review Board submission package with protocol summary, risk assessment, informed consent forms, and data protection measures.
- Create an Informed Consent Form: Create an informed consent form template for M&E data collection that meets ethical standards, is written in plain language, and covers voluntary participation, confidentiality, risks, and data use.
- Create a Data Protection Impact Assessment: Create a Data Protection Impact Assessment (DPIA) for an M&E data system covering data flows, privacy risks, legal basis, and mitigation measures aligned with GDPR and humanitarian data protection standards.
- Create a Health M&E Framework: Build a comprehensive M&E framework for a health program integrating HMIS and DHIS2 data systems, WHO and SPHERE benchmarks, and sector-standard health indicators.
- Create an Education M&E Framework: Build a comprehensive M&E framework for an education program aligned to INEE Minimum Standards, with indicators for access, retention, learning outcomes, and education system strengthening.
- Create a WASH M&E Framework: Build a comprehensive M&E framework for a WASH program using JMP service ladders, water quality monitoring protocols, and sustainability indicators.
- Create a Food Security and Livelihoods M&E Framework: Build a comprehensive M&E framework for a food security and livelihoods program using standard indicators including FCS, rCSI, HHS, HDDS, and IPC/FEWS NET severity classification.
- Create a Protection M&E Framework: Build a comprehensive M&E framework for a protection program covering child protection, GBV prevention and response, and psychosocial support, aligned to IASC and GPC standards with robust data protection protocols.
Draft
- Draft a Progress Report: Write a structured progress report for your donor with activities, results, challenges, and next steps.
- Draft an Evaluation Report: Write up evaluation findings into a professional report with methodology, results, and recommendations.
- Draft an Executive Summary: Create a concise 1-2 page summary of a longer M&E document for busy decision-makers.
- Draft a Case Study: Write a structured case study documenting a program success, challenge, or lesson learned.
- Draft Evaluation Terms of Reference: Write terms of reference for commissioning an external evaluation, including scope, questions, and methodology.
- Draft Lessons Learned: Capture and organize lessons from program implementation into a structured document for future use.
- Draft an M&E Section for a Proposal: Write the monitoring and evaluation section of a funding proposal with approach, indicators, and budget.
- Draft a Communication Plan for Findings: Plan how to share M&E findings with different audiences : donors, staff, communities, and partners.
- Draft a Pause and Reflect Session Guide: Draft a Pause and Reflect session facilitation guide with discussion prompts, data review activities, and adaptation planning.
- Draft a Lessons Learned Database Entry: Draft a structured lessons learned database entry with context, lesson, evidence, and recommendations for organizational knowledge systems.
- Draft an Evaluation Inception Report: Draft a structured evaluation inception report outline covering methodology, workplan, evaluation matrix, risk assessment, team composition, and logistics.
- Draft an M&E Scope of Work: Draft a professional Scope of Work for hiring an M&E consultant or evaluation firm, including background, deliverables, qualifications, timeline, and evaluation criteria.
- Draft a Findings Presentation Deck Outline: Draft a structured outline for an M&E findings presentation deck with slide-by-slide content, visualization recommendations, and speaker notes.
- Draft a Stakeholder Brief: Draft a concise 2-page stakeholder brief that summarizes evaluation or monitoring findings for a non-technical audience with clear recommendations.
- Draft a Community Feedback Report: Draft a simplified community feedback report that presents M&E findings in plain language for beneficiary communities, with visual aids and actionable next steps.
- Draft a Training Facilitation Guide: Draft a facilitation guide for an M&E training session including icebreakers, group exercises, case studies, debrief questions, and time management tips.
Review
- Review My Logframe: Get structured feedback on your logframe's logic, indicator quality, assumptions, and completeness.
- Review My MEL Plan: Get feedback on your MEL plan's completeness, feasibility, indicator quality, and alignment with program design.
- Review My Theory of Change: Get feedback on your ToC's causal logic, assumptions, evidence base, and completeness.
- Review My Indicators: Get each indicator assessed for SMART criteria, measurability, and alignment with your results framework.
- Review My Evaluation Design: Get feedback on your evaluation methodology, questions, sampling, and analysis plan before fieldwork.
- Review My Data Collection Tool: Get feedback on your survey, interview guide, or checklist for clarity, bias, skip logic, and completeness.
- Review My Proposal M&E Section: Get feedback on the M&E section of your proposal for technical quality, completeness, and donor alignment.
- Review My Report Draft: Get structured feedback on a report draft for clarity, evidence, logic, and actionable recommendations.
- Do No Harm Review of M&E Tools: Review an M&E plan or data collection tool for potential harm, ethical risks, safeguarding gaps, and unintended negative consequences for participants and communities.
- Review Ethical Compliance of a Data Collection Plan: Review an M&E data collection plan for ethical compliance against UNEG ethical guidelines, covering informed consent, do no harm, confidentiality, and vulnerable population protections.
- Review Data Quality: Review a dataset or data collection process for quality issues across five dimensions: completeness, accuracy, consistency, timeliness, and validity.
Analyze
- Analyze Survey Data: Get help calculating frequencies, means, cross-tabulations, and interpreting patterns in your survey results.
- Analyze Interview Transcripts: Get help identifying themes, extracting key quotes, and summarizing findings from interview or FGD transcripts.
- Code Qualitative Data: Create a codebook and systematically code interview or focus group data into themes and sub-themes.
- Summarize a Long Report: Get a structured summary of a lengthy M&E document : key findings, recommendations, and action items.
- Extract Key Findings from a Document: Pull out the main findings, data points, and recommendations from a report or evaluation.
- Compare Baseline vs Endline Results: Analyze changes between baseline and endline data, identify significant shifts, and interpret what they mean.
- Identify Patterns Across Datasets: Find trends, outliers, and patterns across multiple data sources or reporting periods.
- Conduct a Gender Analysis: Analyze your M&E data through a gender lens : disaggregation, differential impacts, and inclusion gaps.
- Analyze a Pivot Decision: Analyze program data to recommend whether to continue, adapt, or stop an intervention based on evidence and decision criteria.
- Intersectional Data Analysis: Analyze program data through an intersectionality lens, examining how overlapping dimensions of identity (gender, age, disability, location, ethnicity) create compounding patterns of inclusion or exclusion.
- Conflict Sensitivity Analysis of Monitoring Data: Analyze program monitoring data through a conflict sensitivity and Do No Harm lens, identifying how the program may be interacting with conflict dynamics and recommending adjustments.
- Cost-Effectiveness Analysis Across Interventions: Analyze program cost-effectiveness by comparing unit costs across interventions, sites, or time periods, with benchmarking against sector standards and actionable efficiency recommendations.
- Analyze Qualitative Data with Structured Coding: Analyze qualitative data using a structured coding framework that combines deductive codes from your theory of change with inductive codes emerging from the data, culminating in thematic analysis.
- Analyze Data Using Realist Evaluation: Analyze program data using a realist evaluation approach to develop and refine Context-Mechanism-Outcome (CMO) configurations that explain what works, for whom, and under what circumstances.
- Analyze Chart Selection for M&E Data: Analyze a dataset description and recommend the best chart types for each variable or comparison, with justification grounded in data visualization best practices.
- Analyze Health Routine Data: Analyze HMIS or DHIS2 routine health data to identify trends, coverage gaps, performance outliers, and actionable recommendations for program improvement.
- Analyze Food Security Monitoring Data: Analyze food security monitoring data to calculate severity classifications, identify trends and geographic hotspots, and generate response recommendations using standard indicators.
Design
- Design a Household Survey: Create a structured survey questionnaire for household or beneficiary data collection.
- Design an Interview Guide: Create a semi-structured interview guide with opening, probing, and closing questions.
- Design a Focus Group Guide: Create a focus group discussion guide with icebreakers, key questions, probes, and facilitation notes.
- Design an Observation Checklist: Create a structured checklist for field observations with clear criteria and rating scales.
- Design a Sampling Strategy: Determine how to select your study sample : method, size, stratification, and practical considerations.
- Design a Baseline Study: Plan your program's baseline data collection : what to measure, how, and what tools you need.
- Design a Rapid Assessment: Plan a quick assessment to get actionable data within 1-2 weeks for urgent decisions.
- Design a Mid-Term Evaluation: Plan a mid-term evaluation to check program progress, relevance, and early outcomes.
- Design a Learning Event: Design a learning event agenda (workshop, webinar, or community of practice session) based on M&E findings to promote evidence-based practice.
- Disability-Inclusive Survey Design: Design disability-inclusive data collection tools using the Washington Group Questions, with guidance on ethical protocols, accessible survey administration, and disability-disaggregated analysis.
- Design an Outcome Mapping Framework: Design an Outcome Mapping framework with boundary partners, progress markers, outcome journals, and strategy maps following the IDRC methodology.
- Design a Quasi-Experimental Evaluation: Design a quasi-experimental evaluation with matching strategy, comparison group selection, difference-in-differences analysis plan, and threats to validity assessment.
- Design a Participatory Evaluation: Design a participatory evaluation approach that engages beneficiaries and stakeholders as co-evaluators, with inclusive methods, power analysis, and capacity building components.
- Design a Mixed-Methods Evaluation: Design a mixed-methods evaluation with integration points, sequencing decisions, a methods matrix, and a plan for combining quantitative and qualitative strands.
- Design a Monitoring Dashboard Brief: Design a monitoring dashboard specification with KPIs, visualization types, update frequencies, user roles, and data flow architecture.
- Design an M&E Mentoring Framework: Design an M&E mentoring and coaching framework with competency milestones, structured session plans, progress tracking tools, and matching criteria.
- Design an M&E Community of Practice: Design a Community of Practice structure for M&E practitioners with meeting cadence, knowledge sharing protocols, engagement strategies, and sustainability plan.
- Design a KoboToolbox/ODK Survey Form: Design a structured KoboToolbox or ODK-compatible survey form with question types, skip logic, validation rules, calculation fields, and deployment guidance.
- Design a Real-Time Monitoring System: Design a real-time monitoring system with automated alerts, threshold triggers, escalation protocols, and dashboard specifications for timely program decision-making.
- Design a Health Household Survey: Design a health household survey instrument using validated question modules for immunization, nutrition, WASH, and maternal health, with sampling strategy and field protocols.
- Design a Learning Assessment Tool: Design a learning assessment tool for literacy and numeracy with grade-level benchmarks, scoring rubrics, administration protocols, and analysis specifications aligned to EGRA/EGMA methodology.
- Design a WASH Household Survey: Design a WASH household survey using JMP and WHO standard question modules for water service levels, sanitation access, hygiene practices, and water quality perceptions.
Learn
- Explain an M&E Concept: Get a clear, jargon-free explanation of any M&E concept : from theory of change to contribution analysis.
- Compare Evaluation Approaches: Understand the differences between evaluation methodologies and when to use each one.
- Suggest the Right Methodology: Describe your program and constraints, and get recommendations for the best evaluation approach.
- Help Me Choose Indicators: Describe your program objectives and get suggestions for appropriate indicators to track progress.
- Walk Me Through Data Analysis: Get step-by-step guidance on how to analyze your specific type of M&E data.
- Help Me Understand Contribution Analysis: Walk through the six steps of contribution analysis to build a credible case for your program's contribution to observed changes, without needing a control group.
- Choose the Right Evaluation Approach: Compare evaluation approaches and get a recommendation for which method best fits your program context, resources, and evidence needs.
AI Prompt Library
Expert AI prompts for M&E professionals. Copy, customize, and paste.
Create
Build frameworks, plans, and tools from scratch
Draft
Write documents and reports
Review
Get structured feedback on your work
Analyze
Make sense of your data
Design
Build data collection tools and studies
Learn
Understand M&E concepts and choose approaches
Scoring & Review
Rubrics for scoring and reviewing AI output quality
Logframe Quality Assessment
Score a logframe across five dimensions using AI. Paste your logframe into the prompt to get a structured quality assessment with dimension scores, evidence citations, and priority revisions.
MEL Plan Review
Score a Monitoring, Evaluation and Learning plan across five dimensions using AI. Paste your MEL plan to get a structured quality assessment with dimension scores, evidence citations, and priority revisions.
Theory of Change Assessment
Score a Theory of Change across five dimensions using AI. Paste your ToC narrative or diagram description to get a structured quality assessment with dimension scores, evidence citations, and priority revisions.
Survey Instrument Review
Use AI to review a survey questionnaire before field deployment. Get dimension scores, flagged questions with corrections, and a ready-to-implement revision list in minutes.
Evaluation Report Scoring
Use AI to score an evaluation report across five quality dimensions. Get a structured verdict with evidence citations and a revision brief before accepting or publishing the report.
Donor Progress Report Review
Score a donor progress report across five dimensions using AI. Paste your narrative report to get a structured quality assessment with dimension scores, evidence citations, and priority revisions before submission.