M&E + Design: Innovation Through Collaboration

 

Just six months after COVID-19 closed schools, Educate! launched its first distance learning model in Uganda, and 3,400 young people enrolled. We’re now remotely reaching just as many Ugandan youth as we were in schools pre-pandemic, with over 40,000 enrolled. This rapid adaptation and scale are made possible by our ongoing investment in collaborative research and development (R&D).  

Did you know that on average, less than 1% of yearly revenue in international development is spent on R&D, compared to the up to 20% spent at companies like Google and Intel? 

Educate! believes R&D is critical to ensuring we continue to scale the most effective solutions, and these investments have been crucial to our success. This innovation is a key element of what we believe it takes to build a disruptive non-profit social enterprise. 

Two teams drive R&D at Educate!: Design and Monitoring & Evaluation (M&E). They have developed a systematized partnership, and their collaboration allows us to build innovation directly into our solutions. M&E Manager Brian Okwir notes that this partnership enables us to closely monitor all models as they grow and change, which helps us to quickly discover and overcome any barriers to youth impact. 

Design and M&E continuously collaborate throughout the model development and pilot process:

1. Setup: Collaborating to Build and Pilot

At the Setup stage of a brand new model, the Design and M&E Teams come together to consider how a new model might effectively build the critical youth skills proven to impact their lives in the long term. 

 

The Design Team first considers how to create an engaging and effective learner experience and then works to refine it. They begin by asking themselves questions about the design elements that drive youth participation and engagement.

For example: What keeps some youth from enrolling in the model? Is the curriculum being delivered and received as designed?

The M&E Team produces a plan to  track the implementation of the model and its impact on youth. They focus on assessing the frequency of youth participation throughout the pilot’s learning activities as well as their effectiveness. 

For example: What skills are participants developing in those experiences, and how are those skills impacting their lives?

 

To answer these initial questions and to be able to measure the pilot’s success when it ends, the teams jointly develop key performance indicators for each aspect of the model and find a way to gather that data. For example, to better understand if a particular lesson within the curriculum is effective, the teams might ask youth if they’ve practiced a specific skill from that lesson within their small business or community project and how it might have impacted their work. 

Design and M&E are thought partners. Design comes up with thoughts on how the model might be developed and delivered, and this is transformed into design strategies. M&E develops tools and systems to monitor and evaluate the implementation of the model, and shares information which tells the teams whether the model is working or not based on how it was designed.
— Aloysie Niyoyita, Regional Performance Metrics Manager

2. Integration: Building a Data-Driven Culture

At the Integration stage, Design and M&E meet with additional teams to talk through a pilot’s design, goals, and how to assess its performance as the pilot launches and runs. 

This performance indicator data is regularly collected and displayed within online dashboards that all teams can access to see the model’s progress in real-time. Design and M&E’s role is to ensure others can understand the dashboard’s metrics and how they relate to each staff member’s role. For example, Design and M&E train the Program Implementation Team to interpret and effectively use youth participation data to better manage and support the staff facilitating the learning experience for youth as the program runs. 

Samson Mbugua, Head of Performance Metrics, finds this step critical to Educate!’s alignment and culture:

Design and M&E work to establish that causal relationship between activities and impact, documenting this in a roadmap based on our Theory of Change. Within the broader team, this roadmap is used to center the teams to work toward impact in a measurable manner, creating a data-driven culture.
— Samson Mbugua, Head of Performance Metrics

3. Iteration: Making Evidence-Based Improvements to Strengthen Impact

At the Iteration stage, Design and M&E work together to answer the questions that guided their decisions during the Setup. They each produce a report to highlight where the model met its objectives and where there is room for improvement. If, for example, the Design Team decided to test a gamification strategy to increase youth engagement, they will work with M&E to see if the point system developed for one group of participants helped to motivate them to participate in more activities than their counterparts without this game element.

Finally, Design and M&E propose design changes, brainstorm how to effectively implement them, and consider how to measure their impact. Then, the cycle begins all over again. Thanks to this strong partnership and a steadfast investment in R&D, Brian Okwir, Monitoring & Evaluation Manager, notes:

Educate! is able to build more complex models and systems with the confidence that they are working well, in a way that allows us to continuously learn and improve.
— Brian Okwir, Monitoring & Evaluation Manager
 
Previous
Previous

Benson Advocates for Youth Locally and Globally

Next
Next

Co-Creating Youth Opportunities at Scale