Educate! prioritizes ongoing innovation and learning in our core program model. One of our key learning mechanisms is Build-Measure-Learn (BML) loops, a type of feedback loop in which we Build a new or different element into program design, Measure its impact, and Learn from the results. BML loops enable Educate! to test programmatic and operational components of our model and to make rapid adjustments based on the results of each loop. For example, we used a BML loop to determine that using visual aids was more effective in the classroom than writing on the chalkboard. After analyzing the results of the BML, we were able to quickly integrate this lesson and get visual aids into all of the classrooms we work in.
SMS and Smartphone Monitoring
Educate! uses SMS (text messaging) and smartphone surveys as key components of our M&E system. With the goals of rapid turnaround and robust program management, in 2014 we launched a sophisticated mobile money and telecommunications system allowing for rapid turnaround, robust program management, and cost-effective data collection via SMS text messaging and smartphones. These technologies allow Educate! to gather regular information from remote parts of the country in a very cost-effective manner. The metrics that field staff provide through this system populate a web-based dashboard that updates in real time and allows Educate! to monitor our performance at scale. For example, we can track student businesses started on a trimesterly basis, and optimize quality as necessary if our scale performance is lower than targeted.
Educate! uses experiments to inform and improve program design and implementation. A cross-cutting design is a mini-randomized controlled trial that allows us to rigorously test the impact of new program design components. For example, in 2014 we ran a cross-cutting design to test the effectiveness of group mentorship sessions, providing some of our Scholars with lessons only and others with additional group mentorship. Our M&E team analyzed the data and reported to the Programs and Design & Training teams, who used the data to determine how group mentoring sessions would be implemented in the future.