Reinvent the Wheel, Not the Flat Tire: 5 Takeaways from #MERLTech

|

By Kevin Flanagan and Yuting Liao

A few weeks ago, my colleague Yuting Liao and I had the opportunity to attend MERL Tech—an industry conference of sorts designed to bring together M&E practitioners, researchers, technologists, and development professionals—on behalf of the Monitoring, Evaluation, and Learning (MEL) team at the National Democratic Institute (NDI).

At NDI, the MEL team is always on the lookout for innovative M&E practice that can be effectively applied to the democracy and governance sector and this event seemed like an excellent opportunity to explore the “big ideas” and partake in a larger discussion: what can information and communication technologies (ICT) offer to monitoring, evaluation, and learning work as the pressure to integrate ICTs into many aspects of development programming continues to rise.

Offering nearly thirty presentations, the event provided us ample opportunity to sit back and revel in the opinions of the experts, as well as contribute meaningfully to roundtable discussions and collaborative brainstorming activities. As such, these are the five takeaways:

1. More data does not necessarily mean more learning

ICT can make data collection easier, however, it’s crucial to ask the question: is this the data we need? “Big data” is enticing and a common mistake of the novice researcher is: let’s collect as much data as we can. But will that data answer your evaluation questions or will it simply be distracting? While collecting larger volumes of data could certainly result in unexpected observations, if data collection is not strategically tied to your evaluation questions, it does not necessarily lead to better learning. Quality is more important than quantity.

2. ICT can increase the level of risk for the subjects of the evaluation

Data hacks happen, so start by being scared. Whether we want to admit it or not, ICT implementations introduce additional risks to M&E work, particularly when it comes to privacy and data security. And yet, too often M&E practitioners don’t address the risks until after a breach happens. Worry about this in advance and create a threat model to assess assets, risks, and vulnerabilities.

3. Be a data-led organization, not just data-driven

While ICT does help improve data accuracy, organizations that embrace a “data-led” mentality will empower their users to strive to better understand data and incorporate it into their decisionmaking processes. Successful learning initiatives rely on better interpretation and analysis of data, and ICT for evaluation is useless without capable analytical and sector experts.

4. ICT can expand your sample size, but be mindful of the unexpected challenges in sample bias

When collecting data, ICTs can expand the reach of your evaluation efforts, creating opportunities to capture data beyond the traditional “beneficiaries” of a program. However, the “digital divide” may perpetuate the issue of sample bias, and your results may be valid only for those segments of the population with digital access.

5. There’s no ICT “quick-fix” to improve monitoring & evaluation

While it’s possible to achieve a high level of methodological rigor through carefully designed ICT studies, it’s not always easy to do so—often being technically complex, expensive, and time-consuming. Most importantly, effective ICT is built on sound monitoring & evaluation strategies, and incorporating ICTs into M&E requires long-term institutional commitment and evaluation capacity development.

Despite the wide breadth of content, there was a common theme: “It’s ok to reinvent the wheel, not the flat tire.” These words spoken by Susan Davis during a five-minute “lightning” presentation, struck an unexpected chord with the audience, attendees and presenters alike. Whether these are words of comfort for the tech-timid or caution for the tech-tenacious, Davis pointed us all to the indisputable fact that it’s okay to look to new technologies to address old problems in development as long as we are all aware that any new process, tool, or approach has just as much potential to fall flat as did their predecessors. The successful integration of M&E and ICT is fully reliant on sound monitoring and evaluation strategies and realistic expectations.

 

Kevin Flanagan
Kevin Flanagan is a TechChange alum from Technology for Monitoring and Evaluation course. He is a learning technologist on the Monitoring, Evaluation and Learning team at the National Democratic Institute.

yuting.liao
Yuting Liao is senior assistant for data analysis and visualization on the Monitoring, Evaluation and Learning team at the National Democratic Institute.

The National Democratic Institute is a nonprofit, nonpartisan organization working to support and strengthen democratic institutions worldwide through citizen participation, openness and accountability in government.

Featured image: Wayan Vota Flickr

Also on TechChange Main

Republica 18 Panel
Watch TechChange on re:publica Panel: Refugees and Online Education

With over 9,000 attendees and 1,000 speakers, re:publica 18 is one of the largest conferences about digital culture in the world....

Three Takeaways from #EdTech at Tech@State

On Friday, Nov. 1, TechChange was honored to participate in a panel at the latest Tech@State  conference on education technology,...

Alumni Spotlight: Sonja Schmidt on the Technology for M&E Diploma Program

In 2015, TechChange launched the Technology for Monitoring & Evaluation Diploma Program, which combined three TechChange courses (Tech for M&E,...