In August, along with announcing our Tech for M&E Diploma program, we kicked off a M&E Professionals Series, where we will be talking one-on-one with M&E professionals to give you the inside scoop on the industry.
For this second post in the series, we are featuring an interview that one of our alumni, Stephen Giddings conducted, with Janet Kerley, Senior Director, Monitoring and Evaluation Practice at Development and Training Services, Inc (dTS), a Virginia-based consulting organization that does considerable work with USAID.
Janet Kerley is a master evaluator and an accomplished trainer in evaluation and performance measurement. As Senior Evaluator in the Monitoring and Evaluation Practice at dTS/Palladium, she provides technical leadership for evaluations in the ME unit, provides technical direction on design and field methods, and supervises the preparation of the evaluation reports. As Chief of Evaluation, Research and Measurement for the Peace Corps, she established an impact evaluation system at Peace Corps.
Ms. Kerley was the Team Leader for Monitoring and Evaluation in the Office of the Director of Foreign Assistance, US Department of State, leading a 200-member inter-agency team to develop standard indicators for the 2007 Foreign Assistance Reform reporting tool. She worked at USAID in the Bureau for Policy and Program Coordination, CDIE and as the Monitoring and Evaluation Office in the Bureau for Africa and the Bureau for Europe and Eurasia. Prior to joining USAID, Ms. Kerley was a Senior Research Associate at Aguirre International. She has lived and worked in many countries in Latin America, Africa
S: How has technology changed the way M & E is conducted over the past decade in international development?
J: The change has been remarkable! A decade ago, most of the data gathering and analysis work was all paper-based, making it difficult, time-consuming, and costly. Especially in overseas environments, it took considerable time and effort to gather, transcribe (and often translate) and analyze the data. But today, the tech tools have made data collection and analysis more efficient and save time and money.
However, there is still a considerable “digital divide” between the much more tech savvy young people and the older professionals originally trained using SPSS (or even earlier)
technologies.
S: Does paper-based data collection still have a place in M&E today?
J: Yes — in certain circumstances paper-based data collection may be preferred.
In very rural areas where electricity may not be available, where batteries for electronic devices cannot be charged or where internet connections or mobile phone services is inconsistent or not available, paper-based data collection is still the best option.
Not everyone is comfortable with data collection using electronic devices, but they may be more open to paper-based questions.
S: What are some of the pitfalls of some of the popular tech-based data collection tools?
J: With so much tech available, it is easy to get carried away.
Some less experienced or less than fully trained data gatherers may lose sight of the fundamental questions the monitoring or evaluation is trying to get at. If evaluators lack sufficient training in sound principles of research, they may be tempted to substitute technology for sound reasoning and good judgment.
Some data collection tech tools may also have a tendency to collect too much data, some of which may be irrelevant to the task at hand. USAID, in particular, is burdened by data overload where data management systems fail to filter out data that is of little use and complicates the monitoring and evaluation practices.
S: What challenges have USAID Missions faced when integrating new technologies into their M& E functions?
J: By and large, USAID Missions have been quite open to technological improvements to M&E functions. That said, there is still a “digital divide” where younger employees (including local staff) who have grown up in the digital age are more comfortable with and more adept at using new technologies to enhance M & E. But more senior and older USAID staff seem generally open to embracing and appreciating the advantages that new technologies can bring to M & E while leaving the technical analysis and the new data gathering tools to younger techie staff. USAID staff have generally been very receptive to training in using new M & E technologies to their advantage.
S: Have new evidenced based technologies made decision making by senior USAID staff easier and more informed?
J: Most USAID Mission Directors recognize the value that good evidence on performance can bring to the achievement of program results, and the added clarity that good data and visually well-presented documentation can bring to decision making.
Photo Source: UNDP in Kigali, Rwanda
S: What are the advantages of mixed methods evaluations?
J: The most important starting point for an evaluation is doing the research required to understand what questions you want answered. Only then should you begin to look at evaluation methodologies to acquire necessary information.
When done at a proper scale, well executed quantitative data collection and analytical methods can bring statistical rigor and clarity. For example, the scale of some of the evaluations done for USAID’s food security (Feed the Future) programs has generally provided reliable data. Unfortunately, USAID Missions sometimes do not make available sufficient budget to assure that sample size for quantitative methods is sufficient to draw reliable conclusions. This is where qualitative methods can help to fill gaps.
Storytelling, an evaluation tool, is one of the most useful qualitative data collection methods. Sometimes quantitative data collection methods do not allow beneficiaries to open up and provide adequate and reliable information, but they react much more positively if they are allowed to tell a story. If you get enough good stories they can provide insights and nuances that purely quantitative methods cannot. Thus mixed method evaluations can provide more reliable evidence of performance than quantitative or qualitative methods.
S: Do you think there is a bias towards quantitative methods in international development because of a lack of free and easy to use qualitative tools?
J: Not at all. Many USAID evaluations make good use of qualitative methodologies. A
decade ago, there was an overuse of “the windshield wiper” approach (an evaluation that is not given time to do adequate field work and they report what they observe “through the windshield.”) to evaluations but more recently qualitative methodologies have become more sophisticated and reliable and can provide a lot of extremely useful information for decision makers.
S: What questions should we be asking to select the best technology for M & E?
J: Evaluation planning should begin with framing the research questions — what is it that we need to learn? The preferred technological solution should be one that can best answer the research questions and must also take into account cultural sensibilities. It is very crucial that technology be viewed as a tool, and not as a substitute, for knowing the basic principles of research.
Stephen Giddings, a TechChange alum, has served for 25 years as a Foreign Service Officer with the USAID, retiring in late 2005. For most of his USAID career, he specialized in managing housing and urban development programs, serving in USAID offices in Panama, Kenya, Cote d’Ivoire, Russia and Rwanda, as well as Washington, D.C. During his last four years with USAID he was the Chief of the Policy Division for USAID’s Africa Bureau.
For the past ten years Mr. Giddings has been an independent consultant providing assistance to the Development Assistance Committee (DAC) of the OECD, and consulted with USAID, the International Real Property Foundation (IRPF), among other international development organizations. He serves on the Development Issues Committee of the USAID Alumni Association and is Co-Chair of the Africa Work Group in the Society for International Development’s Washington, D.C. Chapter (SID-Washington). Prior to his USAID career, Mr. Giddings managed low-income housing development programs at the U.S. Department of Housing and Urban Development and was Director of Planning and Development at the Boston Housing Authority. Mr. Giddings received a BA in political science from Wesleyan University and an MPA degree from the Maxwell School of Citizenship and Public Affairs at Syracuse University.
——-
Hope you enjoyed our second installment of our M&E Professionals Series! Don’t forget to follow our blog for the next post in the series!
Interested in engaging in similar conversations with M&E professionals like Stephen and Janet? Join us in our upcoming course TC211: Technology for Data Collection and Survey Design that starts on October 19. If you want the whole package, you can join our second session of our Tech for M&E Diploma program.