In his recent article on MobileActive.org, Paul Currion posed the question, “If all You Have is a Hammer, How Useful is Humanitarian Crowdsourcing”. Currion argued that those working in disaster response “…don’t need more information, they need better information.” He argued that Ushahidi’s Haiti deployment was an example of the failure of crowdsourcing to add value to disaster response efforts and ongoing humanitarian work. This pointed critique of Ushahidi and the use of social media in a humanitarian context resulted in an enthusiastic, and sometimes heated, debate in the article’s comments section.

Supporters of Currion’s view argued that efforts such as Ushahidi resulted in data that was biased and unreliable. Concerns were expressed that the system unfairly raised expectations among affected communities that each message would result in a corresponding response. Defenders of crowdsourcing argued that tools such as Ushahidi have proved particularly valuable in situations such as Search and Rescue. They referred to rescues facilitated by messages sent to the Ushahidi platform in the immediate aftermath of the Haitian earthquake. In addition, some defenders argued that tools like SwiftRiver are now allowing for crowdsourced reports to be filtered, verified and made more actionable. Robert Kirkpatrick, Director of the United Nations Global Pulse program, wrote “Like it or not, this kind of information will be generated increasingly by disaster-affected communities, and it will be available to the global community — including the organizations and individuals with an official mandate to coordinate response. These organizations may choose not to view the reports. Or they may choose to view them and then dismiss them as not actionable. Or they may choose to act upon them. Regardless, they will be held accountable for these decisions. The game has changed. We need to develop policies, processes and tools to deal with this information, because it isn’t going away.”

I tend to agree with this point. A new tool has been created — it is not necessarily appropriate in every context but can be made to add value in some, especially in conjunction with more conventional data. I have already learned of numerous Ushahidi deployments that were not likely considered when it was created in the aftermath of the 2008 Kenyan elections, such as tracking sexual abuse and wildfires. As crowdsourcing applications such as Ushahidi and FrontlineSMS are further refined and data is better organized and made more meaningful (e.g. with tools such as SwiftRiver), they will likely be deployed in a variety of new contexts. Clearly, there is no mandate for agencies to utilize the data produced by them, but based on the attention paid by The UN Secretary General’s Office, UN OCHA, UNDP, UNHCR, The World Bank and others (as witnessed at ICCM 2010), the potential is clearly recognized.

Education and training will likely play an important role in the future of this technology. As practitioners become more familiar with the tools available to them, they will be better able to judge when it may truly add value to an effort and when it is simply inappropriate. Similarly, there may be significant value added by training members of the “crowd”. While this is unrealistic in the immediate aftermath of a crisis, a minimum level of training to a select number of community members in the days and weeks following could likely have a significant impact. Efforts such as UNICEF’s use of RapidSMS, while not purely crowdsourcing, have shown the utility of teaching select representatives to send messages coded in such a way as to easily feed into an organized database. Similarly, the development of well-trained teams to deal with crowdsourced data such as The International Task Force of Crisis Mappers and efforts like Universities for Ushahidi could speed up the evolution of these tools.

I recognize that Ushahidi and other humanitarian crowdsourcing applications should not replace tried and true methods of humanitarian data collection. Challenges exist in how to deal with the new type of information provided by them. However, to continue an analogy used by one of the article’s commenters — Why throw out the baby with the bathwater? Instead, I would argue more questions should be asked regarding how best to build on these young but powerful new tools. Some of my own include:

• How can the “crowd” be best engaged (and perhaps trained) to provide more actionable information?

• Could other forms of media (e.g. radio) be better employed to engage community members, perhaps even those without access to more advanced technology?

• Can (and should) crowdsourcing applications such as Ushahidi be used to “push” information back out to the crowd? Could this be a means of increasing the quality of future data?

• Could Integrated Voice Recognition (IVR) technology be incorporated to respond to participants needs, perhaps by providing prerecorded messages corresponding to their stated needs?

Also on TechChange Main

TechChange Tech Fellow Spotlight: Nithya Menon

This year, we kicked off our first TechChange summer fellowship. We had three fellows join us in our TechChange office...

New Year's Resolutions 2015
How to Make Online Learning a Career-Boosting Habit

When setting new year’s resolutions, we often set goals that include getting healthier, improving our relationships, and advancing our careers....

Mobile Applications for Atrocity Prevention Require Mobile Students

Interested in learning about Mobile Phones for International Development? Early bird registration for our next class ends on February 25,...