(By Sam Bendett and Nick Chiosie) On June 20, 2013, TIDES Project at the Center for Technology and National Security Policy (CTNSP), National Defense University, hosted Dr. Patrick Meier (QCRI) for a discussion on crowdsourcing information during natural and man-made disasters, as well as on the evolution of open-source, unclassified humanitarian technologies for disaster relief and emergency response.
This discussion was geared towards NDU faculty and staff, addressing the following criteria:
• CTNSP’s mission of educating DoD workforce and policy makers on the emerging challenges related to technology, including open source information sharing platforms during disasters and emergencies.
• CTNSP’s mission of supporting civil-military affairs integration, which includes exploration of new and existing technologies, policies and practices for interagency, intergovernmental cooperation. ___________________________________________________________________
Patrick Meier is an internationally recognized thought leader on the application of new technologies for crisis early warning, humanitarian response, human rights and civil resistance. He is currently Director of Social Innovation at the Qatar Computing Research Institute. Previously, he served as Director of Crisis Mapping at Ushahidi and previously co-directed Harvard’s Program on Crisis Mapping and Early Warning. Patrick holds a PhD from The Fletcher School, a Pre-Doctoral Fellowship from Stanford and an MA from Columbia University. He was born and raised in Africa.
IN HIS OWN WORDS
The first generation of humanitarian technologies was powered by free, open-source software produced by organizations such as InSTEDD, Sahana, and Ushahidi. For example, Ushahidi (the name means “witness” or “testimony” in Swahili) developed an interactive-mapping platform linked to a live multimedia inbox and used it to document violence that erupted in Kenya after the disputed presidential elections of 2008. Eyewitnesses sent reports of ethnic attacks and other violent incidents to the Ushahidi Web site via e-mail and text message. Ushahidi then plotted the location of each incident on a Google map, creating a public record of events.
The Ushahidi platform was later used to crowdsource a live crisis map of the 2010 earthquake in Haiti. In the days and weeks following the earthquake, eyewitnesses submitted a large volume of text messages, tweets, photographs, video, and Web-based reports to the Ushahidi in-box. Once these reports were manually collated and plotted on the Ushahidi platform, they became a live crisis map of urgent humanitarian needs. For example, the map showed exactly where victims lay buried under the rubble of collapsed buildings, and where medical supplies needed to be delivered. The US Marine Corps, one of the first responders to the earthquake, has stated that the map helped them save hundreds of lives. The Ushahidi platform has since been used in response to dozens of other disasters worldwide.
After three years with the Ushahidi team, I began to look for a new home where I could help create the next generation of humanitarian-technology solutions. I found this home at the Qatar Computing Research Institute (QCRI) in Doha. QCRI was launched two years ago to carry out world-class R&D in multiple areas of advanced computing, including big-data analytics, distributed systems, and social computing. I was brought on as director of social innovation and given the task of harnessing the world-class expertise at QCRI to address major humanitarian challenges.
Introduction of Patrick Meier and his experience – Patrick’s blog is iRevolution.net where he writes about humanitarian technologies and their use in disaster relief and emergency response.
What is the bigger picture: crisis mapping has evolved since 2006 and needs continual study.
Cooperation with the Ushahidi platform showed what crisis mapping could mean for a broader humanitarian assistance/disaster relief component. This platform was used extensively in various capacities in the 2010 Haiti earthquake.
Ushahidi platform and that initial deployment showed opportunities but also displayed the specific limitations.
Volunteers who donated their time and effort, received and combed through thousands of posts via; text messages, twitter, FB messages. The volunteers were inundated with too many posts and could not manage all of them. Even with 1,000’s of volunteers it became very difficult to keep track of vast amounts of data flowing back and forth.
Libya Crisis Map: Standby Volunteer Task Force (SBTF) was created from crisis mapping volunteers around the world – 1,000’s of people in over 80 countries signed up. This effort was maintained over 4 weeks. Managed work flowed more efficiently, but still, it was difficult to go through the wealth of knowledge even with so many people on board.
However, positive results were achieved – SBTF was told that they sped up response time for aid to pressed individuals.
Real-time social media mapping depict how a population reacts towards a disaster – the vast majority of info on sites like Twitter is not relevant in aiding a disaster response.
Over 20 million tweets were generated during Hurricane Sandy; access to info is equally important as food water – information is time sensitive, what’s important now might not be relevant in an hour.
How do we deal with vast amounts of data: human computing vs. machine computing?
Big crisis data – credibility of information can be in question; verifying that numerous tweets are accurate and current is a difficult task but newer tactics and procedures are evolving to meet the challenge.
One way to make a digital map relevant is to collect tweets which had links to images, captured geo-tags can be viewed by volunteers and that information can be posted in 12 hours. Hopefully helping the affected population.
Micro-tasking is a way to view incoming data and verify accurate statements to the greatest degree of accuracy. Looking at tweets; looking at pictures that are attached; attempting to get verified information on locality, then sending that information to relevant aid organizations. In this scenario, 3 people look at a tweet and all have to verify that the specific damage or need that is embedded in the message was indeed real. If all three agree upon the accuracy of the message it is then passed to the next stage. If all three of the volunteers do not agree, the message gets discarded and they move on to the next one.
Big data is relative: micro-tasking still overwhelms because there are so many tweets/posts during each crisis/disaster
Other avenues of collecting data include Geophedia and MicroMappers who are both working with the UN.
Machine computing prioritizes who wants help/who needs help. Tagging tweets specific to what is being tweeted: damage, injury/crime etc. The algorithm created learns as the crisis progresses and humans are able to figure what is relevant and what is not; people can tweak accordingly as time goes on and assistance is still needed.
Such an algorithm can help bring data analysis to within 80% accuracy if the person who was tweeting was actually on the ground or not.
Machine computing: non-credible tweets leave very predictable markings.
One disaster experience and analysis methods do not translate over to another disaster. Every disaster is unique in that different needs and responses to those needs shift accordingly.
Patrick listed the following resources to get involved with the following resources:
His own blog: Patrickmeier@irevolution.net
Data verification platform: http://irevolution.net/2013/02/19/verily-crowdsourcing-evidence/
Crisis Mappers network: http://crisismappers.net/
He urged attendees to join Standby Volunteer Task Force and to monitor Crisis Mappers discussions/calls for assistance