The drive for data is a central motivation in large, dispersed ICT4D projects such as Bibliomist in Ukraine. The distances and number of project sites for Bibliomist are both vast and challenging, especially given the infrastructure realities in the less developed regions of the country. When it comes to data reported from the field, too much is never enough in my book. I constantly look for tools to improve and streamline our reporting mechanisms. It might be tempting to think that the best route to accomplish this is to simply gather more data, however in many cases the best approach is to simplify the data gathering process.
Currently, our regional staff usually reports to our central team either via email, updating a shared google document, or by direct cell phone call. The problem here is that data can be delayed a few days by the reporting process, or by a lack of Internet connection. When visiting libraries in rural locations, reps must travel back to a location with an Internet connection to report on what they found at the library site. Reporting data can vary significantly, but this particular instance was to improve the reporting efficiency for pre-installation surveys. A regional rep must verify a number of conditions for each library that applies to our program to ensure that facilities meet our requirements. For example, this particular survey contains questions such as ‘what is the available Internet connection?’; ‘are there burglar bars on the doors/windows?’; and ‘is there enough physical space for the computers?’ I wanted a technological solution that would allow the reps to report on this survey without having to transcribe answers into an email, or worry about the availability of an Internet connection. I also wanted the survey responses to be organized and parsed automatically and not have to be done so by hand by our monitoring and evaluation team. Currently, reps would wait until they were home to work on the actual reporting of the survey. This tool will allow them to report on the survey results from the library site itself.
With this goal in mind, I wanted to make it easier for my regional staff to report data to myself and our impact team. Some of my key constraints:
- I did not want to reinvent the data organizing tools already in place by our impact team. (Don’t fix what isn’t broken)
- I did not want to mandate new technology to the regional staff. (Outfit them with smart-phones or require learning a new program)
- I wanted the tool to be accessible to all regional staff regardless of technology comfort level and language preferences. (Keep it simple)
- I wanted reporting to not be dependent on an active internet connection.
- I did not want to spend (much) money.
Taking these things into account, I began researching tools that are already out there. Constraint #1 meant feeding reports through google docs, which requires playing around the Google Documents API. This works in this case as we do not require the power of a stand alone database, and the cloud hosting inherent to gdocs solves a lot of the backup and access worries. #2 meant I wanted to bring data gathering to a technology and process our regional staff already use everyday. In this instance, my regional reps are my ‘customers’ and a quick user-needs study brought back a clear direction. It became apparent that the only common tool available to all 25 representatives is a basic mobile phone with only SMS (text message) and voice capabilities. Even java based form reporting (such as this example) would require buying new phones for some of the staff. #3 kind of fell in line after #2; I knew I’d be crowd-sourcing SMS’s at this point. #4 amounts to the fact that it is very difficult to mandate that staff inform you that the internet is down at a particular library if the only way they can effectively report this is through an internet connection. While internet connections can be unstable in rural Ukraine, cell phone coverage is more or less ubiquitous. For #5 I was half way home by using Google Docs, there are plenty of open-source tools laying in wait out there, but which to choose? I settled on a trusty-veteran of the text message world: Frontline:SMS.
I decided to build a reporting system that would tie together the ubiquitous functionality that Frontline:SMS puts in the hands of each of my field staff with the cloud ‘database-on-the-cheap’ that is Google Docs. I wanted reps to be able to text message in the status of library locations by parsing the message contents. I used the following items in building my tool:
Frontline has a very nice feature where an incoming SMS can be used to trigger an external command. I wanted this command to be a call to a python script I would write, and the command line arguments would be the data that the script passed along to a Google Spreadsheet. To do this, I would need a few helpful Python tricks.
- gdata to give me all the google api calls and authentication commands
- optparse to get data from the FL:SMS command line call into my script
- urllib to clean up an encoding issue so that I can properly parse the text message’s contents
In my next post I will walk through my code and give a tutorial on how I built this tool, and how to install it. In the mean time, you can download the completed, and documented, script here.