I. Theory

NOTE: This document is a very early sketch by a consultant/writer, placed online for collaborative writing. In no way should this document be viewed as a reflection of the overall team's sense of OpenDRI processes. DO NOT USE for operations.

2. Open Data for Resilience

Open Data for Resilience (OpenDRI) works with governments to harness the value of Open Data practices in service of more effective disaster risk management and climate change adaptation. An OpenDRI project offers a menu of DRM tactics for building high-resolution exposure data with collective action, including:

  • Collation of data and their publication in an open geodata catalogue. Data about the exposure of a nation to natural hazards are often fragmented across multiple institutions, often under policies that hinder the aggregation of those data into more comprehensive models. GIS and Data Management System (DMS) platforms that enable this kind of aggregation are also rare and (until recently) were very expensive.

  • Open Data Working Groups. The development of a community between DRM practitioners is critical for fostering information sharing, providing training, and creating the network of analysts who become the primary users (and sustainers) of risk data.

  • Collection of Exposure Data with Participatory Mapping. In many places, there is no geospatial database of the built infrastructure that aggregates key attributes about its vintage, construction materials, elevation, or number of stories. OpenDRI works with communities to build this asset database from the bottom-up using low-cost participatory mapping techniques, where collection and curation of the data is done by the communities that those data describe.

  • Software Development around Open Data. Once risk data exists as a public good, it creates a strong incentive to build software applications and services around the application and analysis of those data. Some products are directly related to DRM. Some harness the information–maps, imagery, and data about the built environment–for uses beyond DRM, including navigation, logistics, and business analysis.

  • Living Labs. Building a community of technologists and organizers around risk data can be accelerated by using a living lab or innovation lab model, where a place serves as an incubator for new projects.

  • Risk Communication Tools. To raise awareness of the potential impact of disasters that have not occurred in living memory, it is important that DRM experts create mechanisms to communicate threats to decision makers and the general public. Risk communication tools provide a platform for this work.

OpenDRI also relies on parallel but separate efforts around improving the modeling of natural hazards, including weather forecasting and mid-term meteorological/climate forecasting. It also relies on cross-support from peers in risk assessment and modeling, who can take data curation that is at the core of OpenDRI and turn that information resource into actionable recommendations around risk management. Note: Risk Assessment is not part of OpenDRI.

History

OpenDRI began from catastrophe. The 2010 earthquake in Haiti killed many of the staff of the national mapping agency (CNIGS) and destroyed a growing knowledge of the geography of Haiti. The building collapse also buried servers with the sole copies of geographic data that would have aided the response and recovery to one of the biggest humanitarian disasters of the last century. Then something unexpected happened.

Several satellite companies collected fresh high-resolution imagery of the damage and made the data available for free. The World Bank collected imagery via aircraft at even higher resolution. More than 600 volunteers of the OpenStreetMap community started tracing the imagery, creating a highly detailed map of Haiti. Volunteers made about 1.2 million edits to the map, performing an estimate year of cartographic work in about 20 days–all for no cost. This effort catalyzed rethinking of community mapping and open data within the World Bank.

The practical reality of what had happened in Haiti fused with a growing movement around open data and open government. Within the GFDRR team, a question emerged: if community mapping could map most of a country in a crisis, what could be done before a disaster? Could GFDRR invest in collecting better data about the exposure of the built environment to natural hazards as a form of technical assistance? Could the communities then curate this data, creating the opportunity for better spatial and temporal resolution of the exposure of a country to threats?

Before the team could set about researching this question, another major disaster emerged: the 2010 famine in the Horn of Africa. GFDRR convened a meeting where partners to the response agreed to share their operational data using a shared data catalogue. This effort—Open Data for the Horn—created a shared catalogue of the various data being collected around the famine, from the Famine Early Warning System to regional maps, geospatial data, and satellite imagery. It has become one of the key points for coordinating activities among OCHA, the World Bank, RCMRD, and WFP. The Sahel Response data catalogue followed soon thereafter.

But the activity at GFDRR was only a small part of a larger movement in open data. In September 2011, Open Government Partnership announced that 8 governments had become the founding signatories on an agreement to make government data far more open to citizens than in the past. Subsequently, 47 more governments signed the declaration. One part of this declaration reads that each government commits to:

We commit to pro-actively provide high-value information, including raw data, in a timely manner, in formats that the public can easily locate, understand and use, and in formats that facilitate reuse... We recognize the importance of open standards to promote civil society access to public data, as well as to facilitate the interoperability of government information systems.

The World Bank itself announced an open data policy in April 2012. All its data and publications would be made available under the Creative Commons 3.0 Attribution License (CC-BY), which permits free reuse and redistribution so long as the data or publication is attributed.

With communities, governments, and international institutions all pursuing open data, the natural next step was to explore packaging open data into a set of approaches around risk.

Defining Open

Open data are “a piece of data or content is open if anyone is free to use, reuse, and redistribute it — subject only, at most, to the requirement to attribute and/or share-alike.”

Open-source software is a piece of software whose “source code is available to the general public for use and/or modification from its original design. Open source code is typically created as a collaborative effort in which programmers improve upon the code and share the changes within the community. Open source sprouted in the technological community as a response to proprietary software owned by corporations.”

Open standards/formats for data provide a free and openly available specification for “storing digital data, usually maintained by a standards organization, which can therefore be used and implemented by anyone. For example, an open format can be implementable by both proprietary and free and open source software, using the typical software licenses used by each.”

Packaging Open Data

In 2012, GFDRR began to package these open data efforts under one label: the Open Data for Resilience Initiative (OpenDRI). Teams from GFDRR began to offer World Bank regions and client governments technical assistance around how to use open data to catalyze better information on risks.

These projects centered on applying the principles of open data, open source software, and open standards to the disaster risk management cycle. The objective was to open several types of data for analysis to a wide range of stakeholders:

  • Hazard
  • Exposure
  • Vulnerability
  • Risk Information

Where appropriate data did not exist, OpenDRI would catalyze its collection and curation, where possible as open data. Where data was already part of an archive, OpenDRI staff would work to negotiate its release as open data, or establish the appropriate controls on the data with host nation officials. The resulting ecosystem would have far more data on which to base decisions about investing in DRM.


Building Partnerships

OpenDRI starts from communities in which it gets implemented. It has connected a wide range of partners:

  1. Government Clients
  2. Science Agencies
  3. Reinsurers
  4. Development Partners
  5. Local NGOs
  6. Voluntary Organizations
  7. Incubators/Social Entrepreneurs

Work Process Overview

Like open data initiatives, OpenDRI starts small and scales virally. It deploys in one site, and then another, expanding in utility as the amount of data increases. It might start in a smaller city, then migrate to other areas as understanding and trust in the process builds. In this way, OpenDRI is an iterative process.

In general, OpenDRI unfolds in five stages:

  1. Scoping: a dialogue to determine the risks, readiness, relationships, and use cases that would form the core of an OpenDRI pilot.
  2. Designing: a collaborative, multi-institutional process to customize the OpenDRI package to the unique context of a client government.
  3. Piloting: creating an initial presence for OpenDRI, seeding the initiative and building a sustainable community around DRM data.
  4. Scaling: expanding the open data ecosystem around the DRM cycle when an OpenDRI implementation sticks.
  5. Sustaining: creating the conditions to hand an OpenDRI initiative to the communities that built it.

Tools

OpenDRI uses a growing menu of tools to develop the open data ecosystem:

Building a DRM Data Catalogue

Building trust to release data into public or semi-public data catalogues enable the strategic release of certain government datasets to the commons, where they can be curated, emended, amended, and (most importantly) reused in ways that governments alone cannot do. Data catalogues do not mean that a government needs to release of all data to the public.

The collation of links to critical data sets enables first recommendation from NHUD:

First, governments can and should make information more easily accessible. People are often guided in their prevention decisions by information on hazards, yet the seemingly simple act of collecting and providing information is sometimes a struggle. While some countries attempt to collect and archive their hazard data, efforts are generally inconsistent or insufficient. Specifically, there are no universal standards for archiving environmental parameters for defining hazards and related data. Data exchange, hazard analysis, and hazard mapping thus become difficult.

Open data empower decision makers at all levels of government, as well as in the private sector. Open data creates a common space where community can gather around shared problems and co-develop solutions with a wide range of partners.

Open Data: the Five Stars

★ Available on the web (whatever format) but with an open licence, to be Open Data

★★ Available as machine-readable structured data (e.g. excel instead of image scan of a table)

★★★ as (2) plus non-proprietary format (e.g. CSV instead of excel)

★★★★ All the above plus, Use open standards from W3C (RDF and SPARQL) to identify things, so that people can point at your stuff

★★★★★ All the above, plus: Link your data to other people’s data to provide context

source: http://www.w3.org/DesignIssues/LinkedData.html

Principles

For data to serve decision makers across a society, it needs to be fully open. This means:

  1. Technically Open: Many government datasets are locked in data formats that can only be read by proprietary software (and sometimes hardware, like obsolete magnetic tape backup drives). The data must be released in ways that allow any device or software can read it.
  2. Legally Open: the license under which the data is released must permit redistribution and reuse.
  3. Accessible: the data must be available at a public Internet address (URI)
  4. Interoperable: the data must follow open standards.
  5. Reusable: can be redistributed and reused in ways that were not necessarily anticipated by the curator of the original data.

Ten Principles of Open Government Data (OGD)

(src: Linked Open Data: The Essentials, Bauer and Kaltenböck)

  1. Data must be complete
  2. Data must be primary
  3. Data must be timely
  4. Data must be accessible
  5. Data must be machine-processable
  6. Access must be non-discriminatory
  7. Data formats must be non-proprietary
  8. Data must be license free
  9. Data must have permanence, be findable over time
  10. Usage costs must be de minimus

From the Sebastopol meeting on Open Government Data

How OpenDRI works with Governments

OpenDRI advises ministries on the collation, cleansing, and release of data related to risks. These datasets tend to be spread across governments. Sometimes, ministries sell them to each other (though the revenues tend to be low and the administrative/transation costs for managing these sales tend to be high). OpenDRI partners work together to determine which data are appropriate for release. That said, rather than following the traditional method of aggregating data into a central web portal, OpenDRI recognizes that ministries wish to retain stewardship over their own data. So OpenDRI recommends that each ministry release its data using (free and open source) platforms that allow other ministries to subscribe to the data using web services. This model has a number of benefits:

  1. Politics: ministries retain control of their own information. Instead of adding a centralized umbrella web portal and the perception of a shift in data ownership, a government adds a free tool into existing workflows.
  2. Freshness: the data in the system is always flowing from the source and is as new as the ministry is able to release.

Open Data Working Groups

A critical aspect of OpenDRI revolves around the development of an ecosystem of experts who use open data to create analytical products. Hosting open data working groups is a proven tactic to recruit, train, and connect individuals who need better access to DRM information. Working groups provide two critical functions:

  • Networking: In many cases, the network of GIS experts, DRM analysts, and disaster managers in a country lack strong relationships. They may also not be aware of the data that other agencies/ministries hold. Open data working groups provide a venue for individuals to become aware of each other’s problems, what data sets they curate and hold, and build sustainable links between institutions. This trust becomes the bedrock on which a sustainable network of DRM professionals curate open data.

  • Training: period working group meetings provide a venue where the network of DRM professionals can build capacity around risk assessment. Through partnerships with international institutions, the working groups receive regular contact with DRM experts from around the world, who may present in country or via webinars on topics of interest to the host country. Training sessions also catalyze cross-pollination between countries.


Community Mapping of Exposure Data

Building better exposure data is very time intensive, but it need not be costly. It sometimes requires individuals to visit thousands of municipal buildings and locations of critical infrastructure, make a basic assessment about the construction of those sites, take pictures, and ask locals questions about the site. If performed by survey departments of the government or commercial ventures, the costs quickly spiral beyond the means of most governments and donors. In comparison, mapping Kathmandu under OpenDRI cost under $200,000 USD.

The approach taken by OpenDRI is to recruit and train community members to map their own cities. This method creates jobs for youth, trains them in modern geospatial tools, and prepares them for additional work curating the map of their cities. See OpenCities. It also creates a map that is free and open for all to use for any purpose.

Peta Gratis Untuk Semua (OpenStreetMap Indonesia)

Case study TBD for OSM in Indonesia, from outcast to integrated into BIG.

OpenStreetMap

OpenStreetMap aims to create a free and open map of the world. Akin to Wikipedia, it allows anyone to draw on “the map” using a wide range of software and devices, including handheld computers and smartphones. To ensure accuracy and data quality, the OpenStreetMap Foundation works with communities in each country to encourage editors and experienced uses to review submissions, and provides software that makes it relatively easy for experienced users to correct the errors of person who has made a mistake or submitted inaccurate data. It is a community managed map.

While some might expect that the accuracy of the map would therefore be far lower than professional cartography, academic studies show that the map is within the margin of error of consumer GPS devices (see Muki Hakley, University College, London in this discussion of the accuracy and reliability of volunteered geographic information)


Software Development around Open Data

The release and creation of data under open licenses foments innovation by a host-country’s technology community. In many cases, small companies and non-profit organizations can build revenue models around adding value to open data. A good example is the application of OpenStreetMap data to build routing applications for other business, particularly around the navigation of complex and fast-changing urban environments.


Living Labs

To support small businesses and non-profits that emerge around open data, a proven tactic is to use a “living lab” or “innovation lab.” This innovation space provides an incubator where teams from multiple efforts can develop software in a co-working space. Even with basic curation, the environment can foster cross-pollination of ideas and interlinking of both data and business models. The living lab also provides a neutral space where government officials, international staff, local business, and community members can interact as peers and co-develop solutions to shared problems.


Risk Communication Tools

To change the mindset of planners at all levels of government, it is not only necessary to give them maps and open up government data; they must also have simple tools that allow them to visualize potential disaster scenarios. Because traditional risk assessment models require a great deal of training and expertise, a range of partners came together to build impact modeling tools that enable a municipal government official to pull hazard data from existing data sources, such as exposure data about a city from OpenStreetMap, and with a few mouse clicks, show the potential impact of a hazard on the schools in the city.


Chapters


blog comments powered by Disqus