Global Database of Disaster Responses

We coordinated a collaborative project with researchers in the business school and department of computer and information science of a university in the United States to build, arguably, the largest database on disaster aid at the international level. The dataset covers every monetary and in-kind donation from firms, governments, multinational agencies, and non-governmental organizations reported in news media to relief and recovery from all major disasters that affected the world from 1990 to 2019. For example, the 2010 earthquake in Chile had the range from February 27, 2010 to the February 26, 2011. The coded data of corporate aid comprises 93,247 donations from 38,980 firms from 83-headquarters countries to 4,637 natural disasters that hit 204 countries in the period 2003-2018.
Collecting Data. We used the following procedure to track disaster donations:

  1. We obtained data on epidemic outbreaks, natural disasters, terrorist attacks and technological accidents from a variety of sources. First, we used the International Disaster Database (EM-DAT) from the Centre for Research on the Epidemiology of Disasters that register disasters based on the following criteria: 10 or more people killed, 100 or more people affected, a declaration of a state of emergency, or a call for international aid. Further information at Second, to overcome the data inaccuracies and missingness in EM-DAT, we obtained data from the reinsurance company Swiss Re and the Financial Tracking System (FTS) from the United Nations Office for Coordination of Humanitarian Affairs (UNOCHA).
  2. We automated code in Python to identify disaster donations in news reports using Factiva, Google, and Lexis Nexis. The search range was within a year from the official start date. A story is relevant for our database if the headline or body is in the results of a Boolean search that has the combination of the affected country, the type of the disaster, and in some cases, the name of the disaster.
    Specifically, the Boolean combinations are as follows:
    1. The affected country.
    2. Event. Derivations of:
      1. Epidemic: “pandemic” OR “epidemic”
      2. Mass movement: “landslide” OR “avalanche” OR “rockfall” OR “subsidence”
      3. Earthquake: “seismic” OR “quake” OR “earthquake” OR “tsunami”
      4. Flood: “flood”
      5. Storm: “storm” OR “typhoon” OR “cyclone” OR “hurricane” OR “tornado”
      6. Volcano: “volcano” OR “volcanic” OR “eruption”
      7. Technological accident: “accident” OR “explosion”
      8. Terrorism: “terrorist” OR “attack”
    3. Action. Derivations of: “donation” OR “donate” OR “donated” OR “donating” OR “pledge” OR “pledged” OR “pledging” OR “give” OR “gave” OR “given” OR “giving.”
    4. Disaster name, when available.

      An example of a Boolean combination is: [03/11/2011-03/11/2012]; (“Japan” or “Japanese” or “Japan’s” or “Japans” ) and (“tsunami” or “earthquake” or “quake” or “disaster”) and (“donation” orR “donate” or “pledge” or “pledging” or “give” or “gave” or “given” or “giving”).

  3. To make over 2,310,000 electronic reports computationally tractable, we apply differential language analysis using JavaScript Object Notation (i.e., JSON and AJAX) to parse the data. We code the following fields by article:

    1. Actor: Entity making the donation.
    2. Actual donation.
      1. In case of in-kind donations, the characteristics of the product or service were recorded (e.g., 1,000 bottles of water; a team of nine technicians) and monetized using either current prices applicable in the affected country (e.g., the average price of one litter of bottled water, the daily man-power wage for a specific professional or technician) or an equivalent pecuniary value based on other firms’ reporting of their donation to the same disaster.
      2. In case of donations reported in a currency different than the dollar, they were converted using the currency exchange rate of the day of the donation.
    3. Employee-driven donation. When the news article mentioned that the donation was an initiative of the employees (and, for example, the company is matching whatever the employees collected), a binary variable took value 1.
    4. Direct Impact: When the news article mentioned that the disaster affected the organization physically in any way (e.g., corporate assets such as buildings were damaged) and/or employees were injured, a binary variable took value 1.
    5. To increase the relevance of the output (for example, some news reports were a series of articles with no relevance to the study but whose combination would make the report to be included in the outcome), the search was qualified with the following filtering process:
      1. The name of the country had to be within 50 words of the type of the disaster or the word “disaster.”
      2. Entities and the act of donating were parsed:
        1. The entities per article were extracted and grouped in three categories: organization (e.g., Tepco), location (e.g., Canada), and individual (e.g., Barack Obama).
      3. The verb identifying the act of donating had to be within 30 words of an entity
      1. Assessing the Quality of Data. We used the followed procedure to check the accuracy of our collection:

        1. We hired independent researchers to conduct two different procedures to verify the quality of the dataset using third-party sources such as company sustainability reports. We randomly selected five percent of the events (156) for the period 2003-2013 and researchers searched reports using Google, Lexis Nexis, and Factiva. From this procedure, 5.1 percent of the selected events (8) had data inaccuracies. About 60 percent of these errors were associated with monetizing the in-kind value of donations, with less than 8% of the donations were incorrectly marked. The rest of sample of discrepancies were due to missing data on the nature of donor’s business.
        2. We run another random draw excluding previously evaluated cases and the researchers repeated the analysis. No other discrepancies were found.
        3. We compared our data with third-party sources:
          1. We had access to exclusive information of donation for the 2010 tsunami and earthquake in Chile via the Chilean government. By comparing our database with the list of donors given by the Chilean government, we found that our dataset comprised 68 percent of the official source. Our tracking did not include donating frequency of small- and medium-sized Chilean, non-multinational enterprises. In terms of magnitude, our dataset accounted for 92 percent of the total corporate aid for the event.
          2. We worked with staff members of the United Nations Office for Coordination of Humanitarian Affairs (UNOCHA) to compare our database with the Financial Tracking System (FTS). This is a global database that records self-reported international humanitarian aid for different humanitarian crises. The FTS covered about seven percent of our firm donations and 65 percent of our government and NGO donations.
          3. The U.S. Chamber of Commerce Foundation maintains Disaster Corporate Aid Trackers that are self-reported records for company response to disasters that focus on U.S. firms. Their data start in 2010 for selected disasters, particularly in the U.S., and account for 11 percent of our database.

        1. We covered newspapers, trade press, magazines, newswires, press releases, TV and radio transcripts, digital video and audio clips, corporate websites and reports, institutional websites and reports, and government websites and reports, among other sources.
        2.There were spelling mistakes in some articles.
        3. For information about the method of collection of FTS data and their verification, visit the following site:
        4.These data are available at