5.4 Automation
Automation aims to reduce manual tasks, errors, speed up data processing, and increase accuracy and timeliness of decision-making. In humanitarian settings, automation can streamline repetitive and routine workflows, allowing IM personnel to focus on value-added analysis, coordination, and problem-solving. However, automation is not a silver bullet. It must be contextually appropriate, responsibly implemented, and maintained over time.
What Processes Can Be Automated?
For programme teams, many repetitive tasks in service delivery operations can be automated to improve efficiency, accuracy, and timeliness. These typically include data ingestion, transformation, harmonization, validation, and reporting. Application Programming Interfaces (APIs), which allow different software systems to talk to each other, are a common feature for automation. Through APIs, data can be loaded from one system to another - facilitating real-time data transfer or scheduled updates. HDX, ReliefWeb, KoboToolbox, SurveyCTO and many of the other available platforms and tools provide APIs and detailed documentation on how to use them.
Example - MapAction's automated data pipeline
MapAction has developed a semi-automated data pipeline to accelerate the delivery of situational awareness products in emergencies. Their system connects to various open data sources, including the Humanitarian Data Exchange (HDX) platform for operational data, Google Earth Engine (GEE) for satellite imagery, and OpenStreetMap (OSM) for additional geospatial information. The automated processing of data from these resources reduces the time required for data preparation during the first 72 hours of a crisis—a critical window for decision-making. The team integrates metadata tagging, version control, and Python scripting to ensure outputs are both rapid and reliable, illustrating how automation can enable speed and consistency without compromising on data standards. Read more.
This model showcases the potential of combining automation with human oversight—using scripts for standard tasks, and analysts for interpretation and contextualisation.
What Should Not Be Automated?
Some processes require human intelligence, contextual understanding, and ethical reasoning, which makes them unfit for automation. Such processes may include:
Needs analysis that integrates qualitative insights from communities.
Interpretation and triangulation of data from multiple, possibly conflicting, sources.
Sensitive data handling that requires nuanced judgment and consent processes.
Context-specific decisions (e.g. indicators selection, narrative construction).
Scenario-based planning and anticipatory action—AI tools can assist, but not replace judgment.
Prerequisites for Automation
Before implementing automation, ensure the following conditions are met:
Steps to Work Towards Automation
Map and document your workflows. Identify repetitive, manual steps (e.g. copy-pasting data weekly into Excel reports). (see chapter 5.1)
Start with simple tools. For many teams, automation begins in Excel or Kobo using scheduled exports, Power Query scripts, or Power Automate.
Build reusable scripts and templates. Use version control (e.g. GitHub, SharePoint), and avoid hard-coded file paths or personal accounts.
Pilot small-scale automations. Test in one country or programme before scaling.
Create fallback options. Ensure manual workarounds exist for critical processes during tool failure.
Train and upskill staff. Identify the technical competencies needed and invest in training staff.
References & Further Readings
510 (n.d). The Netherlands Red Cross'510 - Github Repository.
MapAction (2024). Accelerating Humanitarian Response: Inside MapAction’s Automated Data Pipeline.
OCHA (n.d). Humanitarian Data Exchange - Resources for Developers.
Reliefweb (n.d). All about Reliefweb API.
IFRC (nd). Data Science - GitHub Repository.
IFRC (2023). Data Playbook Toolkit - Data Science and Emerging Technologies.
Last updated
