Across consultations with field teams from Country Offices from different organizations and beyond, a clear message has emerged: Information Management faces a range of persistent, practical challenges on the ground. These challenges are not minor technical issues — they often reflect deeper structural gaps, institutional ambiguity, and resource constraints. Understanding these realities is essential for designing IM systems that are not only theoretically sound but also practically feasible and responsive to frontline needs.
This section synthesizes field-based insights into five key categories of recurring IM challenges, reflecting what staff encounter every day when trying to implement responsible, consistent, and actionable data practices.
Structural and Role-Definition Challenges
A foundational challenge raised by many COs is the lack of clarity around what IM roles are supposed to do — and where they sit in the organizational structure. In many cases, there are no formal job descriptions for IM-specific roles, or existing ones are vague and inconsistent across country operations. Supervision lines are also unclear, with IM officers sometimes reporting through Programme teams, sometimes embedded within technical sectors, and occasionally operating as “floating” technical roles.
This ambiguity creates a knock-on effect: IM staff struggle to prioritize, programme teams don’t know who to go to for what, and IM work becomes reactive rather than strategic. The absence of global guidance on standard IM profiles or functional structures only deepens this fragmentation.
“We have someone doing IM, but no one is sure what her responsibilities actually include. So she ends up firefighting whatever is thrown at her.” – CO Programme Manager
Key Learnings: Defining roles clearly, aligning them with programme and MEAL structures, and institutionalizing IM responsibilities into team structures helped clarify expectations and reduce overlap. Having shared job aids and clear reporting lines improved coordination and sustainability of IM functions.
Guidance and Decision-Making Constraints
Even where IM roles exist, staff often report a lack of guidance on how IM should function day to day — especially in relation to programme design and decision-making. In some offices, responsibilities are blurred between IM and MEAL teams, leading to duplication, gaps, or misunderstandings. IMOs are frequently consulted after decisions have been made, or expected to “clean up” flawed data systems retroactively.
One recurring frustration is the limited influence IM personnel have over upstream decisions. Despite their role in collecting and interpreting programme data, IMOs are rarely positioned to contribute meaningfully to strategic planning or targeting. In addition, organizational bureaucracy — including rigid approval processes — limits their ability to adapt tools or respond creatively to emerging needs.
Deduplication also remains a common pain point. Without integrated systems or shared identifiers, IMOs often resort to manual workarounds that are time-consuming and error-prone.
Key Learnings: Mapping data workflows and jointly developing SOPs helped establish clarity and trust across teams. Embedding IM in review meetings and planning discussions increased the visibility of IM insights. Encouraging collaboration between IM, MEAL, and Programme teams improved overall data use in decision-making.
Digital Tools and System Limitations
Technology should enable IM — but in many field contexts, it becomes a bottleneck. Some COs lack access to standardized or advanced digital tools, while others face tool overload, with different units using different platforms for similar purposes. There is often little guidance on which tools are fit for which purpose or stage of the IM data lifecycle.
Where central platforms do exist, their adoption may be limited due to connectivity issues, lack of training, or misalignment with programme realities. As a result, teams fall back on Excel or paper-based systems, even when those are inefficient or insecure.
Key Learnings: Streamlining the number of tools in use, providing guidance on tool selection, and creating simple, standardized templates helped reduce confusion. Where advanced systems weren’t feasible, well-structured Excel tools proved effective and sustainable when combined with training and documentation.
Assessing the Quality of Data and Information
Several COs noted that even when data is collected on time and in full, questions remain about its reliability. Programme staff — particularly those without a data background — struggle to assess data quality or interpret survey results. There is often no simple checklist or toolkit for reviewing datasets before using it to inform implementation decisions, adjust activities, or coordinate with partners.
This gap leads to downstream issues, where flawed data is circulated without validation, and decisions are made on shaky foundations.
“There’s no quick way for me to know if the data we’ve been sent is good. I just use it because there’s a deadline.” – Field Manage
Key Learnings: Developing simple quality-check protocols, visual indicators for confidence levels, and basic reliability checklists enabled programme and MEAL teams to better assess the data they received. Clarifying who is responsible for data validation at each stage helped improve trust and use of information.
Transforming Data into Action
Perhaps the most complex challenge is not collecting or analyzing data — but using it. Several teams pointed to a “disconnect” between available information and actual decisions. Reports are generated, dashboards are updated, but there is little evidence that data is influencing programmatic direction.
Part of this stems from a lack of time or space to interpret data meaningfully. Part is due to trust — if data quality is poor or unclear, decision-makers may disregard it. And part is cultural — data is sometimes seen as a reporting obligation, not a tool for reflection or learning.
“We have more dashboards than we know what to do with — but they don’t seem to change what we actually do on the ground.” – Area Manager
Key Learnings: Framing data products around decision-making needs, rather than reporting templates, increased uptake. Visualization tools, simplified dashboards, and regular briefing sessions with decision-makers supported better alignment between data insights and operational choices. Establishing feedback loops between analysis and planning helped close the gap between data and action.
The challenges described in this chapter are not isolated incidents. They reflect systemic gaps in IM design, leadership, and support across the humanitarian sector. If IM is to serve its purpose — enabling data-driven, accountable, and adaptive programming — then these field realities must be acknowledged and addressed.
This requires more than new tools. It requires clarity of roles, investment in capacity, alignment between IM and decision-making structures, and a shift in how data is perceived and used.
As a starting point, organizations can:
Standardize IM job profiles and supervision structures
Involve IM staff in programme design and planning processes
Provide simple data quality tools and training for non-specialists
Clarify which tools are used when — and provide access and support
Build a culture where data is used for learning, not just compliance
By listening to field voices and acting on their insights, we can move from fragmented IM to integrated, field-responsive systems that work.