In September this year the USA magazine Government Technology published an article ‘Hard-Won Experience: Lessons from America’s Biggest Disasters and Emergencies’ where four technology leaders with roles in Hurricane Katrina, the Boston Bombings, and Superstorm Sandy were interviewed. As expected their comments were experiential, insightful and useful.
Two things in particular struck me: the commonality of content with lessons from crisis events here in Australia; and the focus on collaboration. The article summarized their views into 12 ‘Lessons Learned’. By my assessment six, or exactly half, of those lessons relate directly to the need for collaboration:
- Apps with two-way communication are effective.
- During a crisis, get everyone in the same room if you can.
- Ask for help from the private sector; companies probably will respond quickly.
- Be prepared for help from the outside, including colleagues from other governments.
- Tap into the disaster recovery plans of the private sector when offered.
- Know who your vendors are before the emergency.
The reason, need, and benefits of more, better, quicker collaboration are clear, as are the frustrations of responders and the community if it doesn’t occur. How familiar are these observations: The data is there somewhere, but is it readily available? Are changes to the data readily accessible? Is there a huge overhead in gathering and maintaining the data needed? Are there many different organisations constantly demanding the same information from the same people? Is there visibility to all of requests once they hit the system? Is the data actually out of date by the time it is collated and published? Does all of this mean different organisations end up having conflicting versions of the same data?
Sound all too familiar…?
So if collaboration is acknowledged as so important: why is it such a consistently recurring theme in lessons learned from disasters? Why can’t we get it right?
In my view the answer lies in data collaboration. While we understand the need to collaborate: the way we deal with data, the vital lifeblood of good decision-making, drives us back to the source of the problem – the impregnable data silos that are our organisations.
One of the lessons from the article that is not explicitly about collaboration so it didn’t make my list of seven was “Brace for some chaos, it might be inevitable”. My only issue in the current paradigm is the word ‘might’ – in my experience of the current approaches it IS inevitable.
The chaos means we spend so much time generating, capturing, validating, cleansing and responding to the data we own we spend little time actually sharing with those who need it. This is not by design it is by necessity!
But does this have to be the case?
What if there was a way to efficiently and effectively share real time data across multiple agencies and with the public. What if data changes appeared in your systems as it changed in those of your emergency management partners? What if this still allowed you to use best-of-breed systems for your needs and still be able to share data? What if this streamlined the multiple requests for your data and allowed it to be automatically published to those you have agreed can have it, perhaps with processes and rules tailored to their needs or your requirements?
All of this is possible with Data Exchange.
Technology continues to change the way that people and organisations engage. Networks and communities of interest are more often than not created organically around need, not strategy. This proves difficult for complex organisations. Detailed data is mined and analysed to identify trends, spot anomalies, and often to predict behavior before it happens. Applying data exchange principles to emergency management could just be the disruption needed to get collaboration right in an emergency.