There’s been an explosion in the amount of data available. It’s growing all the time. So too are the insights it can potentially provide. In a world of increasing complexity, there are frameworks that can be implemented to help your organization enhance situational awareness. What will help you see beyond the horizon?
As the amount of information and the number of available data points have surged, deriving insights has become a necessity. In the past, it was all about data coverage: bigger areas, lower latency, better resolution. It resulted in more and more data points. But we are now past the point where more data equals greater clarity; we are in a world where more data actually equals greater complexity.
This “illusion of knowledge” masks an inconvenient truth: there is too much data, the noise is getting louder and we are all getting further from actual knowledge. This means we have to rethink our methods to separate the wheat from the chaff; to keep up with technology, going beyond the human mind’s capacity; and develop new ways of big data collection and exploitation.
So how can you implement this mindset to ensure the insights derived from analytics empower analysts and generate decision-ready, operational recommendations?
Take the following scenario: at the time of writing, there are at least 6,000 vessels in the Exclusive Economic Zone (EEZ) of the western United States. In the past 30 days, these ships made almost 30,000 port calls worldwide. They conducted 294 meetings and turned off their AIS transponders 728 times. How long would it take an analyst to effectively review every single vessel to identify bad actors? Where do you even start?
We now have all this information, but it’s hard to efficiently and effectively evaluate what’s important to an organization’s operations. What if all this information had been processed and prioritized in advance, so attention can start with the most likely threat and proceed down the list from there?
This is where context becomes key. It’s about finding the “so what” out of the millions of data points available. By establishing a consistent baseline of what constitutes routine behavior patterns – per area, per ship, per season – it is possible to create a clear operational profile. Creating this kind of context provides crucial insights into areas that merit additional focus and evaluation.
To approach this new world of data overload, digital leaders in navies, law enforcement, and border protection agencies worldwide are looking at solutions, such as Artificial Intelligence (AI), to help analysts generate clarity from complexity.
Take the 6,000 vessels above: to look at them manually is almost impossible. This is where machine assistance comes in to play. It enables analysts to focus efforts on the vessel which has a higher probability of being involved in illicit activities, such as conducting ship-to-ship operations, at night, while turning off its AIS. Automation reduces time spent on data gathering and manual reviews. However, to find such insights it must be trained – just like an analyst. And as with all training, it’s essential to have the right approach and methodology.
To fully unlock the potential of data – through analytics to insights – there are three key components to consider when designing such decision-support systems: Consistency, Capacity, and Collaboration.
When aggregating data to build context over time, a new challenge emerges: mistakes also get aggregated. Analysts are trained to cope with uncertainty and deception and to factor them into their workings. When monitoring a small region in real-time this issue is humanly manageable. When processing data in bulk – which can mean billions of data points every day – the process of data cleaning and verification has to be automated. But this also means mistakes can happen: one vessel could be misidentified as another; it might be detected visiting a port it didn’t; and if a vessel changes flag it might be that all the historical data associated with it gets misplaced.
Over time this means the accuracy of the data will keep degrading; subsequent decisions may be compromised. This is why consistency in data quality must be a priority when designing decision-support systems.
There are many ways to increase consistency, such as creating redundancy of sources, developing protocols for cleaning data, and reviewing fusion decisions to retrain models. What all of these, and others, have in common is they aim to ensure every entity – businesses, ships, and companies – has all available and relevant data assigned to it, and no-one else.
By ensuring stable foundations, it will enable those at the forefront of data and insights to create solutions not only fit for purpose today, but ready for the next wave of data. Together with technological advances, this will enable new ways of enhancing analytical capabilities.
The second part of this series will dive deeper into the next two tenets of creating an adaptable and agile approach: Capacity and Collaboration. This will provide an opportunity to further investigate what is needed for your organization to be at the forefront of not only data, but also finding the insights and the “so what” that matters to you.