BLUF – Predictive intelligence is a small, resource-expensive analysis component – highly sought after by leadership and decision-makers who often lack understanding of the ask. This post presents initial assessment of predictive analysis requirements and considerations as a precursor to programme development.
Once upon a lifetime ago, I worked as an intelligence analyst in the liminal space we considered cyber. At the time, a small group of us tried sorting out the state of the possible for nation-states to use cyber in conjunction with or in lieu of kinetic conflict. Trying to gain understanding based on their doctrine and observed espionage activities, there was a Hanlon’s Razor-esque debate: were they taking data with purpose, or was the purpose to take data?
In one case, there were operations, mainly humanitarian, attached to the espionage victim and unlike many of the scraping exfiltrations observed, the exfiltration was nearly surgical – with some of the data having direct implications towards first line operational support. This hadn’t been seen before and truly changed the common assessment from the community debate, as we had yet to show purpose behind the data theft. What followed was three years and many rabbit holes with little to show for it – largely on the premise of predictive analysis.
The majority of intelligence is drawing conclusions based on observed evidence. Meaning you see what’s there and make educated inferences as to what x is, what x is for, why x is there, whom x belongs to etc. Follow-up and further collections are based in because – because we saw this, we want to follow where it goes etc.
Predictive analysis is – at best – a small subset of intelligence analysis. Unlike traditional intelligence, predictive intelligence looks for indicators and warnings (I&W), requiring analysts to take logical leaps based on assessments of possible markers indicating a likely decision or event. To do this, analysts must understand the options, choices, influences, and dynamics at play, and set a collection of markers which may (or may not) be sufficient to act upon, independently. These are not evidences, but precursors meant to trigger responses – similar to sounding the alarm to rescue the dike when a trickle of water is seen in the wall. A Minority Report.
Wait, this sounds like assessing probability. Of course it does, as probability is very similar to predictive I&W, with one key difference. In probability assessment, the sum total of the indicators observed creates an average assessment for action (often taking time to collect and collate). In prediction, it may take only one instance to trigger response. Not saying you are looking for the holy grail, but if you see someone has silver bullets, do you really need to see the weapon, moon phase listing, and thermos to figure they are concerned with and preparing for something along the lines of werewolves?
Creating predictive analysis requires a lot of effort, especially if you are looking from a threat perspective. Understanding where an adversary is from a regional/ operational perspective is key to mapping out their potential courses of action (COAs). This will require not just one’s expertise, but many inputs from a variety of experts – finding everyone’s value, tracing out the paths, and identifying what markers to look for to understand what path is being taken. Complexity doesn’t begin to describe what should be built in this mapping process, but here we’ve an excellent case of Cassandra’s Paradox – when you don’t know what you should have, but you have the sense there is more than you’ve found.
Contrary effort works from your known assets/ risks/ controls, a threat-agnostic approach. More akin to resiliency, you set the flags for intelligence-related critical failure lynchpins. If you know operations will go down if x was taken from you, what indicators or warnings could you have for x? A threat-agnostic approach gives actions to take, reduces your resource expenditure, but potentially requires you to expand your cyber ecosystem – as understanding external business continuity becomes need-to-know, not nice-to-know, and here we are talking about relationships between organisations, not maturity assessments.
Whilst we’ve only talked about some considerations in this post, there is much more to talk about. Further posts can explore where to go from here. After all, we start somewhere.