
BLUF
The distortions created by cyber incidents and the challenge of handling cyber incidents are amplified with the introduction of liability. However, the imposition of death – an irreversible and undesirable impact – shows we need to sort it.
Background
For those who haven’t seen it yet, we have our first confirmed ransomware-related death. Essentially, the hospital was experiencing a ransomware incident, impacting monitoring tools. A baby was born with a trauma, normally preventable had those tools been available. The child died in intensive care months later. The family is understandably upset, warranting the lawsuit.
The question is: in complex situations such as this, who makes the call? The doctors may be well familiar with care, but few would qualify in understanding ransomware incident response. The cybersecurity teams can tell what a service’s status is whilst giving approximations for delays, whilst having no idea about the relative need for those services and the various circumstances where the impact is critical. Executives and boards for healthcare must ask the question: when do you turn away those seeking care? Caregivers may be able to operate without, but at what point should they defer to another facility and how can they know without the tools they rely upon? In any operational decision made during cyber incident response, who makes the decision, who is responsible for what, and who is accountable for the outcomes especially the undesirable and irreversible ones?
These decisions are not to be made lightly. Risk considerations from multiple perspectives need to be weighed, and they are often difficult to consider adequately prior to an actual event. Understanding the risk imposed on provisioning care (or not) is difficult enough when isolated to the healthcare facility or a practice within it, but the service that the facility offers the region’s inhabitants – particularly in emergencies, where time is of the essence – may not be absorbed by another facility.
The circumstantial complexity creates a new set of enquiries when things go awry. Who is accountable and for what part? A liability dilemma of this proportion may well make even malpractice attorneys give up. The doctor didn’t provision adequate care, but they couldn’t account for the missing tools that they rely upon the hospital for provisioning. The hospital was dealing with a cyber incident, but was not necessarily thinking of how to keep care unaffected. The facility’s cyber security and incident responders were focused on the task at hand, but should they take ownership for the mess and visibility for the service loss incurred? Should leadership be held accountable for the relative cyber security budget (which will now compete with the necessary defensive legal team fees)? Is the person who clicked on the phishing lure responsible?
To an extent, everyone is partially responsible. But for purposes of liability, it will be hard to pin down who pays.
Determining operational impact and the ways forward is a challenge for any organisation confronted with a cyber incident. However when looking at accountability issues stemming from cyber incidents, some sectors have greater impact than others. Recently, the Colonial Pipeline incident impacted petrol distribution across the US eastern coast. Though the outage was temporary, dramatic cost increases and shortages impacting consumers and logistics created second- and third-order effects potentially causing far more damage. Crisis tides aside, we have difficulty determining accountability within an organisation, and sometimes nothing short of regulators is able to assign a problem’s ownership. This lawsuit opens things up for change.
In Colonial Pipeline’s case, the concern was not the plant or operational floors being compromised, but the administrative functions. The exploitation of front office operations certainly creates a different sort of problem, just ask the City of Atlanta’s courts. As a question of the organisation’s operational resilience, how does an organisation continue provisioning goods and services if operations are fine, but billing is not? As any ICS/ SCADA security expert will tell you, we spend a great deal protecting the infrastructure, only to be brought down by the comptroller.
The responsibility of cyber oversight and establishing accountability rests with the boards of organisations. Cyber risk should be on every organisation’s risk register and an active agenda item for the board.
Boards could ask several questions about cyber risk management: What can the organisation not live without? What is the likely impact of a cyber incident? Is the impact limited to a portion of the organisation? What is the impact to the region/ sector/ clients/ suppliers if the organisation is taken offline? What interdependencies (internal and external) will be affected? What is the direct loss? What are indirect and second- and third- order losses?
In peace time – that is, when not fixing the acute fallout from a cyber attack — boards should also consider: What is the risk culture in the organisation? What is the level of cyber awareness in the organisation? How does anyone report a cyber incident? What is the response time to such a report? If all communication systems go down, how do the accountable business leaders communicate about it to the board? What is the business continuity plan in case it all goes down and operations have to be shut down to contain damage? How fast can we be back up and running?
The distortions created by cyber incidents and the challenge of handling cyber incidents are amplified with the introduction of liability. However, the imposition of death – an irreversible and undesirable impact – shows we need to sort it. I’m sorry it came to this.
-scl
Pingback: Cyber risk heads out of boardrooms, Shefaly Yogendra