Digital Resilience & Recovery
What all of our examples have demonstrated it that the breakneck speed with which we are digitising our analogue world is precipitating both extraordinary opportunities and extraordinary threats. The efficiencies and services which arise from converting all types of information into bits and bytes for transfer over the ether is literally revolutionary. But what our examples have also shown is that our almost total dependence of digital systems leaves us very vulnerable. Vulnerable to whatever lies within the billions of lines of code that make our automated and or AI systems work β or not work.
Buried in the Code (AI generated image)
Vulnerable to those with a different view of the world who exploit our freedoms while repressing that of their own citizens. Vulnerable to those actors within our digital system economies who exploit their dominant positions to the ultimate detriment of their competitors and customers/consumers. Vulnerable, most of all, to ourselves β who in our uncritical embrace, acceptance of β and growing dependence upon β this new world, seem unable to get off the digital treadmill set running by others. Others who are literally rewiring our world, values, and ways of thinking and doing so that we can graze upon an apparently infinite buffet of the ‘bespoke realities’ referred to on page 7 (Rise of the Influencer β Death of Typography?). A digital treadmill with no speed control or off-switch available. Unless it is the off-switch of our own mind, but even that has consequences. Consequences explored on page 10 with its theme of Digital Denial.
Digital Treadmill (AI generated image)
Much of the power from our digital revolution is arising because much of the processing and storage takes place in the so-called ‘cloud’ with our various personal or business devices acting as the conduit for transfer or receipt of data. For example, generative AI solutions like ChatGPT or Google’s Gemini would quickly overwhelm the resources of even a top-level domestic device, e.g. a smartphone or desktop computer and so what we see as an output on our device has been processed on banks of computer processors stored in some distant data centre consuming vast quantities of electrical power and requiring extensive cooling and air conditioning. The image below may be visual hyperbole but it conveys nevertheless the resource consumption reality of what generating this very image in ‘the cloud’ actually represents.
Hot Clouds (AI generated image)
The assertion underpinning the ‘cloud’ industry is that data stored remotely from the point of use is always more secure since, theoretically, it is immune from whatever local disaster may befall a local site, e.g. fire, flood, theft. To that end vast quantities of corporate, government, and agency data are now hosted on cloud sites. For the illustrative purposes of this post we will use the UK NHS as an example.
The British NHS has embraced a ‘cloud first’ policy, encouraging the use of cloud computing services for data storage and processing. Data is primarily hosted within the UK. However, hosting within the European Economic Area (EEA) or countries deemed adequate by the UK government is permissible, provided appropriate safeguards are in place. Certain NHS data is also stored in local data centers managed directly by NHS entities. This approach offers direct control over data and is often used for systems requiring low latency or specific security considerations. The NHS employs hybrid models, combining on-premises and cloud storage solutions to balance control, performance, and scalability. (Source: NHS Digital).
On the surface the NHS should be a reference model of coherent data utilisation and security. In reality the NHS has a very decentralised structure with a multiplicity of legacy information systems using outdated technology and varying data formats locked away in custom data silos. That makes it difficult or impossible for such data to interact with modern systems. A veritable digital Tower of Babel ever aspiring to reach NHS heaven but forever confounded by the mutually confusing words the God, as described in the Book of Genesis, has made the occupants of each component part of the tower speak.
Digital Tower of Babel (AI generated image)
As our WannaCry malware example illustrated earlier in this essay resilience was non-existent in many cases. So storing data in the ‘cloud’ is by itself not enough. Nevertheless, the British NHS has vast stores of paper-based paper records currently being digitised and in turn these now turned to bits and bytes will take their place in one or more data centres sited wherever. Some see this digitised NHS data bank as a potential treasure trove of incredible value to academic researchers and medical companies. The benefits derived from access to this data bank include: precision medicine that tailors treatments to the individual; public health research and disease prevention/management; prediction of demand and resources required, e.g. operations; and remote monitoring and telemedicine.
But unless the NHS system becomes genuinely ‘a system’ as opposed to what appears currently to be a broken data mosaic (see image below) then current debates about the ethics, security, and confidentiality issues related to this patient data will rendered moot if it is just too difficult and expensive to access it.
A 2019 survey by the Institute of Global Health Innovation (IGHI) at Imperial College London, identified at least 21 different EPR systems in use across NHS trusts, with limited interoperability between them, potentially compromising patient safety. UK politicians have periodically announced targets to have fully interoperable digital records, e.g. by 2024. At the time of writing, however, NHS trusts and organizations still use continue to use different Electronic Health Record (EHR) systems, such as Cerner, Epic, SystmOne, and EMIS.
Broken Data Mosaic (AI generated image)
And so as the former Conservatives government’s 2024 deadline for EPR reform passed the mantle has passed now to the new UK Labour government who on 21st October 2024 announced the following:
In transforming the NHS from analogue to digital, the government will create a more modern NHS by bringing together a single patient record, summarising patient health information, test results, and letters in one place, through the NHS App. It will put patients in control of their own medical history, meaning they do not have to repeat it at every appointment, and that staff have the full picture of patientsβ health. New laws are set to be introduced to make NHS patient health records available across all NHS trusts, GP surgeries and ambulance services in England – speeding up patient care, reducing repeat medical tests and minimising medication errors. (Government issues rallying cry to the nation to help fix NHS, UK Government press release, 21 Oct 2024)
While we are waiting to see if this iteration of political plans survives engagement with the challenges of on-the-ground reality the fact is that current EPR systems often do not integrate seamlessly with one another. We end up, therefore, in a situation similar to that of our earlier Air Traffic Control failure example where data formats have to be converted to an agreed data format so that different systems can utilise the data, e.g the Fast Healthcare Interoperability Resources (FHIR) standard. We should also note that commercial contracts are involved and so a health provider may well find themselves in contract to a vendor whose commercial interest is in locking-in their client to their offering rather than enabling them to easily migrate to competitor systems.
But all of this discussion about ‘clouds’, data formats, Electronic Patient Records, and concepts like interoperability standards leads to an interesting reflection point about who is the ‘owner’ of this data and where should it actually be being stored? Why should everyone not have a local copy of their own Electronic Patient Record (EPR) but β unlike the 2024 Labour government announcement β provided, say, on a memory stick or data card and not just on the NHS App? Would this local copy not provide a degree of resilience because it would be in the patient’s own interest to look after it?
Whose data? (AI generated image)
There are certainly some benefits to data records accompanying the person. Patients would have immediate access to their medical history and could share it with a health professional when required so ensuring continuity of care and making it easier to move between providers or seek second opinions. Also, always providing a complete record could reduce unnecessary repeat tests and procedures.
But on the debit side a patientβs local copy could be lost, stolen, or accessed by unauthorized individuals, exposing sensitive health information. Consequently, strong encryption and secure storage mechanisms on personal devices would be critical. Such a hurdle could be overcome but again there would need to be just one agreed EPR data format. More significantly, such a local record assumes that all patients have the necessary technology, e.g. smartphones, tablets, or computers, and the technical knowledge to manage their own records securely. A local copy would need to periodically synchronised with healthcare providers’ systems to ensure the data it contained was updated to reflect changes in the patientβs health, treatments, and test results. The process for this synchronisation would need a lot of thought, e.g. as part of the consultation or treatment event. The robustness of the local device would be paramount so that it could survive extreme environmental factors, e.g. heat, submersion, being dropped. Some health professionals are likely to have concerns that without proper context and explanations patients may misinterpret complex medical data, leading to confusion, anxiety, or incorrect self-diagnosis through ‘doomscrolling’ on the internet. In some parts of the UK, however, patients already have access to their summary records online and so perhaps we just have to accept the trade-off and learn to mitigate any doomscrolling consequences by making it part of both digital media-literacy and health education.
Doomscrolling a Diagnosis (AI generated image)
Another confounding issue is the legal position because the UK the General Data Protection Regulation (GDPR) is a stringent UK law that governs how personal data is processed for individuals in the UK. Giving patients full local control might complicate compliance and raise questions of data ownership and responsibility for any breaches that could arise, e.g. a patient being held responsible for their own data breach?
Nevertheless, despite the challenges, as per the UK government annoucement highlighted earlier, a hybrid of ‘cloud’ and ‘local’ is likely to emerge from the use of technologies, such as digital health wallets or apps, like the NHS App. Eventually, even the use of decentralized systems, like blockchain from the cryptocurrency world (see later), could enable secure, patient-controlled records while ensuring data integrity and accessibility. Such hybrids will allow patients to store and share at least select parts of their medical records while maintaining security and data integrity. Meanwhile the concerns related to security, updates, and usability are limiting the wide implementation of such patient-held Electronic Patient Records.
There is a memorable aphorism (which is also a trademarked academic initiative) that applies equally to data stored on the cloud or on local devices. That applies to the biggest corporation or government department or someone with a data collection of some sort at home, i.e.
Lots of Copies Keeps Stuff Safe (LOCKSS)
Alternatively, there is the equally memorable:
No Clone Should be Alone
LOCKSS as a foundation principle of establishing digital resilience just makes sense, i.e. for any important data create multiple copies and store them in geographically distributed locations. This approach protects against data loss due to technical failures, natural disasters, or other unforeseen events.
Lots of Copies Keeps Stuff Safe and No Clone Should be Alone (AI generated image)
Beyond being a memorable aphorism, however, LOCKSS is also a specific digital preservation initiative β using open-source software developed by Stanford University. The goal of the LOCKSS initiative being the long-term preservation of digital content, particularly scholarly materials, e.g. e-journals, electronic theses, government documents, and other scholarly resources. LOCKSS relies on a network of decentralized storage systems called LOCKSS boxes (nodes), in which multiple participating institutions maintain their own copies of the digital content. This ensures that even if one or more copies are lost or corrupted, the remaining copies can be used to recover the original data. LOCKSS has self-healing capabilities because it uses a peer-to-peer polling and repair protocol to detect and correct any corrupted or missing data. When discrepancies are found among copies, the system identifies the most widely agreed-upon version and repairs the corrupted ones. To do this the LOCKSS network continuously audits the stored content. Nodes periodically exchange information to verify that their copies match. LOCKSS participants include: academic libraries; government archives; and cultural heritage organisations and agencies.
The LOCKSS model of a self-healing network of decentralised nodes has echoes of the blockchain undepinning crypto-currency transactions. Indeed both share some conceptual similarities, though they serve different purposes and operate in distinct ways. Both systems rely on decentralised distributed architectures to enhance data redundancy, reliability and integrity, but their goals, mechanisms, and implementations differ. LOCKSS is focused on digital preservation, while blockchain emphasizes secure and transparent record-keeping. Despite their differences, these technologies could potentially intersect in applications requiring both reliable preservation and immutable verification. So, ironically, it is the perceived wild-west of the crytocurrency arena that may contain the keys to building a much higher degree of resilience and an ability to recover and rebuild beyond what we can currently achieve.
For such is the thinness of the digital ice we are currently walking on that most western democracies have established the equivalent of the UK National Cyber Security Centre (NCSC), e.g. US: Cybersecurity and Infrastructure Security Agency (CISA), Canada: Canadian Centre for Cyber Security (CCCS), Australia: Australian Cyber Security Centre (ACSC), New Zealand: National Cyber Security Centre (NCSC), EU: European Union Agency for Cybersecurity (ENISA), NATO: NATO Cooperative Cyber Defence Centre of Excellence (CCDCOE), Swedish Civil Contingencies Agency (MSB), and Ireland: National Cyber Security Centre (NCSC – Ireland).
NCSC (UK) is part of the Government Communications Headquarters (GCHQ), one of the UK security services. NCSC was established in 2016. It is headquartered in London, England. In its public-facing form the NCSC offers advice and guidance on topics such as defending democracy and mitigating malware and ransomware attacks. The NCSC also provides cyber-security advice for businesses, charities, clubs, and schools with up to 250 employees.
UK National Cyber Security Centre
The work and importance of the NCSC is going to grow. Cyber-attacks on national and local infrastructures or governance β as well as any targets of opportunity β are increasing in line with rises in global political tensions to a point that, for most organisations, it is not a question of if they will be attacked or disrupted but when.
National Cyber Security Centre (Respond and Recover)
Hostile nation state are either directly or indirectly (via sub-contractors or proxies, e.g. organised crime gangs) probing for any technical weaknesses, seeking and stealing information or money, or disrupting technical or organisational functions. As our earlier examples illustrate, even very technically literate and well-resourced organisations or corporations are subject to disruption. Yet it is at the level of the small business or domestic setting where human behaviour combined with lack of knowledge or skill is where many vulnerabilities lie. At that level, the gateway to personal or financial information and transactions is situated in the smartphone in the pocket or bag. To that we add the additional risks of invisible embedded communication devices invited into our homes under the banner of the so-called ‘internet-of-things’ and so NCSC curated offerings here in this regard provide a useful primer for the uninitiated.
National Cyber Security Centre (Internet of Things)
Internet of Things (AI generated image)
The rapid entry into the public domain of AI now adds a new dimension of challenges and here the NCSC has responded by offering sources of curated information and advice.
National Cyber Security Centre (Artificial Intelligence)
The generative AI systems like ChatGPT, Google Gemini, Meta’s Llama, IBM’s Granite, and Anthropic’s Claude are the current progeny of the digital revolution. These manifestations of AI already have the capacity to be both daemons and demons but on the technological horizon comes what may eventually displace digital as the dominant model, i.e. quantum computing. Our current digital computers process information as binary digits (bits) which can only ever be in a state of 0 or 1. Quantum computers, however, use quantum bits (qubits), which can exist in multiple states simultaneously. The reason for this is that this is the behaviour of matter and energy on very small scales, such as at the levels of atoms and subatomic particles. A qubit is extremely delicate to the point that even measuring it changes its state. Consequently they require highly controlled environments to function, e.g. low temperatures to reduce thermal noise, often close to absolute zero (0 Kelvin or -273.15Β°C). That is one of the major challenges in building reliable quantum computers.
Frozen Intelligence (AI generated image)
Quantum computing is a very challenging concept so a very sketchy analogy may help explain its potential. Our current AI digital processing, e.g. a ChatGPT query is analogous to a very fast hare that when finding itself caught in a complex maze has no alternative but to try and find the exit (or solution) by trying each potential route one-by-one until is succeeds or runs out of energy or time. Time that consumes a lot of energy and generates a lot of heat in data centres. Alternatively, a number of digital hares could be set running at the same time (generating multiples of heat) and if one finds the route out it lets the others know. Substitute hares for digital processors and we pretty much have how demanding digital tasks like AI are enabled currently with banks of digital processors in vast data centres setting their digital hares running.
In comparison although currently a tortoise in processing speeds terms (relative to digital processors) β albeit a tortoise requiring a lot of delicate nurturing and attention β because quantum quantum computers can perform multiple probabilistic calculations simultaneously such a computer could compute the route out of the maze without having to sequentially explore each potential path and so the time to solution could be faster overall than classical digital computers for certain tasks. Quantum computing employs unique computational, problem-solving and search algorithms not possible within the constraints of the digital realm. That in turn enables exploration of multiple possible outcomes when applied to complex problem areas in parallel. Such complex problem areas include: climate research; cryptography; cybersecurity; financial modeling; supply-chain logistics; and medical research, e.g. disease and treatment-outcome prediction, drug discovery, and molecular or genetic modeling.
Quantum computers are still very much a work-in-progress but they are slowly moving out of the research labs. Google, IBM, Microsoft, and Amazon, plus some emerging companies all have early stage quantum systems with some already available via cloud platforms. Academic and national labs, e.g. China, Europe and US are also key players. Marry quantum computing and AI and perhaps infinite possibilities await. For good or ill. Daemons and demons.
To share this post other people can scan the QR code below directly from your phone screen. Alternatively send the image to them via whatever is your preferred messaging system.