- Digital Economy Dispatches
- Posts
- Digital Economy Dispatch #128 -- Ethical Hacking and its Opposite
Digital Economy Dispatch #128 -- Ethical Hacking and its Opposite
Digital Economy Dispatch #128 -- Ethical Hacking and its Opposite
23rd April 2023
I read a lot of books. It is something I have always done ever since I was young. As there was limited choice in my house, I’d read whatever I could get my hands on. It didn’t really make much difference to me. The latest potboiler or a classic hardback novel would be welcome and were just as interesting to my curious mind.
It’s a habit that has continued to this day. Looking at my bedside table, there are always at least half a dozen books on the go. A Spanish historical mystery by Carlos Ruiz Zafon sits on top of Philip Roth’s “American Pastoral”. A collection of humorous essays by David Sedaris shares space with a Zadie Smith novel observing life in multi-cultural London. All grist to the mill.
Despite this wide set of tastes, I have become fascinated by one area in particular. Over the past couple of years, I have found myself drawn repeatedly to reading many more spy novels and particularly the work by the great John Le Carre. I am sure I must have read all his books several times by now. What attracts me is the complex world be paints in his novels. The political intrigue is very rarely as crass as “good” versus “evil”. You don’t build up feelings for one character as always in the “right” and others “wrong”. Without resorting to tricks or artificial plot twists, the stories he tells place you in morally compromising situations where you are constantly questioning the path between actions and consequences.
This is particularly true in his later work. In the post-cold war era, he submerges the reader in a complex world of information gathering where it is almost impossible to determine what is true and what has been invented. Motives and missions are blurred. Intelligence is not so much about finding or delivering information. It is about determining whether it can be verified, understanding where it originated, and assessing if it can be trusted. Information and dis-information are not just two sides of the same coin, they are the alloy from which the coin is cast.
Cyber Warriors
It is within this context that I was rather surprised to see the recent announcements purporting to describe the way the UK cyber force operates. Set up in 2020, the UK’s National Cyber Force (NCF) is a joint effort between GCHQ and the Ministry of Defence and is responsible “for operating in and through cyberspace to counter threats, disrupting and contesting those who would do harm to the UK and its allies, to keep the country safe, and to protect and promote the UK’s interests at home and abroad”. In a document titled “NCF: Responsible Cyber Power in Practice”, it is now revealed - to a degree - how it does this.
The report is fascinating as it provides background on the NCF and the context in which it operates. Much of the report describes in outline the fundamental operational principles by which it works and sets out a basis for what it believes is a responsible approach to cyber warfare consistent with the UK’s values as a cyber power.
What’s interesting is that it is not a technical manual full of techno-babble on how to use digital technologies to collect and analyze data. Rather, the NCF describes its approach as based on a “doctrine of cognitive effect”. Essentially, in rather Le Carre style, it sees its role as sowing distrust, decreasing morale, and hindering operations of adversaries. It does this through the development and use of cyber capabilities to carry out operations including disrupting an adversary’s ability to make use of cyberspace and digital technology, influencing adversaries away from doing harm, and exposing hostile activity and wrongdoing.
The NCF’s role was further revealed in a rare interview with the NCF’s commander, James Babbage (I wonder if that’s his real name!). In an interview in The Economist, under the title of “Cyber warfare is all in the mind”, he described the NCF’s role succinctly as ”disrupt, deny, degrade” the enemy. He views the role of NCF in this ambiguous digital age to be a form of psychological warfare. Rather than a “big red button” to launch an attack, he sees its work as staying in the shadows to induce distrust and paranoia in the enemy so that they feel they cannot rely on information they receive, or are distracted by focusing much needed digital skills on protection and disambiguation tasks.
Our Spies are Better than Your Spies
The commentary and examples from Babbage are useful background. But at its core is a more important issue. Much of the discussion in the latest NCF report focuses on a the more complex ethical challenges in cyber warfare: The basis for a responsible approach to cyber warfare. The concept of “ethical hacking” is quite well developed. With consulting services readily available and training courses are in place for those wanting to take up the reins.
However, Babbage in his interview takes this further. He contrasts the UK approach to the way the Russians are deploying cyber warfare in Ukraine. He emphasizes a clear difference. The UK’s offensive approach is precisely targeted, calibrated to avoid escalation, and clearly accountable to ministers and other officials. He believes that the Russian approach is intentionally broad, indiscriminate, and uncoordinated. As evidence he points to cyber attaches that disrupted thousands of wind turbines in Germany or disable power facilities across Ukraine.
Interestingly, Babbage also points to differences in tactics between the UK and Russia in their use of cyber warfare. Over recent months, he states that the UK tactics have changed priority away from specific attacks to, say, disable a radar station. Rather, they now seek broader impact and psychological influence over planning and strategic decision making by more subtle activities that destabilize the context in which Generals and other leaders take decisions.
These lessons are echoed in a report from the European Cyber Conflict Research Institute. Their study praises the “incredible resilience and determination” shown by Ukraine in its cyberspace defences. A key part of this has been the Ukrainian “IT Army” that has been core to the cyber warfare approach necessary to defend Ukraine from cyber attacks. This volunteer network of IT specialists has been critical to the way Ukraine has not only blocked Russian attempts to disrupt, but also provides the offensive digital information campaign described by Babbage and the NCF.
You Ain’t Seen Nothing Yet
Of course, for many people the current experiences in cyber warfare are just a prelude to what we may soon be experiencing as we consider the impact of 2 key effects.
The first is the impact of AI advances such as the wealth of tools based on Large Language Models (LLMs) like ChatGPT. Already we are seeing increasing concern from the broad availability and ready access to these technologies in 3 areas:
Phishing emails. With ChatGPT it is possible to create realistic, highly personalized messages that can appear to be created by companies, governments, and other institutions. Rather than the easy to spot fake requests common today, a new wave of sophisticated phishing emails are more likely to pass through firewalls and confuse those receiving them.
Malicious code generation. Barriers to using ChatGPT to create malicious code are easily sidestepped. As a result, even people without any technical knowledge can now create code to exploit holes in the security of deployed systems. IT systems administration is about to get a whole lot more challenging.
DDOS. A lot of text can be generated quickly using ChatGPT. It is becoming much easier to overwhelm systems to create Distributed Denial of Service (DDOS) style attacks in a variety of ways. If you think spam is a problem today, just watch.
These present a broad threat to cyber security bringing headaches to every IT systems administrator. However, a second more acute problem seen by others is concern over the current and future digital ambitions of China. Some have described the digital technology advances in China as an “epoch defining” threat to security.
This concern was described very bluntly in a speech recently by Lindy Cameron, director of GCHQ. She believes that “China is not only pushing for parity with western countries, it is aiming for technical supremacy. It will use its tech strength as a lever to achieve a dominant role in global affairs. What does this mean for cybersecurity? Bluntly, we cannot afford not to keep pace otherwise we risk China becoming the predominant power in cyberspace.” Different people have distinct views on the motivations and actions taking place in China. Nevertheless, while rather alarmist, these comments around China’s digital technology ambitions are clearly intended to expand efforts in the UK in this area.
Nothing Left to Lose
Digital technology advances are changing the way cyber warfare is being conducted and reshaping strategies for protecting information. A recent report by the UK’s National Crime Force (NCF) highlights the need for a responsible approach to cyber warfare. But is this realistic in a digital age disrupted by AI advances, and driven by geo-politics? We live in an ambiguous world. In what sounds like a novel from John Le Carre, we now must all acknowledge our responsibilities as digital citizens and ensure that digital transformation activities adopt a strong ethical approach.