Privacy vs security? Where do we draw the boundary line
In February, the closing plenary of the National Security College cyberspace symposium was held at the ANU. In the immediate wake of the U.S. court decision ordering Apple to disable an iPhone security feature at the request of the FBI*, this was a timely opportunity for leading minds in cybersecurity to debate the public ethics of privacy versus security.
The privacy versus security debate is not new. Common law relating to the unauthorised interception of communications – at the time, referring to private post – has existed since the 19th century. Similar principles were (eventually) applied to electronic communications, enshrined in common law and legislation. In the cyber era, the question of privacy has become more complex.
Part of this complexity relates to the distinction between metadata and content, with collection of the former having historically been considered less invasive than the latter. When fixed-line telephony was the dominant communications technology, this distinction was simple and appropriate. In modern communications, this is no longer the case. With the variety of communications options available to users through numerous devices, the technological environment is becoming increasingly integrated and complex – so much that ‘metadata’ and ‘content’ have almost become simpler to define based on what they are not. ‘Content’ consists of information input by a user and specifically designed to be seen or heard by an intended recipient. Meanwhile, ‘metadata’, at its most basic, is information about the communication – that is, the when, where, how and to whom.
However, given the growth of metadata sources from numerous platforms, the when, where, how and to whom suddenly take on a new meaning. Taken in their entirety, these pieces of information, innocuous in isolation, are able to paint a comprehensive picture of you as a user. This is precisely why the protection of privacy should be a paramount consideration of security policies – opening the door to a security weakness, even in an ostensibly isolated case, could create opportunities for precisely the kinds of behaviours we are attempting to prevent.
With the context of the Apple decision as our backdrop, this raises numerous questions:
- From a public ethics perspective in relation to privacy versus security, what is the ‘lesser evil’?
- Can we ever consider it appropriate for government to mandate a security weakness, even when that weakness is arguably in the public interest?
- Does the creation of that weakness, unable to be bound to an individual or even a geographic region, constitute arbitrary interference in the privacy of all other users of iPhones?
There is an argument to be made that Apple complying with the court ruling is the ‘lesser evil’ – that given the global and transient nature of the threat posed by violent extremism, the ends of achieving security justify the means. At the ANU symposium, it was argued by one panel member that the ICT industry could not absolve itself of its responsibility to society; that it was obliged, if not to provide evidence, not to withhold it.
Therein lies the problem. Apple is not withholding evidence. Apple is being compelled by the courts to compromise a key security feature, potentially impacting millions of iPhone users globally. The principle of proportionality demands that we ask whether such an action is necessary or appropriate.
At the ANU symposium, Professor Paul Cornish recalled a phrase used during the IRA’s peak: that there must be a threshold for ‘an acceptable level of violence’. How do we define our threshold in this new era, where would-be violent extremists are able to exploit the security features on which we depend? Do the ends justify the means? When do we put limits on the degrading of privacy and security, for the purpose of maintaining our security?
*Yesterday, the FBI announced they had successfully accessed the data on a suspect in the San Bernardino shooting’s iPhone without the assistance of Apple. They are dropping the case.