Over the weekend more excerpts and analysis from James Comey’s book, “A Hughes Loyalty” hit the media. And while much of the coverage focusses on his descriptions of the US President and his government, there were some other tidbits in there. In particular, there are some comments Comey made comments regarding the disconnect been the FBI and tech community with regards to encryption. And those comments are important when it comes to legislation that is an advanced stage of development here.
In his book, Comey said “the leaders of the tech companies don’t see the darkness the FBI sees”, and “Of course the Silicon Valley types don’t see the darkness – they live where it’s sunny all the time and everybody is rich and smart”.
Those comments might sound like the angry rants of someone who didn’t get their own way but they highlight something far deeper.
Law enforcement has always operated in a world where the work of industry, the machinations of government and the needs (or perhaps desires is a better word) of law enforcement have been able to stay in sync. The development cycles of new technology worked at a pace where new laws could be drafted in oder to ensure new developments were used for good and the application for criminals were restricted.
In addition, most of the communications infrastructure we used was owned or controlled by government. Most of our communications was carried out over the telephone network and postal services which were government owned until privatisation took over the world. And even when email became ubiquitous, most email is unencrypted and hacking online accounts is possible for a skilled penetration tester or hacker.
But the messaging services we use more and more today are a different matter.
In 1996, a computer science graduate student, Daniel Bernstein, placed himself in the centre of a case regarding access to string encryption. By the end of that three year battle, the courts ruled that restrictions on the export of strong encryption were a violation of the First Amendment right to free speech, therefore allowing anyone to access these powerful tools.
Fifteen years after that decision, Apple and Google upped the ante by encrypting all data on their mobile devices by default and, most importantly, not holding the decryption keys. That meant law enforcement agencies couldn’t subpoena or force device makers to decrypt devices. In effect, the tech companies created a system where they simply could not comply with the law.
In Australia, according to a statement from the office of the Acting Attorney general, Marise Payne, new laws are an advanced state of drafting. These would force tech companies to create a system whereby a law enforcement agency, presumably with the backing of a court-issued warrant, would be able to access encrypted communications. A spokesperson from the Attorney General’s office said “The government is continuing to consult with key stakeholders”.
This is the “back door” privacy advocates are concerned about that could weaken systems and therefore compromise our privacy. The argument is that if the government has access to a tool that allows them to get through encryption then it’s inevitable that the tool will either fall into the wrong hands or be misused.
That will give privacy advocates significant concern and will have the tech community up in arms, unless they are too fatigued after losing the battle over metadata retention and the increasing power of Peter Dutton’s Department of Home Affairs where he has oversight of ASIO, the AFP and Border Force.
Both sides of this important issue see only a binary outcome. Either encryption is 100% protected and law enforcement loses any ability to access communications or strong encryption is weakened so that it’s almost useless.
And I think, perhaps unwittingly, Comey explained exactly why this binary argument has evolved. Neither side has really taken the time to fully understand what the other side is saying. I’ve spoken with a number of senior law enforcement officials here and a few in the US. For the most part, the people making decisions in those agencies come from the era when they had, within the limits of the law and instruments like warrants, unfettered access to communications. By and large, the public knew that and accepted that it was possible for law enforcement to order a wire-tap on a phone line or they could intercept your mail and read it.
Law enforcement hasn’t really moved on from that and adapted their operations to deal with the modern world.
The widespread availability and commoditisation of encryption, decentralisation of services and increased involvement or private enterprise in delivering communications have shifted the old equilibrium. The biggest mistakes law enforcement agencies and governments made were not understanding the impact of that 1999 court decision in the US and getting their legislative ducks in a row then, and what the privatisation of infrastructure would mean. If they had made laws limiting how encryption worked before we all had it on our Android and iOS devices though software like Whatsapp, iMessage, Signal, Telegram and others, then they wouldn’t be in the situation they face today.
That said, as a former colleague of mine used to say, “We are where we are”.
What I’d like to see is a group of tech-savvy law enforcement people sit in a room with what Comey condescendingly called “Silicon Valley types” and actually try to understand what each side really wants to accomplish. I’m not convinced there has to a winner and a loser in this discussion. Both sides need to calmly discuss what they need and be prepared to concede that they cannot both get everything they want.
For example, the strength of encryption is not law enforcement’s actual problem. Their problem is access to the information they need to either stop a criminal from breaking the law or in order to carry out a successful prosecution. Breaking encryption is one way to access that data. Another way might be to brute force attack a device and access it by breaking the password. That could be made possible if device/software makers had a “law enforcement mode” that required a specific piece of hardware that would allow an agency, with a warrant, to brute force attack a device without the current limitations on the number of failed log-in attempts.
I understand that might be unpalatable for some but it’s probably better than weakening encryption by using shared keys.
So much of the public discussion about this issue is focused on how one side is tone deaf to the needs of the other side. But I don’t see a lot of people trying to solve the problems – just lots of name calling and sabre rattling.
The government cannot legislate against access to encrypted services. That cat is well and truly out of the bag. Their only real option, short of banning the use of encrypted messaging services and effectively blocking the sale of almost every smartphone, is to look for some common ground. And technologists need to concede that bad guys do use encrypted services to do bad things.
In other words, they need to communicate.
Leave a Reply
You must be logged in to post a comment.