James Comey's Book Highlights Why Governments And Techies Can't Agree On Encryption

Image: iStock

Over the weekend more excerpts and analysis from James Comey's book, "A Hughes Loyalty" hit the media. And while much of the coverage focusses on his descriptions of the US President and his government, there were some other tidbits in there. In particular, there are some comments Comey made comments regarding the disconnect been the FBI and tech community with regards to encryption. And those comments are important when it comes to legislation that is an advanced stage of development here.

In his book, Comey said "the leaders of the tech companies don't see the darkness the FBI sees", and "Of course the Silicon Valley types don't see the darkness - they live where it's sunny all the time and everybody is rich and smart".

Those comments might sound like the angry rants of someone who didn't get their own way but they highlight something far deeper.

Law enforcement has always operated in a world where the work of industry, the machinations of government and the needs (or perhaps desires is a better word) of law enforcement have been able to stay in sync. The development cycles of new technology worked at a pace where new laws could be drafted in oder to ensure new developments were used for good and the application for criminals were restricted.

In addition, most of the communications infrastructure we used was owned or controlled by government. Most of our communications was carried out over the telephone network and postal services which were government owned until privatisation took over the world. And even when email became ubiquitous, most email is unencrypted and hacking online accounts is possible for a skilled penetration tester or hacker.

But the messaging services we use more and more today are a different matter.

In 1996, a computer science graduate student, Daniel Bernstein, placed himself in the centre of a case regarding access to string encryption. By the end of that three year battle, the courts ruled that restrictions on the export of strong encryption were a violation of the First Amendment right to free speech, therefore allowing anyone to access these powerful tools.

Fifteen years after that decision, Apple and Google upped the ante by encrypting all data on their mobile devices by default and, most importantly, not holding the decryption keys. That meant law enforcement agencies couldn't subpoena or force device makers to decrypt devices. In effect, the tech companies created a system where they simply could not comply with the law.

In Australia, according to a statement from the office of the Acting Attorney general, Marise Payne, new laws are an advanced state of drafting. These would force tech companies to create a system whereby a law enforcement agency, presumably with the backing of a court-issued warrant, would be able to access encrypted communications. A spokesperson from the Attorney General's office said “The government is continuing to consult with key stakeholders".

This is the "back door" privacy advocates are concerned about that could weaken systems and therefore compromise our privacy. The argument is that if the government has access to a tool that allows them to get through encryption then it's inevitable that the tool will either fall into the wrong hands or be misused.

That will give privacy advocates significant concern and will have the tech community up in arms, unless they are too fatigued after losing the battle over metadata retention and the increasing power of Peter Dutton's Department of Home Affairs where he has oversight of ASIO, the AFP and Border Force.

Both sides of this important issue see only a binary outcome. Either encryption is 100% protected and law enforcement loses any ability to access communications or strong encryption is weakened so that it's almost useless.

And I think, perhaps unwittingly, Comey explained exactly why this binary argument has evolved. Neither side has really taken the time to fully understand what the other side is saying. I've spoken with a number of senior law enforcement officials here and a few in the US. For the most part, the people making decisions in those agencies come from the era when they had, within the limits of the law and instruments like warrants, unfettered access to communications. By and large, the public knew that and accepted that it was possible for law enforcement to order a wire-tap on a phone line or they could intercept your mail and read it.

Law enforcement hasn't really moved on from that and adapted their operations to deal with the modern world.

The widespread availability and commoditisation of encryption, decentralisation of services and increased involvement or private enterprise in delivering communications have shifted the old equilibrium. The biggest mistakes law enforcement agencies and governments made were not understanding the impact of that 1999 court decision in the US and getting their legislative ducks in a row then, and what the privatisation of infrastructure would mean. If they had made laws limiting how encryption worked before we all had it on our Android and iOS devices though software like Whatsapp, iMessage, Signal, Telegram and others, then they wouldn't be in the situation they face today.

That said, as a former colleague of mine used to say, "We are where we are".

What I'd like to see is a group of tech-savvy law enforcement people sit in a room with what Comey condescendingly called "Silicon Valley types" and actually try to understand what each side really wants to accomplish. I'm not convinced there has to a winner and a loser in this discussion. Both sides need to calmly discuss what they need and be prepared to concede that they cannot both get everything they want.

For example, the strength of encryption is not law enforcement's actual problem. Their problem is access to the information they need to either stop a criminal from breaking the law or in order to carry out a successful prosecution. Breaking encryption is one way to access that data. Another way might be to brute force attack a device and access it by breaking the password. That could be made possible if device/software makers had a "law enforcement mode" that required a specific piece of hardware that would allow an agency, with a warrant, to brute force attack a device without the current limitations on the number of failed log-in attempts.

I understand that might be unpalatable for some but it's probably better than weakening encryption by using shared keys.

So much of the public discussion about this issue is focused on how one side is tone deaf to the needs of the other side. But I don't see a lot of people trying to solve the problems - just lots of name calling and sabre rattling.

The government cannot legislate against access to encrypted services. That cat is well and truly out of the bag. Their only real option, short of banning the use of encrypted messaging services and effectively blocking the sale of almost every smartphone, is to look for some common ground. And technologists need to concede that bad guys do use encrypted services to do bad things.

In other words, they need to communicate.


Comments

    Over the weekend more excerpts and analysis from James Comey's book, "A Hughes Loyalty" hit the media.

    It's a "A Higher Loyalty"

    Edit: Why are my comments always "awaiting moderation"? This has been happening for ages, I've never been out of line, can you please fix this? Thanks.

    Last edited 16/04/18 1:16 pm

    There is always going to be some kind of problem with devices or encryption that tries to give lawful access to only the "good guys". The "law enforcement mode" that you describe, to allow attempts at brute-forcing passwords without being locked out, only shifts the problem: how do you ensure that only law enforcement officers with a warrant ever get to activate this mode? The possible answers to this question are functionally the same as the possible answers to "how do you ensure that only law enforcement officers with a warrant gain access to encrypted communications?"
    I know I'm coming down on the tech side of the debate here, and I'm not saying anything really new. The fact remains that there is, mathematically, no such thing as encryption that works securely until a good guy with a warrant comes along.

    This x1000.

    The author of this article, like the FBI, like the government, seem to think it's foolish naivety that drives the tech refusal to provide the requested access.

    It is not. Those of us in tech understand the impossibility of what is being asked for. The only thing protecting us before was the difficulty and manpower required to tap an individual. Now, this friction is all but eliminated.

    Repeat after me: there is no such thing as encryption that can only be decrypted by the good guys.

    Any "law enforcement" mode is equally accessible to bad actors too... Just like your metadata now is thanks to a blindly optimistic government that is promising this time for real, they'll definitely protect your data better than that medicare stuff...

      Oh... It's also worth noting, we've yet to see the case or incident where unbreakable encryption has had a meaningful outcome on an investigation. By last account, the FBI turned out to be able to crack those iPhones after all...

    Another way might be to brute force attack a device and access it by breaking the password. That could be made possible if device/software makers had a "law enforcement mode" that required a specific piece of hardware that would allow an agency, with a warrant, to brute force attack a device without the current limitations on the number of failed log-in attempts.Not really. With a sufficiently secure password, brute forcing could take longer than the statute of limitations would allow the suspect to be charged for the crime they are accused of.

      It would take longer than earth is going exist to break 256bit encryption with even the most powerful super computers.

    I would rather a few crimes go unpunished, than my bank account secured with weak encryption.

    The FBI has only itself to blame if it can’t solve a crime without access to a phone.

Join the discussion!

Trending Stories Right Now