When former DARPA chief Regina Dugan announced on stage last month that Facebook planned to build a brain computer interface to allow users to send their thoughts directly to the social network without a keyboard intermediary, it had all the Silicon Valley swagger of Facebook circa "move fast and break things". With the same audacity with which any other Facebook product might be announced, Dugan explained that the company hopes to have this revolutionary brain-hack ready to ship "within a few years".
Illustration by Angelica Alzona/Gizmodo
It's an admirable goal, but there's a problem. The body is not a computer. It cannot be hacked, rewired, engineered or upgraded like one, and certainly not at the ruthless pace of a Silicon Valley startup.
Over the past decade, science has made some notable progress in using technology to defy the limits of the human form, from mind-controlled prosthetic limbs to a growing body of research indicating we may one day be able to slow the process of ageing. Our bodies are the next big candidate for technological optimisation, so it's no wonder that tech industry bigwigs have recently begun expressing interest in them. A lot of interest.
Facebook's announcement that it plans to build a brain-computer interface that types at 100 words-per-minute came on the heels of Tesla-founder Elon Musk's announcement that he was forming a new venture, Neuralink, to develop a brain implant capable of telepathy, among other things. Other rich Silicon Valley types are investing big in pills like nootropics to "hack" their brain chemistry, and still other pills, diets, gut bacteria and DNA data-dives in hopes of achieving a longer, healthier life. "Humans are the next platform," Geoff Woo, the co-founder of Andreessen Horowitz-funded nootropics company Nootrobox told New York Magazine last spring.
Are we really, though?
Take the most computational part of the body, the brain. Our brains do not "store" memories as computers do, simply calling up a desired piece of information from a memory bank. If they did, you'd be able to effortlessly remember what you had for lunch yesterday, or exactly the way your high school boyfriend smiled. Nor do our brains process information like a computer. Our grey matter doesn't actually have wires that you can simply plug-and-play to overwrite depression a la Eternal Sunshine.
The body, too, is more than just a well-oiled piece of machinery. We have yet to identify a single biological mechanism for ageing or fitness that any pill or diet can simply "hack".
Research into some of these things is underway, but so far much of what it has uncovered is that the body and brain are incredibly complex. Scientists do hope, for example, that one day brain computer interfaces might help alleviate severe cases of mental illnesses like depression, and DARPA is currently funding a $US65 million ($88 million) research effort aimed at using implanted electrodes to tackle some of the trickiest mental illnesses. After decades of research, it's still unclear which areas of the brain even make the most sense to target for each illness.
How did a Massachusetts woman end up with two electrodes implanted into her brain? Why is the Defence Advanced Research Projects Agency developing a controversial, cutting-edge brain chip technology that could one day treat everything from major depressive disorder to hand cramps? How did we get to deep brain stimulation and where do we go from here?
But as Silicon Valley has begun to dip its toes in the realm of biology, it has brought along its hacker ethos. All you need to achieve ambitious feats of technological innovation are a few all-night hackathons, right?
Within a mere two years, Facebook thinks it will know whether its plan to send 100-word-per-minute status updates from our brains to our screens is possible. The current record for typing with a brain-computer interface, by the way, is somewhere around eight words-per-minute with an implant placed inside the brain.
And Musk, famous for taking on seemingly impossible moonshots with no clear deadline, said he imagines Neuralink's brain-computer interface making its debut within a decade. This is despite the fact that the brain-reading technology it relies upon is, at this point, little more than a fanciful blueprint. The technology available today can only measure a fraction of the neural activity necessary to link someone's entire brain to a computer, or allow them to communicate with another person without speaking.
In 2009, University of Wisconsin-Madison biomedical engineer Justin Williams oversaw an effort that successfully used a brain-computer interface to send messages from the brain to Twitter.
"It was both a small and a big step," he told Gizmodo. "Ten years later have we gotten much further? I'm not sure."
Something like an email or a Facebook post, he noted, is infinitely more complicated than a 140-character, text-only tweet. "Twitter is about as simple as you can get," he said. "Sending an email sounds easy, but take a minute to think about all the processes that are involved — filling out the subject, the address field, the body. From a biological and technological standpoint, that's really complicated. There are a lot of moving parts."
Just last spring, for the first time a man was able to not only control a prosthetic arm with his mind, but "feel" the arm move, too. That, however, is still a long way off from understanding all the brain's 100 billion neurons and their 100 trillion interconnections, then developing technology good enough to connect every single one of them to a machine.
The issue isn't in the technology — it's in the approach to it.
Startups like Nootrobox and Halo Neuroscience claim that they are already delivering consumer-ready products to make us smarter, faster and stronger. But with so much of this science still so uncertain, it's hard to either prove or disprove their claims. The complexity of human biology means that this research doesn't always move at the pace of Moore's Law, and yet when the tech industry approaches these problems, said Williams, "there is a get-it-done kind of attitude that's pretty pervasive."
Conflating machines with the body is a very old human habit. In the 1500s, automata powered by springs and gears led thinkers like René Descartes to suggest that humans are simply complex machines. In the 1800s, German physicist Hermann von Helmholtz compared the brain to a telegraph. In his 1958 book The Computer and the Brain, the mathematician John von Neumann stated explicitly that the human nervous system is "prima facie digital".
Over time, as technology has changed, so have the metaphors, but the gist is the same: The body is but a fancy machine. This mode of thinking has spawned a philosophical doctrine, generous research funding, and misleading jargon in both the realms of biology and computing (see the brain's "circuits" or deep learning's "neural networks", which have more in common with classical computational models than anything neurobiological).
But this point of view becomes especially troubling when the realms of biology and computing merge. We risk starting to treat the human body — in all its complexity, fragility, resilience and mystery — like the machines we compare it to. We risk over-promising on the deliverables, wasting time, money and public patience on far-out research we suggest we can hack together in a few years. And we risk compromising our health and well-being in the process.
Both Facebook and Neuralink have hired top scientific minds to tackle their respective projects. Facebook hired Dugan, and has partnered with UC Berkeley, Johns Hopkins Medicine and other leading academic research institutions. Likewise, Neuralink has brought on high-profile academic researchers. A flush of Silicon Valley funding into basic research at a time when funding is scarce could very well wind up doing more for scientific progress than every NIH grant combined.
At the same time, many scientists are sceptical that the "move fast and break things" approach will work very well when applied to us. After all, we are living, breathing organisms, not inanimate machines. That's something we should try to remember.