November 27, 2024
tami sin youtube  twitter facebook

    Will we ever control the world with our minds?

    August 15, 2019

    For decades, controlling computers by thought was the stuff of science fiction. But now we are tantalisingly close to a breakthrough. The question is, does it create more problems than it solves?

    By Mark Piesing

    Science-fiction can sometimes be a good guide to the future. In the film Upgrade (2018) Grey Trace, the main character, is shot in the neck. His wife is shot dead. Trace wakes up to discover that not only has he lost his wife, but he now faces a future as a wheelchair-bound quadriplegic.

    He is implanted with a computer chip called Stem designed by famous tech innovator Eron Keen – any similarity with Elon Musk must be coincidental – which will let him walk again. Stem turns out to be an artificial intelligence (AI) and can “talk” to him in a way no one else can hear. It can even take over control of his body. You can guess the rest of the story.

    The reality of being a cyborg in 2019 is much less dramatic – but still incredible. In 2012, as part of a research programme led by Jennifer Collinger, a biomedical engineer at the University of Pittsburgh, and funded by the US government’s Defense Advanced Research Projects Agency (Darpa), Jan Scheuermann became one of a tiny handful of people to be implanted with a brain-computer interface. The 53-year-old woman, a quadriplegic due to the effects of a degenerative disorder, has two cables attached to box-like sockets in her head, which connect to what looks like a video game console.

    You might also like:

    What it’s like to be a cyborg
    Why police have started scanning brains
    Will we ever communicate telepathically?
    Scheuermann can use this brain-computer interface to control a robotic arm with her thoughts, well enough to feed herself chocolate. Three years later she successfully flew a fighter aircraft in a computer simulator.

    Darpa has been funding research into these interfaces since the 1970s, and now wants to go one step closer to the kind of world glimpsed in Upgrade. The goal of the Next-Generation Nonsurgical Neurotechnology (N3) programme launched earlier this year is to remove the need for electrodes, cables and brain surgery.

    Al Emondi, who manages the programme, has given scientists from six of the USA’s leading research institutes the task of developing a piece of hardware capable of reading your thoughts from the outside of your head and small enough to be embedded into a baseball cap or headrest. In an approach that has been compared to telepathy – or the creation of “a true brain-computer interface”, according to Emondi – the device has to be bi-directional, able to transmit information back to the brain in a form that the brain will understand.

    Emondi has given the scientists only four years to take the new technology from the laboratory to the point it can be tested on humans. Even Elon Musk’s plan for an Upgrade-style brain–computer interface, Neuralink, still requires risky surgery to embed the chip in the brain, even if it does replace cables with a form of wireless communication.

    “The ability to really change the world doesn't happen often in a career,” says Emondi. “If we can build a neural interface that’s not invasive, we will have opened up the door to a whole new ecosystem that doesn’t exist right now.”

    The only way that humans have evolved to interact with the world is through our bodies, our muscles and our senses – Michael Wolmetz

    “The most common applications are to help people who have lost the ability to move their arms and quadriplegics, paraplegics,” says Jacob Robinson, an electrical and computer engineer at Rice University, Houston, Texas, and the principal researcher of one of the teams. “Imagine then, if we can have the same kind of ability to communicate with our machines but without surgery, then we open up this technology to a broad user base, people who are otherwise able-bodied who just want faster ways to communicate with their devices.”

    Some other researchers think our fascination with brain-computer interfaces is about something more profound. “The only way that humans have evolved to interact with the world is through our bodies, our muscles and our senses, and we’re pretty good at it,” says Michael Wolmetz, a human and machine intelligence research lead at Johns Hopkins Applied Physics Laboratory in Laurel, Maryland. “But it’s also a fundamental limitation on our ability to interact with the world. And the only way to get outside of that evolutionary constraint is to directly interface with the brain.”

    Despite its slightly unnerving strapline of “creating breakthrough technologies and capabilities for national security”, Darpa has a history of pioneering technologies that shape the world that we civilians live in. The development of the internet, GPS, virtual assistants like Apple’s Siri and now AI has all been sped up thanks to the dollars ploughed into these areas by the agency. Its funding of research into brain-computer interfaces suggests it could be a similarly game-changing technology. But it is not alone.

    Musk’s Neuralink is just one of a number of projects attracted by the potential of brain-computer interfaces. Major technology firms including Intel are also working in this area.

    And there are great rewards for those who manage to crack it – the market in neurological technology is expected to be worth $13.3bn (£10.95bn) in 2022.

    The quality of the information that you can transmit is limited by the number of channels – Jacob Robinson

    Brain-computer interfaces are possible today only because in the 1800s scientists tried to understand the electrical activity that had been discovered in the brains of animals. During the 1920s, Hans Berger developed the electroencephalograph (EEEG) to detected electrical activity from the surface of the human skull and recorded it. Fifty years later computer scientist Jacques Vidal’s research at the University of California Los Angeles (UCLA) led him to coin the term “brain–computer interface”.

    Scientists then had to wait for computing power, artificial intelligence and nanotechnology for their visions to be realised. In 2004, a quadriplegic patient was implanted with the first advanced computer interface after a stabbing left him paralysed from the neck down. This allowed him to play ping pong on a computer just by thinking about it.

    Despite such successes, problems remain. “The quality of the information that you can transmit is limited by the number of channels,” says Robinson. “The interfaces require cutting a hole in the skull to put the electrode directly in contact with the brain. Your device might only operate for a limited amount of time before your body rejects it; or if the devices fail, it’s hard to get them out.”

    Millimetres in the skull is the equivalent of tens of metres in the ocean and kilometres in the atmosphere in terms of the clutter you have to face – David Blodgett

    To achieve the goal of an interface that works without the need for brain surgery, Emondi’s teams are exploring using combinations of techniques such as ultrasound, magnetic fields, electric fields and light to read our thoughts and/or write back. Problems include how you tell useful neural activity from the cacophony of other noise the brain emits. It has also got to be able to pick up the signals through the skull and the scalp.

    “When you consider the problem of imaging through a scattering medium, millimetres in the skull is the equivalent of tens of metres in the ocean and kilometres in the atmosphere in terms of the clutter you have to face,” says David Blodgett, principal investigator for the team from Johns Hopkins University Applied Physics Laboratory team.

    “But we still believe that we can get very useful information,” says Emondi.

    Some teams are looking at what Emondi calls “minutely invasive surgery”. “You can still put something in the body, but you can’t do it through any surgical means,” he says. This means you have to eat something, inject it or squirt it up your nose. One team is looking at nanoparticles that act as “nanotransducers” when they reach their destination in the brain. These are very small particles the width of a human air that can transform external magnetic energy into an electric signal to the brain and vice versa. Another is looking at using viruses to inject DNA into to cells to alter them to do a similar job.

    If these techniques work, then the performance of a minutely invasive interface should be able to match that of a chip surgically implanted into the body.

    Then there is the challenge of getting the information from the device to the computer and delivering a response in a split second.

    “If you were using a mouse with a computer, and you click it, and then you have wait to a second for it to do something, then that technology would never get off the ground,” says Emondi. “So, we’ve got to do something that’s going to be superfast.”

    The interfaces need to have “high resolution” and enough “bandwidth”, or channels of communication, to fly a real drone rather than move a robotic arm.

    But even if we can do it, how exactly do we communicate? Will we be communicating in words or in pictures? Will we be able to talk with a friend or pay bills online? How much will this be unique to each individual? No one really knows the answers to such questions because the rules haven’t been written yet.

    “All new interfaces take some practice to get used to,” says Patrick Ganzer, co-investigator on the project at Battelle. “It’s hard to say how easy this new brain-computer interface will be to use. We don’t want users to have to learn hundreds of rules. One attractive option is to have outputs from the user’s brain-computer interface to communicate with a semi-autonomous device. The user will not need to control every single action but simply set a ‘process in motion’ in the computer system.”

    No one who is able-bodied has yet chosen to be embedded with an interface in order to play a video game like Fortnite

    Emondi goes further than this: “As the AI becomes better, the systems we are interoperating with are going to become more autonomous. Depending on the task, we may just have to say, ‘I want that ball’ and the robot goes and gets it itself.”

    The film Upgrade may have hinted at a problem, however; who exactly is in control?

    But there are some clues. “To date, most brain-computer interfaces have extracted detailed movement or muscle-related information from the brain activity even if the user is thinking more broadly about their goal,” says Jennifer Collinger. “We can detect in the brain activity which direction they want to move an object and when they want to close their hand and the resulting movement is a direct path to the object that enables them to pick it up. The user does not have to think ‘right’, ‘forward’, ‘down’.”

    “The amount of mental effort required to operate a BCI varies between participants but has typically been greater for non-invasive interfaces. It remains to be seen whether any technologies that come out of N3 will allow the user to multi-task.”

    There is an even more fundamental question than this. No one who is able-bodied has yet chosen to be embedded with an interface in order to play a video game like Fortnite or shop online – and no one knows whether their behaviour towards an interface would be different, nor whether it would change if the chip was in a baseball cap.

    The ethical dilemmas are tremendous. “The benefits coming out of that technology have to outweigh the risks,” says Emondi. “But if you’re not trying to regain some function that you’ve lost then that’s different: that’s why non-invasive approaches are so interesting.

    There is a question of at what point humans become the weakest link in the systems that we use – Michael Wolmetz

    “But just because it’s not invasive technology doesn’t mean that you aren’t causing harm to an individual’s neural interface – microwaves are non-invasive, but they wouldn’t be a good thing,” he adds. “So, there are limits. With ultrasound, you have to work within certain pressure levels. If it’s electric fields, you have to be within certain power levels.”

    The development of powerful brain-computer interfaces may even help humans survive the hypothetical technological singularity, when artificial intelligence surpasses human intelligence and is able to self-replicate itself. Humans could use technology to upgrade themselves to compete with these new rivals, or even merge with an AI, something Elon Musk has made explicit in his sales pitch for Neuralink.

    “Our artificial intelligence systems are getting better and better,” says Wolmetz. “And there is a question of at what point humans become the weakest link in the systems that we use. In order to be able to keep up with the pace of innovation in artificial intelligence and machine learning, we may very well need to directly interface with these systems.”

    In the end, it may not make any difference. At the end of the film Upgrade, Stem takes full control over Grey’s mind and body. The mechanic’s consciousness is left in idyllic dream state in which he isn’t paralysed, and his wife is alive.

    dgi log front

    recu

    electionR2

    Desathiya