Elon Musk’s claims about brain implants for artificial vision get a reality check

HALL of Tech
By -
0
Elon Musk’s claims about brain implants for artificial vision get a reality check Alan Boyle
Side-by-side images show a 45,000-pixel rendering of a cat image at left, and a simulation representing an image rendered by 45,000 electrodes in a brain implant at right. (Credit: Ione Fine / University of Washington)

If Elon Musk’s Neuralink brain-implant venture succeeds in its effort to create next-generation brain implants for artificial vision, the devices could bring about a breakthrough for the visually impaired — but probably wouldn’t match Musk’s claim that they could provide “better than normal vision,” University of Washington researchers report.

In a study published today by the open-access science journal Scientific Reports, UW psychologists Ione Fine and Geoffrey Boynton point out that the brain’s vision system relies on complex interactions between neurons that don’t directly translate into a pixel-by-pixel picture.

“Engineers often think of electrodes as producing pixels, but that is simply not how biology works,” Fine said in a news release. “We hope that our simulations based on a simple model of the visual system can give insight into how these implants are going to perform. These simulations are very different from the intuition an engineer might have if they are thinking in terms of a pixels on a computer screen.”

For the past several years, Neuralink has been developing a system that relies on brain implants and high-level computer processing — with an initial goal of making it possible for quadriplegic patients to interact with their environment by controlling computerized tools with their minds.

One patient, Noland Arbaugh, was equipped with the implant in January as part of a clinical trial. In May, Arbaugh told ABC News that he was “very happy” to be part of the trial, even though the device’s performance had degraded somewhat. During an update in July, Musk said Neuralink’s roster of implant recipients could reach “high single digits this year,” depending on regulatory approvals.

Musk said the next application for Neuralink’s implants, known as Blindsight, would provide artificial vision. Test versions of the device already have been implanted in monkeys to produce single-pixel blips — “a flash here and a flash there” — that have elicited responses from the monkeys, he said.

Blindsight’s performance would have to ramp up significantly before the implants were ready for human clinical trials.

“The initial resolution for vision will be relatively low — something like Atari graphics sort of thing,” Musk said. “But over time, it could potentially be better than normal vision.” (Musk made a similar claim in March on his X social-media platform.)

Fine and Boynton focused on the claims for potential performance by simulating the sorts of images that could be created by combining inputs from tens of thousands of electrodes connected to individual neurons in the visual cortex. By comparison, Arbaugh’s implant has roughly 1,000 electrodes.

The researchers noted that each neuron in the visual system takes in information about imagery in a small region of space known as the receptive field, and not just a single point of light. Their simulations suggested that an image generated by a 45,000-electrode array wouldn’t be nearly as detailed as a 45,000-pixel image naturally generated by the eyes and the brain.

It would be a daunting task to re-create the codes that are used by thousands upon thousands of cells in the visual cortex to produce normal human vision, Fine said.

“Even to get to typical human vision, you would not only have to align an electrode to each cell in the visual cortex, but you’d also have to stimulate it with the appropriate code,” she said. “That is incredibly complicated because each individual cell has its own code. You can’t stimulate 44,000 cells in a blind person and say, ‘Draw what you see when I stimulate this cell.’ It would literally take years to map out every single cell.”

In a follow-up email, Fine told GeekWire that each person has a unique neuronal code for interpreting vision.

“It’s pretty easy to predict the spatial location and the size of the visual world that is represented by a neuron based on anatomy,” she said. “But I can’t think of any way to predict the orientation, or whether that neuron represents an on-cell (bright spot on dark background) or off-cell (dark spot on light background).”

Fine said researchers may someday come up with a conceptual breakthrough that provides a “Rosetta Stone” for visual processing in the brain. It’s also possible that users of an artificial-vision system like Blindsight could learn to adapt to an incorrect code in the system. “But my own research, and that of others, shows that there’s currently no evidence that people have massive abilities to adapt to an incorrect code,” Fine said.

The UW researchers said their computer-generated models may come in handy for assessing the potential performance of artificial-vision systems. Neuralink isn’t the only team working on such systems: For example, a team led by researchers at the Illinois Institute of Technology began a clinical trial of a 400-electrode brain implant known as the Intracortical Visual Prosthesis two years ago. This April, the Illinois team said the implants provided study participants with an improved ability to navigate and perform visually guided tasks.

Fine said artificial-vision simulations may also provide surgeons as well as patients and their families with more realistic expectations for the technology.

“Many people become blind late in life,” she said. “When you’re 70 years old, learning the new skills required to thrive as a blind individual is very difficult. There are high rates of depression. There can be desperation to regain sight. Blindness doesn’t make people vulnerable, but becoming blind late in life can make some people vulnerable. So, when Elon Musk says things like, ‘This is going to be better than human vision,’ that is a dangerous thing to say.”

The research described in the Scientific Reports study, “A Virtual Patient Simulation Modeling the Neural and Perceptual Effects of Human Visual Cortical Stimulation, From Pulse Trains to Percepts,” was funded by the National Institutes of Health.

https://ift.tt/5CGmOLH July 29, 2024 at 11:12PM GeekWire
Tags:

Post a Comment

0Comments

Post a Comment (0)