This makes me very excited! I have Spinal Muscular Atrophy type 2 [1] and I have lost most of my physical capabilities save for 2-3 fingers and of course speech. Although I am now on Nusinersen [2] treatments I am still becoming weaker over time, albeit extremely slowly.
It brings me comfort to know that such a fallback will eventually exist, should I need one.
Note that specialists are saying that another promising drug from Scholar Rock [3] would probably prevent any further weakening if used in conjunction with my current treatment. Unfortunately, the FDA takes a long time to approve new medications and I have heard this one is particularly special because there is potential for abuse by athletes.
I hope you can get on that promising drug ASAP and that it works wonders. I think the future looks bright for medicine with all these new breakthroughs coming out.
I love that one of the first things he said he did was binge on Civilization 6 until 6am. Welcome back, buddy!
Strangely that simple example was the most powerful part for me. I've done that so many times and it was such a fun experience. Now he gets to re-live that joy (and follow up shame) again!
Some clarity for ppl.
- Other techniques would likely work for this patient. Eyegaze technology is pretty readily available. He has proximity switches for driving his chair and lets not forget voice control. So.. what does BCIe offer significantly? I think this is the BIG problem BCI has. The gains are not enough for a lot of people compared to what the AT sector can already offer. Please remember this. This guy could use eyegaze or voice control on pretty regular hardware. And no surgery needed..
- These types of BCI are effectively an array of switches. You typically map a motor thought eg. "Move your arm up" => Moving the cursor up. This maybe how then you control a game such as chess if it has keyboard shortcuts. Eye movement could be done in the same way but there are easier ways. Interestingly to measure these motor commands you dont really need intracortical BCI. You can do it with surface EEG. Sticking it inside your head - closer to centres where you can measure intentional thought makes the signal cleaner and more reliable
- The big breakthroughs is really making this intracortical stuff safer and long term. Its getting there. But this isnt it
The big wins out there - are in speech BCI. Thats hardcore. Even the two main studies doing this - each of the participants requires a LOT of training time to make a Machine model work efficiently.
Eye tracking does not work nearly as well as people imagine. It cannot directly replace a mouse pointer the way you want it to. The accuracy and reliability are not good enough and never will be due to physical constraints. This system is likely already working better than an eye tracker would for cursor control, and it will certainly improve.
Apple has done great stuff with eye tracking on Vision Pro, but it required completely rewriting the UI for literally everything. Not something we have the luxury of doing for accessibility for quadriplegics.
Source: built an eye tracker and eye-controlled UI at a startup and got acquired by Google
He notes these other forms of HCI in the video, and I think you’re really underselling the main point of this being ease of use. All of those methods are significantly degraded experiences compared to normal ways of interacting with a computer. The potential for a quadriplegic to interface with a computer at a higher ease of use compared to a human without disabilities is huge.
I don't think the point for him was to simply find a way to control his mouse. He said at the end of the clip he specifically wanted to help out with Neuralink. He also said the surgery was easy and he was released a day later.
He appears to just think about where the mouse should go and then be able to click and click-and-hold. Seems like multiple inputs which an eye tracker wouldn't do. Unless maybe its just configured to click when the cursor pauses on a spot?
Also the user experience seems better than attaching electrodes to your head. It seems to just work wirelessly. It is always there and sometimes he has to recharge it.
> You can do it with surface EEG. sticking it inside your head - closer to centres where you can measure intentional and thought makes vy signal cleaner and more reliable.
Further clarification: when doing conventional EEG, the signal quality is so fragile that even blinking can produce recording artifact.
Also, there's the whole "put a shower cap with conductive gel" things that makes it very impractical for every day use.
I think once this tech can be used to connect to a robot / exoskeleton, then it will be very useful for someone like him. Imagine him thinking that he wants an apple from the fridge in another room and the robot goes and grabs it for him.
Sorry but what you're saying simply is not true from my experience. The potential of this technology is the ability to perform multiple actions simultaneously e.g. having your character strafe while zooming in while firing a trigger. With eye tracking you are limited to (for the most part) one action at a time, which is what I do now.
You are correct in that one could add more inputs but that only works if you can use the inputs. The individual in the video has full control of his head which many people do not. All I can do, for example, is use like two fingers.
There’s a ton more that these implants can do other than cursor control. And not needing external visible hardware like other forms of input is a bigger deal than you might think, and especially in terms of the user wanting to feel more “normal” and less of a robot in a bundle of contraptions
I haven't followed Neuralink too closely since it was announced, so I was not expecting to see what I just saw. I've seen a handful of breakthrough moments in my life - I think this will be remembered as one.
Why would this be remembered as a breakthrough? Playing games with a BCI is many years old at this point. Here is an article from 2020 talking about playing Sonic the Hedgehog, amongst other games [1]. Here he is fist bumping President Obama in 2016 with a brain controlled robot arm.
I don't think the device itself is a breakthrough, the issue beforehand was that tissue in the brain tries to heal from the implant, it can be lethal, I don't know what they're doing to have a PERMANENT implant, completely stop that area from "healing" so tht the implant doesn't become a legitimate hazard.
Personally, I think the most exciting part of Neuralink and other companies working on BCIs is the fact that they're trying to keep these implants in long-term, and scale the deployment significantly. Most academic BCI research thus far has just been trials, without patients getting to keep the implants long term.
Moving the mouse wasn't that impressive. That he could turn of the music just like that while the game was open was impressive. And Civ 6 is way more complex to operate compared to chess. I assume that's not mouse driven.
Still early days for this tech but it seems impressive.
I never could get into Civ. Do you have any tutorials or a specific game edition to recommend? Maybe my issue is that it's too slow to get started with, but it seems like I'd enjoy it if I stuck with it.
A key differentiator of Neuralink is that the implant can both read and write through ~1000 channels (each of which is a tiny wire into the brain). So it's not really the same thing as external devices that read electrical activity from outside the brain, because those cannot write data. Not sure if the initial implant supports much of this, obviously you'd start with the simplest use cases.
Which is why I will never get anything like that installed unless I'm paralyzed or effectively so. Humans have no mechanism for recognizing that a sensation, thought, or emotion, which arises within themselves, is actually inserted by malware. No thanks.
That's pretty amazing, the fact he's able to click the pause button with his brain alone is insane to me - that's like Apple Vision Pro without the gigantic goggles.
This is a good point. Seems like a likely candidate for an S curve in tech development. i.e. next 15 years of VR are improvements to camera, display, and tracking technology. Following 15 years are brain implants.
The sheer joy on this mans face to be able to freely control a mouse again, and engage with general technology. If they’re able to make this into a generally safe procedure, a lot of people will be interested in just that.
As soon as you can stimulate tactile impressions it's over. You can put on your VR Headset and be in a completely different world. Eventually the interface for eyes and ears will improve, but tactility would be a huge step towards being in a completely virtual world.
Semi related tangent incoming: I’m reminded of a book I read last year named Semiosis by Sue Burke.
Tiny spoiler warning I guess though not really, it’s just background world building that was used as motivation for side character’s growth. In the book, there was a Hitler-esque villain who existed long before the characters were born. The villain killed many billions of people. But through cloning, the societies of Earth punish this villain for their entire life by feeding them torturous scenarios through their brain implants. These were scenarios like being chased and eaten by a tiger, running naked through a frozen tundra, execution, etc.
The clone thought it was entirely real because it was all in their brain implant, even though they were safe in a jail cell. And as an extra Black Mirror-y twist, anyone in that society could tune in with their own implant to watch the clone being tortured.
I’m not really trying to cast doom and gloom on this brain implant tech, I think it’s neat. I was just reminded of the book I read when you mentioned simulating tactile impressions and virtual worlds. Pleasant simulations would be great, but even “benignly” scary ones like a virtual haunted house in your brain could be terrifying. (As someone who hates haunted houses.)
The inspiration for this (as well as for SpaceX) comes from the “Culture” series of novels by Iain M. Banks, which most people are apparently unfamiliar with. Specifically the BCI interface is called “neural lace” there and it grows along with the brain from a seed and covers its entire surface. There it serves as an interface to access superhuman AIs and information in general, on demand, and only the hopeless luddites choose not to have it
Sorry but surely the idea of having a brain computer interface predates the Culture novels by many decades. Similarly, how are these books the inspiration for SpaceX? Those ideas of traveling the stars (well, planets) _absolutely_ predate the culture series.
Seems pretty obvious to me that these ideas originated long ago in the scientific world and were (beautifully) expanded upon by science fiction authors (again, many decades ago).
They probably interviewed him to see if he’d be a good person for PR reasons. And I imagine he got it for free. So he has plenty to be happy about with the situation
I'd like to know if they're doing 'online training' - ie. Do the weights of the neural net which converts raw signal data into mouse movements update themselves every few seconds using historical data?
Such online training might be necessary to deal with brain plasticity - ie. The optimal set of neurons to read to determine X/Y mouse movement right now might not be the same set it was an hour ago.
Such plasticity can be seen in regular humans too when they say 'whoa, I haven't used a pen for months - let me get used to writing again!'.
But giving it the benefit of the doubt, this looks mind blowing to me !
It was kind of known that the research and tech is almost there for a while already, but seeing the demonstration live like that - incredible !
But then it takes me back to those Musk companies - maybe it's just a repackaged already available research presented in a nice way - making us believe it could be 'deployed' in real world, while in reality it can only be done in a very controlled environment. And we are led to believe that we are '2 weeks away' from it being widely available. Hope we're wrong here.
I want to know how big the GPU crunching all the numbers is to make this work...
The mouse seems to move very nicely and smoothly (60 FPS?) which presumably means the neural net which converts raw sensor data to mouse movements is running in ~15 milliseconds.
Most neural nets don't do a forward pass in 15ms unless they're either tiny or the GPU is very powerful.
This video kinda looks like the patient really likes their new implant and it's abilities, but is pretty frustrated they now need to do more trials/make marketing videos for the implant when they just want to get on with their life...
not exactly it is a bunch of wires measuring differential voltages inside the brain not on surface of the skull. So you get a much better specificity in the signals (think of it as resolution). even if you out 1000 probes on a skull you wouldn't get much information.
It is fascinating. But looking at this guy condition, playing games would be IMHO not the prio. I would hook up the remote controlling to a robotic exoskeleton, for just be back and function normal. I guess that also can be detected ( intention to move feet or hands in any direction ).
Your idea is that the priority should have been, instead of first getting a mouse working on his laptop, to hook him up to some kind of robotic exoskeleton?
Of course, but that’s going to take so much more time and effort. You can see how much joy this man is in just having the autonomy to move a mouse again. That’s honestly amazing.
I don't think it can simultaneously control the dozens of degrees of freedom that a robot would have. Yet. You only need three degrees of freedom for a mouse with button. That alone is transformational for a quadriplegic.
ofek|1 year ago
It brings me comfort to know that such a fallback will eventually exist, should I need one.
Note that specialists are saying that another promising drug from Scholar Rock [3] would probably prevent any further weakening if used in conjunction with my current treatment. Unfortunately, the FDA takes a long time to approve new medications and I have heard this one is particularly special because there is potential for abuse by athletes.
[1]: https://en.wikipedia.org/wiki/Spinal_muscular_atrophy
[2]: https://en.wikipedia.org/wiki/Nusinersen
[3]: https://scholarrock.com/our-pipeline/spinal-muscular-atrophy...
jackblemming|1 year ago
ashildr|1 year ago
ilrwbwrkhv|1 year ago
[deleted]
mdeeks|1 year ago
Strangely that simple example was the most powerful part for me. I've done that so many times and it was such a fun experience. Now he gets to re-live that joy (and follow up shame) again!
willwade|1 year ago
- These types of BCI are effectively an array of switches. You typically map a motor thought eg. "Move your arm up" => Moving the cursor up. This maybe how then you control a game such as chess if it has keyboard shortcuts. Eye movement could be done in the same way but there are easier ways. Interestingly to measure these motor commands you dont really need intracortical BCI. You can do it with surface EEG. Sticking it inside your head - closer to centres where you can measure intentional thought makes the signal cleaner and more reliable
- The big breakthroughs is really making this intracortical stuff safer and long term. Its getting there. But this isnt it
The big wins out there - are in speech BCI. Thats hardcore. Even the two main studies doing this - each of the participants requires a LOT of training time to make a Machine model work efficiently.
modeless|1 year ago
Apple has done great stuff with eye tracking on Vision Pro, but it required completely rewriting the UI for literally everything. Not something we have the luxury of doing for accessibility for quadriplegics.
Source: built an eye tracker and eye-controlled UI at a startup and got acquired by Google
Firaxus|1 year ago
mdeeks|1 year ago
He appears to just think about where the mouse should go and then be able to click and click-and-hold. Seems like multiple inputs which an eye tracker wouldn't do. Unless maybe its just configured to click when the cursor pauses on a spot?
Also the user experience seems better than attaching electrodes to your head. It seems to just work wirelessly. It is always there and sometimes he has to recharge it.
PetitPrince|1 year ago
Further clarification: when doing conventional EEG, the signal quality is so fragile that even blinking can produce recording artifact.
Also, there's the whole "put a shower cap with conductive gel" things that makes it very impractical for every day use.
busymom0|1 year ago
ofek|1 year ago
You are correct in that one could add more inputs but that only works if you can use the inputs. The individual in the video has full control of his head which many people do not. All I can do, for example, is use like two fingers.
mlindner|1 year ago
whamlastxmas|1 year ago
gitfan86|1 year ago
Look at the starship program for an example of where you can get in 20 iterations
brettv2|1 year ago
Veserv|1 year ago
[1] https://www.washingtonpost.com/video-games/2020/12/16/brain-...
dev1ycan|1 year ago
zachbee|1 year ago
Personally, I think the most exciting part of Neuralink and other companies working on BCIs is the fact that they're trying to keep these implants in long-term, and scale the deployment significantly. Most academic BCI research thus far has just been trials, without patients getting to keep the implants long term.
panick21_|1 year ago
Still early days for this tech but it seems impressive.
jpetrantoni1|1 year ago
babayetu-org|1 year ago
Neuralink has now achieved product market fit
stavros|1 year ago
wilg|1 year ago
samatman|1 year ago
hablary|1 year ago
sambroner|1 year ago
Firaxus|1 year ago
emsy|1 year ago
ZitchDog|1 year ago
tim333|1 year ago
nozzlegear|1 year ago
Tiny spoiler warning I guess though not really, it’s just background world building that was used as motivation for side character’s growth. In the book, there was a Hitler-esque villain who existed long before the characters were born. The villain killed many billions of people. But through cloning, the societies of Earth punish this villain for their entire life by feeding them torturous scenarios through their brain implants. These were scenarios like being chased and eaten by a tiger, running naked through a frozen tundra, execution, etc.
The clone thought it was entirely real because it was all in their brain implant, even though they were safe in a jail cell. And as an extra Black Mirror-y twist, anyone in that society could tune in with their own implant to watch the clone being tortured.
I’m not really trying to cast doom and gloom on this brain implant tech, I think it’s neat. I was just reminded of the book I read when you mentioned simulating tactile impressions and virtual worlds. Pleasant simulations would be great, but even “benignly” scary ones like a virtual haunted house in your brain could be terrifying. (As someone who hates haunted houses.)
notso411|1 year ago
[deleted]
ein0p|1 year ago
ShamelessC|1 year ago
Seems pretty obvious to me that these ideas originated long ago in the scientific world and were (beautifully) expanded upon by science fiction authors (again, many decades ago).
heavyset_go|1 year ago
escapecharacter|1 year ago
whamlastxmas|1 year ago
londons_explore|1 year ago
Such online training might be necessary to deal with brain plasticity - ie. The optimal set of neurons to read to determine X/Y mouse movement right now might not be the same set it was an hour ago.
Such plasticity can be seen in regular humans too when they say 'whoa, I haven't used a pen for months - let me get used to writing again!'.
sidewndr46|1 year ago
TheAlchemist|1 year ago
But giving it the benefit of the doubt, this looks mind blowing to me !
It was kind of known that the research and tech is almost there for a while already, but seeing the demonstration live like that - incredible !
But then it takes me back to those Musk companies - maybe it's just a repackaged already available research presented in a nice way - making us believe it could be 'deployed' in real world, while in reality it can only be done in a very controlled environment. And we are led to believe that we are '2 weeks away' from it being widely available. Hope we're wrong here.
yinser|1 year ago
fortyseven|1 year ago
j4hdufd8|1 year ago
londons_explore|1 year ago
The mouse seems to move very nicely and smoothly (60 FPS?) which presumably means the neural net which converts raw sensor data to mouse movements is running in ~15 milliseconds.
Most neural nets don't do a forward pass in 15ms unless they're either tiny or the GPU is very powerful.
random9749832|1 year ago
soygem|1 year ago
londons_explore|1 year ago
ricokatayama|1 year ago
AlexCoventry|1 year ago
ta988|1 year ago
ricokatayama|1 year ago
jjslocum3|1 year ago
Let's see, I think after that, the next product should be Magic Missle. Or maybe Sanctuary?
mplewis|1 year ago
helf|1 year ago
[deleted]
notso411|1 year ago
[deleted]
ccozan|1 year ago
But let's see, we are really at the beginning.
wilg|1 year ago
Firaxus|1 year ago
timschmidt|1 year ago
modeless|1 year ago
justrealist|1 year ago
logtempo|1 year ago
https://www.nature.com/articles/s41586-023-06094-5
https://www.cea.fr/english/Pages/News/brain-computer-interfa...