Some interesting takeaways:
1 - The device has scope in translation. They showed how it can recognise visual cues like a banana and translate monkey how the brain views that in a waveform. In theory this would be language and character independent as long as you know what the object you're looking at is. Could be like a Babel chip for the brain, perhaps
2- The manipulation of the pig's legs reminded me of the videogame brawler Toribashi. You can move, compress and expand joints and get really jerky movements. But if you can do this for multiple joints in harmony, you get smooth motion. Neuralink's visual looked a lot like Toribashi's.
3 - Wireless charging without heating up the chip more than 2 degrees (at risk of causing tissue damage) seems like a scary thing to have permanently enabled near your brain/head. But they've done it. But human trials will involve infinitely more complications, surely.
2- The manipulation of the pig's legs reminded me of the videogame brawler Toribashi. You can move, compress and expand joints and get really jerky movements. But if you can do this for multiple joints in harmony, you get smooth motion. Neuralink's visual looked a lot like Toribashi's.
3 - Wireless charging without heating up the chip more than 2 degrees (at risk of causing tissue damage) seems like a scary thing to have permanently enabled near your brain/head. But they've done it. But human trials will involve infinitely more complications, surely.