ROS is, in my opinion, dying on the industry front.
* It is a dependency hell
* It is resource-heavy on embedded systems
* It is too slow for real-time, high speed control loops
* Huge chunks of it are maintained by hobbyists and far behind the state of the art (e.g. the entire navigation stack)
* As robotics moves toward end-to-end AI systems, stuff needs to stay on GPU memory, not shuttled back and forth across processes through a networking stack.
* Decentralized messaging was the wrong call. A bunch of nodes running on a robot doesn't need a decentralized infrastructure. This isn't Bitcoin. Robots talking to each other, maybe, but not pieces of code on the same robot.
Can you say more about the nav stack? I thought nav2 was considered one of the better more mature packages in ROS2, but it's not my area of expertise.
| As robotics moves toward end-to-end AI systems, stuff needs to stay on GPU memory, not shuttled back and forth across processes through a networking stack.
Very interesting. There is nothing that would prevent PeppyOS nodes from running on the GPU. The messaging tech behind PeppyOS is Zenoh (it's swappable), it can run on embedded systems (PeppyOS nodes will also be compatible with embedded in the future). That being said, at the moment the messaging system runs exclusively on the CPU.
What alternatives there are that exist and can replace ROS? I imagine that not all companies are using ROS, however, I'm not in that field exactly so I don't know. I always thought that the quality of that code is mediocre at best.
Most companies in production are inventing their own purpose-built systems and not open-sourcing them. High speed control loops usually use some form of real-time OS, AI-forward robots are starting to use fused CUDA kernels.
* It is a dependency hell
* It is resource-heavy on embedded systems
* It is too slow for real-time, high speed control loops
* Huge chunks of it are maintained by hobbyists and far behind the state of the art (e.g. the entire navigation stack)
* As robotics moves toward end-to-end AI systems, stuff needs to stay on GPU memory, not shuttled back and forth across processes through a networking stack.
* Decentralized messaging was the wrong call. A bunch of nodes running on a robot doesn't need a decentralized infrastructure. This isn't Bitcoin. Robots talking to each other, maybe, but not pieces of code on the same robot.