A version of this Q&A first appeared in Actuator, TechCrunch’s free robotics newsletter. Subscribe here.
We conclude our year-end robotics Q&A series with this entry by Deepu Talla. In October, he visited NVIDIA’s Bay Area headquarters. Talla has served as Vice President and General Manager of Embedded & Edge Computing for this leading chip company for over 10 years. He provides unique insight into the current state and future direction of robotics in 2023. Over the past few years, NVIDIA has established the leading platform for robotics simulation, prototyping, and deployment.
Previous Q&A:
What role will generative AI play in the future of robotics?
We are already seeing productivity gains from generative AI across a variety of industries. It is clear that the impact of GenAI will be transformative across robotics, from simulation to design and more.
- Simulation: Models can now accelerate simulation development by bridging the gap between 3D technical artists and developers by building scenes, building environments, and generating assets. These GenAI assets will see increased use in synthetic data generation, robotic skill training, and software testing.
- Multimodal AI: Transformer-based models improve robots’ ability to better understand the world around them, allowing them to operate in more environments and complete complex tasks.
- Robot (re)programming: Improves the ability to define tasks and functions in a simple language to make robots more versatile/multipurpose.
- Design: Novel mechanical designs (end effectors, etc.) to improve efficiency.
What do you think about the humanoid form factor?
Designing autonomous robots is difficult. Humanoids are even more difficult. Unlike most of his AMRs, which primarily understand floor-level obstacles, humanoids are locomotion manipulators that require multimodal AI to have a deeper understanding of their surrounding environment. It requires a huge amount of sensor processing, advanced control, and skill execution.
Breakthroughs in generative AI capabilities for building foundational models are making the robotic skills needed for humanoids more commonplace. In parallel, we are also seeing advances in simulation that can train AI-based control and perception systems.
What will be the next major category of robots after manufacturing and warehousing?
Markets where companies are feeling the effects of labor shortages and demographic changes will continue to coincide with corresponding robotic opportunities. This spans robotics companies across a variety of industries, from agriculture to last-mile delivery to retail and more.
The main challenge in building various categories of autonomous robots is building the 3D virtual world needed to simulate and test the stack. Again, generative AI helps by allowing developers to build realistic simulation environments faster. Integrating AI into robotics will enable greater automation in environments that are less active and “robot friendly.”
How far have true general-purpose robots evolved?
We continue to see robots becoming more intelligent and able to perform multiple tasks in a given environment. We hope to continue to focus on mission-specific issues while making it more generalizable. True universal embodied autonomy is even further afield.
Will household robots (beyond vacuum cleaners) become commonplace within the next 10 years?
We plan to make useful personal assistants, lawn mowers, and robots available for joint use to assist the elderly.
The trade-offs that have hampered home robots so far are between how much someone will pay for their robot and whether the robot provides that value. Robot vacuums have long offered value for their price point, which is why they’re growing in popularity.
And as robots become smarter, having an intuitive user interface will be key to increasing adoption. Robots that can map their environment and receive voice commands will be easier for home consumers to use than robots that require programming.
The next category to take off will likely focus on things like automated outdoor lawn care. Other domestic robots, such as personal/healthcare assistants, also hold promise, but must address some of the indoor challenges encountered in dynamic, unstructured home environments.
What are some important robotics stories/trends that aren’t getting enough coverage?
The need for a platform approach. Many robotics startups are unable to scale because they develop robots suited to specific tasks or environments. For large-scale commercial realization, it is important to develop more versatile robots. That is, robots that can quickly add new skills or bring existing skills to new environments.
Robotics engineers need a platform with tools and libraries to train and test AI for robotics. The platform should provide simulation capabilities to train models, generate synthetic data, run the entire robotics software stack, and be able to run modern generative AI models directly on the robot.
Tomorrow’s successful startups and robotics companies will need to focus on developing new robotic skills and automation tasks, and take full advantage of available end-to-end development platforms.
Source: techcrunch.com