Its been ten years since Google's Autonomous car passed the first US State Self-Driving test in Nevada. A landmark in the space of self-driving cars or autonomous vehicles, a field with its origins going back to the 1980s. Every year since Google's achievement, it felt the dream of autonomous pods carrying us from place to place is just a hands length away. Yet, in reality, we are still way far away from that dream.

However, AI and semiconductor giant NVIDIA is on a different path regarding innovation in autonomous vehicles. And at GTC 2022 keynote earlier this week, NVIDIA founder and CEO Jensen Huang described how the company is advancing fundamental technologies and announced new breakthroughs in AI, the metaverse, gaming, data centre and high-performance computing, robotics, healthcare and, notably, autonomous vehicles.

I described our work in two ways. One way is the end-to-end result from the Hyperion eight sensor suite and computer architecture all the way into our perception, our localization, our mapping, and our planning systems that led to the car driving by itself in an urban environment door-to-door, address to address. And so, on the one hand, I describe it end to end, said Jensen Huang to INDIAai. 

One of the breakthrough announcements from the GTC keynote this year is the Neural Reconstruction Engine, which is a new AI toolset for the NVIDIA DRIVE Sim simulation platform that uses multiple AI networks to turn recorded video data into a simulation. The new pipeline uses AI to automatically extract the key components needed for simulation, including the environment, 3D assets and scenarios. These pieces are then reconstructed into simulation scenes that have the realism of data recordings but are fully reactive and can be manipulated as needed. Achieving this level of detail and diversity by hand is costly, time-consuming and not scalable.

Even though the NVIDIA DRIVE Sim platform and AI toolsets such as Neural Reconstruction Engine offer end-to-end solutions for autonomous driving, it isn't the only approach the company is taking. 

"In almost every single case, I also describe it in its parts," said Huang. 

"For example, a customer could decide to use our synthetic data generation system. A customer could decide to use our simulation system. A customer could decide to use our mapping system, and this way, we could adapt the components of this end-to-end system, which is very complicated," he added.

"We could take the components of the end-to-end systems and let customers who are building maybe an AMR for a logistics warehouse or an autonomous truck inside the campus or an autonomous shuttle for a university campus or, you know, airport. They could take the pieces of our technology and apply it to their own use case so that they don't have to build it. And they could, they could adapt this robotics machine learning pipeline to their own application." he added.

In fact, with this approach in mind NVIDIA introduced DRIVE Thor, which combines the transformer engine of Hopper, the GPU of Ada, and the amazing CPU of Grace. The new Thor superchip delivers 2,000 teraflops of performance, replacing Atlan on the DRIVE roadmap, and providing a seamless transition from DRIVE Orin, which has 254 TOPS of performance and is currently in production vehicles. 

Want to publish your content?

Publish an article and share your insights to the world.

Get Published Icon
ALSO EXPLORE