Results for ""
NVIDIA has introduced HOVER, a cutting-edge 1.5 million-parameter neural network designed to enable humanoid robots to achieve advanced locomotion and manipulation. Developed by NVIDIA researchers, HOVER was trained in the company’s Isaac simulation suite, a GPU-powered environment capable of performing physics simulations up to 10,000 times faster than in real-time. This efficiency allowed the model to undergo a full year of simulated training in just 50 minutes on a single GPU, reducing development time and streamlining the process of transferring capabilities to real-world applications without additional fine-tuning.
HOVER was engineered to accommodate a wide range of input devices, enabling it to respond to diverse high-level motion instructions. These inputs include head and hand poses from XR devices like the Apple Vision Pro, whole-body positions from motion capture or RGB cameras, joint angles from exoskeletons, and root velocity commands from joysticks. This adaptability allows HOVER to serve as a versatile, unified interface for teleoperating robots and collecting data for further training.
The model integrates seamlessly with upstream Vision-Language-Action (VLA) models, converting complex instructions into fine-grained motor signals at high frequency, facilitating responsive, nuanced robot control. Built to be compatible with any humanoid robot that can be simulated in NVIDIA’s Isaac, HOVER offers developers a powerful platform to translate robotic potential into practical applications across sectors like healthcare, logistics, and customer service.
Earlier this year, NVIDIA also announced Project GR00T, a general-purpose foundation model for humanoid robots capable of understanding natural language and emulating human movements. Robots powered by GR00T are designed to learn coordination and agility by observing human actions, accelerating their ability to interact effectively in the real world.
HOVER and GR00T reflect NVIDIA’s ongoing efforts to advance AI-driven robotics, positioning humanoid robots as adaptable, intelligent companions in real-world environments. This progress marks a significant step toward realizing robotics that can fluidly integrate with human environments, pushing the boundaries of AI in physical, interactive applications.
Source: Research article, Github