
nvidia
Synthetic Manipulation Motion Generation for Robotics
Generate exponentially large amounts of synthetic motion trajectories for robot manipulation from just a few human demonstrations.
Use-Case Description
Imitation learning lets robots learn skills from observing human demonstrations. But gathering enough high-quality real-world datasets can be challenging, costly, and time-consuming. Synthetic data, generated from physically accurate simulations, addresses the challenge of limited real-world data acquisition by accelerating data collection and providing the diversity needed to generalize robot learning models.
The NVIDIA Isaac GR00T-Mimic blueprint for synthetic manipulation motion generation is the ideal place to start. This is a reference workflow for creating exponentially large amounts of synthetic motion trajectories for robot manipulation from a small number of human demonstrations, built on NVIDIA Omniverse™ and NVIDIA Cosmos™.
First, developers use a spatial computing device such as the Apple Vision Pro to portal into their simulated robot digital twin and record motion demonstration teleoperating a simulated robot. These recordings are then used to generate a larger set of physically accurate synthetic motion trajectories. Finally, the blueprint further augments the dataset by generating an exponentially large, photorealistic, and diverse set of training data.
Experience Walkthrough
The overall experience is divided into four distinct parts:
1. Choose from a pre-recorded set of human demonstrations.
2. View the synthetically generated motion.
3. Select from the list of pre-populated prompts to augment the generated motions.
4. Click "View Source Code” to retrieve the blueprint from GitHub.
Architecture Diagram
What’s Included in the Blueprint
Sample Recorded Data
- Pre-recorded human demonstrations for a single-arm manipulation
Robot Simulation and Training Frameworks
- NVIDIA Isaac™ Lab, an open-source, unified framework for robot learning designed to help train robot policies built on Isaac Sim
Data Generation
- GR00T-Mimic, a feature in Isaac Lab, uses the recorded demonstrations as input to generate synthetic motion trajectories
Data Augmentation
- GR00T-Gen, a feature in Isaac Lab for augmenting 3D datasets to achieve the necessary photorealism and diversity
- Cosmos-Transfer1-7B model
File Deliverables
Input:
- Pre-selected collection of human demonstration recordings, captured with teleoperation in simulation
- Pre-populated prompts to augment data with prepopulated prompts
Output:
- Synthetically-generated trajectories
- Augmented video displayed on the screen
- Jupyter notebook to recreate the end-to-end development experience
Minimum System Requirements
Hardware Requirements
GPU
- NVIDIA 6000 Ada, 4090, 5090, L40, L40S, L20 and A40 or any higher level NVIDIA RTX™-capable GPU
- Cosmos - HGX node (1x H100) TBD
CPU
- Intel Core i7 (7th Generation)
- AMD Ryzen 5
OS Requirements
- Ubuntu 22.04 OS
- Windows 11
Ethical Considerations
NVIDIA believes trustworthy AI is a shared responsibility, and we have established policies and practices to enable development for a wide array of AI applications. When downloaded or used in accordance with our terms of service, developers should work with their supporting team to ensure the technologies meet requirements for the relevant industry and use case and address unforeseen product misuse. Please report security vulnerabilities or NVIDIA AI concerns here.
Licenses
Licensing information for Isaac Lab can be found here.
License information for For NVIDIA Cosmos: can be found under NVIDIA Open Model License
Licensing for GR00T-Mimic can be found here
Terms of Use
Governing Terms: This trial service is governed by the NVIDIA API Trial Terms of Service.
