AgiBot, a robotics startup based in Shanghai, has announced the release of what it describes as the largest humanoid manipulation dataset to date, named AgiBot World. The dataset is designed to facilitate the development of general-purpose robots for practical use in everyday settings.
The AgiBot World ecosystem includes a dataset, foundational models, standardized benchmarks, and a collaborative framework to enable widespread access to high-quality robotic data. The initiative aims to address limitations in current robot learning benchmarks, which often rely on low-quality data and are restricted to short-horizon tasks in controlled environments. These constraints hinder the advancement of adaptable robotic systems capable of functioning in dynamic, unstructured real-world settings.
AgiBot World comprises over one million trajectories gathered from 100 robots operating across 100 real-world scenarios within five target domains. The dataset emphasizes tasks such as fine-grained manipulation, tool usage, and multi-robot collaboration. These scenarios have been designed to reflect complex, real-world applications, offering a comprehensive foundation for advancing robotics research.
The dataset features advanced multimodal hardware, including array-based visual tactile sensors, 6-DoF robotic hands, and mobile dual-arm robots with whole-body control capabilities. The platform is positioned to support research in multimodal imitation learning, multi-agent collaboration, and adaptive manipulation.
AgiBot envisions AgiBot World as a catalyst for a transformative leap in robotics, drawing comparisons to the “ImageNet Moment” in computer vision, which marked a significant advance in artificial intelligence research. By releasing this open-source platform, the company seeks to foster collaboration between academia and industry, advancing scalable robotic systems for real-world applications.