AgiBotWorld-Alpha / README.md
Eralien's picture
Doc: update TOC and table for H5 fields.
f386a97 verified
|
raw
history blame
9.86 kB
metadata
pretty_name: AgiBot World
size_categories:
  - 100K<n<2000K
task_categories:
  - other
language:
  - en
tags:
  - real-world
  - dual-arm
  - Robotics manipulation
Image Alt Text

Key Features πŸ”‘

  • One million+ trajectories from 100 robots.
  • 100+ real-world scenarios across 5 target domains.
  • Tasks involving:
    • Fine-grained manipulation
    • Long-horizon planning
    • Dual-robot collaboration
  • Cutting-edge hardware:
    • Visual tactile sensors
    • 6-DoF Dexterous hand / Gripper
    • Mobile dual-arm robots with whole-body control

Platform Release πŸ“…

  • AgiBot World Beta: ~1,000,000 trajectories of high-quality robot data coming by the end of Q1 2025.
  • AgiBot World Colloseo:Comprehensive platform launching in 2025.
  • 2025 AgiBot World Challenge

Dataset Application Form πŸ“‘

Get started πŸ”₯

Dataset Structure

data format

data
β”œβ”€β”€ task_info
β”‚   β”œβ”€β”€ task_327.json
β”‚   β”œβ”€β”€ task_352.json
β”‚   └── ...
β”œβ”€β”€ observations
β”‚   β”œβ”€β”€ 327[task_id]
β”‚   β”‚   β”œβ”€β”€ 648642[episode_id]
β”‚   β”‚   β”‚   β”œβ”€β”€ depth
β”‚   β”‚   β”‚   β”œβ”€β”€ videos
β”‚   β”‚   β”œβ”€β”€ 648649
β”‚   β”‚   β”‚   └── ...
β”‚   β”‚   └── ...
β”‚   β”œβ”€β”€ 352[task_id]
β”‚   β”‚   β”œβ”€β”€ 648544[episode_id]
β”‚   β”‚   β”‚   β”œβ”€β”€ depth
β”‚   β”‚   β”‚   β”œβ”€β”€ videos
β”‚   β”‚   β”œβ”€β”€ 648564
β”‚   β”‚   β”‚   └── ...
β”‚   └── ...
β”œβ”€β”€ parameters
β”‚   β”œβ”€β”€ 327[task_id]
β”‚   β”‚   β”œβ”€β”€ 648642[episode_id]
β”‚   β”‚   β”‚   β”œβ”€β”€ camera
β”‚   β”‚   β”œβ”€β”€ 648649
β”‚   β”‚   β”‚   └── camera
β”‚   β”‚   └── ...
β”‚   └── 352[task_id]
β”‚       β”œβ”€β”€ 648544[episode_id]
β”‚       β”‚   β”œβ”€β”€ camera
β”‚       └── 648564
β”‚       β”‚    └── camera
|       └── ...
β”œβ”€β”€ proprio_stats
β”‚   β”œβ”€β”€ 327[task_id]
β”‚   β”‚   β”œβ”€β”€ 648642[episode_id]
β”‚   β”‚   β”‚   β”œβ”€β”€ proprio_stats.h5
β”‚   β”‚   β”œβ”€β”€ 648649
β”‚   β”‚   β”‚   └── proprio_stats.h5
β”‚   β”‚   └── ...
β”‚   β”œβ”€β”€ 352[task_id]
β”‚   β”‚   β”œβ”€β”€ 648544[episode_id]
β”‚   β”‚   β”‚   β”œβ”€β”€ proprio_stats.h5
β”‚   β”‚   └── 648564
β”‚   β”‚    └── proprio_stats.h5
β”‚   └── ...

json file format

In the task_[id].json file, we store the basic information of every episode along with the language instructions.Here, we will further explain several specific keywords.

  • error_lable: This key is used to determine whether the entire episode includes error recovery. The currently available values are "None" and "Failure recovery."
  • action_config: The content corresponding to this key is a list composed of all action slices from the episode. Each action slice includes a start and end time, the corresponding atomic skill, and the language instruction.
  • key_frame: The content corresponding to this key consists of annotations for keyframes, including the start and end times of the keyframes and detailed descriptions.
[ {"episode_id": 649078,
   "task_id": 327,
   "task_name": "Picking items in Supermarket",
   "init_scene_text": "The robot is in front of the fruit shelf in the supermarket.",
   "lable_info":{
    "error_lable":"Failure recovery"
    "action_config":[
       {"start_frame": 0,
        "end_frame": 435,
        "action_text": "Pick up onion from the shelf."
        "skill": "Pick"
       },
       {"start_frame": 435,
        "end_frame": 619,
        "action_text": "Place onion into the plastic bag in the shopping cart."
        "skill": "Place"
       },
       ...
    ]
    "key_frame": [
      {"start": 0,
       "end": 435,
       "comment": "Failure recovery"
      }
    ]
},
...
]

h5 file format

In the proprio_stats.h5 file, we store all the robot's proprioceptive data.

|-- timestamp
|-- state
    |-- effector
        |-- force
        |-- position
    |-- end
        |-- angular
        |-- orientation
        |-- position
        |-- velocity
        |-- wrench
    |-- head
        |-- effort
        |-- position
        |-- velocity
    |-- joint
        |-- current_value
        |-- effort
        |-- position
        |-- velocity
    |-- robot
        |-- orientation
        |-- orientation_drift
        |-- position
        |-- position_drift
    |-- waist
        |-- effort
        |-- position
        |-- velocity
|-- action
    |-- effector
        |-- force
        |-- index
        |-- position
    |-- end
        |-- orientation
        |-- position
    |-- head
        |-- effort
        |-- position
        |-- velocity
    |-- joint
        |-- effort
        |-- index
        |-- position
        |-- velocity
    |-- robot
        |-- index
        |-- orientation
        |-- position
        |-- velocity
    |-- waist
        |-- effort
        |-- position
        |-- velocity

Terms

State and action

  1. State State refers to the monitoring information of different sensors and actuators.
  2. Action Action refers to the instructions sent to the hardware abstraction layer, where controller would respond to these instructions. Therefore, there is a difference between the issued instructions and the actual executed state. Actuators
  3. Effector Refers to the end effector, for example dexterous hands or grippers.
  4. End Refers to the 6DoF end pose on the robot flange.
  5. Head The robot has two degrees of freedom, pitch and yaw.
  6. Joint Dual arms with 14 degrees of freedom, 7 DoF each.
  7. Robot Robot referring to the robot's pose in its surrouding environment. The orientation and position refer to the robot's relative pose in the odometry coordinate system, where the origin is set since the robot is powered on.
  8. Waist The robot has two degrees of freedom on its waist, pitch and lift.

Common Fields

  1. Position: Spatial position, encoder position, angle, etc.
  2. Velocity: Speed
  3. Angular: Angular velocity
  4. Effort: Torque of the motor. Not available for now.
  5. Wrench: Six-dimensional force, force in the xyz directions, and torque. Not available for now.

Value shapes and ranges

Group Shape Meaning
/timestamp [$N$] timestamp in nanoseconds
/state/effector/position (gripper) [$N$, 2] left [:, 0], right [:, 1], gripper open range in mm
/state/effector/position (dexhand) [$N$, 12] left [:, :6], right [:, 6:], joint angle in rad
/state/end/orientation [$N$, 2, 4] left [:, 0, :], right [:, 1, :], flange quaternion with xyzw
/state/end/position [$N$, 2, 3] left [:, 0, :], right [:, 1, :], flange xyz in meters
/state/head/position [$N$, 2] pitch [:, 0], yaw [:, 1], rad
/state/joint/current_value [$N$, 14] left arm [:, :7], right arm [:, 7:]
/state/joint/position [$N$, 14] left arm [:, :7], right arm [:, 7:], rad
/state/robot/orientation [$N$, 4] quaternion in xyzw, yaw only
/state/robot/position [$N$, 3] xyz position, where z is always 0 in meters
/state/waist/position [$N$, 2] pitch [:, 0] in radians, lift [:, 1]in centimeters
/action/*/index [$M$] actions do not always has
/action/position (gripper) [$N$, 2] left [:, 0], right [:, 1], 0 for full open and 1 for full close
/action/index [$M_1$] index when the control source for end effector is sending control signals
/action/end/orientation [$N$, 2, 4] same as /state/end/orientation
/action/end/position [$N$, 2, 3] same as /state/end/position
/action/end/index [$M_2$] same as other indexes
/action/head/position [$N$, 2] same as /state/head/position
/action/head/index [$M_3$] same as other indexes
/action/joint/position [$N$, 14] same as /state/joint/position
/action/joint/index [$M_4$] same as other indexes
/action/robot/velocity [$N$, 2] vel along x axis [:, 0] 1.6 to 1.6 m/s, yaw rate -1 to 1 rad/s
/action/robot/index [$M_5$] same as other indexes
/action/waist/position [$N$, 2] same as /state/waist/position
/action/waist/index [$M_6$] same as other indexes

The definitions and data ranges in this section may change with software and hardware version. Stay tuned.

Dataset Preprocess

Our project relies solely on the lerobot library (dataset v2.0)`, please follow their installation instructions. Here, we provide scripts for converting it to the lerobot format.

python scripts/convert_to_lerobot.py --src_path /path/to/agibotworld/alpha --task_id 352 --tar_path /path/to/save/lerobot

We would like to express our gratitude to the developers of lerobot for their outstanding contributions to the open-source community.

License and Citation

All assets and code in this repository are under the MIT License unless specified otherwise. The data is under CC BY-NC-SA 4.0.