LeRobot.js
I always thought that robots were super‑expensive and extremely difficult to use. That perception flipped the moment I unboxed my first robot, the SO-100. Combined with Hugging Face’s LeRobot Python library, I was able to get my robot moving in minutes, because the onboarding was as simple as:
- Find the USB port the robot is connected to on your computer
- Calibrate every joint so the arm learns its full range of motion
- Teleoperate the arm with a keyboard or a leader arm
Everything worked, yet one thing kept nagging me: I live in JavaScript land. Why should web devs miss out on this fun just because the original tooling is in Python? So I asked myself: What’s stopping me from building LeRobot.js so anyone can interact with robots in JS? And the best way to learn something (at least for me) is to build it from the ground up.
That’s how LeRobot.js was born.

Why Bring LeRobot to JS?
From my point of view, robotics bridges the gap between AI and the real world, making it possible for AI to go beyond the screen. To make sure that as many people as possible can actually voice their opinion on how this should look, we need to break out of the Python ecosystem and provide tooling that feels native to the huge JS community.
The mission for LeRobot.js is therefore straightforward: same simplicity, same mental model, just JavaScript.
If you already know the Python API, LeRobot.js should feel instantly familiar. If you start in JS first, you’ll be able to jump to Python without re‑learning everything.
From Python to JavaScript
I began by re‑implementing the must‑have functions everyone touches on day one:
Action | Python | JavaScript |
---|---|---|
Connect | find_port() |
findPort() |
Release motors (disable torque) |
(automatic in calibration) | releaseMotors() |
Calibrate | calibrate(robot) |
calibrate({ robot }) |
Teleoperate | teleoperate(robot, calib) |
teleoperate({ robot, calibrationData }) |
Why releaseMotors()
?
In Python LeRobot, the motors’ torque is switched off automatically when you enter calibration. In the browser we expose that step as an explicit helper so you can free‑move the arm whenever you like.
Quick Start
LeRobot.js currently ships as @lerobot/web
(Node/CLI support will be released very soon) and supports the SO-100 in Chromium‑based browsers (because we need Web Serial to interact with the motors and Web USB to get the serial number of each connected device).
But let's take a look at how this works:
import { findPort, releaseMotors, calibrate, teleoperate } from "@lerobot/web";
// 1. Find and connect to the hardware
const findProcess = await findPort();
const [robot] = await findProcess.result;
// 2. Disable torque so you can move joints by hand
await releaseMotors(robot);
// 3. Guide the arm through its full range to record min/max positions
const calibrationProcess = await calibrate({
robot,
onProgress: (msg) => console.log(msg),
onLiveUpdate: (data) => console.log("Live positions:", data),
});
calibrationProcess.stop();
const calibrationData = await calibrationProcess.result;
// 4. Control the arm with your keyboard
const teleop = await teleoperate({
robot,
calibrationData,
teleop: { type: "keyboard" },
});
teleop.start();
// 5. later → teleop.stop();
Browser Support
- Chromium-based browsers (like Chrome or Edge) version 89 or later (because of Web Serial)
Live Demo
Try the full flow right now in your browser: https://huggingface.co/spaces/NERDDISO/LeRobot.js
It lets you add your SO-100 (more hardware will be added soon), calibrate it and then control it with your keyboard or the UI (more ways to teleoperate will be added soon as well). The space also contains the full docs and the roadmap of what is coming up next.

Under the Hood
LeRobot.js is built with TypeScript, bundled with Vite and published under Apache 2.0 on GitHub and npm.
The demo UI was prototyped in v0.dev using React, shadcn/ui and Tailwind.
Roadmap
TMy goal is to have the full LeRobot feature set in JS, but for now I focus on these key milestones:
Feature | Description | Status |
---|---|---|
Node/CLI | Release @lerobot/node |
in progress |
SO‑100 leader arm | Add the leader arm as a teleoperator for the follower arm | in progress |
record() | Capture trajectories & sensor data | planned |
replay() | Play back recorded episodes | planned |
train() | Train a policy on recorded data | planned |
eval() | Run inference for autonomy | planned |
How You Can Help
Right now LeRobot.js officially supports the SO-100. To broaden hardware coverage I need:
- Pull requests adding new devices
- Access to additional robots, so I can test and add support for new devices
If you have hardware you’d love to see here, open an issue or ping me on the LeRobot Discord (I’m NERDDISCO).
Any help getting training/inference working with ONNX would be very much appreciated as well.
Credits
None of this would be possible without support or inspiration from these individuals:
- Partabot - thanks for providing me with the SO‑100 leader & follower arms
- Xenova - thanks for creating Transformers.js and showing the world that ML can live happily in JS
- BamBot - thanks for proving that browser‑first robot control is real