Skip to main content
Cyberwave Logo
For Developers

Build physical AI like you build web apps

Clean APIs and SDKs. Simulation-first development. Deployment pipelines from sim to real. The catalog is your contract — pick any digital twin, write your application against it, swap hardware without changing code.

1

Pick from the Catalog

The catalog is your contract. Choose any digital twin — arms, AMRs, drones, sensors — and build against a consistent API.

2

Test in Simulation

Same code runs in simulation and production. Train AI models, validate behaviors, iterate safely before touching real hardware.

3

Deploy to Any Hardware

Push to edge with confidence. OTA updates, rollback, and fleet telemetry built in. Swap hardware later without changing your application.

quickstart.py
# The catalog is your contract
import cyberwave as cw
# Pick any device from the catalog
robot = cw.twin("universal_robots/UR7e")
# Same code: simulation or production
robot.move_to(x=0.3, y=0.2, z=0.1)
# Deploy to any site
cw.deploy(robot, site="warehouse-east")

Same SDK. Any robot, camera and more

As easy as importing a library in Python.

cyberwave-terminal
View Full Catalog
Install the SDK
Start developing
Click a highlighted line to actuate
1import cyberwave as cw
2 
3# Patrol a warehouse with the Go2
4go2 = cw.twin("unitree/go2")
5 
6go2.patrol(
7 waypoints=["entrance", "aisle_3", "loading_dock"],
8 speed=0.8,
9 loop=True,
10)
11 
12# React to anomalies in real time
13go2.on("anomaly_detected", lambda e:
14 go2.stop().inspect(e.location)
15)
Loading 3D model...

The Developer Lifecycle: Evolved

Stop building around hardware constraints. Start building software-defined robotics.

Phase

Traditional Automation

(Hardware-Led)

With Cyberwave

(Software-Led)

01.

Hardware Selection

Vendor Lock-in

You choose a vendor, and you're stuck. Switching hardware means rewriting proprietary driver code from scratch.

Hardware Agnostic

Hardware is just an abstraction. Switch from a generic arm to an industrial cobot by changing one line of code.

02.

Integration

"Driver Hell"

Weeks spent writing custom adapters just to make robots talk to sensors (Southbound integration).

Universal Abstraction

Instant connectivity via our open-source Edge Runtime. Sensors and actuators auto-connect to your standardized API.

03.

Simulation

Static Visualization

Pretty 3D models that don't obey physics. Code written here rarely works on the real machine without heavy modification.

Physics-Based Digital Twins

High-fidelity, MuJoCo-powered physics. The code you write for the Twin is the exact same code that runs on the real robot.

04.

Intelligence

Rigid Logic Loops

Hard-coded "If-This-Then-That" scripts. If the environment changes unexpectedly, the system fails.

AI-Native Behaviors

Inject VLM and RL policies natively. Your robots perceive, reason, and adapt to dynamic environments in real-time.

05.

Testing

High-Risk Physical Testing

Validating logic on expensive hardware. A single bug can cause costly damage to assets.

Sim-to-Real Validation

Run thousands of test cycles in the cloud. Validate logic safely in the digital twin before touching the physical edge.

06.

Telemetry

Trapped Data

Logs are stuck on the device. Debugging requires manual retrieval or proprietary vendor tools.

Real-Time Observability

Bi-directional streaming via MQTT & WebRTC. Watch video feeds and sensor data from anywhere in the world.

07.

Deployment

Manual SSH Flashing

Updating a fleet requires logging into devices one by one. Version control is a nightmare.

OTA Fleet Orchestration

Push containerized updates to one robot or one thousand instantly. Manage your fleet like a modern SaaS stack.

Ready to start building?

Get started in under 10 minutes. One SDK, any hardware, zero vendor lock-in.