Skip to main content
Cyberwave Logo
Cyberwave Capability

Hardware Abstraction

One API surface for any robot, drone, sensor, or arm. Build against Cyberwave, run on any supported hardware.

Hardware Abstraction illustration

One API for every robot

Quadrupeds walk, drones fly, arms manipulate, cameras perceive — but your application code shouldn't care. Cyberwave normalizes every device behind a capability-driven twin API. Write logic once, run it on any supported hardware.

app.pyPython
import cyberwave as cw

client = cw.Cyberwave()

# Same API — different robots
dog   = client.twin("unitree/go2")           # → LocomoteCameraTwin
drone = client.twin("dji/mavic-3")           # → FlyingCameraTwin
arm   = client.twin("universal_robots/ur7")  # → GripperDepthCameraTwin

# Capability-specific methods appear automatically
dog.move_forward(2.0)          # Locomotion
drone.takeoff(altitude=5.0)    # Flight
arm.joints.shoulder_pan = 90   # Manipulation
arm.grip(force=0.8)            # Gripper

# Every twin shares: joints, navigation, camera, events
for twin in [dog, drone, arm]:
    twin.on("event", handle_event)
    print(twin.joints.list())
Automatic Class Selection

The SDK reads capability flags from the catalog and returns the right twin subclass. A quadruped gets move_forward(); a drone gets takeoff(); an arm gets grip(). Multi-capability robots compose automatically — a drone with a gripper becomes a FlyingGripperCameraTwin.

Locomotion
Wheeled, legged, aerial, tracked, or hybrid — one primitive.
LocomoteTwin
move_forward(), turn_left(), move_backward()
Manipulation
Grippers, end-effectors, and multi-DOF arms — exposed as joints.
GripperTwin
Perception
RGB, depth, LiDAR, IMU, thermal — every sensor streamable.
CameraTwin
Flight
Takeoff, land, hover — the flight primitive across any UAV.
FlyingTwin
Combination classes — composed automatically
FlyingCameraTwinGripperCameraTwinLocomoteCameraTwinLocomoteGripperTwinLocomoteGripperCameraTwinFlyingGripperCameraTwinFlyingGripperDepthCameraTwin
Unified Message Layer

Every twin gets its own topic namespace. Telemetry flows up, commands flow down, events propagate to workflows — same structure regardless of what hardware is running underneath.

Telemetry
…/twin/{uuid}/position…/twin/{uuid}/joint/update…/twin/{uuid}/telemetry…/twin/{uuid}/edge_health
Commands
…/twin/{uuid}/command…/twin/{uuid}/navigate/command…/twin/{uuid}/motion/command…/twin/{uuid}/mission/command
Events
…/twin/{uuid}/event…/twin/{uuid}/navigate/status
Streams
…/twin/{uuid}/video (WebRTC)…/twin/{uuid}/depth…/twin/{uuid}/map_update
Source provenance — Every message carries a source_type (edge, tele, sim, edit) so downstream consumers can filter. Teleop commands are gated separately from autonomous outputs; simulation data never reaches physical actuators.
MQTTCloud, UI, commands, events
ZenohLocal ML, video frames, zero-copy
Containerized Edge Drivers

Each robot connects through a driver — a Docker container that speaks the device's native protocol on one side and Cyberwave on the other. Edge Core picks the right driver from asset metadata and handles the rest.

Camera Driver
UVC, RealSense, IP cameras — WebRTC video + depth
ROS 2 Bridge
Go2, UR7, UGV Beast — YAML-configured topic mapping
SO-101 Arm
Feetech servo bus — joint telemetry + calibration
BaseEdgeNode SDK
Python ABC for custom drivers — lifecycle + MQTT built in

Adding a new ROS 2 robot means writing a YAML mapping — not a new driver. For non-ROS hardware, extend BaseEdgeNode and implement four hooks: _setup, _subscribe_to_commands, _build_health_status, and _cleanup.

End-to-End Flow

From one SDK call to physical actuation — the same path for any robot.

1
SDK
twin.navigation.goto(x=5, y=3)
2
Platform
Validates, resolves controller policy, publishes MQTT command
3
Edge Driver
Translates to device-native protocol (ROS /cmd_vel, servo bus, etc.)
4
Telemetry
Position + joint states flow back — twin state updated in real time

Build against capabilities, not brands

Start with any robot from the catalog. The SDK gives you the right interface automatically. Swap hardware later without changing a line of application logic.

Ready to get started?

See how Hardware Abstraction can transform your operations.