Studio →
·robotics

Introducing Substrate Workbench: A Native Workbench for Robotics and Drones

Substrate Workbench is a native, GPU-accelerated workbench for robotics and drone engineers. It unifies code editing, simulation, telemetry, MAVLink inspection, and 3D visualization into a single tightly integrated workspace—without Electron or browser overhead.

Substrate Workbench IDE showing code editor, 3D robot visualization, and telemetry dashboards in a single window

We've been building Substrate Workbench because our own SITL workflow was annoying us. Running a robot in simulation—drone, rover, arm, it doesn't matter—means juggling multiple terminals: one for the simulator, one for the vehicle software, one for monitoring, one for whatever node you're actually developing. None of them talk to each other in any useful way. When something went wrong, we were correlating timestamps across four independent log streams. It got old fast.

What wasn't working

The existing tools cover individual parts of the problem well. Foxglove is good at inspecting ROS bags and plotting live telemetry. RViz works for 3D visualization inside a ROS stack. But both stop at visualization—they don't help you act on what you're seeing because neither has a code editor.

Foxglove is also expensive to run. During live plotting sessions we measured it consuming between 2.5 and 6.5 CPU cores depending on panel count and message rate. It's an Electron app, so it routes through a browser renderer to reach the GPU, which makes 3D work slow.

RViz is ROS-only. It doesn't handle MAVLink, doesn't play back bag files outside a ROS session, and on a remote machine over SSH it's painful—you're either setting up X forwarding or running VNC.

The bigger issue is that none of these tools run alongside your code editor in any integrated way. The feedback loop ends up being: edit code, restart SITL, wait, watch logs in three different places, switch back to editor. We wanted to close that loop.

What we built

Substrate Workbench puts a code editor next to a 3D viewport, telemetry plots, and a MAVLink inspector in a single window. Everything runs in one process. There's no Electron, no browser renderer, no WebSocket bridge between a UI and a backend.

The 3D view and all UI rendering goes through wgpu directly—same GPU submission queue, no cross-process overhead. We built the UI on Floem, a Rust-native immediate-mode framework. Point cloud rendering uses instanced draws so lidar-heavy workloads don't tank frame rates.

The StreamSource trait

One design decision we're happy with: all data sources implement the same StreamSource trait. A MavlinkSource reads from a live vehicle over UDP, TCP, or serial. A GazeboSource subscribes to Gazebo topics via gz-transport. A McapFileSource replays recorded .mcap files. The UI doesn't care which one is connected—the same plot panels, 3D viewport, and MAVLink inspector work identically against any source.

In practice this means switching from a live flight session to reviewing a bag file is a source swap. You're not reconfiguring anything.

Integrated SITL launcher

Select your simulator and vehicle configuration, click launch. Substrate starts the simulation, connects the right data source, and opens the 3D view. No extra terminals. When the simulation exits, its logs are captured and attached to the session.

Protocol translation

A local proxy server normalizes MAVLink streams, Gazebo topics, and Foxglove WebSocket traffic into Substrate's internal message bus before any of it reaches the UI. This keeps the frontend protocol-agnostic. The proxy binary can run on a remote machine—the onboard computer, for example—while the Substrate UI connects to it over SSH.

Early access

Substrate Workbench is in early access now. If you're working on autonomous systems and want to try it, reach out at hello@bedrockdynamics.com or request access through the portal.

Share

Written by Bedrock Dynamics Team