Views
URDF Robot (integration)
Drive a URDF robot's joints from an m3u8 joint-angle stream, using the
standalone urdf-loader
package + vuer-m3u's useJointAnglesForUrdf hook.
Compatible dtypes: joint_angles
Why a composition, not a view
URDF rendering pulls in a 3D stack (three, @react-three/fiber,
urdf-loader) that doesn't belong in vuer-m3u's core bundle. So this
isn't a *View component — it's a pattern: vuer-m3u gives you the
data bridge, and you pick a 3D renderer.
The glue is one hook plus ~30 lines of R3F:
const jointValues = useJointAnglesForUrdf(engine, [
'shoulder_pan', 'shoulder_lift', 'elbow',
'wrist_1', 'wrist_2', 'wrist_3', 'gripper',
])
// → { shoulder_pan: 0.12, shoulder_lift: -1.45, ... }Pass the record to robot.setJointValues(jointValues) whenever it changes.
Dependencies
This demo uses five npm packages on top of React — the minimum to render a URDF in a browser:
pnpm add @react-three/fiber @react-three/drei three urdf-loaderurdf-loader is gkjohnson's standalone loader — no AppContext, no
WebSocket, no dial controls. If you want the full @vuer-ai/vuer
<Urdf> component with drag-to-manipulate joints and debug panels,
install that package instead; but for read-only playback from an m3u8
stream, this is all you need.
Data Schema
Same 7-DoF joint stream as JointAngleView.
The i-th entry of jointNames is bound to sample[i].
{"ts": 0.000, "data": [ 0.00, -0.70, 0.20, 1.60, 0.00, 0.70, 0.00]}
{"ts": 0.010, "data": [ 0.00, -0.69, 0.21, 1.59, 0.00, 0.69, 0.00]}Usage
import { useEffect, useState } from 'react'
import { Canvas, useThree } from '@react-three/fiber'
import { OrbitControls } from '@react-three/drei'
import URDFLoader, { URDFRobot } from 'urdf-loader'
import { Box3, Mesh } from 'three'
import {
useTimeline,
ClockProvider,
TimelineController,
usePlaylist,
useJointAnglesForUrdf,
} from '@vuer-ai/vuer-m3u'
const URDF_URL =
'https://raw.githubusercontent.com/unitreerobotics/unitree_rl_gym/main/resources/robots/g1_description/g1_23dof.urdf'
const JOINTS_URL = '/vuer-m3u-demo/humanoid/playlist.m3u8'
const DRIVEN_JOINTS = [
'waist_yaw_joint',
'left_shoulder_pitch_joint', 'left_shoulder_roll_joint',
'left_shoulder_yaw_joint', 'left_elbow_joint', 'left_wrist_roll_joint',
'right_shoulder_pitch_joint','right_shoulder_roll_joint',
'right_shoulder_yaw_joint', 'right_elbow_joint', 'right_wrist_roll_joint',
'left_hip_pitch_joint', 'left_hip_roll_joint', 'left_hip_yaw_joint',
'left_knee_joint', 'left_ankle_pitch_joint','left_ankle_roll_joint',
'right_hip_pitch_joint', 'right_hip_roll_joint', 'right_hip_yaw_joint',
'right_knee_joint', 'right_ankle_pitch_joint','right_ankle_roll_joint',
]
// ~30 lines of our own, replaces @vuer-ai/vuer's heavy <Urdf> component.
function DrivenRobot({ robot }) {
const invalidate = useThree((s) => s.invalidate)
const { engine } = usePlaylist({ url: JOINTS_URL })
const jointValues = useJointAnglesForUrdf(engine, DRIVEN_JOINTS)
useEffect(() => {
robot.setJointValues(jointValues)
invalidate()
}, [robot, jointValues, invalidate])
return <primitive object={robot} />
}
export function UrdfRobotDemo() {
const { clock, state, play, pause, seek, setPlaybackRate, setLoop } = useTimeline()
// Load the URDF + all meshes at parent level so we can show a loading
// overlay while ~19 MB of STLs download.
const [robot, setRobot] = useState(null)
useEffect(() => {
const loader = new URDFLoader()
let parsed = null
// URDFLoader's load callback fires before STL meshes finish — wait for
// manager.onLoad so the bounding-box measurement is valid.
loader.manager.onLoad = () => {
if (!parsed) return
const r = parsed
r.rotation.x = -Math.PI / 2 // Z-up URDF → Y-up three.js
r.traverse((o) => {
if (o.isMesh) { o.castShadow = true; o.receiveShadow = true }
})
r.updateMatrixWorld(true)
r.position.y = -new Box3().setFromObject(r).min.y // feet on Y=0
setRobot(r)
}
loader.load(URDF_URL, (r) => { parsed = r })
}, [])
return (
<ClockProvider clock={clock}>
<Canvas shadows camera={{ position: [3.2, 2.0, 3.2], fov: 48 }}>
<hemisphereLight args={['#bcd3ff', '#3b2416', 0.55]} />
<directionalLight position={[5, 8, 4]} intensity={1.6} castShadow />
<OrbitControls makeDefault target={[0, 1.0, 0]} />
{robot && <DrivenRobot robot={robot} />}
</Canvas>
<TimelineController
state={state}
onPlay={play} onPause={pause} onSeek={seek}
onSpeedChange={setPlaybackRate} onLoopChange={setLoop}
/>
</ClockProvider>
)
}Demo URDF: Unitree G1 (23-DoF), fetched live from
unitreerobotics/unitree_rl_gym. URDFLoader derivesworkingPathfrom thesrcURL, so the relativemeshes/*.STLrefs resolve to raw.githubusercontent.com automatically — nopackagesmap needed. First load pulls ~19 MB of STLs (29 files, cached after). Our mock stream drives all 23 joints in a looping dance: alternating arm raises + flex, hip sway with knees bending on the beat, waist twist.
Coordinate conventions
URDF (and ROS) use Z-up; three.js uses Y-up. Rotate the loaded
URDFRobot by -π/2 around X once at load time so +Z → +Y:
r.rotation.x = -Math.PI / 2Without this, the robot will appear lying on its side. If your scene
already has a Z-up camera (e.g. camera.up.set(0, 0, 1)), skip the
rotation.
Swapping the URDF
Any URL reachable by the browser works.
Plain URDF (primitives only) — point src at it:
<UrdfRobot src="https://.../robot.urdf" jointValues={jointValues} />URDF with mesh references — the URDFLoader resolves relative mesh
paths against the URDF's URL, so meshes colocated under the same host
work out of the box. If the URDF uses ROS-style package://pkgname/...,
set the loader's packages map:
const loader = new URDFLoader()
loader.packages = { robot_description: 'https://.../robot_description' }For strict-CORS hosts add loader.fetchOptions = { mode: 'cors' }.
Under the Hood
useJointAnglesForUrdf(engine, names)
= useMergedTrack(engine)
→ .tracks.get('data')
→ useTrackSample(track, clock.time) // Float32Array[stride]
→ { [names[i]]: sample[i] } // RecordThe resulting record is passed to robot.setJointValues(...) in a
useEffect, which walks the URDF's joint tree once per update — no
per-frame React reconciliation deep in the scene graph.