D
DreamLake

Views

Custom Views

Every pre-built view in the views catalog is a thin composition over the four core hooks: useSegment, useSegmentTrack, useMergedTrack, and useTrackSample. This page shows how to build each category yourself. Each sample is a minimal, self-contained implementation you can copy and modify.

If your first question is "which hook?", start with the decision table on the hooks page.


Sample 1 — Discrete events (useSegment)

A mini ActionLabelView. Each segment's JSONL is a list of events; we highlight whichever overlap the current time.

import {
  useClockValue,
  useClockContext,
  usePlaylist,
  useSegment,
  type TimelineClock,
} from '@vuer-ai/vuer-m3u';
 
type ActionEvent = { ts: number; te: number; label: string };
 
export function MyActions({ src, clock }: { src: string; clock?: TimelineClock | null }) {
  const resolvedClock = useClockContext(clock);
  const { engine } = usePlaylist({ url: src }, resolvedClock);
  const { data } = useSegment<ActionEvent[]>(engine, resolvedClock);
  const time = useClockValue(4, resolvedClock);
 
  const events = data ?? [];
  return (
    <ul>
      {events.map((e, i) => {
        const active = time >= e.ts && time < e.te;
        return (
          <li key={i} style={{ fontWeight: active ? 700 : 400 }}>
            {e.label} · {e.ts.toFixed(2)}–{e.te.toFixed(2)}s
          </li>
        );
      })}
    </ul>
  );
}

Key points:

  • useClockContext(clock) resolves the clock once at the top. Everything below threads the resolved value.
  • useClockValue(4, resolvedClock) is the only re-render driver — the list itself doesn't change mid-segment, only the highlight does, so 4 fps is plenty.
  • useSegment<ActionEvent[]> — the type argument documents the per-line JSONL shape.

Sample 2 — Continuous time-series (useMergedTrack + useTrackSample)

A mini JointAngleView. useMergedTrack merges the current segment with its contiguous neighbors into columnar tracks; useTrackSample gives the value at the current clock time.

import {
  useClockValue,
  useClockContext,
  usePlaylist,
  useMergedTrack,
  useTrackSample,
  type TimelineClock,
} from '@vuer-ai/vuer-m3u';
 
export function MyJoints({
  src,
  clock,
  names,
}: {
  src: string;
  clock?: TimelineClock | null;
  names: string[];
}) {
  const resolvedClock = useClockContext(clock);
  const { engine } = usePlaylist({ url: src }, resolvedClock);
  const { tracks } = useMergedTrack(engine, resolvedClock);
  const time = useClockValue(15, resolvedClock);
  const sample = useTrackSample(tracks.get('data'), time);
 
  return (
    <table>
      <tbody>
        {names.map((name, i) => (
          <tr key={name}>
            <td>{name}</td>
            <td>{sample ? sample[i].toFixed(3) : '—'}</td>
          </tr>
        ))}
      </tbody>
    </table>
  );
}

Key points:

  • useMergedTrack handles prefetch, normalization, contiguous merging, and gap safety; it returns a Map<string, TrackSamples> — reach for tracks.get('data') in the single-channel case.
  • useTrackSample(track, time) returns a reused Float32Array of length track.stride. Don't cache it across renders.
  • 15 fps is a good default for numeric displays. Push to 30 fps only when values change fast enough to read.
  • Only need the current chunk (no cross-boundary smoothing)? Swap useMergedTrack for useSegmentTrack — same shape, lighter footprint.

Splitting one stream into multiple tracks (custom normalizer)

When a single JSONL stream encodes heterogeneous channels that need different interpolators (e.g., [x, y, z, qx, qy, qz, qw] in PoseView — position uses lerp, orientation uses slerpQuat), write a custom Normalizer that emits multiple tracks and pass it to useMergedTrack:

import type { Normalizer, ContinuousSample } from '@vuer-ai/vuer-m3u';
 
const poseNormalizer: Normalizer<ContinuousSample[]> = (samples) => {
  if (!samples?.length) return null;
  const n = samples.length;
  const times = new Float32Array(n);
  const pos = new Float32Array(n * 3);
  const quat = new Float32Array(n * 4);
  for (let i = 0; i < n; i++) {
    times[i] = samples[i].ts;
    const d = samples[i].data as number[];
    pos.set(d.slice(0, 3), i * 3);
    quat.set(d.slice(3, 7), i * 4);
  }
  return new Map([
    ['position', { times, values: pos, stride: 3 }],
    ['orientation', { times, values: quat, stride: 4 }],
  ]);
};
 
const { tracks } = useMergedTrack(engine, resolvedClock, { normalize: poseNormalizer });
const position = useTrackSample(tracks.get('position'), time);
const orientation = useTrackSample(tracks.get('orientation'), time, slerpQuat);

The same normalize option is available on useSegmentTrack when you don't need cross-segment merging.


Sample 3 — Canvas at 60fps (sampleTrack)

For a smooth animation, skip useTrackSample (which only fires on React renders) and drive drawing from clock.on('tick'). sampleTrack is the pure function behind the hook.

import { useEffect, useRef } from 'react';
import {
  useClockContext,
  usePlaylist,
  useMergedTrack,
  sampleTrack,
  lerp,
  type TimelineClock,
} from '@vuer-ai/vuer-m3u';
 
export function TrajectoryCanvas({ src, clock }: { src: string; clock?: TimelineClock | null }) {
  const resolvedClock = useClockContext(clock);
  const { engine } = usePlaylist({ url: src }, resolvedClock);
  const { tracks } = useMergedTrack(engine, resolvedClock);
  const canvasRef = useRef<HTMLCanvasElement>(null);
 
  useEffect(() => {
    const canvas = canvasRef.current;
    const track = tracks.get('data');
    if (!canvas || !track) return;
 
    const ctx = canvas.getContext('2d')!;
    const hint = { value: 0 };
    const out = new Float32Array(track.stride);
 
    const unsub = resolvedClock.on('tick', () => {
      sampleTrack(track, resolvedClock.time, lerp, hint, out);
      ctx.clearRect(0, 0, canvas.width, canvas.height);
      ctx.fillStyle = '#38bdf8';
      ctx.beginPath();
      ctx.arc(out[0], out[1], 6, 0, Math.PI * 2);
      ctx.fill();
    });
    return unsub;
  }, [resolvedClock, tracks]);
 
  return <canvas ref={canvasRef} width={400} height={300} />;
}

Key points:

  • The clock.on('tick') subscription drives 60fps drawing without re-rendering React.
  • hint and out live inside the effect — one allocation per mount, amortized lookup.
  • When tracks updates (new chunks merged), the effect re-subscribes with fresh data.

Checklist — adding a new view

Before opening a PR:

  1. Pick the right hook. Re-read the decision table.
  2. Export a *Sample TypeScript type. Describe the JSONL line shape as a type — it's the contract your users type-check against.
  3. Write the file-level JSDoc. At minimum: one-line purpose, data schema (field table), hooks used, recommended source rate.
  4. Accept clock?: TimelineClock | null. Call useClockContext(clock) at the top. Never skip this — it's how the view composes inside <ClockProvider>.
  5. Add the doc page. New file at pages/vuer-m3u/views/<name>/+Page.mdx using the template from any existing view page: Purpose / Props / Data Schema (+ a short concrete JSONL example) / Usage / Under the Hood.
  6. Register in the views catalog. Add a row to the overview table in pages/vuer-m3u/views/+Page.mdx.