CLI Reference
upload
Upload a file or folder to DreamLake. File type is auto-detected from extension.
Single File
dreamlake upload <file> --episode <target> --to <path> [--type <override>] [--collection <names>]Folder
dreamlake upload <directory> --episode <target> --to <path> [--yes] [--collection <names>]Uploads all files in the directory (flat, no recursion). Mixed types are auto-detected.
Flags
| Flag | Type | Required | Description |
|---|---|---|---|
<file or dir> | positional | Yes | File path or directory |
--episode | string | Yes | Target in [namespace@]space[:episode] format |
--to | string | Yes | Destination path within the episode (e.g. /camera/front) |
--type | string | No | Override auto-detected type (single file only) |
--yes | flag | No | Skip confirmation prompt (folder upload) |
--collection | string | No | Comma-separated collection names. Files are added as members. Auto-created if doesn't exist. |
Auto-detection
| Extension | Type | Lambda Processing |
|---|---|---|
.mp4, .mov, .avi, .mkv, .webm | video | HLS split |
.wav, .mp3, .flac, .aac, .ogg | audio | HLS split (AAC) |
.vtt, .srt | text-track | Time-windowed chunks |
.jsonl, .csv | label-track | Time-windowed chunks |
Unknown extensions are skipped with a warning.
Folder Upload Flow
1. Scan & Summarize
Scanning ./data/ ...
Total 152 files
Video 45
Audio 38
Label Track 42
Text Track 25
Skipped 2 — unknown extension:
readme.txt, notes.md
Continue? [y/n] (y):Shows counts per type. Skipped files (unknown extension) are listed — truncated to 10 if there are many. Use --yes to skip the prompt.
2. Upload with Progress
Uploading ━━━━━━━━━━━━━━━━━━━━━━━━━━ 82/150 run42.mp4 (video, 12.5 MB)Files are uploaded sequentially. Each file internally uses 4 parallel chunk workers.
3. Final Summary
✓ 150/150 uploaded, 2 skippedOr with failures:
✓ 148/150 uploaded, 2 skipped
Failed: 2
run42.mp4: upload failed
mic3.wav: upload failed
Re-run to retry failed files.Resume
Folder uploads are resumable. State is saved to ~/.dreamlake/uploads/folder-{hash}.json after each file completes.
On re-run:
- done files are skipped
- failed files are retried
- New files in the folder are added as pending
Resuming (78/150 done, 1 failed, 71 pending)
Continue? [y/n] (y):Individual files also resume at the chunk level (separate state at ~/.dreamlake/uploads/{hash}.json).
What Happens (Single File)
- File is chunked (10 MB parts) and uploaded to BSS via S3 multipart upload
- File is registered in BSS
- Asset is registered in DreamLake Server — creates episode node, folder hierarchy, and asset leaf node
- Lambda is triggered via presigned URL for HLS segmentation
Episode Syntax
The --episode flag uses [namespace@]space[:episode]:
| Example | Namespace | Space | Episode |
|---|---|---|---|
robotics | (current user) | robotics | (none — space-level) |
alice@robotics | alice | robotics | (none — space-level) |
alice@robotics:run-042 | alice | robotics | run-042 |
Without an episode, assets are uploaded to the project root.
Examples
# Single file
dreamlake upload ./run01.mp4 --episode alice@robotics:run-042 --to /camera/front
# Entire folder
dreamlake upload ./data/ --episode alice@robotics:run-042 --to /sensors
# Folder, skip prompt
dreamlake upload ./data/ --episode alice@robotics:run-042 --to /sensors --yes
# Force type override (single file only)
dreamlake upload ./data.jsonl --episode alice@robotics:run-042 --to /data --type text-track
# Upload and add to a collection (auto-created)
dreamlake upload ./run01.mp4 --episode alice@robotics:run-042 --to /camera/front \
--collection front-camera
# Upload to multiple collections
dreamlake upload ./data/ --episode alice@robotics:run-042 --to /sensors \
--collection training-set,batch-001
# Folder upload + collections
dreamlake upload ./data/ --episode alice@robotics:run-042 --to /sensors \
--collection front-camera,all-videos --yesCollections
The --collection flag adds uploaded files to one or more collections (comma-separated). Collections are auto-created if they don't exist.
- Single file: the file's node ID is added to each collection
- Folder: all successfully uploaded files' node IDs are added after the batch completes
- Collections are linked to the sameproject as the upload target
See Collections API for managing collections via the API.
See Chunked Upload for the upload pipeline design.