D
DreamLake

CLI Reference

upload

Upload a file or folder to DreamLake. File type is auto-detected from extension.

Single File

dreamlake upload <file> --episode <target> --to <path> [--type <override>] [--collection <names>]

Folder

dreamlake upload <directory> --episode <target> --to <path> [--yes] [--collection <names>]

Uploads all files in the directory (flat, no recursion). Mixed types are auto-detected.

Flags

FlagTypeRequiredDescription
<file or dir>positionalYesFile path or directory
--episodestringYesTarget in [namespace@]space[:episode] format
--tostringYesDestination path within the episode (e.g. /camera/front)
--typestringNoOverride auto-detected type (single file only)
--yesflagNoSkip confirmation prompt (folder upload)
--collectionstringNoComma-separated collection names. Files are added as members. Auto-created if doesn't exist.

Auto-detection

ExtensionTypeLambda Processing
.mp4, .mov, .avi, .mkv, .webmvideoHLS split
.wav, .mp3, .flac, .aac, .oggaudioHLS split (AAC)
.vtt, .srttext-trackTime-windowed chunks
.jsonl, .csvlabel-trackTime-windowed chunks

Unknown extensions are skipped with a warning.

Folder Upload Flow

1. Scan & Summarize

Scanning ./data/ ...
 
  Total             152 files
  Video              45
  Audio              38
  Label Track        42
  Text Track         25
  Skipped             2  — unknown extension:
    readme.txt, notes.md
 
  Continue? [y/n] (y):

Shows counts per type. Skipped files (unknown extension) are listed — truncated to 10 if there are many. Use --yes to skip the prompt.

2. Upload with Progress

Uploading ━━━━━━━━━━━━━━━━━━━━━━━━━━ 82/150  run42.mp4 (video, 12.5 MB)

Files are uploaded sequentially. Each file internally uses 4 parallel chunk workers.

3. Final Summary

150/150 uploaded, 2 skipped

Or with failures:

148/150 uploaded, 2 skipped
  Failed: 2
    run42.mp4: upload failed
    mic3.wav: upload failed
  Re-run to retry failed files.

Resume

Folder uploads are resumable. State is saved to ~/.dreamlake/uploads/folder-{hash}.json after each file completes.

On re-run:

  • done files are skipped
  • failed files are retried
  • New files in the folder are added as pending
Resuming (78/150 done, 1 failed, 71 pending)
 
  Continue? [y/n] (y):

Individual files also resume at the chunk level (separate state at ~/.dreamlake/uploads/{hash}.json).

What Happens (Single File)

  1. File is chunked (10 MB parts) and uploaded to BSS via S3 multipart upload
  2. File is registered in BSS
  3. Asset is registered in DreamLake Server — creates episode node, folder hierarchy, and asset leaf node
  4. Lambda is triggered via presigned URL for HLS segmentation

Episode Syntax

The --episode flag uses [namespace@]space[:episode]:

ExampleNamespaceSpaceEpisode
robotics(current user)robotics(none — space-level)
alice@roboticsalicerobotics(none — space-level)
alice@robotics:run-042aliceroboticsrun-042

Without an episode, assets are uploaded to the project root.

Examples

# Single file
dreamlake upload ./run01.mp4 --episode alice@robotics:run-042 --to /camera/front
 
# Entire folder
dreamlake upload ./data/ --episode alice@robotics:run-042 --to /sensors
 
# Folder, skip prompt
dreamlake upload ./data/ --episode alice@robotics:run-042 --to /sensors --yes
 
# Force type override (single file only)
dreamlake upload ./data.jsonl --episode alice@robotics:run-042 --to /data --type text-track
 
# Upload and add to a collection (auto-created)
dreamlake upload ./run01.mp4 --episode alice@robotics:run-042 --to /camera/front \
  --collection front-camera
 
# Upload to multiple collections
dreamlake upload ./data/ --episode alice@robotics:run-042 --to /sensors \
  --collection training-set,batch-001
 
# Folder upload + collections
dreamlake upload ./data/ --episode alice@robotics:run-042 --to /sensors \
  --collection front-camera,all-videos --yes

Collections

The --collection flag adds uploaded files to one or more collections (comma-separated). Collections are auto-created if they don't exist.

  • Single file: the file's node ID is added to each collection
  • Folder: all successfully uploaded files' node IDs are added after the batch completes
  • Collections are linked to the sameproject as the upload target

See Collections API for managing collections via the API.

See Chunked Upload for the upload pipeline design.