-
Notifications
You must be signed in to change notification settings - Fork 136
Cmaf Native Support #822
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Cmaf Native Support #822
Conversation
Co-authored-by: Emil Santurio <emilsas@gmail.com>
WalkthroughThis pull request refactors container format handling from "legacy"/"fmp4" nomenclature to "native"/"cmaf", adds optional 🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 18
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (3)
rs/hang/src/catalog/root.rs (1)
292-381: Backward compatibility is not actually implemented despite the documentation claim.The
containerfield in bothAudioConfigandVideoConfiglacks#[serde(default)]. Without this attribute, serde requires the field to be present in JSON during deserialization, even thoughContainerhas aDefaultimplementation. The documentation stating "Defaults to 'native' for backward compatibility" is misleading.To support backward compatibility with older catalogs missing the
containerfield, add#[serde(default)]to thecontainerfield in both structs:#[serde(default)] pub container: Container,Then add a test to verify older JSON without the field deserializes correctly.
justfile (1)
150-255: Exit-code propagation block won’t run oncargo runfailure underset -e.If you want the explicit
EXIT_CODEhandling to work (and always run the post-cargo runlines), wrap the invocation withset +e/set -e(or remove the explicit propagation and rely ontrap cleanup EXIT).Proposed diff (keep explicit EXIT_CODE handling)
# Run hang to ingest from local files - if [ -n "$PASSTHROUGH_FLAG" ]; then - echo ">>> Running with --passthrough flag" - cargo run --bin hang -- publish --url "{{relay}}" --name "{{name}}" hls --playlist "$OUT_DIR/master.m3u8" --passthrough - else - echo ">>> Running without --passthrough flag" - cargo run --bin hang -- publish --url "{{relay}}" --name "{{name}}" hls --playlist "$OUT_DIR/master.m3u8" - fi - EXIT_CODE=$? + set +e + if [ -n "$PASSTHROUGH_FLAG" ]; then + echo ">>> Running with --passthrough flag" + cargo run --bin hang -- publish --url "{{relay}}" --name "{{name}}" hls --playlist "$OUT_DIR/master.m3u8" --passthrough + else + echo ">>> Running without --passthrough flag" + cargo run --bin hang -- publish --url "{{relay}}" --name "{{name}}" hls --playlist "$OUT_DIR/master.m3u8" + fi + EXIT_CODE=$? + set -e # Cleanup after cargo run completes (success or failure) cleanuprs/hang/src/import/hls.rs (1)
124-145: AvoidClient::builder().build().unwrap()and add an HTTP timeout for ingest.The current code uses
.unwrap()inside the closure, which panics on client creation failure instead of propagating the error through theanyhow::Resultreturn type. Additionally, the HTTP client has no timeout configured, which is problematic for network-based streaming ingest operations.Proposed diff
pub fn new(broadcast: BroadcastProducer, cfg: HlsConfig) -> anyhow::Result<Self> { let base_url = cfg.parse_playlist()?; - let client = cfg.client.unwrap_or_else(|| { - Client::builder() - .user_agent(concat!(env!("CARGO_PKG_NAME"), "/", env!("CARGO_PKG_VERSION"))) - .build() - .unwrap() - }); + let client = match cfg.client { + Some(client) => client, + None => Client::builder() + .user_agent(concat!(env!("CARGO_PKG_NAME"), "/", env!("CARGO_PKG_VERSION"))) + .timeout(Duration::from_secs(10)) + .build() + .context("failed to build reqwest client")?, + }; let passthrough = cfg.passthrough; Ok(Self {
🤖 Fix all issues with AI agents
In @js/hang/src/catalog/container.ts:
- Line 8: Update the stale inline comment entry for "cmaf" in the container list
so it no longer says "(future)"; locate the "cmaf" item in
js/hang/src/catalog/container.ts and change its annotation to reflect that CMAF
is supported/active (consistent with the implementation in container/codec.ts
and MSE usage).
In @js/hang/src/util/mime.ts:
- Around line 22-24: isMimeTypeSupported currently calls
MediaSource.isTypeSupported directly and will throw in non-browser environments;
guard access by checking that typeof MediaSource !== "undefined" and that
MediaSource.isTypeSupported is a function before calling it, returning false
when the guard fails; update the isMimeTypeSupported function to perform this
safe check so imports in Node/SSR/tests do not trigger ReferenceError.
In @js/hang/src/watch/audio/emitter.ts:
- Around line 86-92: When readyState is below HTMLMediaElement.HAVE_METADATA you
add a loadedmetadata listener via mseAudio.addEventListener("loadedmetadata",
tryPlay, { once: true }) but never remove it if the effect unmounts or re-runs;
update the effect to capture the handler reference (e.g., const onLoaded =
tryPlay), attach with mseAudio.addEventListener("loadedmetadata", onLoaded, {
once: true }), and in the effect cleanup call
mseAudio.removeEventListener("loadedmetadata", onLoaded) (guarding that mseAudio
still exists and its readyState is still < HAVE_METADATA) so the listener is
removed if the component unmounts or the effect is disposed before the event
fires.
In @js/hang/src/watch/audio/source.ts:
- Around line 44-47: The mseAudioElement getter can hold a stale
HTMLAudioElement when the source is not using MSE; update the code paths that
switch or disable MSE (the places around the #mseAudioElement declaration and
the logic blocks referenced at 92-107) to explicitly clear the private signal by
setting #mseAudioElement.value = undefined whenever leaving the MSE path or when
disposing/replacing the MSE element so the UI/control state cannot reference a
stale element; ensure every branch that creates or assigns an MSE element also
handles the opposite branch to clear the signal.
- Around line 194-236: The per-track forwarding currently uses the global
this.#signals.effect which leaks subscriptions across track switches; change the
two this.#signals.effect(...) calls inside #runMSEPath to use the local effect
(the parameter named effect) so subscriptions are scoped to the current run and
teardown with the run's lifecycle: subscribe to mseSource.audioElement and
mseSource.stats using the passed-in effect and ensure you unregister in
effect.cleanup (or the effect's native unsubscribe API) so you set
this.#mseAudioElement and this.#stats only while that run is active and they are
removed when the effect ends.
- Around line 186-236: The fallback from #runMSEPath to #runWebCodecsPath is
invalid for container === "cmaf" because Frame.Consumer will still emit fMP4
fragments that WebCodecs.AudioDecoder cannot decode; update the catch in
#runMSEPath to avoid calling this.#runWebCodecsPath when config.container ===
"cmaf" (either select/signal a non-CMAF rendition/container before falling back
or abort and surface the error), or implement fMP4 demuxing inside
#runWebCodecsPath so it accepts CMAF fragments (handle fragment parsing to raw
codec frames before feeding to AudioDecoder); ensure changes reference
mseSource.runTrack, SourceMSE, #runWebCodecsPath, and the config.container check
so the fallback only happens when fragments are compatible.
In @js/hang/src/watch/video/source-mse.ts:
- Around line 77-81: The created video element (this.#video) is appended to
document.body before later initialization steps like addSourceBuffer, so if
those fail the element leaks; change the flow to ensure cleanup on any
initialization error by wrapping the init sequence (creation, append,
addSourceBuffer and subsequent setup) in a try/catch/finally or promise chain
that removes the element from the DOM and clears this.#video on failure
(document.body.removeChild(this.#video) and this.#video = null), or
alternatively defer appending until after addSourceBuffer succeeds; ensure the
rejection path for addSourceBuffer and related methods removes the element and
nulls the reference to avoid leaks.
In @js/hang/src/watch/video/source.ts:
- Around line 102-104: The optional callback property setAudioSync is being
assigned but not cleared on effect/component cleanup, which can leave stale
closures called after a track switch; update every place where setAudioSync is
assigned (the initial MSE assignment and the other assignment block) to return a
cleanup that sets setAudioSync to undefined (or removes the assignment) and also
clear it immediately before assigning a new callback when switching tracks so no
stale calls remain. Ensure you update both assignment sites that set
setAudioSync to always null out/undefine the previous callback in their teardown
and before reassigning.
- Around line 216-270: The forwarding effects registered on this.#signals inside
#runMSEPath are not torn down and will leak across track switches; capture each
disposer returned by this.#signals.effect when you create them and register a
cleanup via effect.cleanup that calls each disposer (or store them in an array
and call them all), so that when the spawned effect ends (or mseSource is
closed) those signal forwards are removed; apply this to every
this.#signals.effect call associated with mseSource (the
frame/display/bufferStatus/syncStatus/stats forwarders) and ensure
effect.cleanup also closes mseSource if not already handled.
- Around line 207-214: The fallback from MSE to WebCodecs in #runTrack is broken
because when config.container === "cmaf" the code calls #runWebCodecsPath with
CMAF-fragmented MP4 data (moof/mdat) which VideoDecoder.decode() cannot handle;
either stop falling back for CMAF containers or implement demuxing before
calling #runWebCodecsPath. Fix by detecting config.container === "cmaf" in the
MSE error path inside #runTrack (and the analogous audio path), and either (a)
avoid calling #runWebCodecsPath and surface the error, or (b) update the
fallback to run a demux step that extracts raw codec frames from CMAF fragments
(moof/mdat → raw frames) and pass those frames to the VideoDecoder/AudioDecoder
in #runWebCodecsPath. Ensure references to #runMSEPath, #runWebCodecsPath, and
config.container are used so the change targets the correct logic.
In @rs/hang/src/catalog/audio/mod.rs:
- Around line 66-68: Add #[serde(default)] above the pub container: Container
field in the struct (the Container field in rs/hang/src/catalog/audio/mod.rs) so
deserializing legacy JSON that omits container falls back to the default; if
Container does not already implement Default, implement Default for Container
(returning the "native" variant) or provide a serde default function and
reference it (e.g., #[serde(default = "default_container")]) so the documented
backward-compatible default is applied.
In @rs/hang/src/catalog/video/mod.rs:
- Around line 114-117: The VideoConfig struct's container field lacks the serde
default attribute, so deserializing older catalogs that omit container will
fail; add #[serde(default)] to the container field declaration in VideoConfig so
Serde will call Container::default() (which returns Container::Native) when the
field is missing, keeping backward compatibility with existing serialized data.
In @rs/hang/src/import/fmp4.rs:
- Around line 710-714: The warning in transport_fragment is misleading because
moov_bytes being Some at this point means init() was never called, not that the
init segment wasn't sent; update the tracing::warn! message in
transport_fragment (which checks self.moov_bytes.is_some()) to clearly say
"init() was not called; moov_bytes still present" (or similar), and optionally
adjust the log level if you want this to be an error or debug signal rather than
a warning; keep the check on self.moov_bytes and only change the log text/level
to accurately reflect the condition.
- Around line 99-110: The current decode method captures available_bytes using
buf.chunk(), which only takes the first contiguous slice and will miss data for
multi-chunk buffers; update the logic in decode (the available_bytes creation
when self.passthrough_mode is true) to collect all remaining bytes instead of
just buf.chunk()—for example by reading buf.remaining() into a new Bytes/Vec or
by iterating over chunks and concatenating them—so subsequent slice/offset math
uses a complete copy of the remaining bytes rather than a single chunk.
🧹 Nitpick comments (16)
rs/hang/src/model/track.rs (2)
22-29: Newis_init_segment_groupstate looks fine, but be mindful it’s now part of the producer’s cloning semantics.Since
TrackProduceris#[derive(Clone)], this flag becomes part of whatever clone behavior is relied on elsewhere (especially if clones can write). If only one writer clone is intended, consider documenting that invariant nearTrackProducer(or make cloning/write patterns explicit).Also applies to: 33-40
82-95: Init segment detection should be “first keyframe at timestamp 0”, not just “timestamp 0”.As written, any keyframe-created group at
timestamp == 0becomes “init segment group”. If that can happen beyond the true init segment (even briefly), you’ll incorrectly keep a non-init group open.A low-cost guard is to compute
is_first_keyframebefore you mutateself.keyframe, and only setis_init_segment_groupwhen it’s the first keyframe.Proposed tweak (guard init detection)
pub fn write(&mut self, frame: Frame) -> Result<(), Error> { tracing::trace!(?frame, "write frame"); + let is_first_keyframe = self.keyframe.is_none(); + let mut header = BytesMut::new(); frame.timestamp.as_micros().encode(&mut header, lite::Version::Draft02); @@ let mut group = match self.group.take() { Some(group) => group, None if frame.keyframe => { let new_group = self.inner.append_group(); // Log when creating a new group, especially for init segment (timestamp 0) - if frame.timestamp.as_micros() == 0 { + if is_first_keyframe && frame.timestamp.as_micros() == 0 { tracing::info!(timestamp = ?frame.timestamp, "creating new group for init segment (timestamp 0)"); // Mark this as the init segment group so we can keep it open self.is_init_segment_group = true; } else { tracing::info!(timestamp = ?frame.timestamp, "creating new group for keyframe"); } new_group }rs/hang-cli/src/publish.rs (1)
52-54: Clarify the logging message.The message "HLS publish preserving original container format." is logged unconditionally, but the passthrough mode may be disabled. Consider making the message conditional or rephrasing it to reflect the actual state.
💡 Proposed improvement
- tracing::info!( - passthrough = *passthrough, - "HLS publish preserving original container format." - ); + tracing::info!( + passthrough = *passthrough, + "HLS publish configured with passthrough mode." + );Or make it conditional:
- tracing::info!( - passthrough = *passthrough, - "HLS publish preserving original container format." - ); + if *passthrough { + tracing::info!("HLS publish preserving original CMAF container format (passthrough mode)."); + } else { + tracing::info!("HLS publish decomposing CMAF fragments to native frames."); + }rs/hang/src/catalog/container.rs (1)
3-18: Consider makingContainerforward-compatible (#[non_exhaustive]) and simplifying serde renames.Adding future variants will otherwise break downstream exhaustive matches; also
rename_all = "camelCase"already yields"native" | "raw" | "cmaf"so explicit per-variant renames are redundant.Proposed diff
#[derive(Serialize, Deserialize, Debug, Clone, Copy, PartialEq, Eq, Default)] -#[serde(rename_all = "camelCase")] +#[serde(rename_all = "camelCase")] +#[non_exhaustive] pub enum Container { - #[serde(rename = "native")] #[default] Native, - #[serde(rename = "raw")] Raw, - #[serde(rename = "cmaf")] Cmaf, }js/hang/src/util/mime.ts (1)
32-52: Optionally refuse to build MP4 MIME types unlesscontainer === "cmaf". (Prevents accidental MSE attempts for native/raw.)Proposed diff
export function buildVideoMimeType(config: Catalog.VideoConfig): string | undefined { + if (config.container !== "cmaf") return undefined; const mimeType = buildMimeType(config.codec, "video"); if (isMimeTypeSupported(mimeType)) { return mimeType; } return undefined; } export function buildAudioMimeType(config: Catalog.AudioConfig): string | undefined { + if (config.container !== "cmaf") return undefined; const mimeType = buildMimeType(config.codec, "audio"); if (isMimeTypeSupported(mimeType)) { return mimeType; } return undefined; }rs/hang/src/import/hls.rs (3)
27-48:HlsConfig::new()should ideally assert new fields in the unit test (incl.passthrough: false).Also applies to: 41-48
177-209: Init path should also callensure_init_segment(...)(like the steady-state path) to avoid extra playlist fetches insidepush_segment.Proposed diff
async fn consume_segments_limited( &mut self, kind: TrackKind, track: &mut TrackState, playlist: &MediaPlaylist, max_segments: usize, ) -> anyhow::Result<usize> { + self.ensure_init_segment(kind, track, playlist).await?; + // Calculate segments to process let next_seq = track.next_sequence.unwrap_or(0);Also applies to: 322-364
465-503:push_segmentdoing a playlist refetch to initialize is surprising and adds network/file I/O to the “hot path”.Proposed diff (after making `consume_segments_limited` call `ensure_init_segment`)
let mut bytes = self.fetch_bytes(url).await?; - // Ensure the importer is initialized before processing fragments - // Use track.init_ready to avoid borrowing issues - if !track.init_ready { - // Try to ensure init segment is processed - let playlist = self.fetch_media_playlist(track.playlist.clone()).await?; - self.ensure_init_segment(kind, track, &playlist).await?; - } - // Get importer after ensuring init segment let importer = match kind { TrackKind::Video(index) => self.ensure_video_importer_for(index), TrackKind::Audio => self.ensure_audio_importer(), }; - // Final check after ensuring init segment + // Final check if !importer.is_initialized() { return Err(anyhow::anyhow!( "importer not initialized for {:?} after ensure_init_segment - init segment processing failed", kind )); }js/hang/src/watch/audio/source-mse.ts (2)
40-88:initialize()assumes DOM + MSE availability; consider failing fast with a clearer error when unavailable.
251-275:hasMoovAtomMP4 box parsing is fragile (32-bit signed bitwise; no extended-size handling).Proposed diff (unsigned size)
- const size = - (data[offset] << 24) | (data[offset + 1] << 16) | (data[offset + 2] << 8) | data[offset + 3]; + const size = + (((data[offset] << 24) | + (data[offset + 1] << 16) | + (data[offset + 2] << 8) | + data[offset + 3]) >>> + 0);rs/hang/src/import/fmp4.rs (3)
314-319: Logging atinfolevel may be too verbose for production.Verifying moov atom signature at
infolevel on every init segment could be noisy. Consider usingdebuglevel for this verification step.- tracing::info!(atom_type = %atom_type, "verifying moov atom signature in init segment"); + tracing::debug!(atom_type = %atom_type, "verifying moov atom signature in init segment");
200-207: Excessiveinfo-level logging in hot path.Multiple
tracing::info!calls occur per fragment (lines 200-207, 788, 796, 815-817). For a live video stream at 30fps with 2-second GOPs, this would produce 15+ info log lines per second. Consider demoting todebuglevel.Reduce log verbosity in hot path
- tracing::info!( + tracing::debug!( moof_size = moof_bytes.len(), mdat_size = mdat_bytes.len(), total_fragment_size = fragment.len(), "processing CMAF fragment (moof+mdat)" ); self.transport_fragment(fragment, moof)?; - tracing::info!("finished processing CMAF fragment, ready for next fragment"); + tracing::debug!("finished processing CMAF fragment");Similarly for lines 788, 796, 815-817 in
transport_fragment.Also applies to: 788-788, 796-796, 815-817
376-380: Repetitive container assignment pattern could be extracted.The same
if passthrough_mode { Container::Cmaf } else { Container::Native }pattern is repeated 7 times across different codec configurations. Consider extracting to a helper.Extract helper function
fn container_for_mode(passthrough_mode: bool) -> Container { if passthrough_mode { Container::Cmaf } else { Container::Native } }Then use
container: container_for_mode(passthrough_mode)in each config.Also applies to: 396-400, 427-431, 464-468, 508-512, 545-549, 559-563
js/hang/src/watch/video/source-mse.ts (3)
83-83: Empty event handler serves no purpose.The
waitingevent listener has an empty callback. Either add meaningful handling or remove it.- this.#video.addEventListener("waiting", () => {}); + // Remove if no handling needed, or add meaningful logic: + // this.#video.addEventListener("waiting", () => { + // this.bufferStatus.set({ state: "empty" }); + // });
504-515: RedundantendOfStreamcall.
endOfStream()is called in theifblock (line 510) and again in the subsequent block (line 520) whenreadyState === "open". If the first call succeeds, the second block's condition will be false. If the first call fails (caught), the state may have changed. Consider consolidating.Consolidate cleanup
- if (this.#sourceBuffer && this.#mediaSource) { - try { - if (this.#sourceBuffer.updating) { - this.#sourceBuffer.abort(); - } - if (this.#mediaSource.readyState === "open") { - this.#mediaSource.endOfStream(); - } - } catch (error) { - console.error("Error closing SourceBuffer:", error); - } - } - - if (this.#mediaSource) { - try { - if (this.#mediaSource.readyState === "open") { - this.#mediaSource.endOfStream(); - } - URL.revokeObjectURL(this.#video?.src || ""); - } catch (error) { - console.error("Error closing MediaSource:", error); - } - } + try { + if (this.#sourceBuffer?.updating) { + this.#sourceBuffer.abort(); + } + if (this.#mediaSource?.readyState === "open") { + this.#mediaSource.endOfStream(); + } + } catch (error) { + console.error("Error closing MediaSource/SourceBuffer:", error); + } + + if (this.#video?.src) { + URL.revokeObjectURL(this.#video.src); + }
517-526: Potential issue:revokeObjectURLwith empty string.If
this.#video?.srcis empty or undefined, passing an empty string toURL.revokeObjectURL("")is a no-op but could indicate the URL was already revoked or never set properly.- URL.revokeObjectURL(this.#video?.src || ""); + if (this.#video?.src) { + URL.revokeObjectURL(this.#video.src); + }
📜 Review details
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (27)
js/hang/src/catalog/audio.tsjs/hang/src/catalog/container.tsjs/hang/src/catalog/video.tsjs/hang/src/container/codec.tsjs/hang/src/frame.tsjs/hang/src/util/mime.tsjs/hang/src/watch/audio/emitter.tsjs/hang/src/watch/audio/source-mse.tsjs/hang/src/watch/audio/source.tsjs/hang/src/watch/broadcast.tsjs/hang/src/watch/video/source-mse.tsjs/hang/src/watch/video/source.tsjustfilers/hang-cli/src/publish.rsrs/hang/examples/video.rsrs/hang/src/catalog/audio/mod.rsrs/hang/src/catalog/container.rsrs/hang/src/catalog/mod.rsrs/hang/src/catalog/root.rsrs/hang/src/catalog/video/mod.rsrs/hang/src/import/aac.rsrs/hang/src/import/avc3.rsrs/hang/src/import/fmp4.rsrs/hang/src/import/hev1.rsrs/hang/src/import/hls.rsrs/hang/src/import/opus.rsrs/hang/src/model/track.rs
🧰 Additional context used
📓 Path-based instructions (1)
**/*.rs
📄 CodeRabbit inference engine (CLAUDE.md)
In Rust crates, tests should be integrated within source files using inline test modules
Files:
rs/hang/src/catalog/mod.rsrs/hang/src/import/hev1.rsrs/hang-cli/src/publish.rsrs/hang/src/import/opus.rsrs/hang/src/catalog/container.rsrs/hang/examples/video.rsrs/hang/src/import/hls.rsrs/hang/src/model/track.rsrs/hang/src/catalog/video/mod.rsrs/hang/src/import/avc3.rsrs/hang/src/catalog/audio/mod.rsrs/hang/src/import/aac.rsrs/hang/src/catalog/root.rsrs/hang/src/import/fmp4.rs
🧬 Code graph analysis (15)
rs/hang/src/import/opus.rs (1)
js/hang/src/catalog/container.ts (1)
Container(12-12)
rs/hang/src/catalog/container.rs (1)
js/hang/src/catalog/container.ts (1)
Container(12-12)
js/hang/src/util/mime.ts (1)
js/hang/src/catalog/video.ts (1)
VideoConfig(101-101)
js/hang/src/catalog/container.ts (1)
js/hang/src/catalog/index.ts (1)
DEFAULT_CONTAINER(5-5)
js/hang/src/watch/audio/source-mse.ts (2)
js/hang/src/watch/video/source-mse.ts (2)
SourceMSE(30-540)fragments(256-270)js/signals/src/index.ts (1)
Getter(13-22)
rs/hang/src/catalog/video/mod.rs (1)
js/hang/src/catalog/container.ts (1)
Container(12-12)
rs/hang/src/import/avc3.rs (1)
js/hang/src/catalog/container.ts (1)
Container(12-12)
js/hang/src/watch/video/source.ts (2)
js/hang/src/catalog/video.ts (1)
VideoConfig(101-101)js/hang/src/watch/video/source-mse.ts (2)
SourceMSE(30-540)stats(537-539)
rs/hang/src/catalog/audio/mod.rs (1)
js/hang/src/catalog/container.ts (1)
Container(12-12)
js/hang/src/frame.ts (2)
rs/hang/src/catalog/root.rs (1)
next(250-270)rs/moq-relay/src/web.rs (1)
next(304-317)
rs/hang/src/import/aac.rs (1)
js/hang/src/catalog/container.ts (1)
Container(12-12)
js/hang/src/watch/video/source-mse.ts (4)
js/hang-ui/src/Components/stats/types.ts (2)
BufferStatus(29-31)SyncStatus(21-24)js/hang/src/watch/video/source.ts (9)
VideoStats(41-45)effect(135-158)effect(160-176)effect(178-205)effect(207-214)effect(216-270)effect(272-413)effect(451-471)effect(473-483)js/hang/src/watch/audio/source-mse.ts (2)
SourceMSE(17-377)fragments(125-139)js/hang/src/publish/priority.ts (1)
PRIORITY(3-11)
js/hang/src/watch/audio/emitter.ts (1)
js/hang/src/watch/audio/source.ts (5)
effect(92-146)effect(148-158)effect(160-192)effect(194-236)effect(238-296)
rs/hang/src/catalog/root.rs (5)
js/hang/src/publish/broadcast.ts (1)
track(120-139)js/hang/src/watch/broadcast.ts (1)
catalog(147-164)js/hang/src/catalog/audio.ts (1)
AudioConfig(60-60)js/hang/src/catalog/container.ts (1)
Container(12-12)js/hang/src/catalog/video.ts (1)
VideoConfig(101-101)
rs/hang/src/import/fmp4.rs (4)
js/hang/src/catalog/container.ts (1)
Container(12-12)rs/hang/src/import/aac.rs (2)
decode(127-146)new(15-21)rs/hang/src/import/opus.rs (1)
decode(73-92)js/hang/src/frame.ts (3)
decode(41-45)decode(240-275)Frame(12-17)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Check
🔇 Additional comments (31)
rs/hang/src/model/track.rs (1)
57-69: Fix the assumption: dropping the group closes the watch channel, preventing late subscribers from receiving the init segment.When
drop(group)is called, thewatch::Sender<GroupState>is dropped, which closes the underlying channel. Late subscribers that callconsume()will have an already-closed receiver, and any consumer attempting to read will encounterError::Cancel(line 212-213 ofgroup.rs) instead of gracefully reading existing frames.To keep the init segment group truly open for new subscribers, explicitly call
group.close()to setstate.closed = Some(Ok(())), allowing consumers to gracefully finish reading and getOk(None)rather thanError::Cancel. Dropping without explicit closure defeats the intent.js/hang/src/catalog/video.ts (1)
17-17: LGTM: Documentation update aligns with container refactoring.The comment correctly reflects the new default container value.
js/hang/src/catalog/audio.ts (1)
18-18: LGTM: Documentation update aligns with container refactoring.Consistent with the video.ts changes.
js/hang/src/container/codec.ts (4)
14-21: Case label updated correctly.The "legacy" → "native" rename is consistent with the container enum changes.
37-52: Case label updated; verify VarInt usage for CMAF decode.The "legacy" → "native" rename is consistent. However, the CMAF decode logic uses VarInt, which differs from the U64 encoding. See concerns raised in the encodeTimestamp review.
65-70: Case label updated correctly.The "legacy" → "native" rename and CMAF size (8 bytes) are consistent with VarInt maximum size.
18-21: The CMAF encode/decode asymmetry is intentional and safe.The code comment (lines 46-49) clarifies the design: CMAF is used exclusively for receiving frames via Media Source Extensions (MSE). The server sends VarInt-encoded timestamp headers, which the client correctly decodes using
decodeVarInt. The client never encodes with CMAF—it uses the default "native" container format for publishing (seejs/hang/src/publish/video/encoder.tsandjs/hang/src/publish/audio/encoder.ts). Frame.Consumer decodes CMAF data received from the server, while Frame.encode is never called withcontainer="cmaf".The actual media timestamp is extracted by MSE from the moof atom, so the timestamp header asymmetry poses no practical issue.
Minor clarification needed: The
getTimestampSizecomment for CMAF (line 70) states "VarInt maximum size" but should reflect that encoding produces fixed 8-byte output.js/hang/src/frame.ts (4)
118-120: LGTM: Improved control flow clarity.Converting the single-line early exit to an explicit block improves readability and debugging.
152-154: LGTM: Consistent guard pattern.The explicit block aligns with the pattern established on lines 118-120.
248-250: LGTM: Explicit return improves clarity.Wrapping the conditional return in a block is consistent with the other control flow improvements in this file.
265-267: LGTM: Explicit promise resolution.Converting the concise arrow to a block statement improves consistency with the rest of the file.
js/hang/src/catalog/container.ts (1)
10-10: This review comment is incorrect. Thecontainer.tsfile is newly created, not modified from a previous version containing["legacy", "raw", "fmp4"]. Git history confirms it was added as a new file with the current enum values["native", "raw", "cmaf"].Additionally, no references to
"legacy"or"fmp4"as container enum values exist in the codebase. The code already implements backward compatibility through the.default(DEFAULT_CONTAINER)pattern whereDEFAULT_CONTAINER = "native", ensuring existing catalogs that lack a container field will work without validation failures.Likely an incorrect or invalid review comment.
js/hang/src/watch/audio/emitter.ts (3)
48-53: LGTM!The enabled state is now correctly computed from both
pausedandmutedsignals, and the logic correctly setsenabled = !paused && !mutedfor the source.
61-96: MSE path handling is well-structured.The bifurcation between MSE and WebCodecs paths is clean. The early return at line 95 correctly prevents the WebCodecs path setup when MSE is active. The direct HTMLAudioElement control for volume, mute, and play/pause is appropriate for MSE-based playback.
118-133: LGTM!The guard at line 121 correctly skips gain transitions when no GainNode exists (MSE path), preventing errors when the WebCodecs path isn't active.
rs/hang/src/import/hev1.rs (1)
65-65: LGTM!Setting
container: Nativeis correct for the H.265 inline SPS/PPS import path. This aligns with the container format documentation wherenativeindicates QUIC VarInt + raw frames.rs/hang/src/import/avc3.rs (1)
65-65: LGTM!Setting
container: Nativeis correct for the H.264 inline SPS/PPS import path, consistent with the H.265 handler.rs/hang/src/catalog/mod.rs (1)
10-10: LGTM!The new
containermodule is properly declared and re-exported, following the established pattern for other catalog submodules.Also applies to: 19-19
rs/hang/src/import/aac.rs (1)
110-110: LGTM!Setting
container: Nativeis correct for the AAC audio import path, consistent with the video codec handlers.js/hang/src/watch/broadcast.ts (1)
67-72: No action needed. ThesetAudioSyncmethod is explicitly typed to acceptHTMLAudioElement | undefinedand correctly handles undefined values by storing the value and resetting the sync timer. The code is working as designed.Likely an incorrect or invalid review comment.
rs/hang/examples/video.rs (1)
73-73: LGTM!The container field is correctly initialized with
Container::Native, demonstrating proper usage of the new field in video configuration.rs/hang/src/import/opus.rs (1)
56-56: LGTM!The container field is correctly set to
Container::Native, consistent with the native frame encoding used by the Opus importer.rs/hang/src/catalog/root.rs (1)
209-222: LGTM!The logging additions provide useful visibility into container field values during catalog publishing, which will be helpful for debugging container-aware handling.
rs/hang/src/import/hls.rs (1)
526-544: Passthrough propagation to per-track importers looks good.js/hang/src/watch/video/source.ts (1)
27-33:RequiredDecoderConfigchange and “strip codedWidth/Height to avoid reloads” looks reasonable.Also applies to: 173-176
js/hang/src/watch/audio/source-mse.ts (1)
343-376: Cleanup looks solid (abort SourceBuffer, end MediaSource, remove element).rs/hang/src/import/fmp4.rs (2)
234-341: LGTM: Init segment handling logic is well-structured.The init method correctly:
- Captures passthrough mode before locking catalog
- Creates tracks before sending init segments
- Assembles ftyp+moov correctly
- Stores init segment for keyframe distribution
- Sends to all tracks with appropriate logging
780-818: Each track receives the complete moof+mdat containing all tracks' data.In passthrough mode, the same
fragmentbytes (the complete moof+mdat) are written to every track in the moof via the loop at line 728. This means each track's stream contains the full fragment with data for all tracks, not just per-track data.This appears to be the intended design of
transport_fragment()(line 710), which is documented as transporting "complete CMAF fragments directly without decomposing." However, verify that sending full multi-track moof+mdat to individual track SourceBuffers is the correct MSE usage pattern, particularly whether the consumer SourceBuffer correctly parses and uses only its track's data from the multi-track moof.js/hang/src/watch/video/source-mse.ts (3)
188-237: LGTM: Frame capture implementation is solid.Good use of
requestVideoFrameCallbackwithrequestAnimationFramefallback. Proper VideoFrame timestamp conversion and cleanup of previous frames viaupdatecallback.
399-425: LGTM: Init segment detection is robust.The
isInitSegmentDatafunction correctly parses MP4 box structure and handles both ftyp+moov and moov-only init segments. Proper bounds checking prevents out-of-bounds reads.
256-270: LGTM: Fragment concatenation is efficient.Correctly handles single-fragment optimization and properly calculates total size for multi-fragment concatenation.
| * - "native": Uses QUIC VarInt encoding (1-8 bytes, variable length) | ||
| * - "raw": Uses fixed u64 encoding (8 bytes, big-endian) | ||
| * - "fmp4": Fragmented MP4 container (future) | ||
| * - "cmaf": Fragmented MP4 container (future) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Update stale comment about CMAF availability.
The comment indicates CMAF is "(future)", but the implementation in js/hang/src/container/codec.ts (lines 18-21, 45-52) and usage in MSE source files show it's already active.
📝 Suggested fix
-- "cmaf": Fragmented MP4 container (future)
+- "cmaf": Fragmented MP4 container with moof+mdat fragments📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| * - "cmaf": Fragmented MP4 container (future) | |
| * - "cmaf": Fragmented MP4 container with moof+mdat fragments |
🤖 Prompt for AI Agents
In @js/hang/src/catalog/container.ts at line 8, Update the stale inline comment
entry for "cmaf" in the container list so it no longer says "(future)"; locate
the "cmaf" item in js/hang/src/catalog/container.ts and change its annotation to
reflect that CMAF is supported/active (consistent with the implementation in
container/codec.ts and MSE usage).
| export function isMimeTypeSupported(mimeType: string): boolean { | ||
| return MediaSource.isTypeSupported(mimeType); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
# First, let's locate the file and understand its context
find . -type f -name "mime.ts" | head -20Repository: moq-dev/moq
Length of output: 81
🏁 Script executed:
# Search for imports of isMimeTypeSupported to see where it's used
rg "isMimeTypeSupported" --type ts --type js -B 2 -A 2Repository: moq-dev/moq
Length of output: 1004
🏁 Script executed:
# Check if this module is imported in test files or SSR contexts
rg "from.*mime|import.*mime" --type ts --type js | grep -E "(test|spec|ssr|server)" | head -20Repository: moq-dev/moq
Length of output: 37
🏁 Script executed:
# Check the overall structure of the mime.ts file
cat -n js/hang/src/util/mime.tsRepository: moq-dev/moq
Length of output: 2076
🏁 Script executed:
# Search for other uses of MediaSource in the codebase
rg "MediaSource" --type ts --type js -B 1 -A 1Repository: moq-dev/moq
Length of output: 5428
🏁 Script executed:
# Check if mime.ts or any functions from it are imported in test files
find . -type f \( -name "*.test.ts" -o -name "*.spec.ts" -o -name "*.test.js" -o -name "*.spec.js" \) | xargs grep -l "mime" 2>/dev/null || echo "No test files found importing mime utilities"Repository: moq-dev/moq
Length of output: 99
🏁 Script executed:
# Check the project structure to understand if this is browser-only or if it can be run in Node
head -30 package.jsonRepository: moq-dev/moq
Length of output: 442
🏁 Script executed:
# Look for any SSR or Node-related imports or configurations
find . -name "*.config.*" -o -name "tsconfig*.json" | head -10Repository: moq-dev/moq
Length of output: 335
🏁 Script executed:
# Check if there are any existing guards for browser APIs in the codebase
rg "typeof.*!==.*undefined|globalThis\.|typeof.*global" --type ts --type js | head -20Repository: moq-dev/moq
Length of output: 2011
🏁 Script executed:
# Check where buildVideoMimeType and buildAudioMimeType are used
rg "buildVideoMimeType|buildAudioMimeType" --type ts --type js -B 2 -A 2Repository: moq-dev/moq
Length of output: 1585
🏁 Script executed:
# Check if mime.ts is a browser-only utility or if it might be loaded in Node contexts
find . -type f \( -name "package.json" -o -name "tsconfig.json" \) -path "*/js/hang/*" | xargs cat | grep -E "target|module|browser|exports" | head -20Repository: moq-dev/moq
Length of output: 87
🏁 Script executed:
# Check the vite and vitest configs to understand the build target
cat js/hang-demo/vite.config.tsRepository: moq-dev/moq
Length of output: 724
🏁 Script executed:
# Double-check if there's explicit browser/non-browser context configuration
cat js/hang/package.json | head -50Repository: moq-dev/moq
Length of output: 1428
Guard MediaSource access to prevent crashes in non-browser contexts.
The function directly accesses MediaSource.isTypeSupported() without checking if MediaSource exists. Since this utility is publicly exported and could be imported in non-browser environments (Node.js, tests, SSR), it will throw ReferenceError: MediaSource is not defined. The codebase already uses this guard pattern extensively for browser APIs (e.g., WebTransport, AudioContext, CaptureController).
Proposed fix
export function isMimeTypeSupported(mimeType: string): boolean {
- return MediaSource.isTypeSupported(mimeType);
+ return typeof MediaSource !== "undefined" && MediaSource.isTypeSupported(mimeType);
}🤖 Prompt for AI Agents
In @js/hang/src/util/mime.ts around lines 22 - 24, isMimeTypeSupported currently
calls MediaSource.isTypeSupported directly and will throw in non-browser
environments; guard access by checking that typeof MediaSource !== "undefined"
and that MediaSource.isTypeSupported is a function before calling it, returning
false when the guard fails; update the isMimeTypeSupported function to perform
this safe check so imports in Node/SSR/tests do not trigger ReferenceError.
| // Try to play if we have metadata (HAVE_METADATA = 1), browser will start when ready | ||
| if (mseAudio.readyState >= HTMLMediaElement.HAVE_METADATA) { | ||
| tryPlay(); | ||
| } else { | ||
| // Wait for loadedmetadata event if not ready yet | ||
| mseAudio.addEventListener("loadedmetadata", tryPlay, { once: true }); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Potential memory leak: event listener not cleaned up.
When mseAudio.readyState < HAVE_METADATA, a loadedmetadata event listener is added with { once: true }. However, if the effect is cleaned up before the event fires (e.g., the component unmounts or the effect re-runs), the listener remains attached to the element.
Consider cleaning up the event listener in the effect's cleanup phase:
Proposed fix
// Try to play if we have metadata (HAVE_METADATA = 1), browser will start when ready
if (mseAudio.readyState >= HTMLMediaElement.HAVE_METADATA) {
tryPlay();
} else {
// Wait for loadedmetadata event if not ready yet
mseAudio.addEventListener("loadedmetadata", tryPlay, { once: true });
+ effect.cleanup(() => mseAudio.removeEventListener("loadedmetadata", tryPlay));
}Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In @js/hang/src/watch/audio/emitter.ts around lines 86 - 92, When readyState is
below HTMLMediaElement.HAVE_METADATA you add a loadedmetadata listener via
mseAudio.addEventListener("loadedmetadata", tryPlay, { once: true }) but never
remove it if the effect unmounts or re-runs; update the effect to capture the
handler reference (e.g., const onLoaded = tryPlay), attach with
mseAudio.addEventListener("loadedmetadata", onLoaded, { once: true }), and in
the effect cleanup call mseAudio.removeEventListener("loadedmetadata", onLoaded)
(guarding that mseAudio still exists and its readyState is still <
HAVE_METADATA) so the listener is removed if the component unmounts or the
effect is disposed before the event fires.
| async appendFragment(fragment: Uint8Array): Promise<void> { | ||
| if (!this.#sourceBuffer || !this.#mediaSource) { | ||
| throw new Error("SourceBuffer not initialized"); | ||
| } | ||
|
|
||
| // Don't queue fragments if MediaSource is closed | ||
| if (this.#mediaSource.readyState === "closed") { | ||
| return; | ||
| } | ||
|
|
||
| if (this.#appendQueue.length >= SourceMSE.MAX_QUEUE_SIZE) { | ||
| const discarded = this.#appendQueue.shift(); | ||
| console.warn( | ||
| `[MSE Audio] Queue full (${SourceMSE.MAX_QUEUE_SIZE}), discarding oldest fragment (${discarded?.byteLength ?? 0} bytes)`, | ||
| ); | ||
| } | ||
|
|
||
| const copy = new Uint8Array(fragment); | ||
| this.#appendQueue.push(copy); | ||
|
|
||
| this.#processAppendQueue(); | ||
| } | ||
|
|
||
| #concatenateFragments(fragments: Uint8Array[]): Uint8Array { | ||
| if (fragments.length === 1) { | ||
| return fragments[0]; | ||
| } | ||
|
|
||
| const totalSize = fragments.reduce((sum, frag) => sum + frag.byteLength, 0); | ||
| const result = new Uint8Array(totalSize); | ||
| let offset = 0; | ||
| for (const fragment of fragments) { | ||
| result.set(fragment, offset); | ||
| offset += fragment.byteLength; | ||
| } | ||
|
|
||
| return result; | ||
| } | ||
|
|
||
| #processAppendQueue(): void { | ||
| if (!this.#sourceBuffer || this.#sourceBuffer.updating || this.#appendQueue.length === 0) { | ||
| return; | ||
| } | ||
|
|
||
| if (this.#mediaSource?.readyState !== "open") { | ||
| return; | ||
| } | ||
|
|
||
| const fragment = this.#appendQueue.shift(); | ||
| if (!fragment) return; | ||
|
|
||
| try { | ||
| // appendBuffer accepts BufferSource (ArrayBuffer or ArrayBufferView) | ||
| this.#sourceBuffer.appendBuffer(fragment as BufferSource); | ||
|
|
||
| this.#stats.update((current) => ({ | ||
| bytesReceived: (current?.bytesReceived ?? 0) + fragment.byteLength, | ||
| })); | ||
| } catch (error) { | ||
| console.error("[MSE Audio] Error appending fragment:", error); | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
wc -l js/hang/src/watch/audio/source-mse.tsRepository: moq-dev/moq
Length of output: 96
🏁 Script executed:
cat -n js/hang/src/watch/audio/source-mse.tsRepository: moq-dev/moq
Length of output: 13085
🏁 Script executed:
rg "updateend|MAX_QUEUE_SIZE|#processAppendQueue" js/hang/src/watch/audio/source-mse.tsRepository: moq-dev/moq
Length of output: 431
🏁 Script executed:
rg "appendFragment" js/hang/src/watch/audio/ -B 2 -A 2Repository: moq-dev/moq
Length of output: 2383
Append queue silently drops fragments when full; implement backpressure to prevent data loss.
The code discards the oldest fragment when the queue reaches MAX_QUEUE_SIZE (10) without pausing upstream processing. While an updateend listener exists to continue draining the queue, appendFragment() returns immediately without waiting, providing no backpressure to callers. When CMAF decoding outpaces SourceBuffer.appendBuffer, fragments are permanently lost rather than being throttled.
Make appendFragment() wait for the queue to drain below capacity before returning, or pause frame consumption at the source when the queue reaches a threshold. Consider sourceBuffer.remove() for long-running playback to avoid unbounded memory growth.
| // For MSE path, expose the HTMLAudioElement for direct control | ||
| #mseAudioElement = new Signal<HTMLAudioElement | undefined>(undefined); | ||
| readonly mseAudioElement = this.#mseAudioElement as Getter<HTMLAudioElement | undefined>; | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Clear mseAudioElement when not on the MSE path to avoid stale UI/control state.
Proposed diff
#runDecoder(effect: Effect): void {
const enabled = effect.get(this.enabled);
if (!enabled) {
return;
}
+ // Default to “no MSE element” unless we’re actively on the MSE path.
+ effect.set(this.#mseAudioElement, undefined);
const catalog = effect.get(this.catalog);Also applies to: 92-107
🤖 Prompt for AI Agents
In @js/hang/src/watch/audio/source.ts around lines 44 - 47, The mseAudioElement
getter can hold a stale HTMLAudioElement when the source is not using MSE;
update the code paths that switch or disable MSE (the places around the
#mseAudioElement declaration and the logic blocks referenced at 92-107) to
explicitly clear the private signal by setting #mseAudioElement.value =
undefined whenever leaving the MSE path or when disposing/replacing the MSE
element so the UI/control state cannot reference a stale element; ensure every
branch that creates or assigns an MSE element also handles the opposite branch
to clear the signal.
| #runMSEPath(effect: Effect, broadcast: Moq.Broadcast, name: string, config: RequiredDecoderConfig): void { | ||
| console.log("[Video Stream] Subscribing to track", { | ||
| name, | ||
| codec: config.codec, | ||
| container: config.container, | ||
| width: config.codedWidth, | ||
| height: config.codedHeight, | ||
| }); | ||
| // Import MSE source dynamically to avoid loading if not needed | ||
| effect.spawn(async () => { | ||
| const { SourceMSE } = await import("./source-mse.js"); | ||
| const mseSource = new SourceMSE(this.latency); | ||
| effect.cleanup(() => mseSource.close()); | ||
|
|
||
| // Forward signals using effects | ||
| this.#signals.effect((eff) => { | ||
| const frame = eff.get(mseSource.frame); | ||
| eff.set(this.frame, frame); | ||
| }); | ||
|
|
||
| this.#signals.effect((eff) => { | ||
| const display = eff.get(mseSource.display); | ||
| eff.set(this.display, display); | ||
| }); | ||
|
|
||
| this.#signals.effect((eff) => { | ||
| const status = eff.get(mseSource.bufferStatus); | ||
| eff.set(this.bufferStatus, status, { state: "empty" }); | ||
| }); | ||
|
|
||
| this.#signals.effect((eff) => { | ||
| const status = eff.get(mseSource.syncStatus); | ||
| eff.set(this.syncStatus, status, { state: "ready" }); | ||
| }); | ||
|
|
||
| this.#signals.effect((eff) => { | ||
| const stats = eff.get(mseSource.stats); | ||
| eff.set(this.#stats, stats); | ||
| }); | ||
|
|
||
| // Expose method to set audio element for synchronization | ||
| this.setAudioSync = (audioElement: HTMLAudioElement | undefined) => { | ||
| mseSource.setAudioSync(audioElement); | ||
| }; | ||
|
|
||
| // Run MSE track | ||
| try { | ||
| await mseSource.runTrack(effect, broadcast, name, config); | ||
| } catch (error) { | ||
| console.error("MSE path error, falling back to WebCodecs:", error); | ||
| // Fallback to WebCodecs | ||
| this.#runWebCodecsPath(effect, broadcast, name, config); | ||
| } | ||
| }); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
MSE path forwarding registered on this.#signals likely leaks across track switches; scope it to the current run.
Proposed diff (scope forwarding)
effect.spawn(async () => {
const { SourceMSE } = await import("./source-mse.js");
const mseSource = new SourceMSE(this.latency);
effect.cleanup(() => mseSource.close());
+ const forward = new Effect();
+ effect.cleanup(() => forward.close());
+
// Forward signals using effects
- this.#signals.effect((eff) => {
+ forward.effect((eff) => {
const frame = eff.get(mseSource.frame);
eff.set(this.frame, frame);
});
- this.#signals.effect((eff) => {
+ forward.effect((eff) => {
const display = eff.get(mseSource.display);
eff.set(this.display, display);
});
- this.#signals.effect((eff) => {
+ forward.effect((eff) => {
const status = eff.get(mseSource.bufferStatus);
eff.set(this.bufferStatus, status, { state: "empty" });
});
- this.#signals.effect((eff) => {
+ forward.effect((eff) => {
const status = eff.get(mseSource.syncStatus);
eff.set(this.syncStatus, status, { state: "ready" });
});
- this.#signals.effect((eff) => {
+ forward.effect((eff) => {
const stats = eff.get(mseSource.stats);
eff.set(this.#stats, stats);
});📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| #runMSEPath(effect: Effect, broadcast: Moq.Broadcast, name: string, config: RequiredDecoderConfig): void { | |
| console.log("[Video Stream] Subscribing to track", { | |
| name, | |
| codec: config.codec, | |
| container: config.container, | |
| width: config.codedWidth, | |
| height: config.codedHeight, | |
| }); | |
| // Import MSE source dynamically to avoid loading if not needed | |
| effect.spawn(async () => { | |
| const { SourceMSE } = await import("./source-mse.js"); | |
| const mseSource = new SourceMSE(this.latency); | |
| effect.cleanup(() => mseSource.close()); | |
| // Forward signals using effects | |
| this.#signals.effect((eff) => { | |
| const frame = eff.get(mseSource.frame); | |
| eff.set(this.frame, frame); | |
| }); | |
| this.#signals.effect((eff) => { | |
| const display = eff.get(mseSource.display); | |
| eff.set(this.display, display); | |
| }); | |
| this.#signals.effect((eff) => { | |
| const status = eff.get(mseSource.bufferStatus); | |
| eff.set(this.bufferStatus, status, { state: "empty" }); | |
| }); | |
| this.#signals.effect((eff) => { | |
| const status = eff.get(mseSource.syncStatus); | |
| eff.set(this.syncStatus, status, { state: "ready" }); | |
| }); | |
| this.#signals.effect((eff) => { | |
| const stats = eff.get(mseSource.stats); | |
| eff.set(this.#stats, stats); | |
| }); | |
| // Expose method to set audio element for synchronization | |
| this.setAudioSync = (audioElement: HTMLAudioElement | undefined) => { | |
| mseSource.setAudioSync(audioElement); | |
| }; | |
| // Run MSE track | |
| try { | |
| await mseSource.runTrack(effect, broadcast, name, config); | |
| } catch (error) { | |
| console.error("MSE path error, falling back to WebCodecs:", error); | |
| // Fallback to WebCodecs | |
| this.#runWebCodecsPath(effect, broadcast, name, config); | |
| } | |
| }); | |
| } | |
| #runMSEPath(effect: Effect, broadcast: Moq.Broadcast, name: string, config: RequiredDecoderConfig): void { | |
| console.log("[Video Stream] Subscribing to track", { | |
| name, | |
| codec: config.codec, | |
| container: config.container, | |
| width: config.codedWidth, | |
| height: config.codedHeight, | |
| }); | |
| // Import MSE source dynamically to avoid loading if not needed | |
| effect.spawn(async () => { | |
| const { SourceMSE } = await import("./source-mse.js"); | |
| const mseSource = new SourceMSE(this.latency); | |
| effect.cleanup(() => mseSource.close()); | |
| const forward = new Effect(); | |
| effect.cleanup(() => forward.close()); | |
| // Forward signals using effects | |
| forward.effect((eff) => { | |
| const frame = eff.get(mseSource.frame); | |
| eff.set(this.frame, frame); | |
| }); | |
| forward.effect((eff) => { | |
| const display = eff.get(mseSource.display); | |
| eff.set(this.display, display); | |
| }); | |
| forward.effect((eff) => { | |
| const status = eff.get(mseSource.bufferStatus); | |
| eff.set(this.bufferStatus, status, { state: "empty" }); | |
| }); | |
| forward.effect((eff) => { | |
| const status = eff.get(mseSource.syncStatus); | |
| eff.set(this.syncStatus, status, { state: "ready" }); | |
| }); | |
| forward.effect((eff) => { | |
| const stats = eff.get(mseSource.stats); | |
| eff.set(this.#stats, stats); | |
| }); | |
| // Expose method to set audio element for synchronization | |
| this.setAudioSync = (audioElement: HTMLAudioElement | undefined) => { | |
| mseSource.setAudioSync(audioElement); | |
| }; | |
| // Run MSE track | |
| try { | |
| await mseSource.runTrack(effect, broadcast, name, config); | |
| } catch (error) { | |
| console.error("MSE path error, falling back to WebCodecs:", error); | |
| // Fallback to WebCodecs | |
| this.#runWebCodecsPath(effect, broadcast, name, config); | |
| } | |
| }); | |
| } |
🤖 Prompt for AI Agents
In @js/hang/src/watch/video/source.ts around lines 216 - 270, The forwarding
effects registered on this.#signals inside #runMSEPath are not torn down and
will leak across track switches; capture each disposer returned by
this.#signals.effect when you create them and register a cleanup via
effect.cleanup that calls each disposer (or store them in an array and call them
all), so that when the spawned effect ends (or mseSource is closed) those signal
forwards are removed; apply this to every this.#signals.effect call associated
with mseSource (the frame/display/bufferStatus/syncStatus/stats forwarders) and
ensure effect.cleanup also closes mseSource if not already handled.
| /// Container format for frame encoding. | ||
| /// Defaults to "native" for backward compatibility. | ||
| pub container: Container, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add #[serde(default)] for backward compatibility.
The documentation states this field "Defaults to 'native' for backward compatibility," but the field lacks a #[serde(default)] attribute. Without it, deserializing older JSON catalog data that omits the container field will fail. Add the attribute to ensure seamless deserialization of legacy catalogs.
🔧 Proposed fix
// Some codecs include a description so the decoder can be initialized without extra data.
// If not provided, there may be in-band metadata (marginally higher overhead).
#[serde(default)]
#[serde_as(as = "Option<Hex>")]
pub description: Option<Bytes>,
/// Container format for frame encoding.
/// Defaults to "native" for backward compatibility.
+ #[serde(default)]
pub container: Container,
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| /// Container format for frame encoding. | |
| /// Defaults to "native" for backward compatibility. | |
| pub container: Container, | |
| // Some codecs include a description so the decoder can be initialized without extra data. | |
| // If not provided, there may be in-band metadata (marginally higher overhead). | |
| #[serde(default)] | |
| #[serde_as(as = "Option<Hex>")] | |
| pub description: Option<Bytes>, | |
| /// Container format for frame encoding. | |
| /// Defaults to "native" for backward compatibility. | |
| #[serde(default)] | |
| pub container: Container, | |
| } |
🤖 Prompt for AI Agents
In @rs/hang/src/catalog/audio/mod.rs around lines 66 - 68, Add #[serde(default)]
above the pub container: Container field in the struct (the Container field in
rs/hang/src/catalog/audio/mod.rs) so deserializing legacy JSON that omits
container falls back to the default; if Container does not already implement
Default, implement Default for Container (returning the "native" variant) or
provide a serde default function and reference it (e.g., #[serde(default =
"default_container")]) so the documented backward-compatible default is applied.
|
|
||
| /// Container format for frame encoding. | ||
| /// Defaults to "native" for backward compatibility. | ||
| pub container: Container, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Find and examine the Container enum definition
fd -t f "container.rs" rs/hang/src/catalog/ | head -20Repository: moq-dev/moq
Length of output: 87
🏁 Script executed:
#!/bin/bash
# Search for Container enum and its Default implementation
rg -n "enum Container|impl.*Default.*Container" rs/hang/src/catalog/ -A 5Repository: moq-dev/moq
Length of output: 373
🏁 Script executed:
#!/bin/bash
# Check the complete VideoConfig struct definition
rg -n "struct VideoConfig|pub container:" rs/hang/src/catalog/video/mod.rs -B 2 -A 10Repository: moq-dev/moq
Length of output: 809
🏁 Script executed:
#!/bin/bash
# Check the full Container enum definition and its derives
rg -n "enum Container" rs/hang/src/catalog/container.rs -B 5 -A 10Repository: moq-dev/moq
Length of output: 600
🏁 Script executed:
#!/bin/bash
# Verify Container has Default derive
rg -n "#\[derive" rs/hang/src/catalog/container.rs | head -5Repository: moq-dev/moq
Length of output: 134
Add #[serde(default)] to maintain backward compatibility.
The Container enum does implement Default (returning Native), but the container field in VideoConfig lacks the #[serde(default)] annotation. Without it, Serde will not use the default value during deserialization—any existing serialized catalog missing this field will fail to deserialize. Add #[serde(default)] to allow Serde to use Container::default() for missing fields:
/// Container format for frame encoding.
/// Defaults to "native" for backward compatibility.
+ #[serde(default)]
pub container: Container,📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| /// Container format for frame encoding. | |
| /// Defaults to "native" for backward compatibility. | |
| pub container: Container, | |
| /// Container format for frame encoding. | |
| /// Defaults to "native" for backward compatibility. | |
| #[serde(default)] | |
| pub container: Container, |
🤖 Prompt for AI Agents
In @rs/hang/src/catalog/video/mod.rs around lines 114 - 117, The VideoConfig
struct's container field lacks the serde default attribute, so deserializing
older catalogs that omit container will fail; add #[serde(default)] to the
container field declaration in VideoConfig so Serde will call
Container::default() (which returns Container::Native) when the field is
missing, keeping backward compatibility with existing serialized data.
| pub fn decode<T: Buf + AsRef<[u8]>>(&mut self, buf: &mut T) -> anyhow::Result<()> { | ||
| // If passthrough mode, we need to extract raw bytes before parsing. | ||
| let available_bytes = if self.passthrough_mode && buf.has_remaining() { | ||
| let chunk = buf.chunk(); | ||
| Some(Bytes::copy_from_slice(chunk)) | ||
| } else { | ||
| None | ||
| }; | ||
|
|
||
| let mut cursor = std::io::Cursor::new(buf); | ||
| let mut position = 0; | ||
| let mut bytes_offset = 0; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Potential issue: Raw bytes capture may not work correctly with multi-chunk buffers.
The available_bytes is captured from buf.chunk() which only returns the first contiguous chunk. If the buffer has multiple chunks (common with network data), this will miss data, causing the slice operations later to fail or capture incorrect data.
Proposed fix: Capture all remaining bytes
- let available_bytes = if self.passthrough_mode && buf.has_remaining() {
- let chunk = buf.chunk();
- Some(Bytes::copy_from_slice(chunk))
+ let available_bytes = if self.passthrough_mode && buf.has_remaining() {
+ // Copy all remaining bytes, not just the first chunk
+ Some(Bytes::copy_from_slice(buf.as_ref()))📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| pub fn decode<T: Buf + AsRef<[u8]>>(&mut self, buf: &mut T) -> anyhow::Result<()> { | |
| // If passthrough mode, we need to extract raw bytes before parsing. | |
| let available_bytes = if self.passthrough_mode && buf.has_remaining() { | |
| let chunk = buf.chunk(); | |
| Some(Bytes::copy_from_slice(chunk)) | |
| } else { | |
| None | |
| }; | |
| let mut cursor = std::io::Cursor::new(buf); | |
| let mut position = 0; | |
| let mut bytes_offset = 0; | |
| pub fn decode<T: Buf + AsRef<[u8]>>(&mut self, buf: &mut T) -> anyhow::Result<()> { | |
| // If passthrough mode, we need to extract raw bytes before parsing. | |
| let available_bytes = if self.passthrough_mode && buf.has_remaining() { | |
| // Copy all remaining bytes, not just the first chunk | |
| Some(Bytes::copy_from_slice(buf.as_ref())) | |
| } else { | |
| None | |
| }; | |
| let mut cursor = std::io::Cursor::new(buf); | |
| let mut position = 0; | |
| let mut bytes_offset = 0; |
🤖 Prompt for AI Agents
In @rs/hang/src/import/fmp4.rs around lines 99 - 110, The current decode method
captures available_bytes using buf.chunk(), which only takes the first
contiguous slice and will miss data for multi-chunk buffers; update the logic in
decode (the available_bytes creation when self.passthrough_mode is true) to
collect all remaining bytes instead of just buf.chunk()—for example by reading
buf.remaining() into a new Bytes/Vec or by iterating over chunks and
concatenating them—so subsequent slice/offset math uses a complete copy of the
remaining bytes rather than a single chunk.
kixelated
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this needs another attempt.
- The contents of each MoQ frame should be a fMP4 fragment (aka CMAF segment). No timestamp prefix allowed.
- We should let MSE perform A/V sync.
- We shouldn't push init segments over the data tracks. Either a separate track, or reconstructed from the catalog.
| return encodeU64(timestamp); | ||
| case "fmp4": | ||
| throw new Error("fmp4 container not yet implemented"); | ||
| case "cmaf": { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We shouldn't be encoding the timestamp here. The entire frame payload should be a fMP4 fragment.
js/hang/src/util/mime.ts
Outdated
| * @param type - "video" or "audio" | ||
| * @returns MIME type string (e.g., "video/mp4; codecs=\"avc1.42E01E\"") | ||
| */ | ||
| export function buildMimeType(codec: string, type: "video" | "audio"): string { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is very MP4 specific, but it has a generic name.
|
|
||
| // For MSE path, expose the HTMLAudioElement for direct control | ||
| #mseAudioElement = new Signal<HTMLAudioElement | undefined>(undefined); | ||
| readonly mseAudioElement = this.#mseAudioElement as Getter<HTMLAudioElement | undefined>; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think we expose this. We should expose an AudioNode instead so it works regardless of the WebCodecs/MSE backend used.
js/hang/src/watch/video/source.ts
Outdated
|
|
||
| // Expose method to set audio element for synchronization | ||
| this.setAudioSync = (audioElement: HTMLAudioElement | undefined) => { | ||
| mseSource.setAudioSync(audioElement); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this approach is just wrong. There's no way you're going to get decent synchronization via currentTime
We should have a single MediaSource and add/remove audio/video SourceBuffers.
| } | ||
|
|
||
| const isMoovAtom = hasMoovAtom(frame.data); | ||
| const isInitSegment = isMoovAtom && !initSegmentReceived; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should send init segments as a separate track, or embedded inside the catalog. Sending it at the start of each group is technically valid but goes against the spirit of them.
In fact, the description field consists of a avcC box which can be used to build a valid MP4 init segment I believe. @littlespex would know more
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
🤖 Fix all issues with AI agents
In `@js/hang/src/watch/source-mse.ts`:
- Around line 1155-1164: The QuotaExceededError handling in the catch block
re-queues the failed fragment with this.#videoAppendQueue.unshift(fragment)
which prioritizes stale fragments and, combined with appendVideoFragment's
overflow logic that drops oldest items with shift(), can cause live stream
staleness; change this behavior to either drop the fragment on
QuotaExceededError or implement a retry cap: track a retry count on fragment
objects (e.g., fragment.retryCount), increment it on QuotaExceededError, and
only requeue with unshift if retryCount is below a small threshold (otherwise
discard and log); update appendVideoFragment/#videoAppendQueue logic accordingly
so failing fragments do not preempt newer ones.
In `@rs/hang/src/model/track.rs`:
- Around line 89-99: Replace the fragile timestamp-as-zero check in the None if
frame.keyframe branch: stop using frame.timestamp.as_micros() == 0 to detect
init segments and instead detect init segments explicitly (either by
adding/using an explicit flag on the frame, e.g. frame.is_init_segment, or by
inspecting the frame payload for container init atoms like "ftyp"/"moov");
update the logic that sets self.is_init_segment_group and the tracing::info
messages to use that explicit check, and adjust callers that construct/forward
Frame values so they set the new is_init_segment flag (or ensure the
payload-inspection helper is used) before calling self.inner.append_group().
- Around line 62-74: The code incorrectly drops self.group for init segments
instead of properly closing or retaining it; when a GroupProducer is discarded
without calling group.close() the GroupState.closed remains None causing
GroupConsumer::next_frame() to see an error/cancel instead of Ok(None). Fix by:
when self.is_init_segment_group is true do not drop the group — either call
group.close() if you intend to end it, or keep the GroupProducer (retain the
group reference in self or another holder) to allow late subscribers to read a
cleanly closed/ended group; ensure is_init_segment_group is only cleared when
the retained/closed group state actually reflects that change. Mentioned
symbols: frame.keyframe, self.group, self.is_init_segment_group, group.close(),
GroupProducer, GroupState.closed, GroupConsumer::next_frame().
♻️ Duplicate comments (7)
rs/hang/src/import/fmp4.rs (2)
98-105: Potential issue: Raw bytes capture may not work correctly with multi-chunk buffers.The
buf.chunk()only returns the first contiguous chunk. If the buffer has multiple chunks (common with network data), this will miss data, causing slice operations to fail or capture incorrect data.Proposed fix: Capture all remaining bytes
let available_bytes = if self.passthrough_mode && buf.has_remaining() { - let chunk = buf.chunk(); - Some(Bytes::copy_from_slice(chunk)) + // Copy all remaining bytes, not just the first chunk + Some(buf.copy_to_bytes(buf.remaining())) } else { None };
805-809: Misleading warning:moov_bytesbeingSomeis expected to indicateinit()was never called.At this point in
transport_fragment,moov_bytesshould have already been consumed byinit(). If it's stillSome, it indicatesinit()was never called, not that the init segment wasn't sent.Suggested clarification
if self.moov_bytes.is_some() { - tracing::warn!("transporting fragment but moov_bytes is still set - init segment may not have been sent"); + tracing::warn!("transporting fragment but moov_bytes is still set - init() may not have been called"); }rs/hang/src/catalog/audio/mod.rs (1)
66-68: Add#[serde(default)]for backward compatibility.The
containerfield documentation claims it "Defaults to 'native' for backward compatibility," but without#[serde(default)], deserializing older catalog JSON that omits this field will fail. TheContainerenum implementsDefault, so the attribute is all that's needed.🔧 Proposed fix
/// Container format for frame encoding. /// Defaults to "native" for backward compatibility. + #[serde(default)] pub container: Container,rs/hang/src/catalog/video/mod.rs (1)
115-117: Add#[serde(default)]for backward compatibility.Same issue as in
AudioConfig: thecontainerfield lacks#[serde(default)], which will cause deserialization failures for existing catalogs that don't include this field.🔧 Proposed fix
/// Container format for frame encoding. /// Defaults to "native" for backward compatibility. + #[serde(default)] pub container: Container,js/hang/src/watch/audio/emitter.ts (1)
86-92: Potential memory leak: event listener not cleaned up.When
readyState < HAVE_METADATA, aloadedmetadatalistener is added with{ once: true }. If the effect is cleaned up before the event fires (e.g., component unmounts), the listener remains attached.🔧 Proposed fix
if (mseAudio.readyState >= HTMLMediaElement.HAVE_METADATA) { tryPlay(); } else { // Wait for loadedmetadata event if not ready yet mseAudio.addEventListener("loadedmetadata", tryPlay, { once: true }); + effect.cleanup(() => mseAudio.removeEventListener("loadedmetadata", tryPlay)); }js/hang/src/watch/audio/source.ts (1)
255-268: Per-track forwarding should not register onthis.#signals(leaks across track switches).These
this.#signals.effect(...)calls insideeffect.spawn()will accumulate across track switches because they're registered on the long-lived#signalsinstance rather than scoped to the current run. Create a localEffectinstance for per-track forwarding.🔧 Proposed fix
effect.spawn(async () => { + // Scope forwarding effects to this specific run + const forward = new Effect(); + effect.cleanup(() => forward.close()); + // ... wait for video MSE source ... // For MSE path, audio plays through the video element - this.#signals.effect((eff) => { + forward.effect((eff) => { const latestMseSource = this.video?.mseSource ? eff.get(this.video.mseSource) : undefined; const mseSource = latestMseSource || videoMseSource; const videoElement = mseSource?.videoElement ? eff.get(mseSource.videoElement) : undefined; eff.set(this.#mseAudioElement, videoElement as HTMLAudioElement | undefined); }); // Forward stats - this.#signals.effect((eff) => { + forward.effect((eff) => { eff.set(this.#stats, { bytesReceived: 0 }); });js/hang/src/watch/video/source.ts (1)
237-270: Signal forwarding effects registered onthis.#signalswill accumulate across track switches.The forwarding effects for
frame,display,bufferStatus,syncStatus,mediaSource, andstatsare registered onthis.#signals(class-level Effect) rather than scoped to the current track'seffect. When tracks switch, these effects aren't cleaned up—they accumulate and may forward stale values from closedmseSourceinstances.Suggested fix: scope forwarding to current track
effect.spawn(async () => { const { SourceMSE } = await import("../source-mse.js"); const mseSource = new SourceMSE(this.latency); effect.cleanup(() => mseSource.close()); + const forward = new Effect(); + effect.cleanup(() => forward.close()); // Forward signals using effects - this.#signals.effect((eff) => { + forward.effect((eff) => { const frame = eff.get(mseSource.frame); eff.set(this.frame, frame); }); // ... apply same pattern to other forwarding effects
🧹 Nitpick comments (13)
js/hang/src/frame.ts (1)
33-35: Redundant ternary expression.Both branches of the ternary produce the same value (
source.byteLength), making the condition unnecessary.Suggested simplification
- const payloadSize = source instanceof Uint8Array ? source.byteLength : source.byteLength; + const payloadSize = source.byteLength;rs/hang/src/import/fmp4.rs (2)
387-411: Redundant init segment storage for combined audio/video case.When
has_video && has_audio, this block attempts to store init segments in configs that already received them in the preceding blocks (lines 355-383). Theif config.init_segment.is_none()check will always be false since the init segment was just stored above.Suggested removal of redundant block
- // If the init segment contains both tracks (e.g., from a direct fMP4 file), - // also store it in the other config type for convenience - if has_video && has_audio { - // Full init segment with both tracks - store in both configs - if let Some(video) = catalog.video.as_mut() { - for (name, config) in video.renditions.iter_mut() { - if config.init_segment.is_none() { - config.init_segment = Some(init_segment_bytes.clone()); - tracing::debug!( - video_track = %name, - "stored full init segment (with both tracks) in video config" - ); - } - } - } - if let Some(audio) = catalog.audio.as_mut() { - for (name, config) in audio.renditions.iter_mut() { - if config.init_segment.is_none() { - config.init_segment = Some(init_segment_bytes.clone()); - tracing::debug!( - audio_track = %name, - "stored full init segment (with both tracks) in audio config" - ); - } - } - } - }
888-893: Unnecessary clone of fragment bytes in loop.
fragment.clone()is called for each track in themoof.trafloop. For multi-track scenarios (audio + video), this creates unnecessary copies. SinceBytesuses reference counting, consider moving the clone outside the critical path or restructuring to avoid redundant clones.Suggested optimization
+ // Clone once per unique track that needs it, or use Arc if many tracks share data let frame = hang::Frame { timestamp, keyframe: should_be_keyframe, - payload: fragment.clone().into(), + payload: fragment.clone().into(), // Note: Bytes::clone is cheap (Arc increment) };Actually,
Bytes::clone()is already cheap (just an Arc increment), so this is acceptable. Consider adding a comment to clarify this for future readers.rs/hang/src/catalog/root.rs (1)
208-222: Consider demoting per-rendition logging todebuglevel.The
info!logging on lines 212 and 217 iterates over all video and audio renditions on every catalog publish. For broadcasts with multiple renditions, this could generate significant log noise at theinfolevel. Consider demoting todebugortraceto match the full catalog JSON logging on line 222.♻️ Proposed refactor
if let Some(video) = &self.catalog.video { for (name, config) in &video.renditions { - tracing::info!(track = name, container = ?config.container, "publishing catalog with container"); + tracing::debug!(track = name, container = ?config.container, "publishing catalog with container"); } } if let Some(audio) = &self.catalog.audio { for (name, config) in &audio.renditions { - tracing::info!(track = name, container = ?config.container, "publishing catalog with container"); + tracing::debug!(track = name, container = ?config.container, "publishing catalog with container"); } }js/hang/src/watch/audio/emitter.ts (1)
48-53: Remove unusedmutedread to avoid unnecessary re-renders.Line 50 reads
mutedviaeffect.get()but the value isn't used—enabledis computed solely from!paused. This causes the effect to re-run whenevermutedchanges, which is wasteful.♻️ Proposed fix
this.#signals.effect((effect) => { const paused = effect.get(this.paused); - const muted = effect.get(this.muted); const enabled = !paused; this.source.enabled.set(enabled); });js/hang/src/watch/audio/source.ts (3)
164-181: Remove redundant enabled checks.Lines 171-177 and 179-181 both check
!enabled && config?.container !== "cmaf". The second check is redundant after the first conditional block.♻️ Proposed fix
`#runDecoder`(effect: Effect): void { const enabled = effect.get(this.enabled); const config = effect.get(this.config); - // For CMAF, we need to add the SourceBuffer even if audio is disabled - // This ensures the MediaSource has both SourceBuffers before video starts appending - // We'll just not append audio data if disabled - if (config?.container === "cmaf") { - // Always initialize MSE for CMAF, even if disabled - // The SourceBuffer needs to be added before video starts appending - } else if (!enabled) { - // For non-CMAF, if disabled, don't initialize + // For non-CMAF containers, skip if disabled + if (!enabled && config?.container !== "cmaf") { return; } - - if (!enabled && config?.container !== "cmaf") { - return; - }
324-389: Complex retry logic is thorough but could benefit from extraction.The nested retry loops and MediaSource state checking are comprehensive for handling transient MSE states, but the complexity makes the method difficult to follow. Consider extracting the
resolveOpenMediaSourcehelper to a class method and simplifying the retry structure.
233-244: Use proper typing instead ofanyfor MSE source references.
videoMseSource(line 233) andlatestMseSource(line 283) are both typed asany, which defeats TypeScript's type safety. Import and use theSourceMSEtype from../source-mseinstead.♻️ Proposed approach
import type { SourceMSE } from "../source-mse"; // Then use: let videoMseSource: SourceMSE | undefined; let latestMseSource: SourceMSE | undefined;js/hang/src/watch/video/source.ts (1)
106-108: Consider adding a proper type formseSourceinstead ofany.Using
anyloses type safety and makes it harder to understand what methods/properties are available on the exposed MSE source. Consider defining an interface or using the actualSourceMSEtype.Suggested type improvement
+import type { SourceMSE } from "../source-mse.js"; + // Expose mseSource instance for audio to access coordination methods -#mseSource = new Signal<any>(undefined); -readonly mseSource = this.#mseSource as Getter<any>; +#mseSource = new Signal<SourceMSE | undefined>(undefined); +readonly mseSource = this.#mseSource as Getter<SourceMSE | undefined>;js/hang/src/watch/source-mse.ts (4)
158-252: The initialization polling logic is complex; consider simplifying with a promise-based wait helper.The method uses multiple nested timeout/interval patterns with duplicated state checking. This complexity makes the code hard to follow and maintain. Consider extracting the "wait for MediaSource to be open" pattern into a reusable helper.
Suggested helper extraction
async `#waitForMediaSourceOpen`(maxWaitMs = 5000, checkIntervalMs = 50): Promise<MediaSource> { const startTime = Date.now(); while (Date.now() - startTime < maxWaitMs) { const mediaSource = this.mediaSource.peek() || this.#mediaSource; if (mediaSource?.readyState === "open") { return mediaSource; } // Reset stale closed references if (mediaSource?.readyState === "closed" && this.#mediaSource === mediaSource) { this.#mediaSource = undefined; } await new Promise(resolve => setTimeout(resolve, checkIntervalMs)); } const state = this.mediaSource.peek()?.readyState || this.#mediaSource?.readyState || "not created"; throw new Error(`MediaSource not ready after ${maxWaitMs}ms (state: ${state})`); }
172-174: Arbitrary 10ms delay for signal propagation is fragile.This delay assumes signal updates propagate within 10ms, which may not hold under load. Consider using a reactive approach or explicitly waiting for the signal value.
1236-1240: Extract duplicate base64 decoding to a helper method.The same base64-to-Uint8Array conversion appears in both
runTrack(here) andrunAudioTrack(lines 1376-1380).Suggested helper
+#decodeBase64(base64: string): Uint8Array { + const binaryString = atob(base64); + const bytes = new Uint8Array(binaryString.length); + for (let i = 0; i < binaryString.length; i++) { + bytes[i] = binaryString.charCodeAt(i); + } + return bytes; +} // In runTrack: -const binaryString = atob(config.initSegment); -const fullInitSegment = new Uint8Array(binaryString.length); -for (let i = 0; i < binaryString.length; i++) { - fullInitSegment[i] = binaryString.charCodeAt(i); -} +const fullInitSegment = this.#decodeBase64(config.initSegment);
1282-1306: Consider extractingisInitSegmentDataas a class method.The nested helper function duplicates some of the atom-type checking logic. Extracting it as a private class method would improve consistency and allow reuse (e.g., the similar
hasMoovAtominrunAudioTrack).
📜 Review details
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (20)
js/hang/src/catalog/audio.tsjs/hang/src/catalog/video.tsjs/hang/src/container/codec.tsjs/hang/src/frame.tsjs/hang/src/watch/audio/emitter.tsjs/hang/src/watch/audio/source.tsjs/hang/src/watch/broadcast.tsjs/hang/src/watch/source-mse.tsjs/hang/src/watch/video/source.tsrs/hang/Cargo.tomlrs/hang/examples/video.rsrs/hang/src/catalog/audio/mod.rsrs/hang/src/catalog/root.rsrs/hang/src/catalog/video/mod.rsrs/hang/src/import/aac.rsrs/hang/src/import/avc3.rsrs/hang/src/import/fmp4.rsrs/hang/src/import/hev1.rsrs/hang/src/import/opus.rsrs/hang/src/model/track.rs
🚧 Files skipped from review as they are similar to previous changes (8)
- rs/hang/src/import/hev1.rs
- js/hang/src/catalog/video.ts
- rs/hang/src/import/aac.rs
- js/hang/src/watch/broadcast.ts
- js/hang/src/container/codec.ts
- rs/hang/examples/video.rs
- rs/hang/src/import/avc3.rs
- rs/hang/src/import/opus.rs
🧰 Additional context used
📓 Path-based instructions (3)
js/**/*.{ts,tsx,js,jsx,json}
📄 CodeRabbit inference engine (CLAUDE.md)
Use
bunfor all TypeScript/JavaScript package management and script execution (not npm, yarn, or pnpm)
Files:
js/hang/src/frame.tsjs/hang/src/watch/audio/emitter.tsjs/hang/src/watch/audio/source.tsjs/hang/src/catalog/audio.tsjs/hang/src/watch/source-mse.tsjs/hang/src/watch/video/source.ts
rs/**/*.rs
📄 CodeRabbit inference engine (CLAUDE.md)
rs/**/*.rs: Usecargofor Rust-specific operations
Rust tests should be integrated within source files rather than in separate test files
Files:
rs/hang/src/catalog/root.rsrs/hang/src/catalog/audio/mod.rsrs/hang/src/catalog/video/mod.rsrs/hang/src/model/track.rsrs/hang/src/import/fmp4.rs
rs/hang/**/*.rs
📄 CodeRabbit inference engine (CLAUDE.md)
Media-specific encoding/decoding should be implemented in the hang layer on top of moq-lite
Files:
rs/hang/src/catalog/root.rsrs/hang/src/catalog/audio/mod.rsrs/hang/src/catalog/video/mod.rsrs/hang/src/model/track.rsrs/hang/src/import/fmp4.rs
🧠 Learnings (2)
📚 Learning: 2026-01-11T03:47:52.766Z
Learnt from: CR
Repo: moq-dev/moq PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-11T03:47:52.766Z
Learning: Applies to rs/hang/**/*.rs : Media-specific encoding/decoding should be implemented in the hang layer on top of moq-lite
Applied to files:
rs/hang/src/catalog/root.rsrs/hang/Cargo.tomljs/hang/src/watch/audio/source.tsjs/hang/src/watch/video/source.tsrs/hang/src/import/fmp4.rs
📚 Learning: 2026-01-11T03:47:52.766Z
Learnt from: CR
Repo: moq-dev/moq PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-11T03:47:52.766Z
Learning: Applies to rs/**/*.rs : Use `cargo` for Rust-specific operations
Applied to files:
rs/hang/Cargo.toml
🧬 Code graph analysis (6)
rs/hang/src/catalog/root.rs (5)
js/hang/src/publish/broadcast.ts (1)
track(120-139)js/hang/src/watch/broadcast.ts (1)
catalog(148-165)js/hang/src/catalog/audio.ts (1)
AudioConfig(66-66)js/hang/src/catalog/container.ts (1)
Container(12-12)js/hang/src/catalog/video.ts (1)
VideoConfig(107-107)
rs/hang/src/catalog/audio/mod.rs (1)
js/hang/src/catalog/container.ts (1)
Container(12-12)
rs/hang/src/catalog/video/mod.rs (1)
js/hang/src/catalog/container.ts (1)
Container(12-12)
js/hang/src/watch/audio/emitter.ts (2)
js/hang/src/watch/audio/source.ts (5)
effect(96-150)effect(152-162)effect(164-218)effect(220-391)effect(393-451)js/hang/src/watch/video/source.ts (8)
effect(141-164)effect(166-182)effect(184-211)effect(213-220)effect(222-280)effect(282-423)effect(461-481)effect(483-493)
js/hang/src/watch/audio/source.ts (3)
js/hang/src/watch/video/source.ts (1)
Source(56-503)js/hang/src/watch/broadcast.ts (2)
catalog(148-165)Broadcast(34-177)js/hang/src/catalog/audio.ts (2)
AudioConfig(66-66)Audio(65-65)
js/hang/src/watch/source-mse.ts (2)
js/hang/src/watch/video/source.ts (9)
VideoStats(41-45)effect(141-164)effect(166-182)effect(184-211)effect(213-220)effect(222-280)effect(282-423)effect(461-481)effect(483-493)js/hang/src/publish/priority.ts (1)
PRIORITY(3-11)
🔇 Additional comments (18)
js/hang/src/catalog/audio.ts (1)
35-40: LGTM! Well-documented initSegment field addition.The new
initSegmentfield is properly typed as optional and clearly documents its purpose (base64-encoded ftyp+moov for MSE playback) and fallback behavior. This aligns well with the parallel changes in video configuration.rs/hang/Cargo.toml (1)
35-35: LGTM! Required feature addition for base64 serialization.Adding the
base64feature toserde_withis necessary to support the newinit_segmentfields that store base64-encoded fMP4 initialization data in the catalog.js/hang/src/frame.ts (1)
23-31: LGTM! Correct CMAF passthrough handling.CMAF fragments contain their own timing information (tfdt/trun boxes), so bypassing the timestamp header encoding is the correct approach for passthrough mode.
rs/hang/src/model/track.rs (1)
58-60: LGTM! Correct CMAF timestamp handling.Skipping timestamp encoding for CMAF is correct since CMAF fragments carry their own timing information in tfdt/trun boxes.
rs/hang/src/import/fmp4.rs (2)
233-259: LGTM! Well-structured init function with proper passthrough handling.The initialization flow correctly:
- Stores passthrough mode state
- Creates tracks with appropriate container types
- Propagates container configuration to TrackProducer
803-912: Good implementation of transport_fragment with proper keyframe handling.The fragment transport logic correctly:
- Validates moov presence before processing
- Determines keyframe status using the same logic as
extract()- Handles first-frame-must-be-keyframe requirement for group creation
- Manages audio keyframe timing relative to video keyframes
rs/hang/src/catalog/audio/mod.rs (1)
70-78: LGTM! Init segment field is well-designed.The
init_segmentfield correctly uses#[serde(default)]for optional deserialization andBase64encoding for serialization. The documentation clearly explains its purpose for MSE playback and the rationale for embedding in the catalog.rs/hang/src/catalog/video/mod.rs (1)
119-131: LGTM! Init segment field with good documentation.The
init_segmentfield is properly configured with#[serde(default)]andBase64serialization. The documentation helpfully notes the future optimization possibility of building init segments from the description field.rs/hang/src/catalog/root.rs (1)
288-361: LGTM! Test updates are consistent.The test correctly includes
container: "native"in the expected JSON and initializescontainer: Container::Nativeandinit_segment: Nonein the struct construction, ensuring serialization round-trip correctness.js/hang/src/watch/audio/emitter.ts (1)
61-116: LGTM! Clean MSE vs WebCodecs path separation.The bifurcation logic is well-structured: MSE path directly controls the
HTMLAudioElementfor volume/mute/pause, while WebCodecs path usesAudioWorkletwithGainNode. The early return on line 95 ensures paths are mutually exclusive.js/hang/src/watch/audio/source.ts (1)
393-451: LGTM! WebCodecs path implementation is clean.The WebCodecs path correctly initializes the
Frame.Consumerwith the container type and properly configures theAudioDecoderwith the codec description. The decode loop and stats tracking are straightforward.js/hang/src/watch/video/source.ts (2)
30-32: LGTM! Type refinement correctly separates reload-triggering config from optional dimensions.The use of
Omit+Partial<Pick<...>>properly allowscodedWidth/codedHeightto be present for logging while preventing unnecessary subscription reloads when only dimensions change.
214-220: LGTM! Clean routing based on container type.The branch logic correctly directs CMAF content to MSE and other containers (native/raw) to WebCodecs. This separation allows the appropriate decoder pipeline for each format.
js/hang/src/watch/source-mse.ts (5)
26-70: LGTM! Well-structured class with clear signal-based state management.The class properly encapsulates MSE state with appropriate signals for external observation (
frame,display,bufferStatus,syncStatus,mediaSource,videoElement) and private queues for fragment management.
87-156: LGTM! Robust MediaSource initialization with proper timeout handling.The initialization correctly:
- Creates a hidden video element for offscreen rendering
- Sets up MediaSource with sourceopen timeout (5s)
- Handles both success and error paths
- Cleans up timeout on completion
711-817: MP4 atom parsing is thorough and handles edge cases well.The implementation correctly:
- Parses ftyp and moov atoms
- Handles multiple tracks by filtering on handler type (vide/soun)
- Provides fallback handler types for compatibility
- Rebuilds moov/mvex with only the target track
Consider adding brief inline comments explaining the MP4 box structure for future maintainers.
1460-1525: LGTM! Comprehensive cleanup handling.The
close()method properly:
- Stores references before clearing fields to avoid race conditions
- Aborts updating SourceBuffers before ending the stream
- Revokes the object URL to prevent memory leaks
- Removes the video element from the DOM
- Closes the internal Effect for signal cleanup
1310-1352: LGTM! GOP-based fragment grouping is appropriate for MSE.Concatenating fragments by group before appending ensures complete GOPs are submitted to the SourceBuffer, which is the recommended pattern for MSE to avoid decoder issues with partial GOPs.
✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.
| if frame.keyframe { | ||
| if let Some(group) = self.group.take() { | ||
| group.close(); | ||
| // Don't close the init segment group - keep it open for new subscribers | ||
| if self.is_init_segment_group { | ||
| tracing::debug!("keeping init segment group open for new subscribers"); | ||
| // Don't close it, just drop it (the group remains open) | ||
| drop(group); | ||
| } else { | ||
| tracing::info!(timestamp = ?frame.timestamp, "closing group and creating new one for keyframe"); | ||
| group.close(); | ||
| } | ||
| self.is_init_segment_group = false; | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Find GroupProducer implementation
rg -n "struct GroupProducer" --type rust
rg -n "impl.*GroupProducer" --type rust -A 3
rg -n "fn close" --type rust | grep -i groupRepository: moq-dev/moq
Length of output: 620
🏁 Script executed:
#!/bin/bash
# Check Drop implementation for GroupProducer
rg -n "impl Drop" --type rust -A 10 | head -100Repository: moq-dev/moq
Length of output: 5758
🏁 Script executed:
#!/bin/bash
# Look at the context of group usage in the track.rs file
fd "track.rs" -x cat -n {} | head -150Repository: moq-dev/moq
Length of output: 5909
🏁 Script executed:
#!/bin/bash
# Get the full GroupProducer struct definition and close method
sed -n '74,145p' rs/moq-lite/src/model/group.rsRepository: moq-dev/moq
Length of output: 1692
🏁 Script executed:
#!/bin/bash
# Check if GroupProducer has a Drop impl or what happens on drop
rg -A 20 "pub struct GroupProducer" rs/moq-lite/src/model/group.rsRepository: moq-dev/moq
Length of output: 548
🏁 Script executed:
#!/bin/bash
# Get the close method implementation
sed -n '120,140p' rs/moq-lite/src/model/group.rsRepository: moq-dev/moq
Length of output: 549
🏁 Script executed:
#!/bin/bash
# Check the full context around the group management in track.rs
sed -n '23,150p' rs/hang/src/model/track.rsRepository: moq-dev/moq
Length of output: 4015
🏁 Script executed:
#!/bin/bash
# Check if there are any other usages of group.close() or group dropping in track.rs
rg -n "group\." rs/hang/src/model/track.rs | head -30Repository: moq-dev/moq
Length of output: 721
🏁 Script executed:
#!/bin/bash
# Understand watch::Sender behavior - check if dropping it causes issues
rg -n "watch::Sender" rs/moq-lite/src/model/group.rs -B 2 -A 10Repository: moq-dev/moq
Length of output: 331
🏁 Script executed:
#!/bin/bash
# Check the GroupState definition to understand the semantics
rg -A 20 "struct GroupState" rs/moq-lite/src/model/group.rsRepository: moq-dev/moq
Length of output: 466
🏁 Script executed:
#!/bin/bash
# Verify: when a watch::Sender is dropped, can receivers still exist?
# Check how GroupConsumer uses the state and what happens on drop
rg -B 5 -A 15 "struct GroupConsumer" rs/moq-lite/src/model/group.rsRepository: moq-dev/moq
Length of output: 477
🏁 Script executed:
#!/bin/bash
# Check if there are any consumers that might be accessing the dropped group
# Search for how GroupConsumer is used and if it needs the sender to be alive
rg -n "GroupConsumer" rs/hang/src/model/track.rs -A 3 -B 3Repository: moq-dev/moq
Length of output: 860
🏁 Script executed:
#!/bin/bash
# Verify: understand watch::Sender drop semantics - check Tokio documentation or usage patterns
# Check if groups are consumed through a broadcast/watch mechanism
rg -n "\.consume()" rs/hang/src/model/track.rs -B 3 -A 5Repository: moq-dev/moq
Length of output: 458
🏁 Script executed:
#!/bin/bash
# Check how GroupConsumer handles an unclosed group
# Specifically, look at the read_frame or consumption logic
rg -B 5 -A 20 "fn read_frame" rs/moq-lite/src/model/group.rsRepository: moq-dev/moq
Length of output: 835
🏁 Script executed:
#!/bin/bash
# Check the full GroupConsumer implementation to see if it cares about closed state
rg -B 2 -A 30 "impl GroupConsumer" rs/moq-lite/src/model/group.rs | head -100Repository: moq-dev/moq
Length of output: 942
🏁 Script executed:
#!/bin/bash
# Look at how closed state is used - does it matter if group is never closed?
rg -n "\.closed" rs/moq-lite/src/model/group.rs -B 2 -A 2Repository: moq-dev/moq
Length of output: 753
🏁 Script executed:
#!/bin/bash
# Check if GroupProducer has a Drop impl that might close the group
rg -A 10 "impl Drop for GroupProducer" rs/moq-lite/src/model/group.rsRepository: moq-dev/moq
Length of output: 37
🏁 Script executed:
#!/bin/bash
# Verify: when a group is dropped without close(), what state are consumers left in?
# Check the next_frame logic to see if it handles unclosed groups
rg -B 3 -A 15 "loop {" rs/moq-lite/src/model/group.rs | grep -A 15 "state.borrow"Repository: moq-dev/moq
Length of output: 130
🏁 Script executed:
#!/bin/bash
# Look for Drop impl for GroupProducer (first search had no matches)
# This confirms GroupProducer does NOT implement Drop
rg "impl Drop" rs/moq-lite/src/model/group.rsRepository: moq-dev/moq
Length of output: 37
🏁 Script executed:
#!/bin/bash
# Check the next_frame logic more carefully
sed -n '165,220p' rs/moq-lite/src/model/group.rsRepository: moq-dev/moq
Length of output: 1363
🏁 Script executed:
#!/bin/bash
# Understand: when watch::Sender is dropped, what happens to state.closed?
# Look at how the group is consumed and what happens if closed is None
rg -B 5 -A 10 "state.borrow_and_update" rs/moq-lite/src/model/group.rsRepository: moq-dev/moq
Length of output: 396
Group must be properly closed, not dropped, before being discarded.
When GroupProducer is dropped without calling close(), the closed field in GroupState remains None. This causes consumers reading from the group to eventually encounter state.changed().await.is_err() in GroupConsumer::next_frame(), which returns Error::Cancel instead of the proper Ok(None) termination signal. This breaks the consumer's ability to distinguish between a cleanly ended group and a forcibly cancelled stream.
If the intent is to keep the init segment group open for late subscribers, the group reference must be retained (perhaps stored elsewhere in the producer), not dropped. Simply dropping it without closing leaves the group in an incomplete state from the consumer's perspective.
Additionally, setting is_init_segment_group = false after the drop is problematic—this flag should reflect the actual state of groups that exist, not be cleared after dropping one.
🤖 Prompt for AI Agents
In `@rs/hang/src/model/track.rs` around lines 62 - 74, The code incorrectly drops
self.group for init segments instead of properly closing or retaining it; when a
GroupProducer is discarded without calling group.close() the GroupState.closed
remains None causing GroupConsumer::next_frame() to see an error/cancel instead
of Ok(None). Fix by: when self.is_init_segment_group is true do not drop the
group — either call group.close() if you intend to end it, or keep the
GroupProducer (retain the group reference in self or another holder) to allow
late subscribers to read a cleanly closed/ended group; ensure
is_init_segment_group is only cleared when the retained/closed group state
actually reflects that change. Mentioned symbols: frame.keyframe, self.group,
self.is_init_segment_group, group.close(), GroupProducer, GroupState.closed,
GroupConsumer::next_frame().
| None if frame.keyframe => { | ||
| let new_group = self.inner.append_group(); | ||
| // Log when creating a new group, especially for init segment (timestamp 0) | ||
| if frame.timestamp.as_micros() == 0 { | ||
| tracing::info!(timestamp = ?frame.timestamp, "creating new group for init segment (timestamp 0)"); | ||
| // Mark this as the init segment group so we can keep it open | ||
| self.is_init_segment_group = true; | ||
| } else { | ||
| tracing::info!(timestamp = ?frame.timestamp, "creating new group for keyframe"); | ||
| } | ||
| new_group |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fragile init segment detection using timestamp 0.
Detecting the init segment by checking timestamp.as_micros() == 0 is fragile. A legitimate media frame could have timestamp 0 (e.g., the first frame of a stream starting at time 0). Consider using an explicit flag or checking the payload content for ftyp/moov atoms instead.
🤖 Prompt for AI Agents
In `@rs/hang/src/model/track.rs` around lines 89 - 99, Replace the fragile
timestamp-as-zero check in the None if frame.keyframe branch: stop using
frame.timestamp.as_micros() == 0 to detect init segments and instead detect init
segments explicitly (either by adding/using an explicit flag on the frame, e.g.
frame.is_init_segment, or by inspecting the frame payload for container init
atoms like "ftyp"/"moov"); update the logic that sets self.is_init_segment_group
and the tracing::info messages to use that explicit check, and adjust callers
that construct/forward Frame values so they set the new is_init_segment flag (or
ensure the payload-inspection helper is used) before calling
self.inner.append_group().
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🤖 Fix all issues with AI agents
In `@js/hang/src/watch/audio/source.ts`:
- Around line 380-387: The code may call initializeAudio(config) twice when
video restarts because after initializing the original latestMseSource you then
detect a restart and call initializeAudio(config) again on verifyMseSource; fix
by aborting/cleaning up the first initialization before retrying or by
short-circuiting the second call: add an initialization cancelation or disposed
flag on the MSE instance (e.g., a cancel token or an isDisposed/isInitializing
property on the object returned/managed by latestMseSource), ensure
effect.get(video.mseSource) is compared against that token, call the instance
cleanup/dispose method (or set cancelled) on latestMseSource before calling
initializeAudio on verifyMseSource, or instead return and trigger a full MSE
re-init flow so there is only one active initializeAudio call at a time.
In `@rs/hang/src/import/fmp4.rs`:
- Around line 826-833: When computing flags for keyframe detection inside
transport_fragment() (the block using traf.trun.first(), trun.entries.first(),
entry.flags and tfhd.default_sample_flags), fall back to the `trex` box
default_sample_flags from moov.mvex before using a hard default; i.e. if
entry.flags is None use tfhd.default_sample_flags if present, otherwise look up
the matching trex (by track_id from traf.tfhd.track_id) and use
trex.default_sample_flags, and only then use a final default value. Update the
flags assignment so it checks entry.flags -> tfhd.default_sample_flags ->
moov.mvex.trex[track_id].default_sample_flags -> default.
♻️ Duplicate comments (11)
rs/hang/src/model/track.rs (2)
62-74: Dropping the init segment group doesn't preserve it for subscribers.The comment states the intent to "keep it open for new subscribers," but calling
drop(group)discards theGroupProducerreference. Without callingclose(), consumers will see an error (Error::Cancel) rather than clean termination when they finish reading. To actually preserve the init segment for late subscribers, the group reference should be retained (stored in a separate field) rather than dropped.
89-100: Init segment detection based on timestamp 0 is unreliable.Using
frame.timestamp.as_micros() == 0to identify init segments is fragile—the first media frame of a stream legitimately starts at timestamp 0. Consider checking for CMAF container type combined with payload inspection for ftyp/moov box signatures, or add an explicitis_init_segmentflag toFrame.rs/hang/src/import/fmp4.rs (2)
99-105:buf.chunk()only captures the first contiguous slice, not all remaining bytes.This issue was flagged in a previous review and remains unaddressed. When the buffer contains multiple chunks (common with network data),
buf.chunk()returns only the first chunk, causing subsequent slice operations (lines 121-122, 142-143, 171-172, 184-185) to potentially access incorrect data or fail.Proposed fix: Copy all remaining bytes
let available_bytes = if self.passthrough_mode && buf.has_remaining() { - let chunk = buf.chunk(); - Some(Bytes::copy_from_slice(chunk)) + Some(buf.copy_to_bytes(buf.remaining())) } else { None };Note: If you need to preserve the buffer for later cursor operations, you may need to copy and then reset, or work with the full byte slice via
buf.as_ref()ifT: AsRef<[u8]>is guaranteed to return contiguous data.
787-789: Warning message still references "init segment" instead ofinit()call.The previous review suggested clarifying this message since
moov_bytesbeingSomeat this point indicatesinit()was never called (which would have cleared it at line 399), not that the init segment wasn't sent.Suggested clarification
if self.moov_bytes.is_some() { - tracing::warn!("transporting fragment but moov_bytes is still set - init segment may not have been sent"); + tracing::warn!("transporting fragment but moov_bytes is still set - init() may not have been called"); }js/hang/src/watch/audio/emitter.ts (1)
85-91: Potential memory leak: event listener not cleaned up.When
mseAudio.readyState < HAVE_METADATA, aloadedmetadatalistener is added with{ once: true }. If the effect is cleaned up before the event fires (e.g., component unmounts), the listener remains attached.Proposed fix
// Try to play if we have metadata (HAVE_METADATA = 1), browser will start when ready if (mseAudio.readyState >= HTMLMediaElement.HAVE_METADATA) { tryPlay(); } else { // Wait for loadedmetadata event if not ready yet mseAudio.addEventListener("loadedmetadata", tryPlay, { once: true }); + effect.cleanup(() => mseAudio.removeEventListener("loadedmetadata", tryPlay)); }js/hang/src/watch/source-mse.ts (1)
1201-1210: QuotaExceededError retry could prioritize stale fragments over fresh ones.When
QuotaExceededErroroccurs, the fragment is re-queued at the front withunshift(). Combined with the queue overflow logic inappendVideoFragmentthat discards the oldest fragment (viashift()), this creates a scenario where old fragments may be retried while newer ones are dropped.For live streaming, consider dropping the failing fragment instead of retrying, or implementing a retry counter.
js/hang/src/watch/video/source.ts (2)
237-270: Signal forwarding effects registered onthis.#signalsmay leak across track switches.The forwarding effects are registered on
this.#signalswhich persists across track switches. When a new track starts, these effects from the old track remain active, potentially causing stale data or memory leaks.Proposed fix: scope forwarding to current effect
effect.spawn(async () => { const { SourceMSE } = await import("../source-mse.js"); const mseSource = new SourceMSE(this.latency); effect.cleanup(() => mseSource.close()); + const forward = new Effect(); + effect.cleanup(() => forward.close()); + // Forward signals using effects - this.#signals.effect((eff) => { + forward.effect((eff) => { const frame = eff.get(mseSource.frame); eff.set(this.frame, frame); }); - this.#signals.effect((eff) => { + forward.effect((eff) => { const display = eff.get(mseSource.display); eff.set(this.display, display); }); // ... apply same change to remaining forwarding effects
274-278: MSE→WebCodecs fallback for CMAF is broken—data cannot be decoded without demuxing.When the MSE path errors, the fallback calls
#runWebCodecsPathwith the sameconfigwhereconfig.container === "cmaf". The WebCodecs path passes CMAF-formatted data (moof/mdat atoms) directly toVideoDecoder.decode(), which expects raw codec frames. This will fail.Either prevent fallback for CMAF containers, or surface the error without fallback.
Proposed fix: prevent fallback for CMAF
try { await mseSource.runTrack(effect, broadcast, name, config); } catch (error) { - console.error("MSE path error, falling back to WebCodecs:", error); - // Fallback to WebCodecs - this.#runWebCodecsPath(effect, broadcast, name, config); + console.error("MSE path error:", error); + if (config.container !== "cmaf") { + console.log("Falling back to WebCodecs path"); + this.#runWebCodecsPath(effect, broadcast, name, config); + } else { + // CMAF data requires MSE - cannot fallback to WebCodecs without demuxing + throw error; + } }js/hang/src/watch/audio/source.ts (3)
46-49: StalemseAudioElementsignal when switching away from MSE path.The
#mseAudioElementsignal is never explicitly cleared when the audio source switches from MSE to WebCodecs path or when disabled. This can leave a stale reference that external consumers (like the emitter) may incorrectly use.Additionally, at line 262, an
HTMLVideoElementis cast toHTMLAudioElement. While they share volume/mute APIs, the type assertion is misleading and could cause issues if audio-specific properties are accessed.Proposed fix
`#runDecoder`(effect: Effect): void { const enabled = effect.get(this.enabled); const config = effect.get(this.config); + + // Clear MSE element when not on MSE path + if (config?.container !== "cmaf") { + effect.set(this.#mseAudioElement, undefined); + }
256-269: Per-track forwarding registered onthis.#signalsleaks across track switches.Using
this.#signals.effect()registers effects at the class level. When the track switches or MSE restarts, these effects persist and can forward stale data from old sources. The forwarding should be scoped to the current run's lifecycle.Proposed fix - scope to local effect
- this.#signals.effect((eff) => { + effect.effect((eff) => { // Get latest SourceMSE instance in case video restarted const latestMseSource = this.video?.mseSource ? eff.get(this.video.mseSource) : undefined; const mseSource = latestMseSource || videoMseSource; const videoElement = mseSource?.videoElement ? eff.get(mseSource.videoElement) : undefined; // Expose as audioElement for emitter compatibility (HTMLVideoElement works the same as HTMLAudioElement for volume/mute) eff.set(this.#mseAudioElement, videoElement as HTMLAudioElement | undefined); }); // Forward stats (audio stats are not currently tracked in unified SourceMSE, but we can add them later) // For now, just set empty stats - this.#signals.effect((eff) => { + effect.effect((eff) => { eff.set(this.#stats, { bytesReceived: 0 }); });
247-251: WebCodecs fallback is invalid for CMAF containers.When
config.container === "cmaf"and video MSE source is unavailable, falling back to#runWebCodecsPathwill fail because WebCodecsAudioDecoderexpects raw codec frames, not fMP4 fragments. TheFrame.Consumerwill still emit fMP4 fragments sinceconfig.containerremains"cmaf".This should either fail gracefully with an error message or attempt to select a non-CMAF rendition.
Proposed fix
if (!videoMseSource) { - console.error("[Audio Source] Video MSE source not available, falling back to WebCodecs"); - this.#runWebCodecsPath(effect, broadcast, name, config, catalog); + console.error("[Audio Source] Video MSE source not available for CMAF audio - cannot play audio"); + // WebCodecs cannot decode fMP4 fragments, so we cannot fallback return; }
🧹 Nitpick comments (6)
rs/hang/src/import/fmp4.rs (2)
199-206: Consider reducing log verbosity for production.
tracing::info!on every fragment may be excessive for production streams. Consider usingtracing::debug!for the per-fragment logs, or at minimum remove the redundant "finished processing" log at line 206.Suggested change
- tracing::info!( + tracing::debug!( moof_size = moof_bytes.len(), mdat_size = mdat_bytes.len(), total_fragment_size = fragment.len(), "processing CMAF fragment (moof+mdat)" ); self.transport_fragment(fragment, moof)?; - tracing::info!("finished processing CMAF fragment, ready for next fragment");
868-888: Fragment is cloned for each track in multi-track files.
fragment.clone()at line 871 and 884 inside thetrafloop means multi-track fragments are cloned unnecessarily. For typical HLS streams (one track per init), this is fine, but for direct fMP4 files with multiple tracks, consider cloning only when needed.Optimization for multi-track files
+ let traf_count = moof.traf.len(); + let mut remaining_trafs = traf_count; // Loop over all of the traf boxes in the moof. for traf in &moof.traf { + remaining_trafs -= 1; // ... existing code ... - let frame = hang::Frame { - timestamp, - keyframe: should_be_keyframe, - payload: fragment.clone().into(), - }; + let payload = if remaining_trafs == 0 { + fragment.clone() // Last iteration, can move + } else { + fragment.clone() + }; + let frame = hang::Frame { + timestamp, + keyframe: should_be_keyframe, + payload: payload.into(), + };Actually, since we're iterating by reference and
fragmentis consumed after the loop, you could restructure to avoid clones in the last iteration. However, given HLS typically has one track per fragment, this optimization may not be worth the complexity.js/hang/src/watch/source-mse.ts (2)
175-178: Timing hack may be fragile.Using
setTimeout(resolve, 10)for signal propagation is a timing-dependent workaround. While it may work in practice, it's not guaranteed and could cause intermittent issues on slow devices or under heavy load.Consider using a more deterministic approach, such as waiting for the signal to actually change or using a proper event-based mechanism.
1283-1287: Base64 decoding could use a utility function.The manual base64 decoding with
atoband character iteration is verbose. Consider extracting to a utility function since this pattern is repeated at line 1406-1410.Suggested utility
function base64ToUint8Array(base64: string): Uint8Array { const binaryString = atob(base64); const bytes = new Uint8Array(binaryString.length); for (let i = 0; i < binaryString.length; i++) { bytes[i] = binaryString.charCodeAt(i); } return bytes; }js/hang/src/watch/audio/source.ts (2)
169-182: Redundant and confusing early-return logic.The conditional checks are duplicated and create confusion:
- Lines 172-178: Comment says initialize for CMAF even if disabled, but the
else if (!enabled)branch returns- Lines 180-182: Duplicate check of the same condition (
!enabled && config?.container !== "cmaf")The intent appears to be "always proceed for CMAF, only proceed for non-CMAF if enabled", but the implementation is convoluted.
Simplified logic
`#runDecoder`(effect: Effect): void { const enabled = effect.get(this.enabled); const config = effect.get(this.config); - // For CMAF, we need to add the SourceBuffer even if audio is disabled - // This ensures the MediaSource has both SourceBuffers before video starts appending - // We'll just not append audio data if disabled - if (config?.container === "cmaf") { - // Always initialize MSE for CMAF, even if disabled - // The SourceBuffer needs to be added before video starts appending - } else if (!enabled) { - // For non-CMAF, if disabled, don't initialize - return; - } - - if (!enabled && config?.container !== "cmaf") { - return; - } + // For non-CMAF containers, skip if disabled + // CMAF always proceeds to set up MSE coordination (actual data appending is gated later) + if (!enabled && config?.container !== "cmaf") { + return; + }
236-244: Consider extracting timeout constants for maintainability.Multiple hardcoded timeout values are scattered throughout the MSE path (50ms, 100ms, 150ms, 200ms, 2000ms, 5000ms). Extracting these as named constants would improve readability and make tuning easier.
const MSE_POLL_INTERVAL = 50 as Time.Milli; const MSE_INIT_TIMEOUT = 2000 as Time.Milli; const MSE_READY_TIMEOUT = 5000 as Time.Milli;
📜 Review details
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (8)
js/hang/src/frame.tsjs/hang/src/watch/audio/emitter.tsjs/hang/src/watch/audio/source.tsjs/hang/src/watch/broadcast.tsjs/hang/src/watch/source-mse.tsjs/hang/src/watch/video/source.tsrs/hang/src/import/fmp4.rsrs/hang/src/model/track.rs
🧰 Additional context used
📓 Path-based instructions (3)
js/**/*.{ts,tsx,js,jsx,json}
📄 CodeRabbit inference engine (CLAUDE.md)
Use
bunfor all TypeScript/JavaScript package management and script execution (not npm, yarn, or pnpm)
Files:
js/hang/src/watch/broadcast.tsjs/hang/src/watch/audio/source.tsjs/hang/src/watch/video/source.tsjs/hang/src/watch/source-mse.tsjs/hang/src/frame.tsjs/hang/src/watch/audio/emitter.ts
rs/**/*.rs
📄 CodeRabbit inference engine (CLAUDE.md)
rs/**/*.rs: Usecargofor Rust-specific operations
Rust tests should be integrated within source files rather than in separate test files
Files:
rs/hang/src/model/track.rsrs/hang/src/import/fmp4.rs
rs/hang/**/*.rs
📄 CodeRabbit inference engine (CLAUDE.md)
Media-specific encoding/decoding should be implemented in the hang layer on top of moq-lite
Files:
rs/hang/src/model/track.rsrs/hang/src/import/fmp4.rs
🧠 Learnings (1)
📚 Learning: 2026-01-11T03:47:52.766Z
Learnt from: CR
Repo: moq-dev/moq PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-11T03:47:52.766Z
Learning: Applies to rs/hang/**/*.rs : Media-specific encoding/decoding should be implemented in the hang layer on top of moq-lite
Applied to files:
js/hang/src/watch/audio/source.tsjs/hang/src/watch/video/source.tsrs/hang/src/import/fmp4.rs
🧬 Code graph analysis (2)
js/hang/src/watch/broadcast.ts (2)
js/hang/src/catalog/video.ts (1)
Video(106-106)js/hang/src/catalog/audio.ts (1)
Audio(65-65)
js/hang/src/watch/audio/emitter.ts (1)
js/hang/src/watch/audio/source.ts (5)
effect(97-151)effect(153-163)effect(165-219)effect(221-419)effect(421-479)
🔇 Additional comments (25)
rs/hang/src/model/track.rs (5)
27-31: LGTM!The new fields
is_init_segment_groupandcontainerare well-documented and appropriately typed for the CMAF passthrough feature.
35-43: LGTM!Constructor properly accepts and stores the container type, with correct initialization of
is_init_segment_group.
108-116: LGTM!Correctly avoids writing an empty header chunk when CMAF bypasses timestamp encoding, and properly closes the frame after writing all chunks.
132-136: LGTM!Defaulting to
Container::Nativein theFromimplementation maintains backward compatibility for existing callers.
58-60: Correct bypass of timestamp encoding for CMAF.CMAF fragments carry timing information in their moof atoms, so skipping the VarInt timestamp header is appropriate.
ContainerderivesPartialEq, confirming the!=comparison is valid.rs/hang/src/import/fmp4.rs (6)
43-65: Well-structured state management for passthrough mode.The new fields are appropriately typed and well-documented. The use of
HashMap<u32, bool>forfirst_frame_sentandOption<Bytes>for the raw byte caches is idiomatic.
72-96: LGTM!Constructor properly initializes all new fields, and
set_passthrough_modeprovides a clean API for toggling passthrough behavior.
233-409: Well-designed init segment handling for HLS per-track initialization.The logic correctly handles:
- Per-track init segments (HLS sends separate video/audio inits)
- Storing init segments only for tracks created in the current
init()call- Clearing cached bytes after storage to avoid repeated warnings
The extensive validation logging will help debug integration issues.
411-548: Container selection logic is consistent across all video codecs.The passthrough mode correctly sets
Container::Cmafwhile non-passthrough usesContainer::Native. Theinit_segment: Noneis appropriate since init segments are stored in the catalog duringinit().
550-587: LGTM!H.265 initialization follows the same passthrough pattern consistently.
589-646: LGTM!Audio initialization correctly applies the same passthrough container logic.
js/hang/src/frame.ts (2)
23-31: LGTM - CMAF passthrough encoding correctly bypasses timestamp header.The CMAF early return is correct since CMAF fragments already contain timing information in the moof atoms. The defensive copy when source is not a Uint8Array is appropriate.
125-130: Minor formatting improvements with explicit blocks.These changes add explicit braces around control flow statements, improving code clarity without changing behavior.
Also applies to: 162-164, 258-260, 275-277
js/hang/src/watch/audio/emitter.ts (2)
48-52: Enablement logic simplified correctly.Enablement now derives solely from
pausedstate, allowing audio to be downloaded even when muted (for visualizations). This aligns with the comment at line 19-20.
60-95: MSE path branching is well-structured.The early return pattern for MSE path (controlling HTMLAudioElement directly) vs WebCodecs path (using GainNode) is clean and maintainable.
js/hang/src/watch/source-mse.ts (4)
95-99: Hidden video element appended to document body.The video element is created with
display: noneand appended todocument.body. While functional, this approach has some considerations:
- The element remains in the DOM until explicitly removed
- If
close()isn't called, the element leaksThe cleanup in
close()(lines 1534-1538) does handle removal, which is good.
647-711: Frame capture implementation looks correct.The frame capture loop properly:
- Uses
requestVideoFrameCallbackwith fallback torequestAnimationFrame- Closes previous frames before updating to prevent memory leaks
- Updates stats and display dimensions appropriately
1092-1102: Track ID extraction handles both tkhd versions correctly.The code correctly handles both version 0 (32-bit fields) and version 1 (64-bit fields) of the tkhd atom, with appropriate offset calculations. The bounds check at line 1095 prevents out-of-bounds access.
1476-1541: Close method handles cleanup thoroughly.The cleanup sequence is well-ordered:
- Clears queues and resets flags
- Cancels frame callbacks
- Closes previous VideoFrame
- Aborts updating SourceBuffers
- Ends MediaSource stream and revokes object URL
- Removes video element from DOM
js/hang/src/watch/video/source.ts (2)
103-109: MSE signal exposure enables audio coordination.The new signals
#mseMediaSourceand#mseSourceproperly expose the MSE internals needed for audio to coordinate CMAF playback through a shared MediaSource.
213-220: Container-based routing is correct.The routing logic correctly directs CMAF containers to the MSE path and native/raw containers to the WebCodecs path.
js/hang/src/watch/broadcast.ts (1)
65-71: Video-first initialization enables MSE coordination.Creating video before audio allows audio to access video's MediaSource for CMAF playback where both tracks share the same MediaSource instance. The property assignment
this.audio.video = this.videoestablishes this coordination. TheAudio.Sourceclass correctly defines an optionalvideoproperty of typeVideo.Sourceto support this coordination pattern.js/hang/src/watch/audio/source.ts (3)
8-9: LGTM!Type-only imports for
SourceMSEandVideoare appropriate for the new MSE coordination feature.
107-111: LGTM!Correctly bypasses worklet creation for CMAF containers since MSE handles audio playback directly through the browser.
421-479: LGTM!The WebCodecs path is correctly implemented with proper resource cleanup and decoder management. The container default comment aligns with the backward compatibility approach.
✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.
rs/hang/src/import/fmp4.rs
Outdated
| if let Some(trun) = traf.trun.first() { | ||
| if let Some(entry) = trun.entries.first() { | ||
| let tfhd = &traf.tfhd; | ||
| let flags = entry.flags.unwrap_or(tfhd.default_sample_flags.unwrap_or_default()); | ||
| // https://chromium.googlesource.com/chromium/src/media/+/master/formats/mp4/track_run_iterator.cc#177 | ||
| let keyframe_flag = (flags >> 24) & 0x3 == 0x2; // kSampleDependsOnNoOther | ||
| let non_sync = (flags >> 16) & 0x1 == 0x1; // kSampleIsNonSyncSample | ||
| is_keyframe = keyframe_flag && !non_sync; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing trex default sample flags lookup.
The extract() method (line 678) retrieves default_sample_flags from the trex box in moov.mvex, but transport_fragment() falls back directly to unwrap_or_default(). This inconsistency could cause incorrect keyframe detection for files that rely on trex defaults.
Proposed fix: Add trex lookup
+ let trex = moov
+ .mvex
+ .as_ref()
+ .and_then(|mvex| mvex.trex.iter().find(|trex| trex.track_id == track_id));
+ let default_sample_flags = trex.map(|trex| trex.default_sample_flags).unwrap_or_default();
+
// Determine keyframe status (reuse logic from extract())
let is_keyframe = if trak.mdia.hdlr.handler == b"vide".into() {
// For video, check sample flags in trun entries
let mut is_keyframe = false;
if let Some(trun) = traf.trun.first() {
if let Some(entry) = trun.entries.first() {
let tfhd = &traf.tfhd;
- let flags = entry.flags.unwrap_or(tfhd.default_sample_flags.unwrap_or_default());
+ let flags = entry.flags.unwrap_or(tfhd.default_sample_flags.unwrap_or(default_sample_flags));🤖 Prompt for AI Agents
In `@rs/hang/src/import/fmp4.rs` around lines 826 - 833, When computing flags for
keyframe detection inside transport_fragment() (the block using
traf.trun.first(), trun.entries.first(), entry.flags and
tfhd.default_sample_flags), fall back to the `trex` box default_sample_flags
from moov.mvex before using a hard default; i.e. if entry.flags is None use
tfhd.default_sample_flags if present, otherwise look up the matching trex (by
track_id from traf.tfhd.track_id) and use trex.default_sample_flags, and only
then use a final default value. Update the flags assignment so it checks
entry.flags -> tfhd.default_sample_flags ->
moov.mvex.trex[track_id].default_sample_flags -> default.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🤖 Fix all issues with AI agents
In `@js/hang/src/container/codec.ts`:
- Around line 58-66: getTimestampSize currently returns 8 for the "cmaf"
container which causes callers that preallocate buffers to insert eight zero
bytes even though encodeTimestamp produces an empty array for CMAF; change the
"cmaf" branch in getTimestampSize (function name: getTimestampSize, file:
codec.ts) to return 0 and update the inline comment to note CMAF uses no
timestamp padding so callers won't preallocate padding bytes.
In `@js/hang/src/watch/audio/source.ts`:
- Around line 288-294: The comment says to "Always initialize" the audio
SourceBuffer, but the early return caused by checking isEnabled via
effect.get(this.enabled) prevents that; change the logic so initializeAudio (and
any pre-adding of the SourceBuffer) runs regardless of isEnabled, and only skip
subscribing to track updates when disabled. Specifically, relocate or remove the
early return around the isEnabled check in the block containing
effect.get(this.enabled) so that initializeAudio (and SourceBuffer setup) is
called unconditionally, while keeping the subscription logic conditional, or
update the comment to reflect that initialization is conditional if you intend
to keep the early return.
♻️ Duplicate comments (11)
rs/hang/src/import/fmp4.rs (3)
98-105: Passthrough raw-byte capture drops multi-chunk buffers.
Buf::chunk()only returns the first contiguous chunk; if the buffer is chained, raw fragment slices can be incomplete and later offsets can fail.Does bytes::Buf::chunk() return only the first contiguous chunk of remaining data?🐛 Proposed fix
- let available_bytes = if self.passthrough_mode && buf.has_remaining() { - let chunk = buf.chunk(); - Some(Bytes::copy_from_slice(chunk)) + let available_bytes = if self.passthrough_mode && buf.has_remaining() { + Some(Bytes::copy_from_slice(buf.as_ref())) } else { None };
724-729: Clarify the moov_bytes warning.At this point it indicates
init()was not called, not just that the init segment wasn’t sent.✏️ Suggested clarification
- if self.moov_bytes.is_some() { - tracing::warn!("transporting fragment but moov_bytes is still set - init segment may not have been sent"); - } + if self.moov_bytes.is_some() { + tracing::warn!("transporting fragment but moov_bytes is still set - init() may not have been called"); + }
758-767: Keyframe detection should fall back totrexdefaults.
extract()usestrexdefaults;transport_fragment()should mirror that to avoid misclassifying keyframes whentfhdomits flags.In ISOBMFF, when trun entry flags and tfhd defaults are absent, should trex default_sample_flags be used?🐛 Proposed fix
+ let trex = moov + .mvex + .as_ref() + .and_then(|mvex| mvex.trex.iter().find(|trex| trex.track_id == track_id)); + let default_sample_flags = trex.map(|trex| trex.default_sample_flags).unwrap_or_default(); + // Determine keyframe status (reuse logic from extract()) let is_keyframe = if trak.mdia.hdlr.handler == b"vide".into() { // For video, check sample flags in trun entries let mut is_keyframe = false; if let Some(trun) = traf.trun.first() && let Some(entry) = trun.entries.first() { let tfhd = &traf.tfhd; - let flags = entry.flags.unwrap_or(tfhd.default_sample_flags.unwrap_or_default()); + let flags = entry.flags.unwrap_or(tfhd.default_sample_flags.unwrap_or(default_sample_flags));rs/hang/src/model/track.rs (2)
62-74: Group dropped withoutclose()leaves consumers in an incomplete state.When
is_init_segment_groupis true, the group is dropped without callingclose(). This leavesGroupState.closedasNone, causingGroupConsumer::next_frame()to returnError::Cancelinstead ofOk(None)when the watch sender is dropped. This breaks the consumer's ability to distinguish between a cleanly ended group and a forcibly cancelled stream.If the intent is to keep the init segment group available for late subscribers, the
GroupProducerreference must be retained (stored elsewhere), not dropped. If the group should be ended, callgroup.close()explicitly.
89-99: Fragile init segment detection using timestamp 0.Detecting the init segment by checking
frame.timestamp.as_micros() == 0is fragile. A legitimate media frame could have timestamp 0 (e.g., the first frame of a stream starting at time 0). Consider using an explicit flag on the frame or inspecting the payload for ftyp/moov atoms to reliably identify init segments.js/hang/src/watch/source-mse.ts (1)
613-620: QuotaExceededError retry re-queues stale fragment at front.When
QuotaExceededErroroccurs,unshift(fragment)re-queues the fragment at the front. Combined with the overflow logic that discards the oldest fragment viashift(), this can prioritize stale fragments over fresh ones in live streaming scenarios. For live content, consider dropping the failing fragment instead of retrying, or implementing a retry count limit.js/hang/src/watch/video/source.ts (2)
225-255: Scope MSE signal forwarding to the track lifecycle.
Usingthis.#signals.effecthere will keep forwarders alive across track switches and can leak/overwrite state after teardown. Use a per-runEffect(or dispose the returned unsubscribers) so these forwarders are cleaned up with the track.♻️ Suggested fix (scope forwarders to the run)
effect.spawn(async () => { const { SourceMSE } = await import("../source-mse.js"); const mseSource = new SourceMSE(this.latency); effect.cleanup(() => mseSource.close()); + const forward = new Effect(); + effect.cleanup(() => forward.close()); // Forward signals using effects - this.#signals.effect((eff) => { + forward.effect((eff) => { const frame = eff.get(mseSource.frame); eff.set(this.frame, frame); }); - this.#signals.effect((eff) => { + forward.effect((eff) => { const display = eff.get(mseSource.display); eff.set(this.display, display); }); - this.#signals.effect((eff) => { + forward.effect((eff) => { const status = eff.get(mseSource.bufferStatus); eff.set(this.bufferStatus, status, { state: "empty" }); }); - this.#signals.effect((eff) => { + forward.effect((eff) => { const status = eff.get(mseSource.syncStatus); eff.set(this.syncStatus, status, { state: "ready" }); }); - this.#signals.effect((eff) => { + forward.effect((eff) => { const mediaSource = eff.get(mseSource.mediaSource); eff.set(this.#mseMediaSource, mediaSource); }); - this.#signals.effect((eff) => { + forward.effect((eff) => { const stats = eff.get(mseSource.stats); eff.set(this.#stats, stats); });
256-263: Avoid CMAF → WebCodecs fallback unless demuxing is implemented.
#runMSEPathis only entered forcontainer === "cmaf". Falling back to WebCodecs with CMAF fragments will fail unless you demux to raw frames first. Either disable fallback for CMAF or add a demux step before#runWebCodecsPath.🐛 Suggested guard
} catch (error) { console.error("MSE path error, falling back to WebCodecs:", error); // Fallback to WebCodecs this.#mseSource.set(undefined); - this.#runWebCodecsPath(effect, broadcast, name, config); + if (config.container !== "cmaf") { + this.#runWebCodecsPath(effect, broadcast, name, config); + } }Does WebCodecs VideoDecoder.decode accept fragmented MP4 (moof+mdat), or does it require raw codec frames (Annex B / OBU)?js/hang/src/watch/audio/source.ts (3)
46-49: ClearmseAudioElementwhen leaving the MSE path.
Without an explicit reset, UI/control state can keep a stale element after switching to WebCodecs or disabling audio.♻️ Suggested reset (example in `#runDecoder`)
`#runDecoder`(effect: Effect): void { const enabled = effect.get(this.enabled); const config = effect.get(this.config); + if (config?.container !== "cmaf") { + effect.set(this.#mseAudioElement, undefined); + }
271-274: CMAF → WebCodecs fallback is likely invalid here too.
Ifcontainer === "cmaf"and the MSE source is unavailable, falling back to WebCodecs will still deliver fMP4 fragments. Unless you demux, AudioDecoder won’t accept them.🐛 Suggested guard
if (!videoMseSource) { console.error("[Audio Source] Video MSE source not available, falling back to WebCodecs"); - this.#runWebCodecsPath(effect, broadcast, name, config, catalog); + if (config.container !== "cmaf") { + this.#runWebCodecsPath(effect, broadcast, name, config, catalog); + } return; }Does WebCodecs AudioDecoder.decode accept fragmented MP4 (moof+mdat), or does it require raw codec frames?
277-285: Scope MSE forwarders to the run to avoid leaks.
this.#signals.effecthere persists across track switches. Use a per-runEffect(or dispose unsubscribers) so the forwarded audio element and stats don’t outlive the track.♻️ Suggested fix (scope forwarders)
effect.spawn(async () => { // ... + const forward = new Effect(); + effect.cleanup(() => forward.close()); // Expose video element as "audioElement" for compatibility with emitter - this.#signals.effect((eff) => { + forward.effect((eff) => { const videoElement = videoMseSource.videoElement ? eff.get(videoMseSource.videoElement) : undefined; eff.set(this.#mseAudioElement, videoElement as HTMLAudioElement | undefined); }); // Forward stats - this.#signals.effect((eff) => { + forward.effect((eff) => { eff.set(this.#stats, { bytesReceived: 0 }); });
🧹 Nitpick comments (4)
js/hang/src/watch/source-mse.ts (3)
145-146: Arbitrary delay for signal propagation is a code smell.Using
setTimeout(resolve, 10)to "wait for signal propagation" is fragile and could introduce race conditions. If the signal system requires time to propagate, consider using a proper synchronization mechanism or documenting why this specific delay is needed.
664-669: Consider documenting or making the audio buffer wait timeout configurable.The loop waits up to 1000ms (10 iterations × 100ms) for the audio SourceBuffer to be ready. This magic number could be extracted to a constant or made configurable to aid debugging and tuning.
805-814: URL revocation may pass empty string.
URL.revokeObjectURL(this.#video?.src || "")will callrevokeObjectURL("")if#videoorsrcis falsy. While this shouldn't cause an error, it's cleaner to check first.Suggested fix
if (this.#mediaSource) { try { if (this.#mediaSource.readyState === "open") { this.#mediaSource.endOfStream(); } - URL.revokeObjectURL(this.#video?.src || ""); + if (this.#video?.src) { + URL.revokeObjectURL(this.#video.src); + } } catch (error) { console.error("Error closing MediaSource:", error); } }js/hang/src/watch/broadcast.ts (1)
167-207: Consider removing self-described "redundant" backup mechanism.The comments explicitly state this method is "redundant" since
audio.#runDecoderwatchesvideo.mseSourcedirectly. Keeping redundant code as a "backup mechanism" adds maintenance burden and can mask issues in the primary mechanism. If the direct watching approach is reliable, consider removing this method; otherwise, document why the backup is necessary.Additionally, the toggle pattern (
enabled.set(false)followed byqueueMicrotask(() => enabled.set(true))) is a timing-dependent workaround that could be fragile if effect processing changes.
| export function getTimestampSize(container: Catalog.Container = DEFAULT_CONTAINER): number { | ||
| switch (container) { | ||
| case "legacy": | ||
| case "native": | ||
| return 8; // VarInt maximum size | ||
| case "raw": | ||
| return 8; // u64 fixed size | ||
| case "fmp4": | ||
| throw new Error("fmp4 container not yet implemented"); | ||
| case "cmaf": | ||
| return 8; // VarInt maximum size (same as native) | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CMAF header size should be 0 to avoid padding.
encodeTimestamp returns an empty array for CMAF; if callers preallocate using getTimestampSize() == 8, they can accidentally prefix zero bytes to the fragment.
🐛 Proposed fix
- case "cmaf":
- return 8; // VarInt maximum size (same as native)
+ case "cmaf":
+ return 0; // No timestamp header for CMAF📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| export function getTimestampSize(container: Catalog.Container = DEFAULT_CONTAINER): number { | |
| switch (container) { | |
| case "legacy": | |
| case "native": | |
| return 8; // VarInt maximum size | |
| case "raw": | |
| return 8; // u64 fixed size | |
| case "fmp4": | |
| throw new Error("fmp4 container not yet implemented"); | |
| case "cmaf": | |
| return 8; // VarInt maximum size (same as native) | |
| } | |
| export function getTimestampSize(container: Catalog.Container = DEFAULT_CONTAINER): number { | |
| switch (container) { | |
| case "native": | |
| return 8; // VarInt maximum size | |
| case "raw": | |
| return 8; // u64 fixed size | |
| case "cmaf": | |
| return 0; // No timestamp header for CMAF | |
| } |
🤖 Prompt for AI Agents
In `@js/hang/src/container/codec.ts` around lines 58 - 66, getTimestampSize
currently returns 8 for the "cmaf" container which causes callers that
preallocate buffers to insert eight zero bytes even though encodeTimestamp
produces an empty array for CMAF; change the "cmaf" branch in getTimestampSize
(function name: getTimestampSize, file: codec.ts) to return 0 and update the
inline comment to note CMAF uses no timestamp padding so callers won't
preallocate padding bytes.
| // Check if audio is enabled | ||
| const isEnabled = effect.get(this.enabled); | ||
|
|
||
| // Only subscribe to track and initialize SourceBuffer if enabled | ||
| if (!isEnabled) { | ||
| return; | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
“Always initialize” comment conflicts with the early return.
If the intent is to pre-add the audio SourceBuffer even when disabled, the if (!isEnabled) return; prevents that. Either move the enabled check after initializeAudio or update the comment/intent.
🤖 Prompt for AI Agents
In `@js/hang/src/watch/audio/source.ts` around lines 288 - 294, The comment says
to "Always initialize" the audio SourceBuffer, but the early return caused by
checking isEnabled via effect.get(this.enabled) prevents that; change the logic
so initializeAudio (and any pre-adding of the SourceBuffer) runs regardless of
isEnabled, and only skip subscribing to track updates when disabled.
Specifically, relocate or remove the early return around the isEnabled check in
the block containing effect.get(this.enabled) so that initializeAudio (and
SourceBuffer setup) is called unconditionally, while keeping the subscription
logic conditional, or update the comment to reflect that initialization is
conditional if you intend to keep the early return.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In `@js/hang/src/watch/source-mse.ts`:
- Around line 420-473: The capture loop in `#startFrameCapture` (captureFrame)
currently constructs new VideoFrame on every tick which will throw repeatedly in
browsers without VideoFrame support; change the function to first detect support
(e.g., typeof VideoFrame === "function" or globalThis.VideoFrame) and if
unsupported either use a canvas/ImageBitmap fallback to capture frames or stop
attempting VideoFrame creation and set bufferStatus/state accordingly; implement
a one-time fallback flag (e.g., this.#videoFrameSupported) so you don't
retry/new VideoFrame on every frame, and ensure subsequent logic
(this.frame.update, this.#stats.update, display sizing, and scheduling via
this.#video.requestVideoFrameCallback/requestAnimationFrame) uses the chosen
safe path instead of calling new VideoFrame when unsupported.
♻️ Duplicate comments (2)
js/hang/src/util/mp4-mime.ts (1)
22-24: GuardMediaSourceaccess for non‑browser runtimes.Direct access can throw in SSR/tests. Return
falsewhen unavailable.Proposed fix
export function isMimeTypeSupported(mimeType: string): boolean { - return MediaSource.isTypeSupported(mimeType); + return ( + typeof MediaSource !== "undefined" && + typeof MediaSource.isTypeSupported === "function" && + MediaSource.isTypeSupported(mimeType) + ); }js/hang/src/watch/source-mse.ts (1)
587-621: Don’t re‑queue quota‑failed fragments ahead of newer ones.Re‑queueing with
unshiftcan stall live playback by starving newer fragments. Prefer dropping or retry‑capping.Proposed fix (drop on quota for live streams)
} catch (error) { if (error instanceof DOMException && error.name === "QuotaExceededError") { - console.warn("[MSE] QuotaExceededError - browser will manage buffer automatically"); - this.#videoAppendQueue.unshift(fragment); + console.warn("[MSE] QuotaExceededError - dropping fragment to avoid stalling live playback"); } else { console.error("[MSE] Error appending video fragment:", error); } } @@ } catch (error) { if (error instanceof DOMException && error.name === "QuotaExceededError") { - console.warn("[MSE] QuotaExceededError for audio - browser will manage buffer automatically"); - this.#audioAppendQueue.unshift(fragment); + console.warn("[MSE] QuotaExceededError for audio - dropping fragment to avoid stalling live playback"); } else { console.error("[MSE] Error appending audio fragment:", error); } }Also applies to: 623-649
| #startFrameCapture(): void { | ||
| if (!this.#video) return; | ||
|
|
||
| const captureFrame = () => { | ||
| if (!this.#video) return; | ||
|
|
||
| try { | ||
| const frame = new VideoFrame(this.#video, { | ||
| timestamp: this.#video.currentTime * 1_000_000, | ||
| }); | ||
|
|
||
| this.#stats.update((current) => ({ | ||
| frameCount: (current?.frameCount ?? 0) + 1, | ||
| timestamp: frame.timestamp, | ||
| bytesReceived: current?.bytesReceived ?? 0, | ||
| })); | ||
|
|
||
| this.frame.update((prev) => { | ||
| prev?.close(); | ||
| return frame; | ||
| }); | ||
|
|
||
| if (this.#video.videoWidth && this.#video.videoHeight) { | ||
| this.display.set({ | ||
| width: this.#video.videoWidth, | ||
| height: this.#video.videoHeight, | ||
| }); | ||
| } | ||
|
|
||
| if (this.#video.readyState >= HTMLMediaElement.HAVE_CURRENT_DATA) { | ||
| this.bufferStatus.set({ state: "filled" }); | ||
| if (this.#video.paused && this.#video.readyState >= HTMLMediaElement.HAVE_CURRENT_DATA) { | ||
| this.#video.play().catch(() => { | ||
| // Ignore autoplay errors | ||
| }); | ||
| } | ||
| } | ||
| } catch (error) { | ||
| console.error("Error capturing frame:", error); | ||
| } | ||
|
|
||
| if (this.#video.requestVideoFrameCallback) { | ||
| this.#frameCallbackId = this.#video.requestVideoFrameCallback(captureFrame); | ||
| } else { | ||
| this.#frameCallbackId = requestAnimationFrame(captureFrame) as unknown as number; | ||
| } | ||
| }; | ||
|
|
||
| if (this.#video.requestVideoFrameCallback) { | ||
| this.#frameCallbackId = this.#video.requestVideoFrameCallback(captureFrame); | ||
| } else { | ||
| this.#frameCallbackId = requestAnimationFrame(captureFrame) as unknown as number; | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
cat js/hang/src/watch/source-mse.ts | sed -n '420,473p'Repository: moq-dev/moq
Length of output: 1590
🏁 Script executed:
wc -l js/hang/src/watch/source-mse.tsRepository: moq-dev/moq
Length of output: 90
🌐 Web query:
VideoFrame API browser support compatibility web availability
💡 Result:
Short answer — VideoFrame (part of WebCodecs) is widely available in modern desktop and mobile browsers but not universal:
- Desktop: supported in Chrome/Chromium (94+), Edge (94+), Firefox (≈130+), Safari (16.4+).
- Mobile: supported in recent Chrome for Android and Safari on iOS (same Safari versions), and many modern Android browsers that track Chromium.
- Requirements: WebCodecs/VideoFrame requires a secure context (HTTPS) and is available in Dedicated Web Workers in supported browsers.
- Not supported in older browsers/IE and in versions earlier than the numbers above — feature detection (e.g., "if ('VideoFrame' in window)") is recommended.
Sources: MDN documentation and Can I use support table. [1][2]
References
[1] MDN — VideoFrame (WebCodecs).
[2] Can I use — VideoFrame API (VideoFrame() / properties).
Avoid a tight error loop when VideoFrame isn't supported.
The current code will throw and catch errors from new VideoFrame(...) on every frame in unsupported browsers (prior to Chrome 94, Firefox ~130, Safari 16.4, etc.), spamming the console with error logs and incurring unnecessary overhead.
Guard against this by checking for VideoFrame availability:
Proposed fix
const captureFrame = () => {
if (!this.#video) return;
try {
- const frame = new VideoFrame(this.#video, {
- timestamp: this.#video.currentTime * 1_000_000,
- });
-
- this.#stats.update((current) => ({
- frameCount: (current?.frameCount ?? 0) + 1,
- timestamp: frame.timestamp,
- bytesReceived: current?.bytesReceived ?? 0,
- }));
-
- this.frame.update((prev) => {
- prev?.close();
- return frame;
- });
+ if (typeof VideoFrame !== "undefined") {
+ const frame = new VideoFrame(this.#video, {
+ timestamp: this.#video.currentTime * 1_000_000,
+ });
+
+ this.#stats.update((current) => ({
+ frameCount: (current?.frameCount ?? 0) + 1,
+ timestamp: frame.timestamp,
+ bytesReceived: current?.bytesReceived ?? 0,
+ }));
+
+ this.frame.update((prev) => {
+ prev?.close();
+ return frame;
+ });
+ }📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| #startFrameCapture(): void { | |
| if (!this.#video) return; | |
| const captureFrame = () => { | |
| if (!this.#video) return; | |
| try { | |
| const frame = new VideoFrame(this.#video, { | |
| timestamp: this.#video.currentTime * 1_000_000, | |
| }); | |
| this.#stats.update((current) => ({ | |
| frameCount: (current?.frameCount ?? 0) + 1, | |
| timestamp: frame.timestamp, | |
| bytesReceived: current?.bytesReceived ?? 0, | |
| })); | |
| this.frame.update((prev) => { | |
| prev?.close(); | |
| return frame; | |
| }); | |
| if (this.#video.videoWidth && this.#video.videoHeight) { | |
| this.display.set({ | |
| width: this.#video.videoWidth, | |
| height: this.#video.videoHeight, | |
| }); | |
| } | |
| if (this.#video.readyState >= HTMLMediaElement.HAVE_CURRENT_DATA) { | |
| this.bufferStatus.set({ state: "filled" }); | |
| if (this.#video.paused && this.#video.readyState >= HTMLMediaElement.HAVE_CURRENT_DATA) { | |
| this.#video.play().catch(() => { | |
| // Ignore autoplay errors | |
| }); | |
| } | |
| } | |
| } catch (error) { | |
| console.error("Error capturing frame:", error); | |
| } | |
| if (this.#video.requestVideoFrameCallback) { | |
| this.#frameCallbackId = this.#video.requestVideoFrameCallback(captureFrame); | |
| } else { | |
| this.#frameCallbackId = requestAnimationFrame(captureFrame) as unknown as number; | |
| } | |
| }; | |
| if (this.#video.requestVideoFrameCallback) { | |
| this.#frameCallbackId = this.#video.requestVideoFrameCallback(captureFrame); | |
| } else { | |
| this.#frameCallbackId = requestAnimationFrame(captureFrame) as unknown as number; | |
| } | |
| } | |
| `#startFrameCapture`(): void { | |
| if (!this.#video) return; | |
| const captureFrame = () => { | |
| if (!this.#video) return; | |
| try { | |
| if (typeof VideoFrame !== "undefined") { | |
| const frame = new VideoFrame(this.#video, { | |
| timestamp: this.#video.currentTime * 1_000_000, | |
| }); | |
| this.#stats.update((current) => ({ | |
| frameCount: (current?.frameCount ?? 0) + 1, | |
| timestamp: frame.timestamp, | |
| bytesReceived: current?.bytesReceived ?? 0, | |
| })); | |
| this.frame.update((prev) => { | |
| prev?.close(); | |
| return frame; | |
| }); | |
| } | |
| if (this.#video.videoWidth && this.#video.videoHeight) { | |
| this.display.set({ | |
| width: this.#video.videoWidth, | |
| height: this.#video.videoHeight, | |
| }); | |
| } | |
| if (this.#video.readyState >= HTMLMediaElement.HAVE_CURRENT_DATA) { | |
| this.bufferStatus.set({ state: "filled" }); | |
| if (this.#video.paused && this.#video.readyState >= HTMLMediaElement.HAVE_CURRENT_DATA) { | |
| this.#video.play().catch(() => { | |
| // Ignore autoplay errors | |
| }); | |
| } | |
| } | |
| } catch (error) { | |
| console.error("Error capturing frame:", error); | |
| } | |
| if (this.#video.requestVideoFrameCallback) { | |
| this.#frameCallbackId = this.#video.requestVideoFrameCallback(captureFrame); | |
| } else { | |
| this.#frameCallbackId = requestAnimationFrame(captureFrame) as unknown as number; | |
| } | |
| }; | |
| if (this.#video.requestVideoFrameCallback) { | |
| this.#frameCallbackId = this.#video.requestVideoFrameCallback(captureFrame); | |
| } else { | |
| this.#frameCallbackId = requestAnimationFrame(captureFrame) as unknown as number; | |
| } | |
| } |
🤖 Prompt for AI Agents
In `@js/hang/src/watch/source-mse.ts` around lines 420 - 473, The capture loop in
`#startFrameCapture` (captureFrame) currently constructs new VideoFrame on every
tick which will throw repeatedly in browsers without VideoFrame support; change
the function to first detect support (e.g., typeof VideoFrame === "function" or
globalThis.VideoFrame) and if unsupported either use a canvas/ImageBitmap
fallback to capture frames or stop attempting VideoFrame creation and set
bufferStatus/state accordingly; implement a one-time fallback flag (e.g.,
this.#videoFrameSupported) so you don't retry/new VideoFrame on every frame, and
ensure subsequent logic (this.frame.update, this.#stats.update, display sizing,
and scheduling via this.#video.requestVideoFrameCallback/requestAnimationFrame)
uses the chosen safe path instead of calling new VideoFrame when unsupported.
Add passthrough mode for CMAF/HLS import and MSE playback support
This PR adds comprehensive CMAF support with passthrough mode for ingestion and Media Source Extensions (MSE) for browser playback.
Changes
Rust/CLI Side
--passthroughflag tohang publish hlscommandjust pub-hlscommand to support passthrough mode via optional parameterTypeScript/Browser Side
SourceMSEclass for both audio and video trackscmaf, playback automatically uses MSE instead of WebCodecssetAudioSyncmethod, eliminatinganytype usageTechnical Details
Passthrough Mode:
The passthrough flag enables transport of complete CMAF fragments without re-encoding or decomposing them. This is particularly useful when you want to preserve the original CMAF structure.
Container Format:
native: Uses QUIC VarInt encoding (1-8 bytes, variable length), raw frame payloads (default)raw: Uses fixed u64 encoding (8 bytes, big-endian), raw frame payloadscmaf: Fragmented MP4 container - frames contain complete moof+mdat fragmentsMSE Implementation:
MediaSourceAPI for CMAF playbackUsage
Publish HLS with passthrough mode
just pub-hls bbb passthrough
Publish HLS without passthrough (default)
just pub-hls bbb
The browser will automatically use MSE for playback when the container is
cmaf, providing more efficient decoding and lower latency compared to WebCodecs for CMAF content.