Subtitles

Subtitles and LIVE ingest

Important

Toggling subtitles on and off at the encoder side is not supported: streams can only be announced once and should stay the same during the event.

Unified Origin - Live supports ingest of subtitles samples that are stored in a fragmented MP4 container. The encoder should POST the subtitles one track per language to the publishing point. The track language is read from the track's media header.

The subtitles format can be fragmented WebVTT or fragmented TTML. In the first case, fragments consist of VTTCue samples (text/wvtt), while the XML-based TTML fragments can use timing attributes that are either track-relative (subt/stpp) or fragment-relative (text/dfxp, to which the deprecated signaling text/ttml will be remapped as well).

For TTML, the following profiles are supported: DFXP, SMPTE-TT, EBU-TT-D, SDP-US, CFF-TT and the IMSC1 Text Profile. Timing attributes @begin and @end are expected in sibling elements under tt/body/div/p.

Note

It is possible to POST subtitles from a source that is different than the encoder that POSTs the video and audio tracks, but keep in mind that the timestamps of all tracks will need to be synchronized, which is challenging. Therefore, we do not recommend this kind of setup.

Play-out formats

Depending on whether input for Origin is fragmented TTML or fragmented WebVTT, it outputs subtitles in different formats for MPEG-DASH, Apple HLS and Microsoft Smooth Streaming. The overview in the table below applies to both Origin VOD and Origin Live:

Protocol

Fragmented TTML input

Fragmented WebVTT input

DASH

Fragmented TTML + (VOD only) TTML sidecar

Fragmented WebVTT + (VOD only) WebVTT sidecar

HLS

WebVTT Segments

WebVTT Segments

Smooth

Fragmented TTML

N/A

HDS

N/A

N/A

Embedded subtitles (captions)

CEA-608 or CEA-708 closed captions may be embedded in the AVC video stream.

Since CEA-608 embedded closed captions are embedded in the video track, the language of the closed captions is set to the language of the video track.

When using this technique to provide the subtitle tracks, there is a limit of two language tracks. As the tracks are embedded they are not available for styling or manipulation by the player. Where the language tracks differ it is possible to use the 'mul' audio code to identify that multiple languages are available.

Because of the limitations associated with this format, it is recommended to use TTML (or WebVTT) based subtitles for best cross format compatibility and player support.

Subtitle playback

Support for playback of subtitles varies greatly from player to player. Please make sure you are using the latest version of your player and make yourself aware of its limitations regarding subtitle support.

Subtitles for MPEG-DASH

TTML-based subtitles are presented using stpp codec, with timing attributes relative to start of track, usually (1970-01-01 00:00:00 UTC). The wvtt codec follows ISO/IEC 14496-30:2014 - Web Video Text Tracks.

Subtitles for HTTP Live Streaming

Important

Subtitles for HLS require at least version 4 of the HLS protocol.

Make sure to set this using the --hls.client_manifest_version option:

--hls.client_manifest_version=4

If both WebVTT and TTML-based subtitles are ingested for a given language, both tracks will be advertised in the master playlist. To exclude the TTML-based subtitles from the playlist, use dynamic track selection, for instance: .../.m3u8?filter=FourCC=="wvtt"||type!="textstream"

Subtitles for HTTP Smooth Streaming

Enabling subtitles in the Silverlight player (MMP Player Framework 2.7) is done by adding the following value to the InitParams parameter:

param name="InitParams" value="enablecaptions=true, selectedcaptionstream=textstream_eng, mediaurl=...

Note that parameter for the selectedcaptionstream is the name of the text track as given by the @Name attribute in the Smooth Streaming client manifest.

Subtitles for HTTP Dynamic Streaming

Not supported.