Unified Origin supports input of fragmented TTML subtitles.
New in version 1.7.31.
From version 1.7.31, input of fragmented WebVTT subtitles is also supported.
In other words, the ingested subtitles must consist of either TTML or WebVTT samples stored in a fragmented MP4 (.ismt or .cmft) container.
You can prepare your subtitles for ingest with Unified Packager, which supports packaging of SRT, WebVTT and TTML. For TTML the profiles it supports are DFXP, SMPTE-TT, EBU-TT-D, SDP-US, CFF-TT and the IMSC1 Text Profile.
Please read Packaging Subtitles for more information on how to package and prepare subtitles.
Overall, the workflow for adding subtitles to a VOD presentation is as follows:
Package supported subtitles (SRT, WebVTT, TTML) in a fragmented MP4, as described in Packaging Subtitles
Add the fragmented MP4 that contains the subtitles to the VOD server manifest like any other track, as described below
Depending on whether input for Origin is fragmented TTML or fragmented WebVTT, it outputs subtitles in different formats for MPEG-DASH, Apple HLS and Microsoft Smooth Streaming. The overview in the table below applies to both Origin VOD and Origin Live:
Fragmented TTML input
Fragmented WebVTT input
Fragmented TTML + (VOD only) TTML sidecar
Fragmented WebVTT + (VOD only) WebVTT sidecar
When the output format of the subtitles is the same as the source (no conversion takes place) all styling information contained in the source will be passed through, but when conversion from TTML to WebVTT or vice versa takes place only basic styling information is kept. Note that it is up to the player to interpret and apply this styling information when displaying the subtitles.
Generating TTML sidecars is disabled by default. To enable it, see: TTML sidecar (enabled with --mpd.sidecar_ttml).
Subtitles stored in an MP4 container are treated similar to audio and video tracks. They have similar properties like 'language' and 'bitrate' and can simply be added to the list of inputs on the command line when creating the server manifest file.
#!/bin/bash mp4split -o tears-of-steel.ism \ tears-of-steel-avc1-400k.ismv \ tears-of-steel-avc1-750k.ismv \ tears-of-steel-en.ismt
If you have multiple subtitles, you simply add the various subtitle files:
#!/bin/bash mp4split -o tears-of-steel.ism \ tears-of-steel-avc1-400k.ismv \ tears-of-steel-avc1-750k.ismv \ tears-of-steel-en.ismt \ tears-of-steel-zh-hans.ismt
For DASH and Smooth output, by default the length of the subtitles segments is equal to the length of the fragments in the subtitles source. Depending on how the source was packaged, this means that the segments will vary in length to be equal to the length of the actual cues, or be constant (e.g., 60 seconds, or 10).
For HLS, by default the length of the subtitles segments is the same as the default length for the audio and video segments, which is 4 seconds.
In all cases (DASH, Smooth, HDS and HLS), using --[iss|hls|hds|mpd].minimum_fragment_length will override the default behavior (and set the target length of all segments to the specified value).
If properties are missing or set incorrectly in the source file of the
subtitles, this should preferably be corrected when packaging the subtitles in a
fMP4 container. This ensures that
mp4split can infer and add the correct
information to the server manifest when running a command like the above.
However, if is not possible to correct the properties when packaging the subtitles in a fMP4 is not possible, they can also be specified when generating the server manifest. For example, you can use --track_language to set the 'language' attribute of the subtitle track:
#!/bin/bash mp4split -o presentation.ism \ tears-of-steel-avc1-400k.ismv \ tears-of-steel-avc1-750k.ismv \ tears-of-steel-en.ismt --track_language=eng
Support for playback of subtitles varies greatly from player to player. Please make sure you are using the latest version of your player and make yourself aware of its limitations regarding subtitle support.
The MIME type of the subtitle track listed in the MPD is
the @contentType attribute is
text, the @codecs attribute is
for TTML-based samples and
wvtt for VTTCue samples.
If a server manifest includes both WebVTT and TTML-based subtitles for a given
language, both tracks will be advertised in the MPD (with codec attribute set
stpp, respectively). While this allows the player to
select the most suitable representation, it may be desirable to exclude one or
the other. This can be done with dynamic track selection, using a filter expression like:
For DVB-DASH the input format must be
EBU-TT-D, see page 27 of DVB-DASH specification (ETSI TS 103 285).
Please see the Unified Streaming Demo for an example that uses subtitles in DASH.
New in version 1.7.31.
WebVTT sidecar subtitles are automatically generated when the subtitles input
format is fragmented WebVTT. The WebVTT sidecar subtitles is signaled as a
separate adaptation set with MIME type
<!-- fMP4 containing VTTCue samples --> <AdaptationSet contentType="text" lang="ru" mimeType="application/mp4" codecs="wvtt"> <!-- WebVTT --> <AdaptationSet contentType="text" lang="ru" mimeType="text/vtt">
TTML sidecar (enabled with --mpd.sidecar_ttml)
New in version 1.10.28.
TTML sidecar subtitles are generated when the subtitles input format is
fragmented TTML and the option
--mpd.sidecar_ttml is specified.
Subtitles for HLS require at least version 4 of the HLS protocol.
Make sure to set this using the --hls.client_manifest_version option:
When you have multiple subtitles, the player normally picks the one it finds most suitable (e.g. when the language of the subtitle track matches the language of the phone). If no match is found, the default track is taken. Like with video and audio tracks, Origin VOD marks the first track in a group that as the default.
To control which track is the first in a group (thus controling which track is the default), you need to combine the tracks for various languages into a single fMP4 before generating the server manifest. This fMP4 needs to be a .ismt, as .cmft only supports one track per file as specified in CMAF. The order in which the tracks are added will be respected, so that the first track becomes the default, see Define default track when preparing content (track order).
Please see the Unified Streaming Demo for an example that uses subtitles in HLS.
Enabling subtitles in the Silverlight player (MMP Player Framework 2.7) is
done by adding the following value to the
param name="InitParams" value="enablecaptions=true, selectedcaptionstream=textstream_eng, mediaurl=...
Note that parameter for the
selectedcaptionstream is the name of the text
track as given by the @Name attribute in the Smooth Streaming client manifest.
Please see the Unified Streaming Demo for an example that uses subtitles in HSS.
Not supported. However, some players can load a subtitle file (for instance .srt) as a 'sidecar' file and display subtitles from the loaded file. An example is Flowplayer.