Recent FCC regulations require that any content shown on television with closed captions must also be captioned when streamed online. IBM Video Streaming or Enterprise for Video Streaming customers with supported encoders now have the option of passing their captioned content through to the IBM Video Streaming player.
Once a channel has caption support enabled, the IBM Video Streaming platform and player will automatically detect and display the 608 closed captions. Viewers can mouse-over the player to reveal the CC button where they can toggle the captions on and off. Caption data is preserved and synced with the recorded file for on-demand viewing.
Note: Closed Captions are only available for our Enterprise broadcasting users.
For additional information on closed captions, please see the related blog post:
How do closed captions work?It is important to first make the distinction between “Closed Captions” and “Subtitles”. Closed Captions serve as an accessibility feature for the deaf or hard of hearing, while Subtitles are an internal viewer/listener feature.
Examples: Actions like “window slams shut” or “footsteps approach” are present in closed captioning to explain what is happening. Subtitles are used simply when translating speech-to-text, like when providing French subtitles for English content.
There are two ways closed captions can be delivered:
- Embedded within video
- Stored as a separate file
NOTE: for adding separate file captions, please see the article: Adding VTT closed captions files to videos
Embedded captions usually contain the following caption formats: CEA-608, CEA-708, DVB-T, DVB-S, WST. These formats can be embedded into the video stream, or written directly into a video file.
Separately stored captions usually contain the following formats: DFXP, SAMI, SMPTE, TTML, XML, WebVTT, SRT, SCC, EBU-STL. These formats transmit caption data to a player beside the video, as opposed to embedded within the video. This is most common with browser-based video playback (Flash, HTML5).
Embedding Closed Captions: Live Streaming
The following are a few of the most commonly used methods and formats for delivering embedded closed captions.
Originally used as the standard for analog broadcasts, but can be embedded in digital broadcasts. 608 data is inserted in the Line 21 data. Due to character limitations, 608 captions are limited to encoding in English, Spanish, Portuguese, French, Italian, German and Dutch. There are only two available fields for languages in line 21 This is often considered an older delivery method.
Introduced for use with digital broadcasts, this newer delivery method has many more advanced appearance standards. 708 data is inserted via the H.264 stream.
RTMP onCaptionInfo metadata
This is an Action Message Format (AMF) onCaptionInfo with a type of "708". This caption type contains Base-64 encoded CEA-608 or CEA-708 caption data.
IBM Video Streaming supports CEA -608 for streaming captions to Live content: onCaptionInfo metadata, and CEA-608 H.264 and H.264 SEI NALU.
In order to stream CEA-608 or on CaptionInfo data, your encoder will need to actively send these formats.
We are working to support all encoders capable of passing through 608 captions.
Please contact us if you are currently using another encoder that supports embedded captions that are not properly passing the embedded captions through to the player.
Watson Live Captions for Video Streaming/Enterprise Video Streaming
If you are currently a Waston Live Captions for Video Streaming or Enterprise Video Streaming customer, You'll be able to turn automatic captions on via your dashboard under Caption Settings.
You can also check your overall live captioning usage on your subscription overview page.
NOTE: Watson Live Captions are available in English only.