HTTP Live Streaming

HTTP Live Streaming (also known as HLS) is an HTTP-based adaptive bitrate streaming communications protocol developed by Apple Inc. and released in 2009. Support for the protocol is widespread in media players, web browsers, mobile devices, and streaming media servers. , an annual video industry survey has consistently found it to be the most popular streaming format.

HLS resembles MPEG-DASH in that it works by breaking the overall stream into a sequence of small HTTP-based file downloads, each downloading one short chunk of an overall potentially unbounded transport stream. A list of available streams, encoded at different bit rates, is sent to the client using an extended M3U playlist.

Based on standard HTTP transactions, HTTP Live Streaming can traverse any firewall or proxy server that lets through standard HTTP traffic, unlike UDP-based protocols such as RTP. This also allows content to be offered from conventional HTTP servers and delivered over widely available HTTP-based content delivery networks. The standard also includes a standard encryption mechanism and secure-key distribution using HTTPS, which together provide a simple DRM system. Later versions of the protocol also provide for trick-mode fast-forward and rewind and for integration of subtitles.

Apple has documented HTTP Live Streaming as an Internet Draft (Individual Submission), the first stage in the process of publishing it as a Request for Comments (RFC). As of December 2015, the authors of that document have requested the RFC Independent Stream Editor (ISE) to publish the document as an informational (non-standard) RFC outside of the IETF consensus process. In August 2017, was published to describe version 7 of the protocol.

Architecture
HTTP Live Streaming uses a conventional web server, that implements support for HTTP Live Streaming (HLS), to distribute audiovisual content and requires specific software, such as OBS to fit the content into a proper format (codec) for transmission in real time over a network. The service architecture comprises:


 * Server
 * Codify and encapsulate the input video flow in a proper format for the delivery. Then it is prepared for distribution by segmenting it into different files. In the process of intake, the video is encoded and segmented to generate video fragments and index file.
 * Encoder: codify video files in H.264 format and audio in AAC, MP3, AC-3 or EC-3. This is encapsulated by MPEG-2 Transport Stream or MPEG-4_Part_14 to carry it.
 * Segmenter: divides the stream into fragments of equal length. It also creates an index file that contains references of the fragmented files, saved as .m3u8.


 * Distributor
 * Formed by a standard web server, accepts requests from clients and delivers all the resources (.m3u8 playlist file and .ts segment files) needed for streaming.


 * Client
 * Request and download all the files and resources, assembling them so that they can be presented to the user as a continuous flow video. The client software downloads first the index file through a URL and then the several media files available. The playback software assembles the sequence to allow continued display to the user.

Features
HTTP Live Streaming provides mechanisms for players to adapt to unreliable network conditions without causing user-visible playback stalling. For example, on an unreliable wireless network, HLS allows the player to use a lower quality video, thus reducing bandwidth usage. HLS videos can be made highly available by providing multiple servers for the same video, allowing the player to swap seamlessly if one of the servers fails.

Adaptability
To enable a player to adapt to the bandwidth of the network, the original video is encoded in several distinct quality levels. The server serves an index, called a master playlist, of these encodings, called variant streams. The player can then choose between the variant streams during playback, changing back and forth seamlessly as network conditions change.

Using fragmented MP4
At WWDC 2016 Apple announced the inclusion of byte-range addressing for fragmented MP4 files, or fMP4, allowing content to be played via HLS without the need to multiplex it into MPEG-2 Transport Stream. The industry considered this as a step towards compatibility between HLS and MPEG-DASH.

Low Latency HLS
Two unrelated HLS extensions with a Low Latency name and corresponding acronym exist:


 * Apple Low Latency HLS (ALHLS) which was announced by Apple at WWDC2019
 * Community LHLS (LHLS) which predated Apple's publication and is allegedly simpler

The remainder of this section describes Apple's ALHLS. It reduces the glass-to-glass delay when streaming via HLS by reducing the time to start live stream playbacks and maintain that time during a live-streaming event. It works by adding partial media segment files into the mix, much like MPEG-CMAF's fMP4. Unlike CMAF, ALHLS also supports partial MPEG-2 TS transport files. A partial media segment is a standard segment (e.g. 6 seconds) split into equal segments of less than a second (e.g. 200 milliseconds). The standard first segment is replaced by the series of partial segments. Subsequent segments are of the standard size. HTTP/2 is required to push the segments along with the playlist, reducing the overhead of establishing repeated HTTP/TCP connections.

Other features include:
 * Playlist Delta Updates: only sending what changed between playlist, which typically fit in single MTU making it more efficient to load the playlists which, with large DVR windows, can be quite large.
 * Blocking of playlist reload: when requesting live media playlists, wait until the first segment is also ready, and return both at same time (saving additional HTTP/TCP requests)
 * Rendition Reports: add metadata to other media renditions to make switching between ABR faster
 * New tags added: EXT-X-SERVER-CONTROL / EXT-X-PART / EXT-X-SKIP / EXT-X-RENDITION-REPORT
 * URL QUERY_STRING ?_HLS callbacks added

Apple also added new tools: tsrecompressor produces and encodes a continuous low latency stream of audio and video. The mediastreamsegmenter tool is now available in a low-latency version. It is an HLS segmenter that takes in a UDP/MPEG-TS stream from tsrecompressor and generates a media playlist, including the new tags above.

Support for low-latency HLS is available in tvOS 13 beta, and iOS & iPadOS 14. On April 30, 2020, Apple added the low latency specifications to the second edition of the main HLS specification.

Dynamic ad insertion
Dynamic ad insertion is supported in HLS using splice information based on SCTE-35 specification. The SCTE-35 splice message is inserted on the media playlist file using the EXT-X-DATERANGE tag. Each SCTE-35 splice_info_section is represented by an EXT-X-DATERANGE tag with a SCTE35-CMD attribute. A SCTE-35 splice out/in pair signaled by the splice_insert commands are represented by one or more EXT-X-DATERANGE tags carrying the same ID attribute. The SCTE-35 splice out command should have the SCTE35-OUT attribute and the splice in command should have the SCTE35-IN attribute.

Between the two EXT-X-DATERANGE tags that contain the SCTE35-OUT and SCTE35-IN attributes respectively there may be a sequence of media segment URIs. These media segments normally represent ad programs which can be replaced by the local or customized ad. The ad replacement does not require the replacement of the media files, only the URIs in the playlist need to be changed to point different ad programs. The ad replacement can be done on the origin server or on the client's media playing device.

Server implementations
Notable server implementations supporting HTTP Live Streaming include:
 * Adobe Media Server supports HLS for iOS devices (HLS) and Protected HTTP Live Streaming (PHLS).
 * Akamai supports HLS for live and on-demand streams.
 * AT&T supports HLS in all formats live or on-demand.
 * Axis Communication IP cameras supports HLS via CamStreamer App ACAP
 * Instart supports HLS for on-demand streams.
 * Amazon CloudFront supports HLS for on-demand streams.
 * Bitmovin supports HLS for on-demand and live streaming.
 * CDNetworks supports HLS for live and on-demand streams.
 * Cisco Systems: supports full end to end delivery for Live/TSTV/VOD/HLS and Cloud DVR services.
 * Cloudflare supports HLS for live and on-demand streams.
 * EdgeCast Networks supports cross-device streaming using HLS.
 * Fastly supports HLS for live and on-demand streams.
 * Helix Universal Server from RealNetworks supports iPhone OS 3.0 and later for live and on-demand HTTP Live or On-Demand streaming of H.264 and AAC content to iPhone, iPad and iPod.
 * IIS Media Services from Microsoft supports live and on-demand Smooth Streaming and HTTP Live Streaming.
 * Level 3 supports HLS live and on-demand streams.
 * Limelight Networks supports HLS for some accounts.
 * Nginx with the nginx-rtmp-module supports HLS in live mode. Commercial version Nginx Plus, which includes ngx_http_hls_module module, also supports HLS/HDS VOD.
 * Nimble Streamer supports HLS in live and VOD mode, Apple Low Latency HLS spec is also supported.
 * Node.js with the hls-server package supports hls encoding to live mode and local files conversion.
 * Storm Streaming Server supports HLS as backup mode for its Media Source Extensions player
 * Tata Communications CDN supports HLS for live and on-demand streams.
 * TVersity supports HLS in conjunction with on-the-fly transcoding for playback of any video content on iOS devices.
 * Unreal Media Server supports low latency HLS as of version 9.5.
 * Ustream supports HLS delivery of live broadcasts. The ingested stream is re-transcoded if the original audio and video codec falls outside HLS requirements.
 * VLC Media Player supports HLS for serving live and on-demand streams as of version 2.0.
 * Wowza Streaming Engine from Wowza Media Systems supports HLS and encrypted HLS for live (with DVR), on-demand streaming and Apple Low Latency HLS spec.

Usage

 * Google added HTTP Live Streaming support in Android 3.0 (Honeycomb).
 * HP added HTTP Live Streaming support in webOS 3.0.5.
 * Microsoft added support for HTTP Live Streaming in EdgeHTML rendering engine in Windows 10 in 2015.
 * Microsoft added support for HTTP Live Streaming in IIS Media Services 4.0.
 * Yospace added HTTP Live Streaming support in Yospace HLS Player and SDK for flash version 1.0.
 * Sling Media added HTTP Live Streaming support to its Slingboxes and its SlingPlayer apps.
 * In 2014/15, the BBC introduced HLS-AAC streams for its live internet radio and on-demand audio services, and supports those streams with its iPlayer Radio clients.
 * Twitch uses HTTP Live Streaming (HLS) to transmit and scale the live streaming to many concurrent viewers, also supporting multiple variants (e.g., 1080p, 720p, etc.).

Supported players and servers
HTTP Live Streaming is natively supported in the following operating systems:
 * Windows 10 version 1507 to 2004 (Microsoft Edge Legacy) (no longer supported)
 * Windows 11 Media Player
 * macOS 10.6+ (Safari and QuickTime)
 * iOS 3.0+ (Safari)
 * Android 4.1+ (Google Chrome)

Windows 10 used to have native support for HTTP Live Streaming in EdgeHTML, a proprietary browser engine that was used in Microsoft Edge (now referred to as Edge Legacy) before the transition to the Chromium-based Blink browser engine. Edge Legacy was included in Windows 10 up till version 2004. It was replaced by Edge Chromium in version 20H2. Along with Windows 11, Microsoft released an updated Media Player that supports HLS natively.