As of 2022[update], an annual video industry survey has consistently found it to be the most popular streaming format.
Apple has documented HTTP Live Streaming as an Internet Draft (Individual Submission), the first stage in the process of publishing it as a Request for Comments (RFC).
[9] HTTP Live Streaming uses a conventional web server, that implements support for HTTP Live Streaming (HLS), to distribute audiovisual content and requires specific software, such as OBS to fit the content into a proper format (codec) for transmission in real time over a network.
The service architecture comprises: HTTP Live Streaming provides mechanisms for players to adapt to unreliable network conditions without causing user-visible playback stalling.
For example, on an unreliable wireless network, HLS allows the player to use a lower quality video, thus reducing bandwidth usage.
To enable a player to adapt to the bandwidth of the network, the original video is encoded in several distinct quality levels.
At WWDC 2016 Apple announced[11] the inclusion of byte-range addressing for fragmented MP4 files, or fMP4, allowing content to be played via HLS without the need to multiplex it into MPEG-2 Transport Stream.
[12][13] Two unrelated HLS extensions with a Low Latency name and corresponding acronym exist: The remainder of this section describes Apple's ALHLS.
Other features include: Apple also added new tools: tsrecompressor produces and encodes a continuous low latency stream of audio and video.
It is an HLS segmenter that takes in a UDP/MPEG-TS stream from tsrecompressor and generates a media playlist, including the new tags above.
A SCTE-35 splice out/in pair signaled by the splice_insert() commands is represented by one or more EXT-X-DATERANGE tags carrying the same ID attribute.