2 years ago

#61921

test-img

Max Hay

Does AVPlayer support live footage served directly from a fragmented MP4 file?

Overview

I have a server generating a livestream of video that is exposed as a fragmented MP4 file.

That file is being served to an iOS emulator trying to play the video using react-native-video, which, I believe uses AVPlayer.

The first request the emulator makes is a range request for bytes 0-1. I record the X-Playback-Session-Id and respond with: 206 partial content, bytes 0-1, and the content-range bytes 0-1/*. According to the specification, the size of * indicates that the value is unknown.

I then receive an error on the AVPlayer stating that the server is not correctly configured. According to the apple docs this indicates the server does not support range requests.

I have implemented support for range requests. As an experiment, I set the content-range to respond with a very large size, instead of * (bytes 0-1/17179869176). Which works to an extent. The AVPlayer follows through with multiple range requests for different byte-ranges (0-17179869175). Though sometimes it only requests a singular range. This buffers for a while and displays nothing until I stop the server (with a breakpoint), and a short while after the video stops buffering (but does not close any active connections) and plays what it has so far loaded. Given that this is a livestream that's not acceptable.

Playing the livestream inside chrome or an android emulator works exactly as I'd expect - the video is played as soon as it gets all the necessary data. But chrome also does not require any of the support for byte range requests to be able to play a video.

I can understand that without any source of content-length the AVPlayer is unable to make range requests as it doesn't know where the file ends. However, as the media I'm exposing is a live stream I don't have a meaningful content-length to give it. So there must be something I can specify either in headers on the server or as AVPlayer settings on the client that states the video is a livestream and so cannot be handled through range requests, or that it must request chunks of footage at a time.

I've looked online and found some useful documents regarding the subject of livestreaming, though all of them are surrounding use of HLS and m3u playlist files. However, changing the back-end to generate m3u playlist files and to decode the video to work out the durations for the chunks correctly would probably take more weeks and months of development time, and I don't understand why it'd be necessary, given that I'm only exposing a single resolution of a single video file that does not need to seek, and that it does work perfectly fine on android.

After having spent so long and having come across so many different hard to resolve issues it's starting to feel like I've somehow gone down the wrong path and that I'm going about this completely the wrong way.

My question is two-fold

Does AVPlayer support live footage served directly from a fragmented MP4 file?

If so, how do I implement it?

ios

video

avplayer

live-streaming

react-native-video

0 Answers

Your Answer

Accepted video resources