HTTP live streaming server on iPhone

0 votes
I am attempting to setup an HTTP live streaming server on an iPhone that feeds the HTML5 client with the video stream that was captured by the camera (which supports HTTP Live Streaming).

I've got the following functioning so far.

iOS's HTTP Live streaming server (built in Node.js) updates the index file dynamically from a list of Transport Stream (video/MP2T) files produced by the video capture module.
The video capture module creates a sequence of 10-second QuickTime files continually using AVCaptureMovieFileOutput (there is a slight delay between them, but it is sufficient for my application).
I need an on-the-fly converter to transform each QuickTime file into a Transport Stream file (I don't need to change the encoding; I only need a different container), which can then be stored in a different format.

connects the two components above.

According to my study (I've done a lot of reading on this subject, and I'm 99% sure), this is the only technique to utilise the hardware video encoder of the iPhone. Please correct me if I'm wrong.

A couple people mentioned using ffmpeg, but I'd prefer develop something from scratch or use much smaller code with an MIT licence (if any) (and open-source it with MIT license).

I'm fairly new to this media container stuff, so if anyone could send me in the correct route, that would be greatly appreciated (sample code, open source, documents)
Sep 20 in IOS by Soham
• 9,670 points
93 views

1 answer to this question.

0 votes
If you don't encode the raw h264 using x264 or a similar programme, making an mpeg ts from AVCaptureVideoDataOutput is not an easy procedure. Let's suppose for a moment that getting mpeg ts files was simple. Once they were compiled in a m3u8 container, a small web server could be launched to deliver the files. As far as I'm aware, using localhost tunnels from the device is not a major issue, and numerous apps do so. Therefore, while it's possible to create HLs from the gadget, I doubt the results would be good.

Now for method number 2. You capture the frames using AvCaptureVideoDataOutput and then wrap them in a cute little protocol.

Open a socket and deliver the data to your server using json or perhaps something more obscure like bencode. Ah, good luck. You'd better have a strong network because even via WiFi, sending uncompressed frames will consume up bandwidth.

The third technique is next.

You use avassetwriter to create a new movie and use conventional C routines to read it back from the temp file. This works great, but what you have is raw h264 and your mp4 is incomplete, so it lacks any moov atoms. This is where the fun part begins: producing the header. good fortune

Moving on to technique 4, which appears to have some value.

We establish not just one, but two avassetwriters, and we control how they are used by utilising a gcd dispatch queue.

Avassetwriters can only be used once; therefore, we start the first one on a timer, set it for, say, 10 seconds, and then start the second one while dismantling the first. We now have a string of.mov files that each include a complete moov atom and compressed h264 video. The server will now receive these and put them together into a single video feed. We might also use a straightforward streamer that receives mov files, encapsulates them in the rtmp protocol using librtmp, and sends them to a media server.

That question has been misread several times, but could we simply send each individual MOV file to another Apple device to enable device-to-device communication?

It is feasible and relatively simple to connect devices on the same subnet using wifi. Finding another device through a cellular connection is practically impossible; if it is possible at all, it can only be done on cell networks that employ addressable ip addresses, which not all popular carriers do.

Even if you could, there would still be a problem because none of the avfoundation video players could manage switching between that many different separate movie files. It would be necessary for you to create your own streaming player, most likely based on ffmpeg decoding. (That actually functions rather nicely)
answered Sep 20 by Aditya
• 7,660 points

Related Questions In IOS

0 votes
0 answers

How to connect iphone to a local server running on mac?

I am running a django server at ...READ MORE

Sep 22 in IOS by Soham
• 9,670 points
44 views
0 votes
1 answer

How is a rounded rect view with transparency done on iphone?

view.layer.cornerRadius = radius; The difficult technique is to ...READ MORE

answered Sep 22 in IOS by Rahul
• 9,680 points
39 views
0 votes
1 answer

Is there a way to to check if a picture was taken on that iPhone?

Actually, the model and manufacturer information is ...READ MORE

answered Sep 22 in IOS by Rahul
• 9,680 points
37 views
0 votes
0 answers

touch.facebook.com on iPhone giving me "Entity 'nbsp' not defined"

Thanks to Logan and this url structure, ...READ MORE

Sep 27 in IOS by Soham
• 9,670 points
34 views
0 votes
1 answer

Truffle tests not running after truffle init

This was a bug. They've fixed it. ...READ MORE

answered Sep 11, 2018 in Blockchain by Christine
• 15,790 points
1,106 views
0 votes
1 answer
0 votes
1 answer

Protocols used in a distributed/dlt system for the nodes to establish communication

yes all are over TCP/IP connections secured by TLS encryption in hashgraph architecture-hashgraph, ...READ MORE

answered Aug 6, 2018 in Blockchain by aryya
• 7,440 points
727 views
0 votes
1 answer

Is it possible to run .APK/Android apps on iPad/iPhone devices?

It is not possible to run Android ...READ MORE

answered Sep 20 in IOS by Aditya
• 7,660 points
185 views
0 votes
1 answer

what font face of clock on lock screen of iPhone (iOS 9)?

San Francisco is the default font in ...READ MORE

answered Sep 20 in IOS by Aditya
• 7,660 points
67 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP