h.264 live stream - inputstream

After reasearching for a few days, i m still lost with this issue:
I have a webcam connected over WiFi to my Android device.
I wrote an Android app to connect to a specified Socket of the webcam (IP and port). From this Socket i get an InputStream which is already encoded in H.264. Then i redirect this InputStream from the android device to my server, where i managed to decode it to images/frame by using Xuggler.
I would like to stream my webcam live to the internet to a flash player or something.
I know i have to use Wowza, FMS or RED5 for this.
My problem is, that i dont understand how to proceed with the InputStream i have. All examples i ve read need a mp4/flv or other container file to stream from... but i have a continuous live InputStream.
Some other examples consider using Flash Encoder. But my InputStream is already encoded in H.264.
This is a general understanding question. Please advise me on how to solve this.
Thank you

you have following options -
Encode in flv container. Yes you can transmit live stream using using flv container. You can set the 'duration' field in the header to be arbitrary long. e.g youtube use this trick for live streaming.
you can encode the stream into RTMP. ffmpeg has code for rtmp code which can be used for understand, or i believe there are other opensource rtmp muxers available.
convert the stream into HLS, there are flash based HLS player available.

why flash if I may ask, hope you know that HTML5 video tag now directly accepts h264 encoded videos.

Related

Change Codec of hls stream in Nginx

Currently I am trying to play the Hls stream that is being sent to my nginx server via rtmp, in Unity with this plugin: AVPro. It is working on my PC but I want to play it on Android and it seems like the codec is not supported for Android regardless of which Video API I use. Currently im sending the stream with Streamlabs if that is important.
So I wondered if it is possible to somehow change the codec that nginx is using for the Hls stream.
When I watch the stream on my PC with VLC Media Player and look at the codec information it tells me this:
Videocodec: H264 - MPEG-4 AVC(part 10) (h264)
Audiocodec: ADTS
Im using nginx for Windows and this version: nginx 1.7.11.3 Gryphon
And if it is not possible with nginx I would like to know if there is an alternative that you can use as a streaming server. I haven't really found anything yet.

nginx-rtmp video stream seeking functionality

I am trying to make a video streaming service by using nginx-rtmp for reading stream and sending a dash/hls streaming media
Used this link for live streaming media
I used obs for sending the stream to nginx
But it is a live stream and user cant seek the stream
so, is there a way so that users cana actually seek it
Or is there a better way to stream video using dash/hls

AntMedia HLS streaming delay

I'm sending an RMTP stream named "testStream" to my AntMedia server. This stream can be viewed correctly on the page:
https://MYDOMAIN:5443/WebRTCAppEE/player.html
I would like to get the URL of the HLS stream to view the video within a native Android and iOS app. I've never done this before, I assume (indeed, I hope) that HLS is natively supported by both operating systems.
To get the HLS stream, I tried this URL:
https://MYDOMAIN:5443/WebRTCAppEE/streams/testStream.m3u8
It works, I tried that URL with VLC.
The only drawback is the delay, because the video stream has a ten-second delay. Opening the same video with a browser, at the address:
https://MYDOMAIN:5443/WebRTCAppEE/player.html
I don't notice any delay, and if there is, it's negligible.
Am I doing something wrong? I accept advice to embed the video into a native Android Studio and XCode app without delay, keeping the code as simple as possible. Thank you.
Thank you for your question,
https://MYDOMAIN:5443/WebRTCAppEE/player.html
plays stream with WebRTC so that there is no delay.
You can use https://MYDOMAIN:5443/WebRTCAppEE/play.html?id=testStream in your mobile to play with WebRTC.
Check the below doc for other options(WebRTC, HLS, etc.)
https://github.com/ant-media/Ant-Media-Server/wiki/Embedded-Web-Player

Consume RTMP ans distribute via WebSocket

I have a Linux PC which streams video (with audio) from a webcam to an RTMP server (nginx). The nginx RTMP server then converts the video into HLS and that HLS stream is shown on the browsers. Everything works good. The only problem is the delay due to the HLS protocol (10-20 seconds depending on the HLS playlist size).
I am looking for an alternative to HLS which can run on most of the major browsers. I can not use WebRTC due to the lack of audio, I can not use flash due to lack of support is mobile browsers. So my question is, is there any way to consume the RTMP stream, then distribute it via WebSocket and play on modern WebSocket supported browsers without any additional plugin? I am using ffmpeg to publish the RTMP stream from the Linux PC. If required, the source stream can easily be changed to other live streaming protocol like RTSP. So if there's some other solution which can solve this problem without RTMP, I can go for that too.
Thanks in advance.
Yes this is possible, but there's an even simpler solution. Just stream the data over HTTP.
WebSockets are only needed for bi-directional communication. You're just sending the video to the client.

How do I set up a live audio streaming http server?

I was hoping to build an application that streams audio (mp3, ogg, etc.) from my microphone to a web browser.
I think I can use the html5 audio tag to read/play the stream from my server.
The area I'm really stuck on is how to setup the streaming http endpoint. What technologies will I need, and how should my server be structured to get the live audio from my mic and accessible from my server?
For example, for streaming mp3, do I constantly respond with mp3 frames as they are recorded?
Thanks for any help!
First off, let's split this problem up into a few parts. You have the audio capture (recording), the encoding/codec, the server, and the receiving clients.
Capture -> Codec -> Server -> Several Clients
For audio capture, you will need to use the Web Audio API along with getUserMedia. This will allow you to get 32-bit floating point PCM samples from the recording device. This data stream takes up a ton of bandwidth... a few megabit for a stereo stream. This stream is not directly playable in an HTML5 audio tag, and while you could play it on the receiving end with the Web Audio API, it takes up too much bandwidth to be useful. You need to use a codec to get the bandwidth usage down.
The codecs you want to look at include MP3, AAC (and its variants such as HE-AAC), and Opus. Not all browsers support all codecs. MP3 is the most widely compatible but AAC provides better quality for a given bitrate. Opus is a free and open codec but still doesn't have the greatest client adoption. In any case, there isn't yet a codec that you can run in-browser with any real stability. (Although it's being worked on! There are a lot of test projects made with Emscripten.) I solved this problem by reducing the bit depth of my samples to 16-bit signed integers and sending this PCM stream to a server to do the codec, over a binary websocket.
This encoding server took the PCM stream and ran it through a codec server-side. Here you can use whatever you'd like, such as a licensed codec binary or a tool like FFmpeg which encapsulates multiple codecs.
Next, this server streamed the data to a real streaming media server like Icecast. SHOUTcast and Icecast servers take the encoded stream and relay it to many clients over an HTTP-like connection. (Icecast is HTTP compliant whereas SHOUTcast is close but not quite there which can cause compatibility issues.)
Once you have your streaming server set up, it's as simple as referencing the stream URL in your <audio> tag.
Hopefully that gets you started. Depending on your needs, you might also look into WebRTC which does all of this for you but doesn't give you options for quality and also doesn't scale beyond a few users.

Resources