I'm sending an RMTP stream named "testStream" to my AntMedia server. This stream can be viewed correctly on the page:
https://MYDOMAIN:5443/WebRTCAppEE/player.html
I would like to get the URL of the HLS stream to view the video within a native Android and iOS app. I've never done this before, I assume (indeed, I hope) that HLS is natively supported by both operating systems.
To get the HLS stream, I tried this URL:
https://MYDOMAIN:5443/WebRTCAppEE/streams/testStream.m3u8
It works, I tried that URL with VLC.
The only drawback is the delay, because the video stream has a ten-second delay. Opening the same video with a browser, at the address:
https://MYDOMAIN:5443/WebRTCAppEE/player.html
I don't notice any delay, and if there is, it's negligible.
Am I doing something wrong? I accept advice to embed the video into a native Android Studio and XCode app without delay, keeping the code as simple as possible. Thank you.
Thank you for your question,
https://MYDOMAIN:5443/WebRTCAppEE/player.html
plays stream with WebRTC so that there is no delay.
You can use https://MYDOMAIN:5443/WebRTCAppEE/play.html?id=testStream in your mobile to play with WebRTC.
Check the below doc for other options(WebRTC, HLS, etc.)
https://github.com/ant-media/Ant-Media-Server/wiki/Embedded-Web-Player
Related
Currently I am trying to play the Hls stream that is being sent to my nginx server via rtmp, in Unity with this plugin: AVPro. It is working on my PC but I want to play it on Android and it seems like the codec is not supported for Android regardless of which Video API I use. Currently im sending the stream with Streamlabs if that is important.
So I wondered if it is possible to somehow change the codec that nginx is using for the Hls stream.
When I watch the stream on my PC with VLC Media Player and look at the codec information it tells me this:
Videocodec: H264 - MPEG-4 AVC(part 10) (h264)
Audiocodec: ADTS
Im using nginx for Windows and this version: nginx 1.7.11.3 Gryphon
And if it is not possible with nginx I would like to know if there is an alternative that you can use as a streaming server. I haven't really found anything yet.
I have a Linux PC which streams video (with audio) from a webcam to an RTMP server (nginx). The nginx RTMP server then converts the video into HLS and that HLS stream is shown on the browsers. Everything works good. The only problem is the delay due to the HLS protocol (10-20 seconds depending on the HLS playlist size).
I am looking for an alternative to HLS which can run on most of the major browsers. I can not use WebRTC due to the lack of audio, I can not use flash due to lack of support is mobile browsers. So my question is, is there any way to consume the RTMP stream, then distribute it via WebSocket and play on modern WebSocket supported browsers without any additional plugin? I am using ffmpeg to publish the RTMP stream from the Linux PC. If required, the source stream can easily be changed to other live streaming protocol like RTSP. So if there's some other solution which can solve this problem without RTMP, I can go for that too.
Thanks in advance.
Yes this is possible, but there's an even simpler solution. Just stream the data over HTTP.
WebSockets are only needed for bi-directional communication. You're just sending the video to the client.
After reasearching for a few days, i m still lost with this issue:
I have a webcam connected over WiFi to my Android device.
I wrote an Android app to connect to a specified Socket of the webcam (IP and port). From this Socket i get an InputStream which is already encoded in H.264. Then i redirect this InputStream from the android device to my server, where i managed to decode it to images/frame by using Xuggler.
I would like to stream my webcam live to the internet to a flash player or something.
I know i have to use Wowza, FMS or RED5 for this.
My problem is, that i dont understand how to proceed with the InputStream i have. All examples i ve read need a mp4/flv or other container file to stream from... but i have a continuous live InputStream.
Some other examples consider using Flash Encoder. But my InputStream is already encoded in H.264.
This is a general understanding question. Please advise me on how to solve this.
Thank you
you have following options -
Encode in flv container. Yes you can transmit live stream using using flv container. You can set the 'duration' field in the header to be arbitrary long. e.g youtube use this trick for live streaming.
you can encode the stream into RTMP. ffmpeg has code for rtmp code which can be used for understand, or i believe there are other opensource rtmp muxers available.
convert the stream into HLS, there are flash based HLS player available.
why flash if I may ask, hope you know that HTML5 video tag now directly accepts h264 encoded videos.
I have some output from a program I'd like to stream live to a html5 video tag. So far I've used VLC to capture the screen, transcode it to ogg, and stream it using its built-in http server. It works insofar that I see the desktop image in the browser window.
The catch is this: Every time I refresh the page, the video starts from the top, where I'd like to see only the current screen, so that I can use it to build a sort of limited remote desktop solution that allows me to control the ubuntu desktop program from the browser.
I was thinking websockets to send the mouse events to the program, but I'm stuck on how to get the live picture instead of the whole stream.
Thanks in advance!
If you are building server side as well, I would suggest handle that operation yourself.
What you can do, is use mjpeg for html streaming. And you can write server application that will accept http connections and will send header of mjpeg stream and then every update will send picture it self. That way you will have realtime stream in browser.
This option is good due to ability of having control over stream from server side, and for client side it is just tag with mjpeg.
Regarding WebSockets - yes you can build it, but you will have to implement input devices control on remote computer side.
Here is server of streaming MJPEG that might be interesting to you: http://www.codeproject.com/Articles/371955/Motion-JPEG-Streaming-Server
is http streaming possible without using any streaming servers?
Of course. You can output and flush, it gets to client before you end the script, thus it's streaming.
For live streaming, only segmented, like Apple HLS, other variants of segmented HLS (like OSMF) are not widely supported at the moment.
IIS from microsoft can also do smooth streaming (and Apple HLS as well).
Apple HLS can be supported on any web server when you pre-segment stream to chunks and just upload to web server path.
For VoD streaming, there is lot's of modules for all web servers.
Yes, although libraries have varying level of support. What needs to be used is "http chunking", such that lib does not try to buffer the whole request/response in memory (to computed the content length header) and instead indicate content comes in chunks.
Yes,not only its possible but has been implemented by various media server companies, only reason they still make usage of servers because of commercial purpose. Basically the content you want to stream should be divided into chunks/packets and then client machine can request those chunks via simple HTTP Get Requests.
Well if you have WebSockets available you can actually get quite low-latency streaming for low-fps scenarios by sending video frames as jpegs.
You can also send audio separately and play it using WebAudio on your browser. I imagine it could work for scenarios where you do not require perfect audio-video sync.
Another approach is to stream MPEG chunks through WebSockets, decode them in JS using jsmpeg and render to a canvas. You can find more here (video-only):
http://phoboslab.org/log/2013/09/html5-live-video-streaming-via-websockets
yes, the answer to your problem with http streaming is MPEG-DASH tech