Proxy a rtmp stream - nginx

How could I proxy a rtmp stream?
I have two raspberry pi streaming live video from raspicams on my LAN. Each raspberry pi sends the video to ffmpeg which wraps in flv and sends to crtmpserver.
A third server using nginx, has a static html page with two instances of jwplayer, each pointing to one raspberry pi.
The setup is just like this one.
The web server uses authentication and I'd like streams not to be public too.
I'm thinking of trying nginx-rtmp-module, but I am not sure if it would help me. Also, it seems dormant and has many open issues.
I'm open to suggestions, thanks in advance!

You can use MonaServer with this client (copy it into the www/ directory of MonaServer) which listen on the udp port 6666 and wait for an flv file to publish it with the name "file".
Then you should already be able to play your stream with jwplayer (with the address rtmp:///file) or with any other player. MonaServer support the HTTP protocol so you can host your html page without nginx if you want.
Now if you want to filter the subscription to "file" you need to write a client:onSubscribe function in your main.lua script, just like this :
function onConnection(client)
INFO("Connection from ",client.address)
function client:onSubscribe(listener)
INFO("Subscribing to ", listener.publication.name, "...")
if not client.right then
error("no rights to play it")
end
end
end
(Here you need to change the "not client.right" and implement your authentication function for your purpose)
Going further you could use another flash video client that support RTMFP in order to handle a large number of clients. Contact me (jammetthomas AT gmail.com) for more informations.

Related

Serve HLS with raspap/lighttpd rather than nginx

I currently serve audio/video from a raspberry pi using the picam project along with nginx to stream it as an HLS (Http Live Streaming) stream (as detailed in the project page). Thus, in /etc/nginx/sites-available/default I add:
location /hls/ {
root /run/shm;
}
Then, I can access my stream (for example with VLC player) at http://mypi.local/hls/index.m3u8.
However, I no longer wish to rely on my internet box to stream. Indeed, I would like my client(s) to directly connect to the pi. Thus, I have recently tried Raspap to transform my raspberry pi to a hotspot.
However, as raspap seems to use lighttpd as its webserver, I am wondering how I can still stream my audio/video stream as it is currently done with picam and nginx.
server.modules += ("mod_alias")
alias.url = ( "/hls/" => "/run/shm/" )

Media (video) negotiation in Astrisk

First, let me dictate the call flow and the nodes involved.
UA1 <--------------> Proxy1 (Kamailio)/RTPProxy1 <-------------------> Asterisk <-------------> Proxy2(Kamailio) /RTPProxy2<---------> UA2
Currently, Asterisk acts as a B2BUA server, and the location lookup/registration is handled by the Proxies. The Asterisk is in the signaling as well as media (audio) path.
Problem Statement:
Asterisk should be in the audio path and not video path if the call is an audio+video call. So, audio goes from UA1 to RTPproxy, Asterisk to RTPProxy to UA2 and back. While video from UA1 to RTPProxy 1 to RTProxy2 to UA2.
Question:
Can Asterisk be configured/programmed, so that it negotiates with RTPProxy1/2 video IP/port? While for Audio it does negotiation with its own IP and Port as its currently doing.
Thanks
Abhijit
No, asterisk video is very limited. Negotiation options are same, so it will work same as audio.
If you want make it different, create TWO calls - one audio call and one video call without audio.
However if you use kamailio as proxy, in theory it is POSSIBLE make it like you want. But very unlikly your UA will support that(at least i never hear about something like that).

HLS Nginx and Streaming Media

This is a process question more then anything.
I've been reading up on HTTP streaming (HLS) over the past week.
My goal is to be able to deliver content from my NGINX web server using HLS.
I have looked at Clappr using HLS.js. as a player however I'm just unclear what I need to do to deliver the content. Do I need a streaming media sever? just a web server?
I think I can use ffmpeg to make the HLS streams.
Eventually I'm hoping to be able to record incoming streams for processing later. Right now I just want to be able to put out HLS streams.
Any advice or infographic or something to put this in perspective would be appreciated.
Do I need a streaming media sever? just a web server?
One of the main purposes of HLS is that you can serve the data with any HTTP server. Content is effectively files done in chunks. No special streaming server is needed. Nginx is fine.

TCP > COM1 for receiving messages and displaying on POS display pole

I currently have a Java Applet running on my web page that communicates to a display pole via COM1. However since the Java update I can no longer run self-signed Java Applets and I figure it would just be easier to send an AJAX request back to the server and have the server send a response to a TCP port on the computer...the computer would need a TCP > COM virtual adapter. How do I install a virtual adapter to go from a TCP port to COM1?
I've looked into com0com and that is just confusing as hell to me, and I don't see how to connect any ports to COM1. I've tried tcp2com but it doesn't seem to install the service in Windows 7 x64. I've tried com2tcp and the interface seems like it WOULD work (I haven't tested), but I don't want an app running on the desktop...it needs to be a service that runs in the background.
So to summarize how it would work:
Web page on comp1 sends AJAX request to server
Server sends text response to comp1 on port 999
comp1 has virtual COM port listening on port 999, sends data to COM1
pole displays data
EDIT: I'm using Win 7 x64 and tcp2com doesn't work as a service. I tried using srvany but I get an error stating that the application started then stopped. If I use powershell and pass the tcp2com as an argument, it doesn't quit but it also doesn't run. So I nixed the whole 'service' deal and put the command: powershell -windowstyle hidden "tcp2com --test tcp/999 com1" and it works...sort of. The characters that get sent are all effed. I can write "echo WTF > COM1" on another computer which has COM2TCP (different vendor) and it'll come up as a single block on the POS display pole. However if I use COM2TCP on both the server and client machines, everything works fine...but that's only a trial version and it costs several hundred dollars! On another note, is there a way to send the raw text over IP without having to use another Virtual COM > IP adapter on another computer? Sort of like how curl works but different...?
After somewhat of an exhaustive search, I came across a program called 'piracom'. It's a very simple app that lets you specify port settings for the express purpose of connecting a serial port to an listening port over the network. So this is IP > Serial. For Serial > IP I used HW-VSP3-Single as even on the piracom website it said it's compatible! I've tested and it works!
I just put a shortcut to piracom in the startup folder of my user account; the app runs off of a .ini that it updates every time you make a change...so if you run the server and hide it, on the next reboot of the pc it'll start up running and hidden with all prior settings. Easy.
Now it's a matter of installing HW-VSP3 on the server and making a method on the Rails app which will write to the virtual COM port. The only issue I can see right now is that writing echo \14Test This! > COM3 actually prints the \14...if I do that in my Java applet, it sends the "go to beginning" signal.
Addendum 1: The \14 problem was fixed by using the serialport gem for RoR. I created a method in a controller that returned head :no_content and then send data to the COM port. Calls to this method were made via jQuery's $.Ajax, using "HEAD" HTTP method. Apparently though I had to add the GET verb in Rails routes because the HEAD option isn't supported for some gimpy reason.
Addendum 2: Some garbage data was being sent to the display pole at the end of the string...turns out I needed to turn off the "NVT" option in HW-VSP3. Also keep in mind that firewalls need to be modified to allow communication.

How to live stream a desktop to html5 video tag

I have some output from a program I'd like to stream live to a html5 video tag. So far I've used VLC to capture the screen, transcode it to ogg, and stream it using its built-in http server. It works insofar that I see the desktop image in the browser window.
The catch is this: Every time I refresh the page, the video starts from the top, where I'd like to see only the current screen, so that I can use it to build a sort of limited remote desktop solution that allows me to control the ubuntu desktop program from the browser.
I was thinking websockets to send the mouse events to the program, but I'm stuck on how to get the live picture instead of the whole stream.
Thanks in advance!
If you are building server side as well, I would suggest handle that operation yourself.
What you can do, is use mjpeg for html streaming. And you can write server application that will accept http connections and will send header of mjpeg stream and then every update will send picture it self. That way you will have realtime stream in browser.
This option is good due to ability of having control over stream from server side, and for client side it is just tag with mjpeg.
Regarding WebSockets - yes you can build it, but you will have to implement input devices control on remote computer side.
Here is server of streaming MJPEG that might be interesting to you: http://www.codeproject.com/Articles/371955/Motion-JPEG-Streaming-Server

Resources