react native WebRTC remote stream not working - firebase

I'm creating a video chat app
i just cloned this repo for understanding but its not showing remote stream.
in the firebase there are 4 ice canditates(2 video 2 audio) from the caller side but only 2 from callee side 2 audio nothing for video.
No errors in the console.
the stun servers used are
`iceServers: [
{
urls: ['stun:stun1.l.google.com:19302', 'stun:stun2.l.google.com:19302']
},
],`
I'm completely new to webrtc please help me what can be the problem.
thank you.

Related

AntMedia HLS streaming delay

I'm sending an RMTP stream named "testStream" to my AntMedia server. This stream can be viewed correctly on the page:
https://MYDOMAIN:5443/WebRTCAppEE/player.html
I would like to get the URL of the HLS stream to view the video within a native Android and iOS app. I've never done this before, I assume (indeed, I hope) that HLS is natively supported by both operating systems.
To get the HLS stream, I tried this URL:
https://MYDOMAIN:5443/WebRTCAppEE/streams/testStream.m3u8
It works, I tried that URL with VLC.
The only drawback is the delay, because the video stream has a ten-second delay. Opening the same video with a browser, at the address:
https://MYDOMAIN:5443/WebRTCAppEE/player.html
I don't notice any delay, and if there is, it's negligible.
Am I doing something wrong? I accept advice to embed the video into a native Android Studio and XCode app without delay, keeping the code as simple as possible. Thank you.
Thank you for your question,
https://MYDOMAIN:5443/WebRTCAppEE/player.html
plays stream with WebRTC so that there is no delay.
You can use https://MYDOMAIN:5443/WebRTCAppEE/play.html?id=testStream in your mobile to play with WebRTC.
Check the below doc for other options(WebRTC, HLS, etc.)
https://github.com/ant-media/Ant-Media-Server/wiki/Embedded-Web-Player

Converting HTTP requests to MQTT and back again for smart home integration

We have an already running MQTT setup for communication between smart home devices and remote server, for remotely controlling the devices. Now we want to integrate our devices with Google Home and Alexa. These two use HTTP for communication with third party device clouds.
I have implemented this for Google Home and after receiving the request to device cloud, the request is converted to MQTT. This MQTT request is then sent to smart home device. The device cloud waits for few seconds to receive reply from smart home device. If no reply is received within predefined time, it then sends failure HTTP response to Google Home else it sends the received reply.
Is there a better way to handle this? Since this is a commercial project I want to get this implemented in the correct way.
Any help will be appreciated.
Thanks
We're using AWS IoT and I think it's a good way to handle IoT issues, below are some features of it:
Certification, each device is a thing and attached its own policy, it's security
Shadow, it's device's current state JSON document, The Device Shadow service acts as an intermediary, allowing devices and
applications to retrieve and update a device's shadow
Serverless, we use lambda to build skill and servers, it's flexible
Rule, we use it to intercept MQTT messages so that we can report device's state changing to Google and Alexa. BTW, to Google, Report State implementation has become mandatory for all partners launch & certify.
You can choose either MQTT or HTTP
It’s time-consuming but totally worth it! We've sold 8k+ products, so far so good.
At least Google Home doesn't really require synchronous operation there. Once you get the EXECUTE-intent via their API, you just need to sent that to your device (but it doesn't necessarily has to report its state back).
Once its state changes, you either store it for further QUERY-intents or provide this data to the Google Homegraph server using the "Report State" interface.
I'm developing gBridge.io as a project providing quite similar functionality (but for another target group). There, it is strictly split as described: A HTTP endpoint listener reacts to commands from Google Home and sends it to a cache, where it is eventually sent to the matching MQTT topic. Another worker is listening to MQTT topics from the users and storing there information in the cache, so it can be sent back to Google once required.

How to push a local video when there are no livestream

I'm actually looking to a ningx rtmp server which send video flux to Twitch.
But (obviously there is a but) , i'm looking to have a server which do the following :
If no one send a flux with OBS or XSplit, the server send a local video to Twich
If someone send a flux, the server switch to this flux whitout being deconnected on Twitch
The server switch again to the local video when the flux get stopped.
I think there is some issue with on_play or on_connect but i'm not sure about how it's works.
If anybody have any idea, it could be great.
P.S : Actually, it's possible to have little delay offline between the switch because it could be possible to have a computer which take the flux from the server then send it to Twitch, so I'll juste have few seconds black screen I guess.

Proxy a rtmp stream

How could I proxy a rtmp stream?
I have two raspberry pi streaming live video from raspicams on my LAN. Each raspberry pi sends the video to ffmpeg which wraps in flv and sends to crtmpserver.
A third server using nginx, has a static html page with two instances of jwplayer, each pointing to one raspberry pi.
The setup is just like this one.
The web server uses authentication and I'd like streams not to be public too.
I'm thinking of trying nginx-rtmp-module, but I am not sure if it would help me. Also, it seems dormant and has many open issues.
I'm open to suggestions, thanks in advance!
You can use MonaServer with this client (copy it into the www/ directory of MonaServer) which listen on the udp port 6666 and wait for an flv file to publish it with the name "file".
Then you should already be able to play your stream with jwplayer (with the address rtmp:///file) or with any other player. MonaServer support the HTTP protocol so you can host your html page without nginx if you want.
Now if you want to filter the subscription to "file" you need to write a client:onSubscribe function in your main.lua script, just like this :
function onConnection(client)
INFO("Connection from ",client.address)
function client:onSubscribe(listener)
INFO("Subscribing to ", listener.publication.name, "...")
if not client.right then
error("no rights to play it")
end
end
end
(Here you need to change the "not client.right" and implement your authentication function for your purpose)
Going further you could use another flash video client that support RTMFP in order to handle a large number of clients. Contact me (jammetthomas AT gmail.com) for more informations.

How to live stream a desktop to html5 video tag

I have some output from a program I'd like to stream live to a html5 video tag. So far I've used VLC to capture the screen, transcode it to ogg, and stream it using its built-in http server. It works insofar that I see the desktop image in the browser window.
The catch is this: Every time I refresh the page, the video starts from the top, where I'd like to see only the current screen, so that I can use it to build a sort of limited remote desktop solution that allows me to control the ubuntu desktop program from the browser.
I was thinking websockets to send the mouse events to the program, but I'm stuck on how to get the live picture instead of the whole stream.
Thanks in advance!
If you are building server side as well, I would suggest handle that operation yourself.
What you can do, is use mjpeg for html streaming. And you can write server application that will accept http connections and will send header of mjpeg stream and then every update will send picture it self. That way you will have realtime stream in browser.
This option is good due to ability of having control over stream from server side, and for client side it is just tag with mjpeg.
Regarding WebSockets - yes you can build it, but you will have to implement input devices control on remote computer side.
Here is server of streaming MJPEG that might be interesting to you: http://www.codeproject.com/Articles/371955/Motion-JPEG-Streaming-Server

Resources