I have a video that I want to broadcast on a network using the http protocol.
I know libVLC can do that but I haven't been able to know how. I checked the doxygen documentation, but this hasn't helped me. Can you help me please ?
Thanks
Libvlc is the library to develop the application. vlc player is the client AND SERVER application to do that. If you just need to stream, use vlc media player as a server. You can find the command line / GUI steps if you google "vlc how to stream".
Basically in the file open dialog you get the option to configure either to load a stream from another source or local file OR RUN your own application as a streaming server.
The play button at the bottom of open dialog has a small button on the right to selec "stream" instead of play. But you need to have configured all options correctly to setup the type of stream you are looking for.
Lastly, you can run another instance of vlc as client to test your stream locally.
Related
I would like to know where can I find (if it does exist) the video file for a call with video.
I already have all the things running properly:
Call recording enabled for extensions as Force
Call recording enabled for routes as Force
Video Support enabled
Video Codecs checked
3CX Softphone calling with video
3CX recording the video locally
WAV file available on FreePBX CRD Reports
Now I would like to have/find the video file (MPEG, H264, etc).
Am I missing some config or it can't be done?
I'm running on FreePBX 13.0.194.2 and Asterisk 13.
Tnx in advance,
I think at this moment MixMonitor(from 1.8 version) use audiohooks, so not able record video.
Try Record app(not sure, but may work).
I'm developing an ASP.NET application that's required to stream on-demand videos from server to client. Now I consider using DirectShow to do some kind of processing works before the video is transmitted over Internet. Following this article, I know I can transfer video stream over network through WMAsfWriter after it's processed by DirectShow and the output is a URL that the client can get access to through Windows Media player. But in my ASP.NET application, I want the video stream played on the web page of the client browser such as Chrome. I'm not sure if the output URL can be parsed by client browser and the video stream can be played there directly, so I want to ask that is it possible? If not, what extra steps do I need to take to achieve my goal?
I think you can made WebRTC streamer DirectShow filter and open this stream in browser. Ways like WMP / VLC player require ActiveX, that, really, dead technology now. Even Microsoft Edge do not support it anymore. WebRTC most common way today. Web version of Skype and a lot of other apps use it.
I am building a app that has a javascript layer on top that uses QtWebkit to occasionally access web pages; however if I start my app up without a wireless connection and then set up a connection (using ifup then a device connect then dhcpcd to get the gateway set up) the Javascript continues to be oblivious to the gateway to the wider world.
QNetworkConfigurationManager in my QtApp reports the connection is up OK after I updateConfigurations() and the IP and netmask are reported OK as well.
wget from the command prompt happily obtains whatever webpage I ask it to get.
But if I create an instance of QNetworkAccessManager then
manager->get(QNetworkRequest(QUrl("http://www.google.com/index.html"))); replies that it cannot reach the page.
Do I need to poke an update the QtWebKit somehow for the JS layer to update it's configuration as well?
The problem has been found; the implementation of Qt for embedded we have has an error in it for our implementation of uclibc. This has been patched by the (deleted for security reasons) company that supplied us with the libraries for our embedded solution.
In Flex you can stream microphone audio to an FMS/Red5 server using NetStream.attachAudio, which requires a Microphone object. Is it possible to stream audio through the NetStream from somewhere other than a Microphone? For example, from a file/embedded resource?
The reason I'm asking is that I'd like to be able to run automated tests that don't require using an actual microphone.
Well, it looks like this isn't possible. My workaround is to use SoundFlower to route audio file playback (invoked outside of Flash) into a virtual microphone, which Flash then streams to the media server. From Flash's point of view, its just as if you were manually speaking into the mic.
Then save it at server side,is it possible?
Flash applications can do this, if the user allows them to use the microphone.
You need to use some client technology which can communicate with client hardware(mic), like flash/java(java-i am not that much sure).On the server side you need to implement something like media/streaming server which can record/stream/save client streams.
I done this task using flash on client side and red5 on server side.
I would use a Java applet. The advantage is that you don't have to use any special server-side software like Flash Media Server or Red5. You can process the recorded sound using a simple PHP script - the applet can send a WAV file to the script using the HTTP protocol (something that Flash cannot do, as Adobe wants you to purchase their Flash Media Server).