I am trying to run my Ubuntu machine as vlc server. where i run below command to stream my local video over http.vlc 1.avi
:sout=#transcode{vcodec=theo,vb=800,acodec=vorb,ab=128,channels=2,samplerate=44100}:duplicate{dst=http{dst=:8080/test.ogg}} :sout-all :sout-keep
Below is vlc client commad to display the http streaming output which is stopping always after 10 sec. For subsequent attempt this is not working.("failed to find url")
vlc http://localhost:8080/test.ogg .
Please suggest any workaround. Also please let me knwo if i should switched to ffmpeg if this is legacy problem. please suggest the command as well.
Note : using the latest vlc
Thanks in advance!
this was vlc version mismatch,once i make same vlc version in both client and server then it works perfectly
I have an app controlling my AVR on a local network and I'm trying to embed some of the functionality into another app written by myself. I've started up WireShark and started controlling the volume, which shows up as:
GET /ctrl-int/1/setproperty?dmcp.device-volume=-15.750000 HTTP/1.1
I'm not totally up on this type of http control but i'd like to know if this is enough data to be able to send the same request via a browser or terminal etc.
cheers
Without knowing the avr you can't realy tell. But you should be able to send the command via
avr-ip/ctrl-int/1/setproperty?dmcp.device-volume=-15.750000
in the browser or from you app. The ip should be in the wireshark logs as well.
If that works it was enough information.
when I type wget http://yahoo.com:80 on unix shell. Can some one explain me what exactly happens from entering the command to reaching the yahoo server. Thank you very much in advance.
RFC provide you with all the details you need and are not tied to a tool or OS.
Wget uses in your case HTTP, which bases on TCP, which in turn uses IP, then it depends on what you use, most of the time you will encounter Ethernet frames.
In order to understand what happens, I urge you to install Wireshark and have a look at the dissected frames, you will get an overview of what data belongs to which network layer. That is the most easy way to visualize and learn what happens. Beside this if you really like (irony) funny documents (/irony) have a look at the corresponding RFCs HTTP: 2616 for example, for the others have a look at the external links at the bottom of the wikipedia articles.
The program uses DNS to resolve the host name to an IP. The classic API call is gethostbyname although newer programs should use getaddrinfo to be IPv6 compatible.
Since you specify the port, the program can skip looking up the default port for http. But if you hadn't, it would try a getservbyname to look up the default port (then again, wget may just embed port 80).
The program uses the network API to connect to the remote host. This is done with socket and connect
The program writes an http request to the connection with a call to write
The program reads the http response with one or more calls to read.
Suppose I want to transfer just a portion of a file over FTP - is it possible using a standard FTP protocol?
In HTTP I could use a Range header in the request to specify the data range of the remote resource. If it's a 1mb file, I could ask for the bytes from 600k to 700k.
Is there anything like that in FTP? I am reading the FTP RFC, don't see anything, but want to make sure I'm not missing anything.
There's a Restart command in FTP - would that work?
Addendum
After getting Brian Bondy's answer below, I wrote a read-only Stream class that wraps FTP. It supports Seek() and Read() operations on a resource that is read via FTP, based on the REST verb.
Find it at http://cheeso.members.winisp.net/srcview.aspx?dir=streams&file=FtpReadStream.cs
It's pretty slow to Seek(), because setting up the data socket takes a long time. Best results come when you wrap that stream in a BufferedStream.
Yes you can use the REST command.
REST sets the point at which a subsequent file transfer should start. It is used usually for restarting interrupted transfers. The command must come right before a RETR or STOR and so come after a PORT or PASV.
From FTP's RFC 959:
RESTART (REST) The argument field
represents the server marker at which
file transfer is to be restarted. This
command does not cause file transfer
but skips over the file to the
specified data checkpoint. This
command shall be immediately followed
by the appropriate FTP service command
which shall cause file transfer to
resume.
Read more:
http://www.faqs.org/rfcs/rfc959.html#ixzz0jZp8azux
You should check out how GridFTP does parallel transfers. That's using the sort of techniques that you want (and might actually be code that it is better to borrow rather than implementing from scratch yourself).
We are currently working on a Flex application that needs to connect to a set a traffic detection cameras via RTSP. Being totally new to the world of video streaming in general, I was wondering if that is possible.
AFAIK it is not possible to consume an RTSP feed in the Flash player, so I'm thinking that we would need some sort of a converter on the server that takes the RTSP stream and converts it to RTMP so we can consume the feed in our Flex app. We were hoping that Red5 could helps us do that.
Am I correct in my assumption and has anyone done this?
Wowza Media seems to support RTSP to RTMF converting: http://www.wowzamedia.com/comparison.html
And there is also general video stream transcoder Xuggle http://www.xuggle.com/ based on Red5 and FFMPEG.
You could try restreaming it via Red5 and connecting your Flex app to the Red5 server.
Read more at: http://red5wiki.com/wiki/SteamStream
Based on this work
I tried to convert a H264 signal to a SWF stream that could be
easily be displayed in Flash. Here is the recipe. (This recipe is
for Linux.)
Download Live555 streaming media, from http://www.live555.com/liveMedia/
The src file you have is usually named live555-latest.tar.gz
Unpack and compile:
Unpack:tar xzvf live555-latest.tar.gz. This will create a directory named live.
cd live
./genMakefiles linux (if you have a 32 bit system) or ./genMakefiles linux-64bit if your system is 64-bit)
make, and after a while you'll have a brand new compiled code
Live55 has a lot of good stuff, but we are only interested in the "testProgs"
directory, where openRTSP resides. OpenRTSP will let us receive a signal and send it
to ffmpeg, a program wich feeds ffserver. Ffserver is a server that receives
the signal from ffmpeg and converts it to SWF (and other formats).
Download, unpack, configure and install ffmpeg
Download ffmpeg from http://www.ffmpeg.org/. The version I tested is 0.6.1: http://www.ffmpeg.org/releases/ffmpeg-0.6.1.tar.gz
Unpack:tar xzvf ffmpeg-0.6.1.tar.gz. This will create a directory named ffmpeg-0.6.1
cd ffmpeg-0.6.1
All the funny video streaming things are packaged in VideoLan. So you
better install VideoLan right now. Go to http://www.videolan.org/ and see how easy is to
install it. You may be surprised that the package dependencies contains ffmpeg libraries.
After installing VideoLan do ./configure and then make.
After 3 or 4 hours you will have mmpeg and mmserver compiled and working.
Now we are almost ready to stream the whole world. First of all, let's try to
get openRTSP working.
Go to your "live" directory (remember 3.2) and do: cd testProgs
Try this:./openRTSP -v -c -t rtsp://<hostname>:<port>/<cam_path> First of
all, you'll see logs which says something like:
- opening conection blah blah.
- sending DESCRIBE blah blah.
- receiving streamed data.
If all goes OK, your console will start to print a lot of strange characters very quickly.
These characters are bytes of video, but you can't see it (now). If you don't see your screen
printing characters, there is something wrong with your configuration. Check the steps up
to now.
We got the signal! Now let's send it to an useful component: ffmpeg, which is bound to
ffserver. We need to create a configuration file for ffserver.
Use your favorite editor to create this text file:
Port 8090
BindAddress 0.0.0.0
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 1000
CustomLog -
NoDaemon
<Feed feed1.ffm>
File /tmp/feed1.ffm
FileMaxSize 200K
ACL allow 127.0.0.1
</Feed>
<Stream testFlash.swf>
Feed feed1.ffm
Format swf
VideoFrameRate 25
VideoSize 352x288
VideoIntraOnly
NoAudio
</Stream>
<Stream stat.html>
Format status
ACL allow localhost
ACL allow 192.168.0.0 192.168.255.255
</Stream>
Name the file, for example, ffserver.conf. Save it anywhere, for example in the same directory of ffserver.
So, ffserver will be bound to the port 8090, for input and output. <Feed> tag configures the
input stream. The name of the configured feed in this case is feed1.ffm. Remember it for step 14.
<Stream> contains configuration for the output stream. In this case the name will be testFlash.swf (remember too), and the format will be SWF. The video frame rate will be 25 and the size 352x288, and it won't contain audio. The last stream is a HTML file (stat.html) that will show you the status of the server.
Start ffserver: ./ffserver -f ffserver.conf (or wherever you have left the config file). The -f parameter indicated
that you will load the confugration from a custom file.
Open a navigator and go to http://localhost:8090/stat.html. A status page of the server will show up, and we'll see a line of information about our testFlash.swf stream. It seems very quiet now, so let's feed this stream with the output of openRTSP (from step 7).
Do this:
<path to openRTSP>/openRTSP -v -c -t rtsp://<hostname>:<port>/<cam_path> | <path to ffmeg>/ffmpeg -i - http://localhost:8090/feed1.ffm
The first path (before the "|" is the same as step 9. "|" is a symbol that connects the output of
openRTSP (the sequence of video signal, aka strage chars) to be the input of ffmpeg. "-I -" means that
the input of mmpeg is taken from the pipe "|" and http://localhost:8090/feed1.ffm is the destination (output)
of ffmpeg, wich is basically the input of ffserver.
So with this command we have connected openRTSP -> ffmpeg -> ffserver
When you enter this command a lot of information will be shown. Is important to note that the input params
and the output params are shown, and these params NEED to be "compatible". In my case, this will be shown:
Input #0, h264, from 'pipe: ':
Duration: N/A, bitrate: N/A
Stream #0.0: Video: h264, yuv420p, 352x288, 25 fps, 25 tbr, 1200k tbn, 50 tbc
Output #0, ffm, to 'http://localhost:8090/feed1.ffm':
Metadata:
encoder: Lavf52.64.2
Stream #0.0: Video: FLV, yuv420p, 352x288, q=2-31, 200 kb/s, 1000k tbn, 25 tbc
Stream mapping:
Stream #0.0 -> #0.0
</pre>
And then the stream begins to play. You will see in the last line numbers CONSTANTLY changing,
telling you the live frame rating in each moment. Something like
frame= 395 fps= 37 q=31.7 Lsize = 1404kB time=15.80 bitrate = 727.9kbits/s
If you don't see this line of metrics, then there is something wrong with your output configuration. Go back and change the parameters of testFlash.swf.
Everything is done. You can see the video in http://localhost:8090/testFlash.swf. You can use this URL to embed a Flash movie or, as in my case, a Flex application.
Red5 is fine, especially now with xuggle support (allows for off the shelf integration of FFMPEG) which provides great integration with red5 (ie. great tutorial on doing live stream conversion etc).
If you're familiar with programming in flex or whatever it takes to bring it into a swf, then you could try and implement rtsp-over-tcp, AFAIK udp isn't available in flash.
I just tested this with Wowza.
Input was RTSP stream from Etrovision H264 DVS.
Take a look at this tread and use Application.xml file from there if you want to try it:
http://96.30.11.104/forums/showthread.php?p=24806
Video is playing in Flash player, but the price is 5 seconds delay for a single stream, all equipment in office LAN, server running at Core2Duo/2.8GHz/3GB RAM.
Not sure if it can go faster or it's the expected transcoding damage for this setup...
While its public / open source version is a bit dated now, you can have a look at this project which did RTSP and Image based camera transcoding into RTMP streams with Xuggler and Red5.
http://sourceforge.net/projects/imux/
(disclaimer: I worked for the company that created the original source)