Embed video from USB webcam into web page using ffserver and ffmpeg - http

I need to stream images from a USB webcam to a webpage on my embedded system. The operating system used is Linux.
I successfully installed ffserver and ffmpeg, and also mplayer.
This is my /etc/ffserver.conf (it's not definitive, I am just testing it):
# Port on which the server is listening. You must select a different
# port from your standard HTTP web server if it is running on the same
# computer.
Port 8090
# Address on which the server is bound. Only useful if you have
# several network interfaces.
BindAddress 0.0.0.0
# Number of simultaneous HTTP connections that can be handled. It has
# to be defined *before* the MaxClients parameter, since it defines the
# MaxClients maximum limit.
MaxHTTPConnections 2
# Number of simultaneous requests that can be handled. Since FFServer
# is very fast, it is more likely that you will want to leave this high
# and use MaxBandwidth, below.
MaxClients 2
# This the maximum amount of kbit/sec that you are prepared to
# consume when streaming to clients.
MaxBandwidth 1000
# Access log file (uses standard Apache log file format)
# '-' is the standard output.
CustomLog -
# Suppress that if you want to launch ffserver as a daemon.
NoDaemon
<Feed feed1.ffm>
File /tmp/feed1.ffm #when remarked, no file is beeing created and the stream keeps working!!
FileMaxSize 200K
# Only allow connections from localhost to the feed.
ACL allow 127.0.0.1
# the output stream format - SWF = flash
Format swf
# this must match the ffmpeg -r argument
VideoFrameRate 5
# another quality tweak
VideoBitRate 320
# quality ranges - 1-31 (1 = best, 31 = worst)
VideoQMin 1
VideoQMax 3
VideoSize 640x480
# wecams don't have audio
NoAudio
</Stream>
# FLV output - good for streaming
<Stream test.flv>
# the source feed
Feed feed1.ffm
# the output stream format - FLV = FLash Video
Format flv
VideoCodec flv
# this must match the ffmpeg -r argument
VideoFrameRate 5
# another quality tweak
VideoBitRate 320
# quality ranges - 1-31 (1 = best, 31 = worst)
VideoQMin 1
VideoQMax 3
VideoSize 640x480
# wecams don't have audio
NoAudio
</Stream>
<Stream stat.html>
Format status
</Stream>
<Redirect index.html>
# credits!
URL http://ffmpeg.sourceforge.net/
</Redirect>
From the shell I can execute:
# ffserver -f /etc/ffserver.conf
and
# ffmpeg -f video4linux2 -s 320x240 -r 5 -i /dev/video0 http://127.0.0.1:8090/test.flv
No errors are reported during the execution. Sounds good but maybe it's not OK at all.
Then, in the webpage, I wrote this simple code:
<video controls>
<source src="http://127.0.0.1:8090/test.flv">
</video>
I read on another thread here on stack overflow (I lost the link) that this code should be enough.. But it's not working for me.
But I can see the file /tmp/feed1.ffm has been created, so I think I can use this stream to show the camera images on my webpage. Am I right ?
What it the simplest solution ?
Thank you.
EDIT
I allowed the connections into the ffserver's configuration file:
<Feed feed1.ffm>
File /tmp/feed1.ffm #when remarked, no file is beeing created and the stream keeps working!!
FileMaxSize 200K
ACL allow 127.0.0.1
ACL allow localhost
ACL allow 192.168.2.2 192.168.2.10
</Feed>
But still does not work.

ffmpeg -f video4linux2 -s 320x240 -r 5 -i /dev/video0
http://127.0.0.1:8090/test.flv
As described in the documentation, you should stream to the feed1.ffm file, not to the test.flv file. ffmpeg -> ffserver communication is the ffm file, and ffserver -> webbrowser communication is the .flv file.

I think HTML doesn't like the pseudo files like pipes or .ffm :)
Maybe you could use the <embed>tag from HTML5.
<embed type="video/flv" src="http://127.0.0.1:8090/test.flv" width="320" height="240">
Or however you want to set the size.

Related

capture multiline events with rsyslog and storing them to file

We have a centralized rsyslog infrastructure capturing events from TCP sent by devices around the world using imtcp module.
The idea is to read from syslog (TCP) and store the events to disk, one line per event. The events are later processed by other consumers.
As far as we can see, some events are splitted in multiple events once they are stored on the disk breaking the rest of our process.
Capturing one single package with tcpdump, we confirmed that the source syslog is sending us the whole event containing multiple lines (typical java exceptions).
[root#xx xx.xx.xx.xx]# tcpdump -i bond0 tcp port 50520 -A -c 1
tcpdump: verbose output suppressed, use -v or -vv for full protocol decode
listening on bond0, link-type EN10MB (Ethernet), capture size 262144 bytes
12:12:26.062110 IP xx.xx.xx.xx.com.41444 > xx.xx.xx.com.50520: Flags [P.], seq 3270590174:3270590613, ack 2646946316, win 27, options [nop,nop,TS val 3937801207 ecr 2623497312], length 439
E....`#.<.ML..A....N...X..>...2......q.....
....._d`<13> xxx #2.0.#2021 02 10 12:19:50:898#+00#Info#com.xx.xx.xx.xx.xx#
##JavaEE/xx#xx#xx#JavaEE/xx#com.xx.xx.xx.xx.APIServiceHandler#xx#xx##xx#xx##0#Thread[HTTP Worker [#xx],5,Dedicated_Application_Thread]#Plain##
Is the user getting thru SSO? xx:true#
1 packet captured
44 packets received by filter
2 packets dropped by kernel
As this is a global system, we cannot request the device owners to modify the format, all the actions should take place on our side.
This is our rsyslog.conf file
$MaxMessageSize 128k
# Global configuration/modules
module(load="imtcp" MaxListeners="100")
module(load="imfile" mode="inotify")
module(load="impstats" interval="10" resetCounters="on" format="cee" ruleset="monitoring")
module(load="mmjsonparse")
module(load="mmsequence")
module(load="omelasticsearch")
module(load="omudpspoof")
# Include all conf files
$IncludeConfig /etc/rsyslog.d/*.conf
And this is our template that reads from tcp and writes to file (etc/rsyslog.d/template.conf)
template(name="outjsonfmt_device" type="list") {
constant(value="{")
property(outname="device_ip" name="fromhost-ip" format="jsonf")
constant(value=",")
property(outname="time_collect" name="timegenerated" dateFormat="rfc3339" format="jsonf")
constant(value=",")
constant(value="\"device_type\":\"device\"")
constant(value=",")
property(outname="collector_id" name="$myhostname" format="jsonf")
constant(value=",")
property(outname="msg" name="rawmsg-after-pri" format="jsonf" )
constant(value="}\n")
}
template(name="device-out-filename" type="string" string="/data1/input/device/%fromhost-ip%/device_%$now-utc%_%$hour-utc%.log")
ruleset(name="writeRemoteDataToFile_device") {
action(type="omfile" dynaFileCacheSize="10000" dirCreateMode="0700" FileCreateMode="0644" dirOwner="user" dirGroup="logstash" fileOwner="user" fileGroup="user" dynafile="device-out-filename" template="outjsonfmt_device")
}
input(type="imtcp" port="50520" ruleset="writeRemoteDataToFile_device")
How can we configure rsyslog to escape line breaks in the middle of an event, prior to write the event to disk? We already tried $EscapeControlCharactersOnReceive with no success and other similar parameters
The imtcp has a module parameter DisableLFDelimiter which you could try setting to on to ignore line-feed delimiters, assuming your input has an octet-count header. The page says, This mode is non-standard and will probably come with a lot of problems.
module(load="imtcp" MaxListeners="100" DisableLFDelimiter="on")

How to send RTP stream to Janus from NGINX RTMP module?

This is my first post here, even though this platform has already helped me a lot.
So, i'm trying to create a stream and display it in a browser. I have already configured NGINX with the rtmp module and my stream works very well with HLS (between 5 and 10 seconds of latency).
Now I would like to set up a low-latency stream and that's why I have installed the janus-gateway webRTC server that allows to take in input an RTP stream and provide in output a webRTC stream.
Here's the schema I'd like to follow :
OBS -> RTMP -> Nginx-rtmp-module -> ffmpeg -> RTP -> Janus -> webRTC -> Browser
But I have a problem with this part : "nginx-rtmp-module -> ffmpeg -> janus"
In fact, my janus's server is running and demos streaming works very well in localhost, but when i try to provide an RTP stream, Janus don't detect the stream in the demos (it shows "No remote video available").
Anyone can help me, please ?
Ressources :
My janus.plugin.streaming.jcfg configuration :
rtp-sample: {
type = "rtp"
id = 1
description = "Opus/VP8 live stream coming from external source"
metadata = "You can use this metadata section to put any info you want!"
audio = true
video = true
audioport = 5002
audiopt = 111
audiortpmap = "opus/48000/2"
videoport = 5004
videopt = 100
videortpmap = "VP8/90000"
secret = "adminpwd"
}
My nginx.conf application :
application test {
deny play all;
live on;
on_publish http://localhost/test/backend/sec/live_auth.php;
exec ffmpeg -i rtmp://localhost/test/$name -an -c:v copy -flags global_header -bsf dump_extra -f rtp rtp://localhost:5004;
}
If you need something more for help me, don't hesitate ! Thank you in advance, and sorry for my bad english :)
I finally solved this problem with the following command :
sudo ffmpeg -y -i "rtmp://127.0.0.1/app/stream" -c:v libx264 -profile:v main -s 1920x1080 -an -preset ultrafast -tune zerolatency -g 50 -f rtp rtp://127.0.0.1:5004
Unfortunately, when I use -c:v copy, it doesn't work. It only works when encoding with libx264 which adds latency and I got between 3 and 4 seconds of latency.
However, when I installed Janus, my goal was to do better than with HLS, protocol with which I reach 2.5 seconds of latency.
So Janus did not meet my need. Moreover I was warned that it was not a stream server. After some research I came across Github on the Oven Media Engine project, a stream server that offers a latency of less than 1s. The documentation is complete on the dedicated site and a player (Oven Media Player) adapted to this server is available under MIT license. The server is under GPLv2 license.
Here is the current schema of my architecture :
OBS -> Nginx (which allows to allow streaming with on_publish, because OME doesn't allow it yet. The stream is then pushed to the OME server) -> OME -> Transcoding in different bitrate and resolution (optional) -> OME -> Edge OME (optional) -> player.
If you have any questions, don't hesitate, the support is very friendly !
Hope it helps

Raspberry Pi : use VLC to stream webcam : Logitech C920 [H264 Video without transcoding + Audio + LED control] - SpyCam / BabyCam

I have a RaspberryPi and a Logitech C920 Webcam.
I want to use these devices to work as a surveillance / babycam, i.e. : Stream audio + video over HTTP (or any other protocol) without cpu intensive video
transcoding
The C920 webcam is able to stream H264 natively, so theoretically I won't need to ask RaspberyPi+VLC to transcode the video stream.
The built-in C920 Microphone stream does not seem to be included in the webcam stream.
Cam and microphone are 2 separate devices.
The C920 also has a built-in led indicator. I want to control that to avoid the LED to ligth up while recording.
How can I achieve that ?
This solution is tested and working with versions indicated below.
Using this method, the RaspberryPi3 is always around 5% CPU.
edit 2018-11-18:
One can also see the all-in-one solution prototype on RaspiVWS project homepage (for curious people, see GitHub project)
0. Preliminary checks
1. Webcam video configuration
2. Microphone identification
3. Stream using VLC
4. Make RaspberryPi3+ a Wifi access point
(If you have no existing network to connect your Pi to)
5. Script at startup or as a service
6. [EDIT] Additional commands : infinite loop recording & split video
7. [EDIT] Program execution at a given instant
8. [EDIT] TROUBLESHOOTING
0. Preliminary checks
The answer is working with Raspbian 9.4 Stretch.
Check your version with the following command :
lsb_release -a
You should see:
No LSB modules are available.
Distributor ID: Raspbian
Description: Raspbian GNU/Linux 9.4 (stretch)
Release: 9.4
Codename: stretch
We can rely on the following tools :
v4l allows to control the webcam. It offers the command v4l2-ctl which will allows us to control and config the webcam.
VLC which is not only a video player, but also has powerful streaming capabilities
You can install them with the following commands :
sudo apt-get install vlc
sudo apt-get install v4l-utils
Once everything is installed, you can configure your C920 webcam.
1. Webcam video configuration
v4l2-ctl --all lists all available devices and their config
pi#raspberrypi:~ $ v4l2-ctl --all
Driver Info (not using libv4l2):
Driver name : uvcvideo
Card type : HD Pro Webcam C920
Bus info : usb-3f980000.usb-1.5
Driver version: 4.14.30
Capabilities : 0x84200001
Video Capture
Streaming
Extended Pix Format
Device Capabilities
Device Caps : 0x04200001
Video Capture
Streaming
Extended Pix Format
Priority: 2
Video input : 0 (Camera 1: ok)
Format Video Capture:
Width/Height : 1920/1080
Pixel Format : 'H264'
Field : None
Bytes per Line : 3840
Size Image : 4147200
Colorspace : sRGB
Transfer Function : Default
YCbCr/HSV Encoding: Default
Quantization : Default
Flags :
Crop Capability Video Capture:
Bounds : Left 0, Top 0, Width 1920, Height 1080
Default : Left 0, Top 0, Width 1920, Height 1080
Pixel Aspect: 1/1
Selection: crop_default, Left 0, Top 0, Width 1920, Height 1080
Selection: crop_bounds, Left 0, Top 0, Width 1920, Height 1080
Streaming Parameters Video Capture:
Capabilities : timeperframe
Frames per second: 30.000 (30/1)
Read buffers : 0
brightness (int) : min=0 max=255 step=1 default=-8193 value=128
contrast (int) : min=0 max=255 step=1 default=57343 value=128
saturation (int) : min=0 max=255 step=1 default=57343 value=128
white_balance_temperature_auto (bool) : default=1 value=1
gain (int) : min=0 max=255 step=1 default=57343 value=255
power_line_frequency (menu) : min=0 max=2 default=2 value=2
white_balance_temperature (int) : min=2000 max=6500 step=1 default=57343 value=4822 flags=inactive
sharpness (int) : min=0 max=255 step=1 default=57343 value=128
backlight_compensation (int) : min=0 max=1 step=1 default=57343 value=0
exposure_auto (menu) : min=0 max=3 default=0 value=3
exposure_absolute (int) : min=3 max=2047 step=1 default=250 value=333 flags=inactive
exposure_auto_priority (bool) : default=0 value=1
pan_absolute (int) : min=-36000 max=36000 step=3600 default=0 value=0
tilt_absolute (int) : min=-36000 max=36000 step=3600 default=0 value=0
focus_absolute (int) : min=0 max=250 step=5 default=8189 value=0 flags=inactive
focus_auto (bool) : default=1 value=1
zoom_absolute (int) : min=100 max=500 step=1 default=57343 value=100
led1_mode (menu) : min=0 max=3 default=3 value=3
led1_frequency (int) : min=0 max=255 step=1 default=0 value=0
The last 2 lines gives us clues to control the built-in LED indicator, for instance, to deactivate the LED indicator.
The -d0 parameter indicates on which device the modifcation should be applied (if you ahve several cams or its device name changed)
v4l2-ctl -d0 --set-ctrl=led1_mode=0
v4l2-ctl -d0 --set-ctrl=led1_frequency=30
2. Microphone identification
The command arecord -l will give us the list of ALSA devices. (ALSA is the audio manager in RaspberryPi)
pi#raspberrypi:~ $ arecord -l
**** Liste des Périphériques Matériels CAPTURE ****
carte 1: C920 [HD Pro Webcam C920], périphérique 0: USB Audio [USB Audio]
Sous-périphériques: 1/1
Sous-périphérique #0: subdevice #0
This means that the built-in microphone is located on hardware 1, periph 0. You can check that in command line with alsamixer -c 1 -V capture
3. Stream using VLC
VLC can be launched using command line.
Since we do not have video and audio already mixed together in a single stream access, we need to ask VLC to do that.
It is the role of the transcoding feature of VLC.
Stream over HTTP
We also want to stream over HTTP, VLC can also achieve that.
cvlc v4l2:///dev/video0:chroma=h264 :input-slave=alsa://hw:1,0 --sout '#transcode{acodec=mpga,ab=128,channels=2,samplerate=44100,threads=4,audio-sync=1}:standard{access=http,mux=ts,mime=video/ts,dst=:8099}'
Explanation
v4l2:///dev/video0:chroma=h264 gives VLC input data : it grabs the video stream from /dev/video0 and that it is a h264 encoding (if your webcam is the 0th video device, it could also be another number, refer to v4l2-ctl --all command)
:input-slave=alsa://hw:1,0 tells VLC to take another input stream with the video. It is the audio stream identified from the arecord above
--sout tells VLC how to handle the output stream
#transcode{acodec=mpga,ab=128,channels=2,samplerate=44100,threads=4,audio-sync=1} tells VLC to convert the audio to mpga codec, 128 kbits/s, 2 channels, 44100 Hz sampling, using all 4 RaspberryPi3+ cores. audiosync is optional. It took me some time to realize this : the webcam h264 video stream is kept as provided (no video transcoding).
:standard{access=http,mux=ts,mime=video/ts,dst=:8099} tells VLC to provide stream over HTTP on port 8099 with the TS muxing format.
On any other device, you can use VLC to access your RaspberryPi3+ VLC stream :
vlc http://<raspberrypi-ip>:8099
It works with any VLC client :
windows
unix
mac
confirmed with iPhone 7 (v11.2.1 (15C153)) with VLC app (3.0.3 (305))
NB : Having the video already in H264 1920x1080 30fps in output of the webcam saves a lot of RaspberryPi3+ CPU.
Different containers
You can also record to various containers, or even containers + stream, here are some examples:
record to MKV
cvlc v4l2:///dev/video0:chroma=h264 :input-slave=alsa://hw:1,0 --sout '#transcode{acodec=mpga,ab=128,channels=2,samplerate=44100,threads=4,audio-sync=1}:standard{access=file,mux=mkv,dst='/home/pi/Webcam_Record/MyVid.mkv'}'
record to MP4
cvlc v4l2:///dev/video0:chroma=h264 :input-slave=alsa://hw:1,0 --sout '#transcode{acodec=mpga,ab=128,channels=2,samplerate=44100,threads=4,audio-sync=1}:standard{access=file,mux=mp4,dst='/home/pi/Webcam_Record/MyVid.mp4'}'
record + stream
cvlc v4l2:///dev/video0:chroma=h264 :input-slave=alsa://hw:1,0 --sout '#transcode{acodec=mpga,ab=128,channels=2,samplerate=44100,threads=4,audio-sync=1}:duplicate{dst=standard{access=file,mux=mp4,dst='/home/pi/Webcam_Record/MyVid.mp4'},dst=standard{access=http,mux=ts,mime=video/ts,dst=:8099}}'
Format filenames, timestamps
You can also use formatted string for filenames. Prefix command like this:
cvlc --sout-file-format v4l2:///dev/video0:<...> dst='/home/pi/Webcam_Record/%F_%T_MyVid.mp4'}
It will produce a file named YYYY-MM-DD_HH:MM:SS_MyVid.mp4 (: are authorized in unix filenames, but not in windows filenames)
4. Make RaspberryPi3+ a Wifi access point
If you have no existing network to connect your Pi to:
You can follow instructions from official RaspberryPi3+ website : https://www.raspberrypi.org/documentation/configuration/wireless/access-point.md
Otherwise, if you already have a network you can connect to your pi using its IP.
See part 3
On any other device, you can use VLC to access your RaspberryPi3+ VLC
stream : vlc http://<raspberrypi-ip>:8099
5. Script at startup
You can put many commands into a bash file my_bash_file.sh.
For instance :
#!/bin/bash
# auto stream launch + led off
#cvlc -vvv for verbose debug
# change this value to adapt to your webcam device number
deviceNb=0
# force video format + led off
v4l2-ctl -d${deviceNb} --set-fmt-video=width=1920,height=1080,pixelformat=1 --set-ctrl=led1_mode=0
# if delay needed
# cvlc v4l2:///dev/video${deviceNb}:chroma=h264 :input-slave=alsa://hw:1,0 :live-caching=2500 --sout '#transcode{acodec=mpga,ab=128,channels=2,samplerate=44100,threads=4,audio-sync=1}:standard{access=http,mux=ts,mime=video/ts,dst=:8099}'
# no delay
cvlc v4l2:///dev/video${deviceNb}:chroma=h264 :input-slave=alsa://hw:1,0 --sout '#transcode{acodec=mpga,ab=128,channels=2,samplerate=44100,threads=4,audio-sync=1}:standard{access=http,mux=ts,mime=video/ts,dst=:8099}'
Basic method
You can then make the rc.local script use your custom script to be executed at startup.
You can follow instructions from official RaspberryPi3+ website : https://www.raspberrypi.org/documentation/linux/usage/rc-local.md
Another method : Create a deamon service
We will create a "webcam-stream" service, assuming all necessary bash commands are located /home/pi/Webcam_Record/vlc_webcam_stream_service.sh
cd /lib/systemd/system/
sudo nano webcam-stream.service
And write in it:
[Unit]
Description=Custom Webcam Streaming Service
After=multi-user.target
[Service]
Type=simple
ExecStart=/home/pi/Webcam_Record/vlc_webcam_stream_service.sh
Restart=on-abort
[Install]
WantedBy=multi-user.target
Make the service file and the script executable:
sudo chmod 644 /lib/systemd/system/webcam-stream.service
chmod +x /home/pi/Webcam_Record/vlc_webcam_stream.sh
Allow VLC to be excuted as root:
sudo sed -i 's/geteuid/getppid/' /usr/bin/vlc
Reload deamons and enable our service:
sudo systemctl daemon-reload
sudo systemctl enable webcam-stream.service
Check it is recognized and working:
sudo service webcam-stream status
sudo service webcam-stream start
You can check with another computer that the video is correctly streamed.
Note that the webcam won't be available while the service is running.
Once you're done, you can connect to the RaspberryPi3+ wifi access point and access your video stream.
6. [EDIT] Additional commands : infinite loop recording & split video
The following bash scripts allows infinite recording of 15 s long videos with timestamped filenames and streaming
#!/bin/bash
# auto stream launch + led off
#cvlc -vvv for verbose debug
# adapt to video device name
deviceNb=1
# loop duration
duration=15
#infinite recording
#loopOption=
loopOption=--loop
# force video format + led off
v4l2-ctl -d ${deviceNb} --set-fmt-video=width=1920,height=1080,pixelformat=1 --set-ctrl=led1_mode=0
# if delay needed :live-caching=2500
cvlc --sout-file-format --run-time=${duration} ${loopOption} v4l2:///dev/video${deviceNb}:chroma=h264 :input-slave=alsa://hw:1,0 --sout '#transcode{acodec=mpga,ab=128,channels=2,samplerate=44100,threads=4,audio-sync=1}:duplicate{dst=standard{access=file,mux=mp4,dst='/home/pi/Webcam_Record/%F_%T_Spy.mp4'}:dst=standard{access=http,mux=ts,mime=video/ts,dst=:8099}'
7. [EDIT] Program execution at a given instant
EDIT 04 aug 2018
To launch the execution today at 14:00, you can use the following command:
./my_vlc_webcam_script.sh | at 1400
See the at command manual for further details.
8. TROUBLESHOOTING
EDIT 07 jul 2018
I recently ran into VLC error after a dist-upgrade:
VLC media player 2.2.6 Umbrella (revision 2.2.6-0-g1aae78981c)
[00acb230] pulse audio output error: PulseAudio server connection failure: Connection refused
The solution I found is to launch VLC in GUI mode and change the default audio device to ALSA (instead of Automatic). I can also be done in command line.
See the solution found here VLC issues with PulseAudio
cvlc -A alsa,none --alsa-audio-device default
You need the vcodec= for video to work and deinterlace if you want that.
cvlc v4l2:///dev/video0:chroma=h264
:input-slave=alsa://hw:1,0
:live-caching=2500
--sout '#transcode{
deinterlace,
vcodec=mpgv,
acodec=mpga,
ab=128,
channels=2,
samplerate=44100,
threads=4,
audio-sync=1}
:standard{
access=http,
mux=ts,
mime=video/ts,
dst=0.0.0.0:8099}'

IIS: Download large file with wget - connection always fails with Bitrate Throottling

I have a problem to make IIS to allow download file with Bitrate Throottling. I set limit file to 100kb/s. There is no problem without bitrate limitation. But with limit I have a problem.
I'm using a code similar to described in this article:
Securing Large Downloads Using C# and IIS 7
I also tried to switch off IIS Bitrate Throottling and control bitrate "by hand" calculating with TimeSpan the bitrate and using Thread.Sleep(10) in a while...
But all my tries was useless, I don't get any exceptions.
to test download I use wget, this way:
wget -t 1 http://db.realestate.ru/yrl/RealEstateExportToYandex.xml
(you can try it with wget for windows)
this is a 240Mb text file, wget always stops, at random position of downloading, 5% - 60% and throws this error message:
Read error at byte ... (Connection reset by peer).
May be the problem is not with IIS, because in may localhost is working well, but not online on highly loaded server.
Solved with this parameters specified in wget command:
wget -t 1 --header="Keep-Alive: 30000" -nv http://db.realestate.ru/yrl/RealEstateExportToYandex.xml

How can I test an outbound connection to an IP address as well as a specific port?

OK, we all know how to use PING to test connectivity to an IP address. What I need to do is something similar but test if my outbound request to a given IP Address as well as a specif port (in the present case 1775) is successful. The test should be performed preferably from the command prompt.
Here is a small site I made allowing to test any outgoing port. The server listens on all TCP ports available.
http://portquiz.net
telnet portquiz.net XXXX
If there is a server running on the target IP/port, you could use Telnet. Any response other than "can't connect" would indicate that you were able to connect.
To automate the awesome service portquiz.net, I did write a bash script :
NB_CONNECTION=10
PORT_START=1
PORT_END=1000
for (( i=$PORT_START; i<=$PORT_END; i=i+NB_CONNECTION ))
do
iEnd=$((i + NB_CONNECTION))
for (( j=$i; j<$iEnd; j++ ))
do
#(curl --connect-timeout 1 "portquiz.net:$j" &> /dev/null && echo "> $j") &
(nc -w 1 -z portquiz.net "$j" &> /dev/null && echo "> $j") &
done
wait
done
If you're testing TCP/IP, a cheap way to test remote addr/port is to telnet to it and see if it connects. For protocols like HTTP (port 80), you can even type HTTP commands and get HTTP responses.
eg
Command IP Port
Telnet 192.168.1.1 80
The fastest / most efficient way I found to to this is with nmap and portquiz.net described here: http://thomasmullaly.com/2013/04/13/outgoing-port-tester/ This scans to top 1000 most used ports:
# nmap -Pn --top-ports 1000 portquiz.net
Starting Nmap 6.40 ( http://nmap.org ) at 2017-08-02 22:28 CDT
Nmap scan report for portquiz.net (178.33.250.62)
Host is up (0.072s latency).
rDNS record for 178.33.250.62: electron.positon.org
Not shown: 996 closed ports
PORT STATE SERVICE
53/tcp open domain
80/tcp open http
443/tcp open https
8080/tcp open http-proxy
Nmap done: 1 IP address (1 host up) scanned in 4.78 seconds
To scan them all (took 6 sec instead of 5):
# nmap -Pn -p1-65535 portquiz.net
The bash script example of #benjarobin for testing a sequence of ports did not work for me so I created this minimal not-really-one-line (command-line) example which writes the output of the open ports from a sequence of 1-65535 (all applicable communication ports) to a local file and suppresses all other output:
for p in $(seq 1 65535); do curl -s --connect-timeout 1 portquiz.net:$p >> ports.txt; done
Unfortunately, this takes 18.2 hours to run, because the minimum amount of connection timeout allowed integer seconds by my older version of curl is 1. If you have a curl version >=7.32.0 (type "curl -V"), you might try smaller decimal values, depending on how fast you can connect to the service. Or try a smaller port range to minimise the duration.
Furthermore, it will append to the output file ports.txt so if run multiple times, you might want to remove the file first.

Resources