I have a problem to make IIS to allow download file with Bitrate Throottling. I set limit file to 100kb/s. There is no problem without bitrate limitation. But with limit I have a problem.
I'm using a code similar to described in this article:
Securing Large Downloads Using C# and IIS 7
I also tried to switch off IIS Bitrate Throottling and control bitrate "by hand" calculating with TimeSpan the bitrate and using Thread.Sleep(10) in a while...
But all my tries was useless, I don't get any exceptions.
to test download I use wget, this way:
wget -t 1 http://db.realestate.ru/yrl/RealEstateExportToYandex.xml
(you can try it with wget for windows)
this is a 240Mb text file, wget always stops, at random position of downloading, 5% - 60% and throws this error message:
Read error at byte ... (Connection reset by peer).
May be the problem is not with IIS, because in may localhost is working well, but not online on highly loaded server.
Solved with this parameters specified in wget command:
wget -t 1 --header="Keep-Alive: 30000" -nv http://db.realestate.ru/yrl/RealEstateExportToYandex.xml
Related
My institute recently installed a new proxy server for our network. I am trying to configure my Cygwin environment to be able to run wget and download data from a remote repository.
Browsing the internet I have found two different solutions to my problem, but no one of them seem to work in my case.
The first one I tried was to follow these instructions, so in Cygwin:
cd /cygdrive/c/cygwin64/etc/
nano wgetrc
at the end of the file, I added:
use_proxy = on
http_proxy=http://username:password#my.proxy.ip:my.port/
https_proxy=https://username:password#my.proxy.ip:my.port/
ftp_proxy=http://username:password#my.proxy.ip:my.port/
(of course, using my user and password)
The second approach was what was suggested by this SO post, so in my Cygwin environment:
export http_proxy=http://username:password#my.proxy.ip:my.port/
export https_proxy=https://username:password#my.proxy.ip:my.port/
export ftp_proxy=http://username:password#my.proxy.ip:my.port/
in both cases, if I try to test my wget, I get the following:
$ wget http://www.google.com
--2020-01-30 12:12:22-- http://www.google.com/
Resolving my.proxy.ip (my.proxy.ip)... 10.1XX.XXX.XX
Connecting to my.proxy.ip (my.proxy.ip)|10.1XX.XXX.XX|:8XXX... connected.
Proxy request sent, awaiting response... 407 Proxy Authentication Required
2020-01-30 12:12:22 ERROR 407: Proxy Authentication Required.
It looks like if my user and password are not ok, but I actually checked them on my browsers and my credentials work just fine.
Any idea on what this could be due to?
This problem was solved thanks to the suggestion of a User of the community AskUbuntu.
Basically, instead of editing the global configuration file wgetrc, I should have created a new .wgetrc with my proxy configuration in my Cygwin home directory.
In summary:
Step 1 - Create a .wgetrc file;
nano ~/.wgetrc
Step 2 - record in this file the proxy info:
use_proxy=on
http_proxy=http://my.proxy.ip:my.port
https_proxy=https://my.proxy.ip:my.port
ftp_proxy=http://my.proxy.ip:my.port
proxy_user=username
proxy_password=password
I have a small nginx based test application that I want to run inside a docker container. So I followed the example given here docker installation
So I have a foder name restartTest and it contains an index.html file that has this single line in it that says Docker Test 1. I mount this up as my volume during runtime for docker container. So the commmand I use is
docker run -dP -v /Users/Sachin/restartTest/:/usr/share/nginx/html --name engine2 nginx
And it runs fine. I use curl to verify that the volume has mounted properly and the application is running as desired. Now what I do is that I change the content of the index.html file (from my localhost) to Docker test 2 and then I restart the container. I execute the following command to verify that the content has indeed changed inside the docker container
docker exec engine2 cat /usr/share/nginx/html/index.html
And as expected, the file reads Docker Test 2. However, when I use the curl command to see if the webpage also reflects the change I see that I still get Docker Test 1 as the response. The index.html reflects the change however when I run the curl command or if I access the app from the browser, I still get the same result. I have tried the following but to no avail.
Restart the service
Stop and start the container
Stop and start the boot2docker VM and docker daemon.
I have no clue as to why this is happening.
So I found this known bug with VirtualBox VM that is used for running Docker on Mac.
When we have shared content between our host machine and the VirtualBox, then only we face this bug. There is a optimisation as far as web servers like nginx, apache (and apparently vertx) are concerned. Whenever we request a static file from the server, it uses sendfile to provide us with the file. The bug is that in case of VirtualBox (in the scenario described above) we always get the first version of the file no matter what we try. The workaround for this in case of nginx and apache is to turn sendfile off . However, there is a hack that we use as far as vertx is concerned.
rename the file say login.html to login.html.moved (anything)
curl :/….../login.html (we won’t get anything)
rename the file back to its original name login.html.moved to login.html
Hard refresh the page (Command + Shift + R).
For further reading about this bug consult the following
Link1
Link2
Link3
Link4
I assume it is a caching problem. Did you try to set expires -1 in your index.html location configuration to disable server side caching for static files?
I used this code from the command prompt on a windows box (linux machine is at work):
ftp -u ftp://cran.R-project.org/incoming/ qdap_0.1.0.tar.gz
I used the info from:
https://github.com/hadley/devtools/wiki/Release
http://cran.r-project.org/doc/manuals/R-exts.html#Submitting-a-package-to-CRAN
I expected to see it show up here: ftp://cran.r-project.org/incoming/ but I do not see it.
Am I just being impatient or did my package not upload? Here is the command line output:
C:\Users\trinker\GitHub>ftp -u ftp://cran.R-project.org/incoming/ qdap_0.1.0.tar
.gz
Transfers files to and from a computer running an FTP server service
(sometimes called a daemon). Ftp can be used interactively.
FTP [-v] [-d] [-i] [-n] [-g] [-s:filename] [-a] [-A] [-x:sendbuffer] [-r:recvbuf
fer] [-b:asyncbuffers] [-w:windowsize] [host]
-v Suppresses display of remote server responses.
-n Suppresses auto-login upon initial connection.
-i Turns off interactive prompting during multiple file
transfers.
-d Enables debugging.
-g Disables filename globbing (see GLOB command).
-s:filename Specifies a text file containing FTP commands; the
commands will automatically run after FTP starts.
-a Use any local interface when binding data connection.
-A login as anonymous.
-x:send sockbuf Overrides the default SO_SNDBUF size of 8192.
-r:recv sockbuf Overrides the default SO_RCVBUF size of 8192.
-b:async count Overrides the default async count of 3
-w:windowsize Overrides the default transfer buffer size of 65535.
host Specifies the host name or IP address of the remote
host to connect to.
Notes:
- mget and mput commands take y/n/q for yes/no/quit.
- Use Control-C to abort commands.
(This was previously a comment and is being transferred to an answer here.)
Make sure you are not looking at a page cached earlier by your browser.
To perform the actual upload you might want to try the free cross platform FileZilla FTP software. You can upload and concurrently view the contents of the source directory on your machine (in the left pane) and the target directory on CRAN (in the right pane) and view a log of what is happening in the top pane and a progress indicator in the bottom pane. It also has a site manager to store the sites you upload to so you don't need to keep typing in their URL each time you do an upload.
I have a server where I store data from Mac A and Mac B.
I use rsync to keep the files updated between my Macs.
I run the following code unsuccessfully
#!/bin/zsh
# to copy files from my server to my folder
rsync -Pav $Masi:~/private/ ~/Dropbox/Courses/math/
# to copy files from my folder to my server
rsync -Pav ~/Dropbox/Courses/math $Masi:~/private/
I get the following error message
ssh: connect to host port 22: Connection refused
rsync: connection unexpectedly closed (0 bytes received so far) [receiver]
rsync error: unexplained error (code 255) at io.c(600) [receiver=3.0.5]
ssh: connect to host port 22: Connection refused
rsync: connection unexpectedly closed (0 bytes received so far) [sender]
rsync error: unexplained error (code 255) at io.c(600) [sender=3.0.5]
I have ssh keys in place so the connection should work, since I can use scp without problems.
How can you use rsync between my server and one of my Macs?
I used to do a lot of this. Just ran a test, a few suggestions.
Spell out your entire user#host pattern
Run the ssh connection sans the rsync first, you may need to first approve your fingerprint
You do not seem to pass a flag to protect extended attributes, this can yield broken files on OS X. If you do not need resource forks, you are OK, but most of the time you do need them.
My test case:
$ rsync -Pav ~/Desktop/ me#remote.example.com:~/rsyc-test
In that case, all the files within ~/Desktop were copied to the remote host, in my home dir. Since the directory 'rsyc-test' did not exist, it was made for me. I had a .app on my Desktop, it made it over, surprisingly, it works. Even some .webloc files made it and appear to work, though I do not trust it.
I would strongly suggest adding in the -E flag
-E, --extended-attributes
Apple specific option to copy extended attributes, resource
forks, and ACLs. Requires at least Mac OS X 10.4 or suitably
patched rsync.
I ran a new test, moved a Interarchy bookmark to my desktop, I know for a fact these break if they are copied sans resource forks. Running without the -E versus with the -E, there is a difference of 152 bytes in xfered data. The first file on the remote machine did not work, the second transfered file did work.
I can not help but notice in your example one of your paths is ~/Dropbox so this may all not matter, since DropBox, the app, does not at all support resource forks currently, though I hear there are plans to in the future.
You also are not sending in the --delete flag, if your end goal is a mirror of your data, you are not getting that, if your end goal is backups that continually grows, keeping everything that was ever on the source, the lack of --delete is good.
Other notes:
You can exclude those silly .DS_Store files
--exclude '.DS_Store'
You can also set rsync up in a way to be a true mirror, so you would not need to run your other command, see the man page for details.
My final working command to shove the Desktop of my laptop to a remote machine:
$ rsync -PEav --delete --exclude '.DS_Store' ~/Desktop/ me#remote.example.com:~/rsycn-test
Check "$Masi". Is that the hostname you are trying to reach?
Try the following command to debug it:
rsync -e 'ssh -v' -Pav $Masi:~/private/ ~/Dropbox/Courses/math/
The Connection refused usually happens when there is a connection issue to the remote (e.g. firewall).
In your case the problem is that $Masi variable is empty. If it's not variable, use Masi.
As per this error:
ssh: connect to host port 22: Connection refused
Notice the double space above after the host word.
the connect to host message doesn't say to which host, so you're trying to connect to empty host. So it sound like a typo in the host name.
When trying to run the Flex Builder 3 profiler on any I don't get the profiler dialog window and then after a few seconds I get "Socket timeout" in the console window. Any ideas why it can't connect?
I've got the latest debug version of Flash player and have tried shutting off my firewall.
I'm running it on XP from the local drive, ie. not through localhost.
Thanks,
Alex
It looks like the browser (Firefox in my case) has to be shutdown before the profiler is started. Step 1. in the livedocs even says this -- wish I had read it earlier. :)
http://livedocs.adobe.com/flex/3/html/help.html?content=profiler_3.html
Make sure that your firewall does not block port 9999, you can customize the port number too: Open Preferences->Flex->Profiler->Connections.
Check /etc/hosts (C:\Windows\System32\drivers\etc\hosts), and see if it contains a line:
127.0.0.1 localhost
In my case, it was somehow changed to ::1 localhost, and that's why it stopped working.
Thanks to Ziv for the (poorly formatted) answer.
While I try to run my Flex Profiler I got this error message:
In the flash application I got the following exception:
Error #2044: Unhandled securityError:. text=Error #2048: Security sandbox violation:
file:///C|%2Fwork%2Flabsense%2Fbranches%2Frel%5F1%5F2%5F5%5FEA%2Fsources%2Fui%2F.metadata%2F.plugins%2Fcom.adobe.flash.profiler%2FProfilerAgent.swf?host=localhost&port=9999
cannot load data from localhost:9999.
at ProfilerAgent()[C:\SVN\branches\3.2.0\modules\profiler3\as\ProfilerAgent.as:127]
And in the flex Profiler console (at the eclipse) I got : Socket timeout.
I am run on windows vista,
Flex builder: 3.2
Flash debugger: 10,0,22,87
Things that I have done to resolve this issue:
Switch the connection port of the profiler to 9998 (and back)
Remove and reinstall the flash debugger player.
Install flex builder 3.2 (instead of 3.0)
Delete all the enters in the mm.cfg file
Add enter to the mm.cfg:
PreloadSwf=C:\work\labsense\Sources\ui\.metadata\.plugins\com.adobe.flash.profiler\ProfilerAgent.swf?host=localhost&port=9999
or
PreloadSwf=C:\work\labsense\Sources\ui\.metadata\.plugins\com.adobe.flash.profiler\ProfilerAgent.swf?host=localhost&port=9998
or
PreloadSwf=C:/work/labsense/Sources/ui/.metadata/.plugins/com.adobe.flash.profiler/ProfilerAgent.swf?host=localhost&port=9999
or with spaces:
PreloadSwf=C: \ work \ labsense \ Sources \ ui \ .metadata \ .plugins \ com.adobe.flash.profiler \ ProfilerAgent.swf?host=localhost&port=9999
or
C:\work\labsense\Sources\ui\.metadata\.plugins\com.adobe.flash.profiler\ProfilerAgent.swf?
or add all or some of the enters:
TraceOutputFileName=C:\Users\zivo\AppData\Roaming\Macromedia\Flash Player\Logs\flashlog.txt
ErrorReportingEnable=1
MaxWarnings=0
TraceOutputFileEnable=1
ProfilingFileOutputEnable=1
Turn on and off the vista firewall
Add exception for port 9999 in the vista firewall
Try to run the profiler SWF separately
Same result.
Try one last thing:
Because I have expreins problem little bit similar before with the flash debugger, the resolution then was:
Right click on flash player (debugger),
choose “Debugger”,
choose “other machine”
add “127.0.0.1”
click ok
then, it solve the issue (but apparently he connect to the debugger with host 127.0.0.1 instead of localhost (which is a same)
I now add to the mm.cfg file, the follow entry:
PreloadSwf=C:/work/labsense/branches/rel_1_2_5_EA/sources/ui/.metadata/.plugins/com.adobe.flash.profiler/ProfilerAgent.swf?host=127.0.0.1&port=9999
Then, after saving, I run the profiler, and its work!!
And the reasons for all this was:
Some program change the file C:\Windows\System32\drivers\etc\hosts to:
# Copyright (c) 1993-2006 Microsoft Corp.
#
# This is a sample HOSTS file used by Microsoft TCP/IP for Windows.
#
# This file contains the mappings of IP addresses to host names. Each
# entry should be kept on an individual line. The IP address should
# be placed in the first column followed by the corresponding host name.
# The IP address and the host name should be separated by at least one
# space.
#
# Additionally, comments (such as these) may be inserted on individual
# lines or following the machine name denoted by a '#' symbol.
#
# For example:
#
# 102.54.94.97 rhino.acme.com # source server
# 38.25.63.10 x.acme.com # x client host
::1 localhost
127.0.0.1 iDBO # LMS GENERATED LINE
This means that localhost is not lead to 127.0.0.1!!!
Fixing is easy:
# ::1 localhost
# 127.0.0.1 iDBO # LMS GENERATED LINE
127.0.0.1 localhost
Instead (remark the problem and fix the problem
After trying all the other suggestions here, this post on Adobe's forum clued me in.
Adobe forum
When the Flash debug player launches, it looks for mm.cfg in %HOMEDRIVE%%HOMEPATH%. On this particular computer that path is not my home directory on C: but on the file server mapped to I:. So once I created I:\mm.cfg with the contents
PreloadSwf=C:\Users\ehedstrom\Documents\FLEXBU~1\.metadata\.plugins\com.adobe.flash.profiler\ProfilerAgent.swf?host=localhost&port=9999
everything magically started working!
Browser tabs, make sure you have latest debug as you said you did, also make sure that the port is correct, for some reason the port sometimes changes(1001 or 20957) from the default 9999, be sure that your mm.cfg has ProfilingFileOutputEnable=1 and that bittorrent isn't on.
hth