Parallel downloading a file using command line and lftp - http

I am looking into how to use lftp for parallel downloading a file over http .I see this example
lftp -c "pget -n 10 http://example.com/foo.bar"
However not finding any information on how to specify custom http headers and cookie values here. Any help would be appreciated.
Thanks!

For cookies there is the http:cookie setting. See the man page. Custom headers are not supported yet, but there are a few supported via http:* settings.

Related

How can I get niginx fastcgi to run requests in parallel?

I'm trying to get nginx to serve more than one connection at a time, with a fasgcgi backend.
This stackoverflow answer might contain the answer, but neglects to say where that option could be configured. All the options I see are in config files. Where would I put command line options like "-c 2"? It's not nginx -c, that's config. I don't see anyplace that looks like it would take command line options.
Ok, it looks like I don't need the above linked answer. The setting is
FCGI_CHILDREN
And the reason I had a bit of finding this is that this setting is not in nginx's config, it's in fcgiwrap's config. That is (on my machine) in /etc/init.d/fcgiwrap. Change FCGI_CHILDREN to something larger than 1.
FCGI_CHILDREN="5"
Just changing that to greater than one allow me to run more than one request at a time.
The linked answer mentions
if you are using spawn-fcgi to launch fcgiwrap then you need to use -f "/usr/bin/fcgiwrap -c 5"
but I did not have to do that.

Change Jenkins basepath

I'm trying to serve a stock jenkins installation (on Amazon Linux AMI) thru myjenkinsinstance:8080/jenkins (rather than myjenkinsinstance:8080), and then proxy this with e.g. Nginx (over HTTP).
This question has been 'answered' before, but the solution doesn't seem to be relevant anymore.
#admins I would prefer to comment on that thread (specifically this 'answer'), rather than opening a duplicate, but I am not allowed to, per my 'reputation' score (as my comment would not be a solution at all, but further request for help).
From the closest thing to an answer I've seen:
Go to Jenkins Home Directory ( I have mine in C:\Jenkins)
Edit jenkins.xml
Add this --prefix=/jenkins to the end of the argument as show below and restart the jenkins service ALL worked OK for me !
Example : <arguments>-Xrs-Xmx256mDhudson.lifecycle=hudson.lifecycle.WindowsServiceLifecycle -jar "%BASE%\jenkins.war" --httpPort=8080 --prefix=/jenkins</arguments>
Open Url http://localhost:8080/jenkins this should bring up the home page of jenkins
there is no 'jenkins.xml' in the $JENKINS_HOME directory, but there is a config.xml
there is no <arguments/> entry in the config.xml
there seems to be no other configuration for the initial installation
There's also a 'Jenkins Location > Jenkins URL' setting in the "Configure System" settings (myjenkinsinstance/configure), but modifying this seems to have no noticeable affect.
The end goal would be to automate this installation via e.g. CloudFormation (as part of the EC2's UserData).
Any suggestions would be greatly appreciated.
On your linux system, you need to find the jenkins default config file located at
/etc/default/jenkins
and then add the following arguments according to your requirements. This is a rough idea.
JENKINS_ARGS="--webroot=/var/cache/jenkins/war --prefix=/jenkins
--httpPort=$HTTP_PORT --ajp13Port=$AJP_PORT"
This should work most likely. If it doesnt, pls update your answer with the current arguments present. This works fine for Debian/Ubuntu.
Also you are running jenkins on your windows machine or linux?
So my 'solution' was to use sed and insert some lines into /etc/nginx/nginx.conf and /etc/init.d/jenkins.
e.g.
sed -i '/^ location \/ {/aproxy_pass http://127.0.0.1:8080/;' /etc/nginx/nginx.conf
sed -i '/^PARAMS=/ s/"$/ --prefix=\/jenkins"/' /etc/init.d/jenkins
I highly doubt this is anything near a 'best practice', but it seems to work for now (what happens were I to update with yum... I'm not sure, but the plan is to back the instance with an Elastic Filesystem, which hopefully will allow us to consider the jenkins instance ephemeral, anyway).

How can I attach a patch.diff-file from sourceforge.net?

I found a patch that will fix a bug in my PyOpenGL-program
here.
But I have no idea how I can install this patch in arch-linux.
Can anyone help me ?
First of all, this has nothing to do with python or openGL, so you might want to remove those tags.
You are looking for the patch-utility and then, in the directory that contains the src-folder, do something like
patch -p0 -b < patch-py33-import.diff
The -b switch creates backup files before patching, the -p switch tells patch to use the complete path/filename that are given in the patch file.

Setting nosignal option for CURL command line?

From the libcurl documentation, I understand NOSIGNAL has to be set to 1 when using multi-thread program.
However, if I am calling curl from a command line, I don't see a NOSIGNAL switch/option. How do I set nosignal when calling curl directly?
You can't: as you said, there is no such option documented by cURL help (and you can also verify that there is no reference to CURLOPT_NOSIGNAL within the command-line source code - see src/tool_operate.c).
Now the question is why you would need this, as asked by Celada. Using libcurl in a multi-threaded context makes sense (i.e. you may want to create/reuse curl handles from several threads) - and you should then follow the best practices, but I don't see the point with the CLI tool.

Cannot load a plain text file generated by a PHP - script using curl utility

I am sitting on a Mac OS X system and I cannot get around a simple problem from the domain of working with the command line: using the command curl http://mureakuha.com/dl.php?type=1&id=1234 I get no data from a (obviously) PHP script generating plain text files.
I expect the solution to be a matter of passing right flags to curl, yet I have no clue where to start. Any help much appreciated.
Try curl 'http://mureakuha.com/dl.php?type=1&id=1234'. The problem here is the unquoted & symbol in url.

Resources