Vim edit file over http - http

I am trying to do this:
vim http://mysite.com/x.html
I have chmod 777 on the file to make sure full access is granted, I can open the file without a problem,
but when I tried to save the edit, there prompted an error:
"http://mysite.com/x.html" E212: Can't open file for writing

You (obviously) can't upload that file to the server via http.
Use ssh/scp or ftp.
See :help netrw.

http is the wrong protocol here. This makes for a good read: http://www.w3.org/blog/2008/10/understanding-http-put/ - the HTTP 'verbs' (PUT, POST, GET etc.) do not dictate how the server is going to handle the request you send. In fact HTTP "defines the intended semantics of the communication... (but) does not define how either side fulfills those expectations".
You could quite easily run
vim http://stackoverflow.com/questions/19476683/vim-edit-file-over-http
but you won't be able to edit this page.
See http://vim.wikia.com/wiki/Editing_remote_files_via_scp_in_vim for working on files via ssh / ftp.

Related

wget won't download files I can access through browser

I am an amateur historian trying to access newspaper archives. The server where the scans are located "works" using an outdated tif viewer that doesn't seem to actually work at all anymore. I can access the files individually in chrome without logging in, but when I try to use wget or curl, I'm told that viewing the file is unauthorized, even when I use my login info, and even when using my cookies from chrome.
Here is an example of one of the files: https://ulib.aub.edu.lb/nahar/images2/7810W2/78101001.TIF
When I put this into chrome, it automatically downloads the file even though I cannot access the directory itself, but when I use wget, I get the following response: "401 unauthorized Username/Password Authentication Failed."
This is the basic wget command I'm using (if I can get it to work at all, then I'll input a list of the other files):
wget --no-check-certificate https://ulib.aub.edu.lb/nahar/images2/7810W2/78101001.TIF
I've tried variations with and without cookies, with a blank user, with and without login credentials, As I'm sure you can tell, I'm new to this sort of thing but eager to learn.
From what I can see, authentication on your website is done with HTTP basic. This kind of authentication is not using HTTP cookies, it is using HTTP Authorization header. You can pass HTTP basic credentials to wget with the following arguments.
wget --http-user=YourUsername --http-password=YourPassword https://ulib.aub.edu.lb/nahar/images2/7810W2/78101001.TIF

Error occuring when trying to upload file via FTP using Qt

I have a file in my folder and I want to send it to my Embedded Linux device via FTP (much like this and this and this). I know the step-by-step to do it, but I'm failing when it comes to creating the correct QUrl for it: when I call ''put'', I always get the error 301:
QNetworkReply::ProtocolUnknownError 301 the Network Access API cannot honor the request because the protocol is not known
As details, I want to save the file in a specific directory located inside a SD Card in the device, /media/mmcblk0p2/bin, and the connection doesn't have, at least for now, a password and user name defined¹. Also interesting to notice that I'm not being able to connect myself via FTP using Terminal; it always says "421 Service not available, remote server has closed connection", which is not the same problem AFAIK. (Btw I'm being able to connect via SSH using FileZilla, so it's not a hardware/physical problem)
So where is the problem? I have exactly the same code as in the mentioned links. As for now, the link I'm using is
ftp://10.1.25.10/media/mmcblk0p2/bin/center.png
(when returning the QUrl object with QDebug) and I'm not being able to make it work.
Any help would be appreciated.
¹: Btw I remember reading somewhere that when one doesn't use a user name for connecting to FTP, the system only allows the client to connect to the /ftp folder. Is that true? In that case, just calling QUrl::setUserName("root"); would suffice?
I finally discovered my problem: since I was copying the code from examples of upload to HTTP servers, I was using the function "post", specific for HTTP, instead of "put" which was the correct function to use.
Regarding the QUrl, I used QUrl urlTemp("//10.1.25.10/test.info"); while telling it to use ftp with scheme, urlTemp.setScheme("ftp");.

Error when uploading package to CRAN incoming: 550 access denied

I'm trying to upload a package on CRAN for its first release, but I can't get past the FTP upload.
It seems I do not have write access to ftp://cran.r-project.org/incoming:
550 Access is denied.
Could not download /home/roudierp/Documents/CODE/lhs/fresh_meat/clhs_0.4-2.tar.gz from local filesystem
There were 1 files or directories that could not be transferred. Check the log for which items were not properly transferred.
I tried with two file browsers (Dolphin and Konqueror), two GUI-based FTP clients (FireFTP and GFTP) and with good ol' ftp command line interface, with no success.
I used anonymous as the user name, and my email address or nothing at all as passwords.
I also tried to use curl and explicitly disable EPSV as per this post:
curl --disable-epsv -T clhs_0.4-2.tar.gz ftp://cran.R-project.org/incoming/
But I still get an access denial error:
curl: (25) Failed FTP upload: 550
Any idea what I'm doing wrong?
I finally managed to upload my package: as expected, it was a problem on my side.
It seems that the proxy behind which I am while at work would be to blame, and somehow blocked the upload. Weird stuff, as I've been uploading stuff to external FTP sites in the past, and as I'm pretty sure port 21 is not blocked.
But anyway, I managed to upload my archive to ftp://cran.R-project.org/incoming/
from a direct internet connection whithout problem.
Thanks,
Pierre

Intercept and use local files in http requests

I'm trying to find a tool that will allow non-programmers to test files on a live server.
For example, they could modify an image on their computer, reload a webpage, then see the results of their work immediately.
I've tried finding a tool for this, because it seems obvious enough that someone must've thought of it, but a lot of software I see doesn't quite fit. A tool called Fiddler does this (they call it autoresponding) but it's Windows-only. I could change the hosts file to redirect to a local instance of nginx or something, but that seems difficult to maintain when all I really want is a simple tool that will something like this...
http://someserver.com/css/(.*) -> /home/user/localcss/$1
Does anybody have any recommendations?
Edit: Redirect clarification
Fiddler has this feature; just click the AutoResponder tab and map URLs to local files. Thousands of people do this every day.
See also video #5 here: http://www.fiddlerbook.com/fiddler/help/video/default.asp
I found Charles Proxy very useful for this
http://www.charlesproxy.com/documentation/tools/map-local/
Max's PAC solution was a life-saver so I'm providing more details (can't yet up vote)
To use a local version of, say, css files, create a file 'proxy.pac', which contains this function:
function FindProxyForURL(url, host)
{
// use regex to match requests ending with '.css'
// and redirect them to localhost
var regexpr = /.**\.css/;
if(regexpr.test(url))
{
return "PROXY localhost";
}
// Or else connect directly:
return "DIRECT";
}
Save 'proxy.pac' and point your browser to this file. In Firefox this is in Options > Advanced > Connection > Settings > Automatic Proxy Configuration URL
For best practice, also add a MIME type to your web server: map '.pac' to type 'application/x-ns-proxy-autoconfig'.
All requests to .css files will now be routed to localhost. Don't forget to ensure the file structure is the same on the proxy server.
In the case of CSS, it may well be easier to override CSS by using a local chrome. For example in Firefox, chrome/userContent.css. See http://kb.mozillazine.org/UserContent.css
It's been a while since I asked this question and I have an good technique that wasn't suggested.
PAC files are supported by all major browsers, and allow you to write a script that can redirect any individual request to a proxy server. So for example the proxy server could serve a PAC file, have the PAC file redirect whitelisted URLs to the proxy server, and return the local versions of these files. It can even support HTTPS.
Beware of one gotcha - Internet Explorer. It helpfully "caches" the results of this script incorrectly, so that if one URL on a domain is proxied, all URLs at that domain will be proxied. This feature can be disabled, however.
You can do this with the modify response rule in Requestly.
Using the local file option you can specify any file to be used as the response for the intercepted request.
According to their documentation it also supports hot reloading, i.e., as long as the file path remains the same, the rule will pick up the changes that you made.
As for the dynamic URL matching, they have support for regex and wildcards in their source filters
Note: This is currently only available in their desktop app.
If you want to implement this using their chrome extension ,which is what I personally did, you can use the Redirect rule paired with a mock server. Here is a page explaining this
You can setup a mock server / mock files endpoint within Requestly instead of using something nginx or a local server to do so. But this works only for text based content, not images
This would also bypass any setup on the tester's local machine. They would only need to install the extension. All you would have to do is send them the endpoint for your mock server and the redirect rule that you created.
Actually you can't do this because browsers don't allow files over http:// to access file on the local machine (just think a moment about it... What would happen if, for example, a malicious webpage loads some private files from your computer?).
Some browsers (e.g. Safari) allows files over file:// to access other file:// files, others don't, but no browser allows http:// to access file://.
Firefox has a feature called "Signed scripts", which are scripts digitally signed with a trusted certificate. They can ask the user to grant them access to the local hard drive. Look at this: http://www.mozilla.org/projects/security/components/signed-scripts.html
Do you mean the Fiddler Web Proxy (www.fiddler2.com)? There is a commercial Java-based alternative named Charles Web Proxy that may fit your needs.

How to use http-delete from the shell

Is it possible to send a HTTP DELETE request from the shell and if so, how?
curl -X delete URL
see (google cache, server seems slow) reference.
There are probably a lot of tools to do this, but the easiest, if you don't want to make sure those tools are available, might be to manually create your DELETE-request and send it to the server via telnet. I think the syntax is pretty much like this, although I've never used the delete command manually myself, only get and post.
DELETE /my/path/to/file HTTP/1.1
Host: www.example.com
Connection: close
There should be an extra newline (blank line) after the request to terminate it, but markdown won't allow me to do that. Store that in a file (or paste it in the console, if you don't want to use a script), then simply do
telnet www.example.com 80 < myRequest.txt
Of course, you can use a here-document as well.

Resources