I'm trying to upload a package on CRAN for its first release, but I can't get past the FTP upload.
It seems I do not have write access to ftp://cran.r-project.org/incoming:
550 Access is denied.
Could not download /home/roudierp/Documents/CODE/lhs/fresh_meat/clhs_0.4-2.tar.gz from local filesystem
There were 1 files or directories that could not be transferred. Check the log for which items were not properly transferred.
I tried with two file browsers (Dolphin and Konqueror), two GUI-based FTP clients (FireFTP and GFTP) and with good ol' ftp command line interface, with no success.
I used anonymous as the user name, and my email address or nothing at all as passwords.
I also tried to use curl and explicitly disable EPSV as per this post:
curl --disable-epsv -T clhs_0.4-2.tar.gz ftp://cran.R-project.org/incoming/
But I still get an access denial error:
curl: (25) Failed FTP upload: 550
Any idea what I'm doing wrong?
I finally managed to upload my package: as expected, it was a problem on my side.
It seems that the proxy behind which I am while at work would be to blame, and somehow blocked the upload. Weird stuff, as I've been uploading stuff to external FTP sites in the past, and as I'm pretty sure port 21 is not blocked.
But anyway, I managed to upload my archive to ftp://cran.R-project.org/incoming/
from a direct internet connection whithout problem.
Thanks,
Pierre
Related
I am trying unsuccessfully to publish from SBT into a Nexus repository running on my network. Attempts to publish fail with a forbidden error
If I look at the Nexus side of things with trace debugging on, I can see the request, but there is no Authorization header in the request.
This is my build.sbt
And this is my credentials file
I have used curl to see what the Realm should be, which hopefully I have reflected in my credentials file
But nothing I do seems to get the Authorization header in the PUT request. Is there something obvious I am missing? I feel like I am spinning my wheels.
Thanks for any help
This did end up being how I had set up my files. My build.sbt file was fine. However, in my credentials, my Host value contains a port, which was confusing the IvyAuthenticator. I ended up seeing this error message when running through sbt shell in IntelliJ
My issue was that, in my credentials file, my host ended with the ":8081" value, and it looks like IvyAuthenticator was using the host name without the port information.
So, after updating my credentials file, so that the host value was just the machine name without any port details, my publish succeeded.
I am an amateur historian trying to access newspaper archives. The server where the scans are located "works" using an outdated tif viewer that doesn't seem to actually work at all anymore. I can access the files individually in chrome without logging in, but when I try to use wget or curl, I'm told that viewing the file is unauthorized, even when I use my login info, and even when using my cookies from chrome.
Here is an example of one of the files: https://ulib.aub.edu.lb/nahar/images2/7810W2/78101001.TIF
When I put this into chrome, it automatically downloads the file even though I cannot access the directory itself, but when I use wget, I get the following response: "401 unauthorized Username/Password Authentication Failed."
This is the basic wget command I'm using (if I can get it to work at all, then I'll input a list of the other files):
wget --no-check-certificate https://ulib.aub.edu.lb/nahar/images2/7810W2/78101001.TIF
I've tried variations with and without cookies, with a blank user, with and without login credentials, As I'm sure you can tell, I'm new to this sort of thing but eager to learn.
From what I can see, authentication on your website is done with HTTP basic. This kind of authentication is not using HTTP cookies, it is using HTTP Authorization header. You can pass HTTP basic credentials to wget with the following arguments.
wget --http-user=YourUsername --http-password=YourPassword https://ulib.aub.edu.lb/nahar/images2/7810W2/78101001.TIF
I have a file in my folder and I want to send it to my Embedded Linux device via FTP (much like this and this and this). I know the step-by-step to do it, but I'm failing when it comes to creating the correct QUrl for it: when I call ''put'', I always get the error 301:
QNetworkReply::ProtocolUnknownError 301 the Network Access API cannot honor the request because the protocol is not known
As details, I want to save the file in a specific directory located inside a SD Card in the device, /media/mmcblk0p2/bin, and the connection doesn't have, at least for now, a password and user name defined¹. Also interesting to notice that I'm not being able to connect myself via FTP using Terminal; it always says "421 Service not available, remote server has closed connection", which is not the same problem AFAIK. (Btw I'm being able to connect via SSH using FileZilla, so it's not a hardware/physical problem)
So where is the problem? I have exactly the same code as in the mentioned links. As for now, the link I'm using is
ftp://10.1.25.10/media/mmcblk0p2/bin/center.png
(when returning the QUrl object with QDebug) and I'm not being able to make it work.
Any help would be appreciated.
¹: Btw I remember reading somewhere that when one doesn't use a user name for connecting to FTP, the system only allows the client to connect to the /ftp folder. Is that true? In that case, just calling QUrl::setUserName("root"); would suffice?
I finally discovered my problem: since I was copying the code from examples of upload to HTTP servers, I was using the function "post", specific for HTTP, instead of "put" which was the correct function to use.
Regarding the QUrl, I used QUrl urlTemp("//10.1.25.10/test.info"); while telling it to use ftp with scheme, urlTemp.setScheme("ftp");.
I am trying to do this:
vim http://mysite.com/x.html
I have chmod 777 on the file to make sure full access is granted, I can open the file without a problem,
but when I tried to save the edit, there prompted an error:
"http://mysite.com/x.html" E212: Can't open file for writing
You (obviously) can't upload that file to the server via http.
Use ssh/scp or ftp.
See :help netrw.
http is the wrong protocol here. This makes for a good read: http://www.w3.org/blog/2008/10/understanding-http-put/ - the HTTP 'verbs' (PUT, POST, GET etc.) do not dictate how the server is going to handle the request you send. In fact HTTP "defines the intended semantics of the communication... (but) does not define how either side fulfills those expectations".
You could quite easily run
vim http://stackoverflow.com/questions/19476683/vim-edit-file-over-http
but you won't be able to edit this page.
See http://vim.wikia.com/wiki/Editing_remote_files_via_scp_in_vim for working on files via ssh / ftp.
I'm trying to do a very simple CFHTTP GET call to a local website running on IIS7, however it throws a 408 Connection Failure.
I've done all the obvious things:
The site is listed in the hosts file locally
I've added the CFHTTPPARAM tags for IIS compression issues (deflate;q=0)
Surfing to the URL in the browser works fine
Doing a CFHTTP to google.com works fine, no local sites work at all.
When searching on Google there are others that have had this, but no solutions.
Anyone successfully got through this issue?
If you are using a private, or not well known certificate provider you may need to add the public key of the certificate provider to the JRUN keystore.
Here's more info on how to do that:
http://cfmasterblog.com/2008/11/09/adding-a-certificate-to-the-coldfusion-keystore/
You may just need to restart CF if you changed your HOSTS file after CF was started. It caching DNS entries pretty greedily.
It's a bad implementation. Use cfx_http.