So I'm trying to download a file that works fine in the browser, but will simply not work using curl:
$ curl http://www.partner.viator.com/partner/admin/tools/links_feeds/downloadFeed.jspa?feed=Products&PUID=10869 -L --O full_viator_product_list.zip
I get:
[1] 10097
-L: command not found
What am I doing wrong?
(Just to prove I've done some homework, the issue here did not help.)
See the & in the url? that's where it goes wrong. On the Linux command line, this basically means 'run the command in the background, and continue to the next command, if any'.
If you put the entire url in quotes, it will work: curl 'http://www.partner.viator.com/partner/admin/tools/links_feeds/downloadFeed.jspa?feed=Products&PUID=10869' -L --O full_viator_product_list.zip
When using commands like this, make sure you always either quote or escape them. If you don't do this, nastier things than this problem can happen.
Related
I'm doing some bioinformatics analysis in Rstudio, but something strange happens when using system(). I'm also using Windows Subsystem for Linux, so I can run a UNIX executable in my Windows cmd like so:
bash -c "./parasail-master/build/parasail_aligner -a sw_trace_striped_sat -f SSWtemplate.fa -q SSWtest.fa -O EMBOSS -d >OUT.txt"
Don't worry about the specifics: what's important is that I use bash -c to indicate I want to use the UNIX bash, and I'm running the executable parasail_aligner. It all works out, and I get the nice output file "OUT.txt".
Now, since I'm doing my analysis in Rstudio, I want to execute this directly from an R script, like so:
system('bash -c "./parasail-master/build/parasail_aligner -a sw_trace_striped_sat -f SSWtemplate.fa -q SSWtest.fa -O EMBOSS -d >OUTER.txt"')
So: just give it as an argument to system()? But this gives the following error:
input file, query file, and stdin detected; max inputs is 2
This is obviously an error specifically generated by parasail_aligner. The funny thing is: I don't get this error at all from cmd directly, but I do get it when running the command in R using system(). Does anyone have any idea why something like this can happen at all? I would expect system() to just give its argument to cmd, but clearly it doesn't do this... Running the command in a command terminal opened in Rstudio also works fine, it is specifically system() that seems to mess up.
I'm terribly sorry if this question is vague, but I can't give you a simple example which you could use to replicate the error. I've been using system() for a while now and I've never had this kind of problem. I am on Windows and I've found some people online that say you should use shell() instead of system(), but doing so just gives me the same error.
Maybe it has something to do with this "stdin" thing the error mentions and how R/RStudio handles this, I don't know. But parasail seems to think I give it an extra input "stdin": it is true I give an Input File and a Query File (see error message), but I don't know what this "stdin" is.
If anyone has any ideas about what could be behind this strange behaviour of system(), I'm all ears. I understand that helping me is difficult since I can't give a simple example in which the problem occurs, but I hope someone might know what could be the problem anyway
UPDATE (answer?): so, I managed to resolve the issue, like so:
system('bash -c "./parasail-master/build/parasail_aligner -a sw_trace_striped_sat < SSWtemplate.fa -f SSWtest.fa -O EMBOSS -d >OUTER.txt"')
I did some searching about stdin, and (forgive me if what I say sounds amateur, I'm not really familiar with UNIX or command line) found out its "symbol" is <. So you can see in the code above, I changed the way I give in my inputs "SSWtemplate" and "SSWtest", giving one of them using "<", and this solves the problem.
I have no idea why this happens. Especially since it only happens when calling the command from inside RStudio, and not when doing so from cmd. If anyone can clarify this further (i.e. why and how functions like system() and shell() seem to mess with stdin), it would be a big help. Otherwise, I'll just answer this to my own question and leave it at that.
I have a very unnecessary scripting question: how can I make the command fortune run along with any other command? So, for example, instead of running
something
I want to ALWAYS force run something similar, but not exactly
fortune && something
wherein the fortune command finishes before the other command begins.
Is there a way to do this in Mac OS X Yosemite?
I don't know exactly what you are trying to achieve.
But you can try below:
eg. you want to fire 'hostname' and 'pwd' whenever 'hostname' command is fired.
Add below line in your bash profile.
alias hostname="hostname;pwd;"
you will get output like below:
[user#host ~] hostname
host1.example.com
/data/data2/new
On my Unix server I execute this command to copy all content from folderc via the unix shell.
wget -r -nH --accept=ismv,ismc,ism,jpg --cut-dirs=5 --level=0 --directory-prefix="/root/sstest" -o /root/sstest2.log http://site.com/foldera/folderb/folderc/
All the content from folderc is actually copied to /root/sstest .
The wget does not exit after copying and take me back to the command prompt.
What could be causing this behaviour?
I had the same problem, and I just add single quote to the front and end of the URL.
This step resolved this issue form me.
It's possible that the HTTP server miscommunicates the length of a response, so that Wget keeps waiting for more data. It could be due to a bug in Wget or in the server (or a software component running on the server) which you don't notice in an interactive web browser.
To debug this, make sure you are running the latest version of Wget. If the problem persists, use the -d flag to collect the debug output, and send a report about the misbehavior to Wget developers at bug-wget#gnu.org. Be sure to strip the sensitive data, such as passwords or internal host names, from the report before sending it.
I observe a similar problem when downloading files from dropbox with wget:
the download finishes (file is complete)
wget (or curl, depending on what I use for download) do not show up in running processes, anymore, after the file is complete
wget (or curl) do not return to the command prompt
returning to the command prompt can be "forced" by simply hitting enter, I do not have to actually kill any process to return to the command prompt, it's just kind of stuck before I press enter one more time.
The problem is not wget-specific, it also occurs when I try to download the same file from the same location with curl. The problem does not occur at all if I download the same file from several unix web server, neither with wget, nor with curl.
I have tried using timeout (with a sufficiently long time) to force wget/curl to return to the command prompt, but they even do not return to the command prompt after timeout kills them.
I am sitting on a Mac OS X system and I cannot get around a simple problem from the domain of working with the command line: using the command curl http://mureakuha.com/dl.php?type=1&id=1234 I get no data from a (obviously) PHP script generating plain text files.
I expect the solution to be a matter of passing right flags to curl, yet I have no clue where to start. Any help much appreciated.
Try curl 'http://mureakuha.com/dl.php?type=1&id=1234'. The problem here is the unquoted & symbol in url.
I just started using Zsh lately for some of the integrated support in the shell prompt for my Git status etc.
When I type in:
ruby -v
to confirm the version of ruby I'm running, Zsh asks if I want to change the command to _ruby. Well after saying no at the prompt and the command completing as expected I continue to get the question at the prompt after confirming my command is correct.
I'm assuming there is a completion file or something of the sort.
Thanks
Update:
The shell is no longer trying to complete _ruby, it stopped responding after closing the shell a few times some how.
I tried to clean the file up several times but there is a "opts" variable that is 50 or more lines long and the lines are all ran together, some lines more than 150 characters. Maybe I could email an attachment to you if you still want to see it.
I sincerely apologize for the messy post.
This is command autocorrection, activated by the correct option. It has nothing to do with completion. You're seeing _ruby because zsh thinks there is no ruby command and it offers _ruby as the nearest existing match.
If you've just installed ruby, it's possible that zsh has memorized the list of available command earlier, and it won't always try to see if the command has appeared in between. In that case, run hash -rf. Future zsh sessions won't have this problem since the ruby command already existed when they started.
Sometimes, when you change your PATH, zsh forgets some hashed commands. The option hash_listall helps against this. As above, if you can force zsh to refresh its command cache with hash -rf.
You could make an alias:
alias ruby='nocorrect ruby'
It's what I did when zsh kept asking me if I meant .meteor when I typed meteor because auto-correct is still useful from time to time.
I find the autocorrect feature can get annoying at times. So I do in my ~/.zshrc,
DISABLE_CORRECTION="true"
I had the same problem even when the command is not installed.
I can solve it using the CORRECT_IGNORE variable in my .zshrc
# OPTs to enable
setopt HASH_LIST_ALL
setopt CORRECT
# Zsh variable to determine what to ignore,
# in this case everything starting with _ or .
CORRECT_IGNORE="[_|.]*"
I hope it helps to you or anyone with this issue
Sometime ago after an update, I got command auto-correction enabled which I don't want. If the same happened to you and you want to revert it, in the ~/.zshrc file you'll have make it:
# Uncomment the following line to enable command auto-correction.
ENABLE_CORRECTION="false"
or comment it as per bellow:
# Uncomment the following line to enable command auto-correction.
# ENABLE_CORRECTION="true"
Just a note, on my zsh (version 5.7.1 on macOS), the DISABLE_CORRECTION didn't work.
I saw in my .zshrc file the following two lines, which I then commented out
setopt CORRECT
setopt CORRECT_ALL
That did it for me.