I am a in a student job where I am required to do work with a DB but it really isn't my domain.
In the Documentation it says to enter the line
GET /_cat/health?v
This returns the error
-bash: GET: command not found
It also proposes that I copy as curl. Then the command that works is
curl -XGET 'localhost:9200/_cat/health?v&pretty'
I can I make the command "GET /_cat/health?v" to work?
GET is a request method of the HTTP protocol. If you don't write an HTTP server or client software then you don't have to deal with it explicitly.
The command line
curl -XGET 'localhost:9200/_cat/health?v&pretty'
tells curl to request the URL http://localhost:9200/_cat/health?v&pretty using the GET request method.
GET is the default method, you don't need to specify it explicitly.
Also, the second argument you provide to curl is not an URL. curl is nice and completes it to a correct URL but other programs that expect URLs might not work the same (for various reasons). It's better to always specify complete URLs to get the behaviour you expect.
Your command line should be:
curl 'http://localhost:9200/_cat/health?v&pretty'
The apostrophes around the URL are required because it contains characters that are special to the shell (&). A string enclosed in apostrophes tells the shell to not interpret any special characters inside it.
Without the apostrophes, the shell thinks the curl command ends on & and pretty is a different command and the result is not what you expect.
Behind the scene, curl uses HTTP to connect to the server localhost on port 9200 and sends it this HTTP request:
GET /_cat/health?v&pretty
When you start working with elasticsearch, one of the first things they ask you to do to test your install is to do a GET /_cat/health?v, as shown here:
enter link description here
They fail to tell you that this will not work in a terminal, as Ravi Sharma has explained above. Maybe the elasticsearch team should clarify this a bit. At least they supply a Copy as cURL link. It is just frustrating for someone new at this.
sudo apt install libwww-perl
GET command is in package libwww-perl
Related
I am trying to make a curl get request to my api through command line.
curl http://localhost:8080/getList?id=100&mrp=50&discount=0
But when I log the request in my api I get:
&{GET /getList?id=100 HTTP/1.1 1 1 which means that only id is being sent through the request. I don't understand why it is happening.
When you run the curl command with multiple request parameters separated by &, unix treats the & as sign to execute the previous command in the background.Everthing else following it is treated as a separate command.
Wrap the url in quote while sending the request
curl "http://localhost:8080/getList?id=100&mrp=50&discount=0"
Below are the reasons for only id getting passed in the Curl Get request:
If you see your CURL command, you will notice that you are using & to pass multiple values to your request parameters in this GET call/request.
curl http://localhost:8080/getList?id=100&mrp=50&discount=0
In the Linux/Unix environment, & has a pre-defined interpretation. It is used to run any command in the background. So if & is present after any text, then the text is interpreted as a command and & means to run this command in the background.
Any text after & is treated as a new command. So your above Curl get request is interpreted by Linux as 3 different commands:
i. curl http://localhost:8080/getList?id=100&
ii. mrp=50&
iii. discount=0
Solution : The solution to avoid this interpretation by Unix/Linux is to surround your url with double quotes "
curl "http://localhost:8080/getList?id=100&mrp=50&discount=0"
This will help you pass all the three parameter values to your curl get request.
I have a mini program/server built on one of my computers (Machine1) and I am trying to create or overwrite a file through cURL on another computer (Machine2). So Machine2 is connected to Machine1. Ive been looking through cURL's documentation for command that will do this but have had no luck and as well on stack overflow.
https://curl.haxx.se/docs/manpage.html
I have also tried the examples on this SO post:
HTTP POST and GET using cURL in Linux
Any idea as to what the command might be through command prompt? (equivalent of a POST command). I have tried so far using -O, -K, -C and a multitude of others which have not worked.
In command line, all you need to do is using curl --form to simulate a multipart/form-data POST request:
curl --form "testfile=#thefilename.jpg" http://<Machine2>/<Path>
testfile is the field name used for form, if you don't care, just use any english word.
# is used here to make file thefilename.jpg get attached in the post as a file upload. Refer to curl man doc.
In server side, URL http://<Machine2>/<Path> should be listened. When curl send the previous POST request, server side program should get it, extract the attached file (thefilename.jpg), and save to disk.
I'm working on OAUTH 2.0 stuff using following curl command which is working fine in my terminal.
command -
curl -u testclient1:testpass1 http://localhost/oauth2-server/token.php -d 'grant_type=password&username=bshaffer&password=brent123'
I want to know what's equivalent HTTP request for above CURL command so that, i can use guzzle(comparatively easier) to make HTTP request to get the token. I've tried a lot of combination but not getting the right way to do it.
finally after a lot of googling I managed to find out the equivalent HTTP request of that CURL command.
Here is the command -
http://testclient1:testpass1#localhost/oauth2-server/token.php?grant_type=password&username=bshaffer&password=brent123
I am in my Terminal and I want to send a POST request to a given URL. I have tested this with a REST client so I know that the parameters work.
So lets say I want to POST the following parameters:
username=tony
password=secret
To my URL: https://exmaple.com/login/
I tried the following curl command in my Terminal (I am using OSX Lion)
curl --data "username=tony&password=secret" http://exmaple.com/login/
I get an 500 Server Error back from the server so I am now thinking of something that could be different between the REST Client and the curl command.
Thanks for your help
Update: I am using a https service. Do I have to adjust my curl command to account for this?
Try this
curl -F username=tony -F password=secret http://exmaple.com/login/
-F (reference) should probably do the same as --data? Possible the problem is in the webapp.
Maybe the app you are hitting uses basic auth for authentication? Try this one:
curl --user name:password http://exmaple.com/login/
How download only exists files with curl via commandline? I have code like this:
curl http://host.com/photos/IMG_4[200-950].jpg -u user:pass -o IMG_4#1.jpg
This command download all images from IMG_4200.jpg to IMG_4950.jpg - even if they do not exist.
use -f
(HTTP) Fail silently (no output at
all) on server errors. This is mostly
done to better enable scripts etc to
better deal with failed attempts. In
normal cases when a HTTP server fails
to deliver a document, it returns an
HTML document stating so (which often
also describes why and more). This
flag will prevent curl from outputting
that and return error 22.
This method is not fail-safe and there
are occasions where non-successful
response codes will slip through,
especially when authentication is
involved (response codes 401 and 407).