curlHandle <- getCurlHandle(ssl.verifypeer = FALSE)
url = "http://www.google.com"
Responseresult=postForm(url,curl=curlHandle,style="POST")
This is the code that I have used and I want to get the status code and the success or failure message that is sent by the server but by only using curl.
Related
I used the following Python code to retrieve a web page behind a login page successfully for some years:
username = 'user'
password = 'pass'
login_url = 'https://company.com/login?url='
redirect_url = 'https://epaper.company.com/'
data = { 'email' : username, 'pass' : password }
initial_url = login_url + quote(redirect_url)
response = requests.post(initial_url, data=data)
Then something changed at company.com about 2 months ago, and the request returned status code 400. I tried changing the data parameter to json (response = requests.post(initial_url, json=data)) which gave me a 200 response telling me a wrong password was provided.
Any ideas what I could try to debug?
Thanks,
Jan
Update: I just tried using a requests session to retrieve the csrf_token from the login page (as suggested here), so now my code reads:
with requests.Session() as sess:
response = sess.get(login_url)
signin = BeautifulSoup(response._content, 'html.parser')
data['csrf_token'] = signin.find('input', {'name':'csrf_token'})['value']
response = sess.post(initial_url, data=data)
Unfortunately, the response is still 400 (and 200/wrong password with the json parameter).
First: When you send data=data, used {"Content-Type":"application/x-www-form-urlencoded"}; if you send json=data, in headers response should be used {"Content-Type":"application/json"}
Second: Perhaps redirects have been added. Try to add:
response = sess.post(url, data=data)
print("URL you expect", url)
print("Last request URL:", response.url)
Be sure to check:
print(sess.cookies.get_dict())
print(response.headers)
If you get an unexpected result when checking, change the code like this:
response = sess.post(url, data=data, allow_redirects=False)
I am trying to access a RapidAPI with the httr package in R as follows:
library(httr)
url <- "https://extract-news.p.rapidapi.com/v0/article"
queryString <- list(url = "https://www.theverge.com/2020/4/17/21224728/bill-gates-coronavirus-lies-5g-covid-19")
response <- VERB("GET", url, addheaders(x_rapidapi_key ="my-api-key", x_rapidapi_host = "extract-news.p.rapidapi.com"), query = queryString, contenttype("application/octet-stream"))
content(response, "text")
I tried accessing other APIs too. However, I keep getting this error message:
Status: 401
Could you please help me solve this issue?
Thanks a lot in advance!
Try using content_type() and add_headers instead of contenttype and addheaders to tell the server what sort of data you are sending. Read more about the VERB method here.
library(httr)
url <- "https://extract-news.p.rapidapi.com/v0/article"
queryString <- list(url = "https://www.theverge.com/2020/4/17/21224728/bill-gates-coronavirus-lies-5g-covid-19")
response <- VERB("GET", url, add_headers(x_rapidapi-host = 'extract-news.p.rapidapi.com', x_rapidapi-key = '*************************', '), query = queryString, content_type("application/octet-stream"))
content(response, "text")
can anyone help me how to get partial data from url ,these code is giving me failed to decode error
import requests
url = "http://tools.ietf.org/rfc/rfc2822.txt"
start=24
end=30
headers = {"Range":f"bytes={start}-{end}"}
r = requests.get(url,stream=True,headers=headers)
print(r.text)
You ask for a range of the resource, which the server reponds with, but due to the standard header Accept-Encoding: "gzip, deflate", the server sends the bytes back of the encoded resource. In order to retrieve the non-encoded bytes use the following header:
import requests
url = "http://tools.ietf.org/rfc/rfc2822.txt"
start=24
end=30
headers = {"Range":f"bytes={start}-{end}", "Accept-Encoding": None}
r = requests.get(url,stream=True,headers=headers)
print(r.text)
I am attempting to retrieve an access token using my refresh token, client id and client secret for the youtube api using R Code.
This is google's example of how to POST a request.
POST /o/oauth2/token HTTP/1.1 Host: accounts.google.com Content-Type: application/x-www-form-urlencoded client_id=21302922996.apps.googleusercontent.com&client_secret=XTHhXh1SlUNgvyWGwDk1EjXB&refresh_token=1/6BMfW9j53gdGImsixUH6kU5RsR4zwI9lUVX-tqf8JXQ&grant_type=refresh_token
This was my r code:
library(httr)
url<- paste("https://accounts.google.com/o/oauth2/token?client_id=", client_id, "&client_secret=", client_secret, "&refresh_token=", refresh_token, "&grant_type=access_token", sep="")
POST(url)
And I keep getting this response:
Response [https://accounts.google.com/o/oauth2/token?client_id=xxxxxxxxxx&client_secret=xxxxxxxx&refresh_token=xxxxxxxxxxxxxxxxxxxxxx&grant_type=refresh_token]
Date: 2015-09-02 16:43
Status: 400
Content-Type: application/json
Size: 102 B
{
"error" : "invalid_request",
"error_description" : "Required parameter is missing: grant_type"
Is there a better way to do this? Maybe using RCurl? If so, what would the format of the request be? I would appreciate help on this!
The RAdwords package has a function to retrieve the refresh token. If you don't want to add the entire package you can just add the following code to your script.
refreshToken = function(google_auth) {
# This function refreshes the access token.
# The access token deprecates after one hour and has to updated
# with the refresh token.
#
# Args:
# access.token$refreh_token and credentials as input
# Returns:
# New access.token with corresponding time stamp
rt = rjson::fromJSON(RCurl::postForm('https://accounts.google.com/o/oauth2/token',
refresh_token=google_auth$access$refresh_token,
client_id=google_auth$credentials$c.id,
client_secret=google_auth$credentials$c.secret,
grant_type="refresh_token",
style="POST",
.opts = list(ssl.verifypeer = FALSE)))
access <- rt
access
}
I'm creating an QuickTest script that get all URL's for a page and check if the status of http request where code is 200. But my question is how do I get the http request status code from a URL ?
This was my solution:
Set Http = CreateObject("Microsoft.XMLHttp")
Set oDesc = Description.Create()
oDesc("micclass").Value = "Link"
Set EditCollection = Browser("Browser").Page("Page").ChildObjects(oDesc)
URL_TEMP = EditCollection(i).GetROProperty("href")
Http.Open "GET" , URL_TEMP , false
Http.Send
msgbox Http.Status