nginx keeps passing the same http_cookie to uwsgi - nginx

I have a small python app running via uwsgi with requests served by nginx.
I'm printing the environment variables... and it looks like after a couple of ok requests, nginx is sending the same HTTP_COOKIE param for unrelated requests:
For example:
{'UWSGI_CHDIR': '/ebs/py', 'HTTP_COOKIE':
'ge_t_c=4fcee8450c3bee709800920c', 'UWSGI_SCRIPT': 'server',
'uwsgi.version': '1.1.2', 'REQUEST_METHOD': 'GET', 'PATH_INFO':
'/redirect/ebebaf3b-475a-4010-9a72-96eeff797f1e', 'SERVER_PROTOCOL':
'HTTP/1.1', 'QUERY_STRING': '', 'x-wsgiorg.fdevent.readable':
, 'CONTENT_LENGTH': '',
'uwsgi.ready_fd': None, 'HTTP_USER_AGENT': 'Mozilla/5.0 (compatible;
MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)', 'HTTP_CONNECTION':
'close', 'HTTP_REFERER': 'http://www.facebook.com/', 'SERVER_NAME':
'pixel.domain.com', 'REMOTE_ADDR': '10.load.bal.ip',
'wsgi.url_scheme': 'http', 'SERVER_PORT': '80', 'wsgi.multiprocess':
True, 'uwsgi.node': 'py.domain.com', 'DOCUMENT_ROOT':
'/etc/nginx/html', 'UWSGI_PYHOME': '/ebs/py', 'uwsgi.core': 127,
'HTTP_X_FORWARDED_PROTO': 'http', 'x-wsgiorg.fdevent.writable':
, 'wsgi.input':
,
'HTTP_HOST': 'track.domain.com', 'wsgi.multithread': False,
'REQUEST_URI': '/redirect/ebebaf3b-475a-4010-9a72-96eeff797f1e',
'HTTP_ACCEPT': 'text/html, application/xhtml+xml, /',
'wsgi.version': (1, 0), 'x-wsgiorg.fdevent.timeout': None,
'HTTP_X_FORWARDED_FOR': '10.load.bal.ip', 'wsgi.errors': , 'REMOTE_PORT': '36462',
'HTTP_ACCEPT_LANGUAGE': 'en-US', 'wsgi.run_once': False,
'HTTP_X_FORWARDED_PORT': '80', 'CONTENT_TYPE': '',
'wsgi.file_wrapper': ,
'HTTP_ACCEPT_ENCODING': 'gzip, deflate'}
and
{'UWSGI_CHDIR': '/ebs/py', 'HTTP_COOKIE':
'ge_t_c=4fcee8450c3bee709800920c', 'UWSGI_SCRIPT': 'server',
'uwsgi.version': '1.1.2', 'REQUEST_METHOD': 'GET', 'PATH_INFO':
'/redirect/2391e658-95ef-4300-80f5-83dbb1a0e526', 'SERVER_PROTOCOL':
'HTTP/1.1', 'QUERY_STRING': '', 'x-wsgiorg.fdevent.readable':
, 'CONTENT_LENGTH': '',
'uwsgi.ready_fd': None, 'HTTP_USER_AGENT': 'Mozilla/5.0 (iPad; CPU OS
5_1_1 like Mac OS X) AppleWebKit/534.46 (KHTML, like Gecko)
Version/5.1 Mobile/9B206 Safari/7534.48.3', 'HTTP_CONNECTION':
'close', 'HTTP_REFERER': 'http://www.facebook.com/', 'SERVER_NAME':
'pixel.domain.com', 'REMOTE_ADDR': '10.load.balancer.ip',
'wsgi.url_scheme': 'http', 'SERVER_PORT': '80', 'wsgi.multiprocess':
True, 'uwsgi.node': 'py.domain.com', 'DOCUMENT_ROOT':
'/etc/nginx/html', 'UWSGI_PYHOME': '/ebs/py', 'uwsgi.core': 127,
'HTTP_X_FORWARDED_PROTO': 'http', 'x-wsgiorg.fdevent.writable':
, 'wsgi.input':
,
'HTTP_HOST': 'fire.domain.com', 'wsgi.multithread': False,
'REQUEST_URI': '/redirect/2391e658-95ef-4300-80f5-83dbb1a0e526',
'HTTP_ACCEPT':
'text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8',
'wsgi.version': (1, 0), 'x-wsgiorg.fdevent.timeout': None,
'HTTP_X_FORWARDED_FOR': '10.load.bal.ip', 'wsgi.errors': , 'REMOTE_PORT': '39498',
'HTTP_ACCEPT_LANGUAGE': 'en-us', 'wsgi.run_once': False,
'HTTP_X_FORWARDED_PORT': '80', 'CONTENT_TYPE': '',
'wsgi.file_wrapper': ,
'HTTP_ACCEPT_ENCODING': 'gzip, deflate'}
These are 2 distinct clients. I opened an incognito session, confirmed that no cookie was sent in the headers, and the uwsgi log shows that it received the same HTTP_COOKIE.
How can I make sure that nginx only passes the proper information for the current request, without regard to other requests?

Figured it out...
I had to add this line to uwsgi_params in /etc/nginx/
uwsgi_param HTTP_COOKIE $http_cookie;
Without it, the HTTP_COOKIE variable could not be trusted in uwsgi/python app.

Related

Posting httr request to API with xmlquery in body

I am trying to retrieve data from an API with stock data from this site: nasdaqomxnordic.com.
So far, I have succesfully constructed a curl call by inspecting the page in Google Chrome. The curl call works great when I send it from reqbin.com and I receive the expected data.
However, when I try to translate it into an httr request in R, I can't make it work. It seems that httr connects succesfully to the server, but nothing happens after that - the command just keeps running forever in r, until I cancel it manually. I suspect that the issue lies in how the body of the request (the xmlquery) is sent so I have tried various formats and tried curlconverter.com, but with no luck (see code below).
Do you guys have any idea about what could be wrong?
The curl code looks like this (this works perfectly on reqbin.com):
curl 'https://www.nasdaqomxnordic.com/webproxy/DataFeedProxy.aspx' \
-H 'authority: www.nasdaqomxnordic.com' \
-H 'accept: */*' \
-H 'accept-language: se-SE,se;q=0.9,en-US;q=0.8,en;q=0.7' \
-H 'content-type: application/x-www-form-urlencoded; charset=UTF-8' \
-H 'origin: https://www.nasdaqomxnordic.com' \
-H 'referer: https://www.nasdaqomxnordic.com/aktier/microsite?Instrument=CSE1158&name=Novo%20Nordisk%20B&ISIN=DK0060534915' \
-H 'sec-ch-ua: "Not?A_Brand";v="8", "Chromium";v="108", "Google Chrome";v="108"' \
-H 'sec-ch-ua-mobile: ?0' \
-H 'sec-ch-ua-platform: "Windows"' \
-H 'sec-fetch-dest: empty' \
-H 'sec-fetch-mode: cors' \
-H 'sec-fetch-site: same-origin' \
-H 'user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36' \
-H 'x-requested-with: XMLHttpRequest' \
--data-raw 'xmlquery=%3Cpost%3E%0A%3Cparam+name%3D%22Exchange%22+value%3D%22NMF%22%2F%3E%0A%3Cparam+name%3D%22SubSystem%22+value%3D%22History%22%2F%3E%0A%3Cparam+name%3D%22Action%22+value%3D%22GetDataSeries%22%2F%3E%0A%3Cparam+name%3D%22AppendIntraDay%22+value%3D%22no%22%2F%3E%0A%3Cparam+name%3D%22Instrument%22+value%3D%22CSE1158%22%2F%3E%0A%3Cparam+name%3D%22FromDate%22+value%3D%222021-12-25%22%2F%3E%0A%3Cparam+name%3D%22ToDate%22+value%3D%222022-12-25%22%2F%3E%0A%3Cparam+name%3D%22hi__a%22+value%3D%220%2C5%2C6%2C3%2C1%2C2%2C4%2C21%2C8%2C10%2C12%2C9%2C11%22%2F%3E%0A%3Cparam+name%3D%22ext_xslt%22+value%3D%22%2FnordicV3%2Fhi_csv.xsl%22%2F%3E%0A%3Cparam+name%3D%22OmitNoTrade%22+value%3D%22true%22%2F%3E%0A%3Cparam+name%3D%22ext_xslt_lang%22+value%3D%22en%22%2F%3E%0A%3Cparam+name%3D%22ext_xslt_options%22+value%3D%22%2Cadjusted%2C%22%2F%3E%0A%3Cparam+name%3D%22ext_contenttype%22+value%3D%22application%2Fms-excel%22%2F%3E%0A%3Cparam+name%3D%22ext_contenttypefilename%22+value%3D%22NOVO_B-2021-12-25-2022-12-25.csv%22%2F%3E%0A%3Cparam+name%3D%22ext_xslt_hiddenattrs%22+value%3D%22%2Civ%2Cip%2C%22%2F%3E%0A%3Cparam+name%3D%22ext_xslt_tableId%22+value%3D%22historicalTable%22%2F%3E%0A%3Cparam+name%3D%22DefaultDecimals%22+value%3D%22false%22%2F%3E%0A%3Cparam+name%3D%22app%22+value%3D%22%2Faktier%2Fmicrosite%22%2F%3E%0A%3C%2Fpost%3E' \
--compressed
My best try at an httr equivalent look like this:
require(httr)
headers = c(
`authority` = 'www.nasdaqomxnordic.com',
`accept` = '*/*',
`accept-language` = 'se-SE,se;q=0.9,en-US;q=0.8,en;q=0.7',
`content-type` = 'application/x-www-form-urlencoded; charset=UTF-8',
`origin` = 'https://www.nasdaqomxnordic.com',
`referer` = 'https://www.nasdaqomxnordic.com/aktier/microsite?Instrument=CSE1158&name=Novo%20Nordisk%20B&ISIN=DK0060534915',
`sec-ch-ua` = '"Not?A_Brand";v="8", "Chromium";v="108", "Google Chrome";v="108"',
`sec-ch-ua-mobile` = '?0',
`sec-ch-ua-platform` = '"Windows"',
`sec-fetch-dest` = 'empty',
`sec-fetch-mode` = 'cors',
`sec-fetch-site` = 'same-origin',
`user-agent` = 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/105.0.0.0 Safari/537.36',
`x-requested-with` = 'XMLHttpRequest'
)
data = 'xmlquery=%3Cpost%3E%0A%3Cparam+name%3D%22Exchange%22+value%3D%22NMF%22%2F%3E%0A%3Cparam+name%3D%22SubSystem%22+value%3D%22History%22%2F%3E%0A%3Cparam+name%3D%22Action%22+value%3D%22GetDataSeries%22%2F%3E%0A%3Cparam+name%3D%22AppendIntraDay%22+value%3D%22no%22%2F%3E%0A%3Cparam+name%3D%22Instrument%22+value%3D%22CSE1158%22%2F%3E%0A%3Cparam+name%3D%22FromDate%22+value%3D%222021-12-25%22%2F%3E%0A%3Cparam+name%3D%22ToDate%22+value%3D%222022-12-25%22%2F%3E%0A%3Cparam+name%3D%22hi__a%22+value%3D%220%2C5%2C6%2C3%2C1%2C2%2C4%2C21%2C8%2C10%2C12%2C9%2C11%22%2F%3E%0A%3Cparam+name%3D%22ext_xslt%22+value%3D%22%2FnordicV3%2Fhi_csv.xsl%22%2F%3E%0A%3Cparam+name%3D%22OmitNoTrade%22+value%3D%22true%22%2F%3E%0A%3Cparam+name%3D%22ext_xslt_lang%22+value%3D%22en%22%2F%3E%0A%3Cparam+name%3D%22ext_xslt_options%22+value%3D%22%2Cadjusted%2C%22%2F%3E%0A%3Cparam+name%3D%22ext_contenttype%22+value%3D%22application%2Fms-excel%22%2F%3E%0A%3Cparam+name%3D%22ext_contenttypefilename%22+value%3D%22NOVO_B-2021-12-25-2022-12-25.csv%22%2F%3E%0A%3Cparam+name%3D%22ext_xslt_hiddenattrs%22+value%3D%22%2Civ%2Cip%2C%22%2F%3E%0A%3Cparam+name%3D%22ext_xslt_tableId%22+value%3D%22historicalTable%22%2F%3E%0A%3Cparam+name%3D%22DefaultDecimals%22+value%3D%22false%22%2F%3E%0A%3Cparam+name%3D%22app%22+value%3D%22%2Faktier%2Fmicrosite%22%2F%3E%0A%3C%2Fpost%3E'
http_response <- httr::POST(url = 'https://www.nasdaqomxnordic.com/webproxy/DataFeedProxy.aspx', httr::add_headers(.headers=headers), body = data, verbose())
I have also tried this alternate formatting of the data-part of the request, but with no better luck:
data = list(
`xmlquery` = '<post>\n<param name="Exchange" value="NMF"/>\n<param name="SubSystem" value="History"/>\n<param name="Action" value="GetDataSeries"/>\n<param name="AppendIntraDay" value="no"/>\n<param name="Instrument" value="CSE1158"/>\n<param name="FromDate" value="2021-12-25"/>\n<param name="ToDate" value="2022-12-25"/>\n<param name="hi__a" value="0,5,6,3,1,2,4,21,8,10,12,9,11"/>\n<param name="ext_xslt" value="/nordicV3/hi_csv.xsl"/>\n<param name="OmitNoTrade" value="true"/>\n<param name="ext_xslt_lang" value="en"/>\n<param name="ext_xslt_options" value=",adjusted,"/>\n<param name="ext_contenttype" value="application/ms-excel"/>\n<param name="ext_contenttypefilename" value="NOVO_B-2021-12-25-2022-12-25.csv"/>\n<param name="ext_xslt_hiddenattrs" value=",iv,ip,"/>\n<param name="ext_xslt_tableId" value="historicalTable"/>\n<param name="DefaultDecimals" value="false"/>\n<param name="app" value="/aktier/microsite"/>\n</post>'
)
```

How to web scrape AQI from airnow?

I am trying to scrape the current AQI in my location by beautifulsoup 4.
url = "https://www.airnow.gov/?city=Burlingame&state=CA&country=USA"
header = {
"User-Agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/84.0.4147.125 Safari/537.36",
"Accept-Language": "en-GB,en-US;q=0.9,en;q=0.8"
}
response = requests.get(url, headers=header)
soup = BeautifulSoup(response.content, "lxml")
aqi = soup.find("div", class_="aqi")
when I print the aqi, it is just empty div like this:
However, on the website, there should be a element inside this div containing the aqi number that I want.

Python requests POST returns 400 instead 200

I'm expecting 200 though 400 get's returned.
Does one see what I'm doing wrong in my request?
Code:
import requests
import json
import lxml.html
from lxml.cssselect import CSSSelector
from lxml.etree import fromstring
SELECTOR = CSSSelector('[type=hidden]')
BASE_URL = 'https://www.bonuscard.ch/myos/en/login'
LOGIN_URL = BASE_URL+'1.IFormSubmitListener-homePanel-loginPanel-loginForm'
# headers copied from chromium (returns 200)
headers = {
"Accept" : "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8",
"Accept-Encoding" : "gzip, deflate, br",
"Accept-Language" : "en,de;q=0.9",
"Cache-Control" : "no-cache",
"Connection" : "keep-alive",
"Content-Length" : "151",
"Content-Type" : "application/x-www-form-urlencoded",
"DNT" : "1",
"Host" : "www.bonuscard.ch",
"Origin" : "https: //www.bonuscard.ch",
"Pragma" : "no-cache",
"Referer" : "https: //www.bonuscard.ch/myos/en/login",
"Upgrade-Insecure-Requests" : "1",
"User-Agent" : "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/65.0.3325.181 Safari/537.36"
}
with requests.Session() as session:
response = session.get(BASE_URL)
tree = lxml.html.fromstring(response.content)
keyOnly_token = [e.get('id') for e in SELECTOR(tree)][0]
payload = {
keyOnly_token:"",
"userName-border:userName-border_body:userName ": "jon#doe.com",
"password-border:password-border_body:password ": "123",
"login ": ""
}
response = session.post(LOGIN_URL,headers=headers,data=payload)
# Returns 400
print(response)
These changes displayed no difference either:
POST without headers
POST with json=payload instead of data=payload
Thanks to Ivan's direction I found this curl to requests converter which was the solution https://curl.trillworks.com/#

Erlang HTTP problems

I can't make a request with Erlang/Cowboy at all. I can make one from the erlang shell but not when running a cowboy release. I've tried using the 'hackney' library as well:
hackney:start(),
{ok, _, _, Ref} = hackney:request(
get, <<"http://www.youtube.com">>, [], <<>>, [{pool, default}]
),
{ok, Body} = hackney:body(Ref),
io:format("body: ~p~n~n", [Body]),
Error:
Error in process <0.361.0> on node 'cta_erlang_backend#127.0.0.1' with exit value:
{[{reason,undef},
{mfa,{hello_handler,handle,2}},
{stacktrace,[{hackney,start,[],[]},
{hello_handler,handle,2,
[{file,"src/hello_handler.erl"},{line,18}]},
{cowboy_handler,handler_handle,4,
[{file,"src/cowboy_handler.erl"},{line,111}]},
{cowboy_protocol,execute,4,
[{file,"src/cowboy_protocol.erl"},
{line,442}]}]},
{req,[{socket,#Port<0.267>},
{transport,ranch_tcp},
{connection,keepalive},
{pid,<0.361.0>},
{method,<<"POST">>},
{version,'HTTP/1.1'},
{peer,{{10,0,0,1},40049}},
{host,<<"10.0.0.103">>},
{host_info,undefined},
{port,8080},
{path,<<"/">>},
{path_info,undefined},
{qs,<<>>},
{qs_vals,undefined},
{bindings,[]},
{headers,[{<<"host">>,<<"10.0.0.103:8080">>},
{<<"connection">>,<<"keep-alive">>},
{<<"content-length">>,<<"4">>},
{<<"cache-control">>,<<"no-cache">>},
{<<"origin">>,
<<"chrome-extension://fdmmgilgnpjigdojojpjoooidkmcomcm">>},
{<<"user-agent">>,
<<"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/39.0.2171.65 Chrome/39.0.2171.65 Safari/537.36">>},
{<<"content-type">>,<<"text/plain;charset=UTF-8">>},
{<<"accept">>,<<"*/*">>},
{<<"accept-encoding">>,<<"gzip, deflate">>},
{<<"accept-language">>,<<"en-GB,en-US;q=0.8,en;q=0.6">>}]},
{p_headers,[{<<"connection">>,[<<"keep-alive">>]}]},
{cookies,undefined},
{meta,[]},
{body_state,waiting},
{buffer,<<"asdf">>},
{multipart,undefined},
{resp_compress,false},
{resp_state,waiting},
{resp_headers,[]},
{resp_body,<<>>},
{onresponse,undefined}]},
{state,{state}}],
[{cowboy_protocol,execute,4,[{file,"src/cowboy_protocol.erl"},{line,442}]}]}
=ERROR REPORT==== 19-Oct-2016::18:56:51 ===
Ranch listener my_http_listener had connection process started with cowboy_protocol:start_link/4 at <0.361.0> exit with reason:
{[{reason,undef},{mfa,{hello_handler,handle,2}},{stacktrace,[{hackney,start,[],[]},{hello_handler,handle,2,[{file,"src/hello_handler.erl"},{line,18}]},{cowboy_handler,handler_handle,4,[{file,"src/cowboy_handler.erl"},{line,111}]},{cowboy_protocol,execute,4,[{file,"src/cowboy_protocol.erl"},{line,442}]}]},{req,[{socket,#Port<0.267>},{transport,ranch_tcp},{connection,keepalive},{pid,<0.361.0>},{method,<<"POST">>},{version,'HTTP/1.1'},{peer,{{10,0,0,1},40049}},{host,<<"10.0.0.103">>},{host_info,undefined},{port,8080},{path,<<"/">>},{path_info,undefined},{qs,<<>>},{qs_vals,undefined},{bindings,[]},{headers,[{<<"host">>,<<"10.0.0.103:8080">>},{<<"connection">>,<<"keep-alive">>},{<<"content-length">>,<<"4">>},{<<"cache-control">>,<<"no-cache">>},{<<"origin">>,<<"chrome-extension://fdmmgilgnpjigdojojpjoooidkmcomcm">>},{<<"user-agent">>,<<"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/39.0.2171.65 Chrome/39.0.2171.65 Safari/537.36">>},{<<"content-type">>,<<"text/plain;charset=UTF-8">>},{<<"accept">>,<<"*/*">>},{<<"accept-encoding">>,<<"gzip, deflate">>},{<<"accept-language">>,<<"en-GB,en-US;q=0.8,en;q=0.6">>}]},{p_headers,[{<<"connection">>,[<<"keep-alive">>]}]},{cookies,undefined},{meta,[]},{body_state,waiting},{buffer,<<"asdf">>},{multipart,undefined},{resp_compress,false},{resp_state,waiting},{resp_headers,[]},{resp_body,<<>>},{onresponse,undefined}]},{state,{state}}],[{cowboy_protocol,execute,4,[{file,"src/cowboy_protocol.erl"},{line,442}]}]}
hello_handler.erl:
-module(hello_handler).
-behaviour(cowboy_http_handler).
-export([init/3]).
-export([handle/2]).
-export([terminate/3]).
-record(state, {
}).
init(_, Req, _Opts) ->
hackney:start(),
{ok, Req, #state{}}.
handle(Req, State) ->
{Method, Req2} = cowboy_req:method(Req),
case Method of
<<"POST">> ->
{ok, _, _, Ref} = hackney:request(get, <<"http://www.youtube.com">>,
[], <<>>, [{pool, default}]),
{ok, Body} = hackney:body(Ref),
io:format("body: ~p~n~n", [Body]),
ResponseBody = <<"Hello Erl POST!">>;
<<"GET">> ->
ResponseBody = <<"Hello Erlang1!">>
end,
{ok, Req2} = cowboy_req:reply(200,
[{<<"content-type">>, <<"text/plain">>}],
ResponseBody,
Req),
{ok, Req2, State}.
terminate(_Reason, _Req, _State) ->
ok.
{[{reason,undef},
{mfa,{hello_handler,handle,2}},
{stacktrace,[{hackney,start,[],[]},
{hello_handler,handle,2,
[{file,"src/hello_handler.erl"},{line,18}]},
{cowboy_handler,handler_handle,4,
[{file,"src/cowboy_handler.erl"},{line,111}]},
{cowboy_protocol,execute,4,
[{file,"src/cowboy_protocol.erl"},
{line,442}]}]},
Crash at cowboy_handler.erl 111 line, https://github.com/ninenines/cowboy/blob/1.1.x/src/cowboy_handler.erl#L111
Reason: hello_handler:handle/2 is undef
So
Make sure your hello_handler.erl in src dir;
Compile it with rebar compile;
restart server or l(hello_handler) in erlang shell

how to login to website using HTTP Client in Delphi xe

i am trying to implement the HTTP Client in my project, i cant login to my account,i get Forbidden!, with IdHTTP its working well, whats is missing or wrong in my code ?
NetHTTPClient1 properties:
Connectiontimeout = 30000
AllowCookies = True
HandleRedirects = True
UserAgent = Mozilla/5.0 (Linux; Android 6.0; Nexus 5 Build/MRA58N) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.76 Mobile Safari/537.36
NetHTTPRequest1 Properties :
Method String = POST
URL = https://www.instagram.com/accounts/web_create_ajax/attempt/
Code:
procedure TForm2.Button1Click(Sender: TObject);
var
Params : TStrings;
lHTTP: TIdHTTP;
IdSSL : TIdSSLIOHandlerSocketOpenSSL;
N: Integer;
Token,email,S: string;
Reply: TStringList;
Cookie: TIdCookie;
begin
lHTTP := TIdHTTP.Create(nil);
try
IdSSL := TIdSSLIOHandlerSocketOpenSSL.Create(lHTTP);
IdSSL.SSLOptions.Method := sslvTLSv1;
IdSSL.SSLOptions.Mode := sslmClient;
lHTTP.IOHandler := IdSSL;
lHTTP.ReadTimeout := 30000;
lHTTP.HandleRedirects := True;
lHTTP.Request.UserAgent := 'Mozilla/5.0 (Linux; Android 6.0; Nexus 5 Build/MRA58N) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.76 Mobile Safari/537.36';
lHTTP.Get('https://www.instagram.com', TStream(nil));
Cookie := lHTTP.CookieManager.CookieCollection.Cookie['csrftoken', 'www.instagram.com'];
if Cookie <> nil then
Token := Cookie.Value;
finally
end;
try
Params := TStringList.Create;
Params.Add('username=' +'myusername');
Params.Add('password=' + 'mypassword');
NetHTTPClient1.CustomHeaders['X-CSRFToken'] := Token;
NetHTTPClient1.CustomHeaders['X-Instagram-AJAX'] := '1';
NetHTTPClient1.CustomHeaders['X-Requested-With'] := 'XMLHttpRequest';
NetHTTPClient1.CustomHeaders['Referer'] := 'https://www.instagram.com/';
Memo1.Lines.Add(NetHTTPRequest1.Post('https://www.instagram.com/accounts/login/ajax/', Params).StatusText);
finally
end;
///login with IdHTTP///Wroks//
try
lHTTP.Request.CustomHeaders.Values['X-CSRFToken'] := Token;
lHTTP.Request.CustomHeaders.Values['X-Instagram-AJAX'] := '1';
lHTTP.Request.CustomHeaders.Values['X-Requested-With'] := 'XMLHttpRequest';
lHTTP.Request.Referer := 'https://www.instagram.com/';
lHTTP.Request.ContentType := 'application/x-www-form-urlencoded';
lHTTP.Request.UserAgent := 'Mozilla/5.0 (Linux; Android 6.0; Nexus 5 Build/MRA58N) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.76 Mobile Safari/537.36';
Reply := lHTTP.Post('https://www.instagram.com/accounts/login/ajax/', Params);
Memo1.Lines.Add(Reply);
end;
TNetHTTPClient is buggy with handleRedirect and post. https://quality.embarcadero.com/browse/RSP-14671
after when you login, you receive the cookie (the key in some way) and you must use theses cookies in all futur connexion.
"TNetHTTPClient is buggy with handleRedirect and post. "
This is already fix in version: 10.2 Tokyo Release 2

Resources