Mocking cookies with responses - python-requests

I have a piece of code that deals with cookies being set by the server as a response of an HTTP response. I'm attempting to test it using responses, like so:
responses.add(responses.GET, "http://invalid/cookies",
adding_headers = {
"Set-Cookie": "foo=bar; " +
"domain=.invalid; " +
"expires=Fri, 01-Jan-2055 00:00:00 GMT; " +
"path=/; " +
"HttpOnly",
})
I would then expect this to return my cookie, but no such luck.
session = requests.Session()
session.get('http://invalid/cookies')
session.cookies['foo'] # KeyError
Indeed, this works outside of the context of responses.activate:
session = requests.Session()
session.get('https://httpbin.org/cookies/set?foo=bar')
session.cookies['foo'] # 'bar'
How can I mock cookies with responses?

The python3-responses package in Ubuntu 16.04 is out of date. You will need to use a newer version from pip/pip3, after which this behaviour works as expected.
Do note however that this, too, is buggy at the time of writing:
responses.add(responses.GET, 'http://invalid/cookies',
adding_headers = {
"set-cookie": "foo=bar; " +
"domain=.invalid; " +
# "expires=Fri, 01-Jan-2055 00:00:00 GMT; " +
"path=/; " +
"" # "HttpOnly",
})
session = requests.Session()
session.get('http://invalid/cookies')
dict(response.cookies) # this should have one cookie
{'foo': 'bar', 'path': '/', 'domain': '.invalid'}

Related

how to quick reject multipart/form-data post request?

I have a http server with a multipart/form-data protocol interface to upload file.
There is a verification in the protocol, how can i reject the requst ahead of whole file stream uploaded.
The code like
#require_http_methods(['POST'])
#csrf_exempt
def Upload(request):
func_name = sys._getframe().f_code.co_name
request_id = str(uuid.uuid1())
logger = logging.LoggerAdapter(init_logger, {})
response_instance = Response(request_id, func_name, logger)
logger.info("====== request " + func_name + " " + request_id + " : " + str(request.POST) + " ======")
cert = request.POST.get("cert", None)
if cert is None or not check_cert():
logger.error("check cert failed")
return JsonResponse(response_instance.generator(ERROR_ERROR_DENIED, "permission denied"))
download_file = '/root/downloads/test.zip'
package = request.FILES.get("firmware_package", None)
destination = open(download_file, 'wb')
for chunk in package.chunks():
destination.write(chunk)
destination.close()
When i request server with wrong cert and big file, i have to wait a long time before the server returns "permission denied".

ERR_CONNECTION_RESET when downloading PDF with POST from ASP.Net Core controller

I created an endpoint that takes a json body from a POST request, uses that data to fill in the fields of a PDF, and send the filled out file back. When I try to send a POST request using Send and Download on Postman, I initially get a 200 OK back and Postman goes into a loading state, showing how much time the download has taken so far.
After about 2 minutes of this, I get an ECONNRESET error:
Thinking this was just a problem with Postman, I updated a React project of mine to hit the endpoint. I was expecting making the request would start the browser's built in file download feature. Instead, I got a similar error: ERR_CONNECTION_RESET.
I debugged my controller and it seems to be parsing the request body correctly. It also doesn't seem to take too long to return from the endpoint's function. I'm using the File method from ControllerBase to make the response from the file stream, and I make sure not to dispose of the file stream too early.
Per Ali Vahidinasab's request, here is the Postman request exported to C#:
var client = new RestClient("https://localhost:44398/api/charactersheet/download");
client.Timeout = -1;
var request = new RestRequest(Method.POST);
request.AddHeader("Content-Type", "application/json");
request.AddHeader("Accept", "application/pdf");
var body = #"{
" + "\n" +
#" ""abilityScores"": {
" + "\n" +
#" ""strength"": 10,
" + "\n" +
#" ""dexterity"": 11,
" + "\n" +
#" ""constitution"": 12,
" + "\n" +
#" ""intelligence"": 13,
" + "\n" +
#" ""wisdom"": 14,
" + "\n" +
#" ""charisma"": 15
" + "\n" +
#" }
" + "\n" +
#"}";
request.AddParameter("application/json", body, ParameterType.RequestBody);
IRestResponse response = client.Execute(request);
Console.WriteLine($"Status Code: {response.StatusCode}");
When I ran this client code, the response had a status code of 0, which I understand means that there is no response from the server.
I guess the crux of this question is: is this an issue with the server or the client? And what can I do to fix it?
I found the issue. It was on the server side. I had to set the position of the memory stream I was returning to 0 before returning it.

Cookie available in cookie header but not from getCookies()?

I'm seeing some odd behavior with a cookie on the server side, and would like to understand why.
On the client:
document.cookie = 'test_cookie=' + '[AB]cd|ef-gh[IJ]' + '; path=/;';
document.cookie = 'test_cookie2=' + 'cd|ef-gh' + '; path=/;';
On the server:
headers = httpServletRequest.getHeaders()
// iterate and print headers
cookies = httpServletRequest.getCookies();
// iterate and print headers
Output:
// Both are there on the header, so tomcat doesn't block it:
...
header: cookie: test_cookie=[AB]cd|ef-gh[IJ]; test_cookie2=cd|ef-gh
// Only one shows up from getCookies()
...
cookie: test_cookie2=cd|ef-gh
// no test_cookie ???
Why am I not able to see the test_cookie2?
I could uri-encode before I set it on the client, but I thought '[' and ']' were allowed cookie characters?
Is there a more correct way to set it?
Here's the way to set the cookie correctly on the frontend:
document.cookie = 'test_cookie="[AB]cd|ef-gh[IJ]"; path=/';
Not the double quotes around the cookie value that contains the special characters.

Debugging RCurl-based authentication & form submission

SourceForge Research Data Archive (SRDA) is one of the data sources for my dissertation research. I'm having difficulty in debugging the following issue related to SRDA data collection.
Data collection from SRDA requires authentication and then submitting Web form with an SQL query. Upon successful processing of the query, the system generates a text file with query results. While testing my R code for SRDA data collection, I've changed the SQL request to make sure that the results file is being regenerated. However, I've discovered that the file contents stays the same (corresponds to previous query). I think that the lack of refresh of the file contents could be due to failure of either authentication, or query form submission. The following is the debug output from the code (https://github.com/abnova/diss-floss/blob/master/import/getSourceForgeData.R):
make importSourceForge
Rscript --no-save --no-restore --verbose getSourceForgeData.R
running
'/usr/lib/R/bin/R --slave --no-restore --no-save --no-restore --file=getSourceForgeData.R'
Loading required package: RCurl
Loading required package: methods
Loading required package: bitops
Loading required package: digest
Retrieving SourceForge data...
Checking request "SELECT *
FROM sf1104.users a, sf1104.artifact b
WHERE a.user_id = b.submitted_by AND b.artifact_id = 304727"...
* About to connect() to zerlot.cse.nd.edu port 80 (#0)
* Trying 129.74.152.47... * connected
> POST /mediawiki/index.php?title=Special:Userlogin&action=submitlogin&type=login HTTP/1.1
Host: zerlot.cse.nd.edu
Accept: */*
Content-Length: 37
Content-Type: application/x-www-form-urlencoded
* upload completely sent off: 37out of 37 bytes
< HTTP/1.1 200 OK
< Date: Tue, 11 Mar 2014 03:49:04 GMT
< Server: Apache/2.2.8 (Ubuntu) PHP/5.2.4-2ubuntu5.25 with Suhosin-Patch
< X-Powered-By: PHP/5.2.4-2ubuntu5.25
* Added cookie wiki_db_session="c61...a3c" for domain zerlot.cse.nd.edu, path /, expire 0
< Set-Cookie: wiki_db_session=c61...a3c; path=/
< Content-language: en
< Vary: Accept-Encoding,Cookie
< Expires: Thu, 01 Jan 1970 00:00:00 GMT
< Cache-Control: private, must-revalidate, max-age=0
< Transfer-Encoding: chunked
< Content-Type: text/html; charset=UTF-8
<
* Connection #0 to host zerlot.cse.nd.edu left intact
[1] "Before second postForm()"
* Re-using existing connection! (#0) with host zerlot.cse.nd.edu
* Connected to zerlot.cse.nd.edu (129.74.152.47) port 80 (#0)
> POST /cgi-bin/form.pl HTTP/1.1
Host: zerlot.cse.nd.edu
Accept: */*
Cookie: wiki_db_session=c61...a3c
Content-Length: 129
Content-Type: application/x-www-form-urlencoded
* upload completely sent off: 129out of 129 bytes
< HTTP/1.1 500 Internal Server Error
< Date: Tue, 11 Mar 2014 03:49:04 GMT
< Server: Apache/2.2.8 (Ubuntu) PHP/5.2.4-2ubuntu5.25 with Suhosin-Patch
< Vary: Accept-Encoding
< Connection: close
< Transfer-Encoding: chunked
< Content-Type: text/html
<
* Closing connection #0
Error: Internal Server Error
Execution halted
make: *** [importSourceForge] Error 1
I've tried to figure this out using debug output as well as Network protocol analyzer from Firefox embedded Developer Tools, but so far without much success. Would appreciate any advice and help.
UPDATE:
if (!require(RCurl)) install.packages('RCurl')
if (!require(digest)) install.packages('digest')
library(RCurl)
library(digest)
# Users must authenticate to access Query Form
SRDA_HOST_URL <- "http://zerlot.cse.nd.edu"
SRDA_LOGIN_URL <- "/mediawiki/index.php?title=Special:Userlogin"
SRDA_LOGIN_REQ <- "&action=submitlogin&type=login"
# SRDA URL that Query Form sends POST requests to
SRDA_QUERY_URL <- "/cgi-bin/form.pl"
# SRDA URL that Query Form sends POST requests to
SRDA_QRESULT_URL <- "/qresult/blekh/blekh.txt"
# Parameters for result's format
DATA_SEP <- ":" # data separator
ADD_SQL <- "1" # add SQL to file
curl <<- getCurlHandle()
srdaLogin <- function (loginURL, username, password) {
curlSetOpt(curl = curl, cookiejar = 'cookies.txt',
ssl.verifyhost = FALSE, ssl.verifypeer = FALSE,
followlocation = TRUE, verbose = TRUE)
params <- list('wpName1' = username, 'wpPassword1' = password)
if(url.exists(loginURL)) {
reply <- postForm(loginURL, .params = params, curl = curl,
style = "POST")
#if (DEBUG) print(reply)
info <- getCurlInfo(curl)
return (ifelse(info$response.code == 200, TRUE, FALSE))
}
else {
error("Can't access login URL!")
}
}
srdaConvertRequest <- function (request) {
return (list(select = "*",
from = "sf1104.users a, sf1104.artifact b",
where = "b.artifact_id = 304727"))
}
srdaRequestData <- function (requestURL, select, from, where, sep, sql) {
params <- list('uitems' = select,
'utables' = from,
'uwhere' = where,
'useparator' = sep,
'append_query' = sql)
if(url.exists(requestURL)) {
reply <- postForm(requestURL, .params = params, #.opts = opts,
curl = curl, style = "POST")
}
}
srdaGetData <- function(request) {
resultsURL <- paste(SRDA_HOST_URL, SRDA_QRESULT_URL,
collapse="", sep="")
results.query <- readLines(resultsURL, n = 1)
return (ifelse(results.query == request, TRUE, FALSE))
}
getSourceForgeData <- function (request) {
# Construct SRDA login and query URLs
loginURL <- paste(SRDA_HOST_URL, SRDA_LOGIN_URL, SRDA_LOGIN_REQ,
collapse="", sep="")
queryURL <- paste(SRDA_HOST_URL, SRDA_QUERY_URL, collapse="", sep="")
# Log into the system
if (!srdaLogin(loginURL, USER, PASS))
error("Login failed!")
rq <- srdaConvertRequest(request)
srdaRequestData(queryURL,
rq$select, rq$from, rq$where, DATA_SEP, ADD_SQL)
if (!srdaGetData(request))
error("Data collection failed!")
}
message("\nTesting SourceForge data collection...\n")
getSourceForgeData("SELECT *
FROM sf1104.users a, sf1104.artifact b
WHERE a.user_id = b.submitted_by AND b.artifact_id = 304727")
# clean up
close(curl)
UPDATE 2 (no functions version):
if (!require(RCurl)) install.packages('RCurl')
library(RCurl)
# Users must authenticate to access Query Form
SRDA_HOST_URL <- "http://zerlot.cse.nd.edu"
SRDA_LOGIN_URL <- "/mediawiki/index.php?title=Special:Userlogin"
SRDA_LOGIN_REQ <- "&action=submitlogin&type=login"
# SRDA URL that Query Form sends POST requests to
SRDA_QUERY_URL <- "/cgi-bin/form.pl"
# SRDA URL that Query Form sends POST requests to
SRDA_QRESULT_URL <- "/qresult/blekh/blekh.txt"
# Parameters for result's format
DATA_SEP <- ":" # data separator
ADD_SQL <- "1" # add SQL to file
message("\nTesting SourceForge data collection...\n")
curl <- getCurlHandle()
curlSetOpt(curl = curl, cookiejar = 'cookies.txt',
ssl.verifyhost = FALSE, ssl.verifypeer = FALSE,
followlocation = TRUE, verbose = TRUE)
# === Authentication ===
loginParams <- list('wpName1' = USER, 'wpPassword1' = PASS)
loginURL <- paste(SRDA_HOST_URL, SRDA_LOGIN_URL, SRDA_LOGIN_REQ,
collapse="", sep="")
if (url.exists(loginURL)) {
postForm(loginURL, .params = loginParams, curl = curl, style = "POST")
info <- getCurlInfo(curl)
message("\nLogin results - HTTP status code: ", info$response.code, "\n\n")
} else {
error("\nCan't access login URL!\n\n")
}
# === Data collection ===
# Previous query was: "SELECT * FROM sf0305.users WHERE user_id < 100"
query <- list(select = "*",
from = "sf1104.users a, sf1104.artifact b",
where = "b.artifact_id = 304727")
getDataParams <- list('uitems' = query$select,
'utables' = query$from,
'uwhere' = query$where,
'useparator' = DATA_SEP,
'append_query' = ADD_SQL)
queryURL <- paste(SRDA_HOST_URL, SRDA_QUERY_URL, collapse="", sep="")
if(url.exists(queryURL)) {
postForm(queryURL, .params = getDataParams, curl = curl, style = "POST")
resultsURL <- paste(SRDA_HOST_URL, SRDA_QRESULT_URL,
collapse="", sep="")
results.query <- readLines(resultsURL, n = 1)
request <- paste(query$select, query$from, query$where)
if (results.query == request)
message("\nData request is successful, SQL query: ", request, "\n\n")
else
message("\nData request failed, SQL query: ", request, "\n\n")
} else {
error("\nCan't access data query URL!\n\n")
}
close(curl)
UPDATE 3 (server-side debugging)
Finally, I was able to get in touch with a person responsible for the system and he helped me to narrow down the issue to cookie management IMHO. Here's the error log record, corresponding to running my code:
[Fri Mar 21 15:33:14 2014] [error] [client 54.204.180.203] [Fri Mar 21
15:33:14 2014] form.pl: /tmp/sess_3e55593e436a013597cd320e4c6a2fac:
at /var/www/cgi-bin/form.pl line 43
The following is the snippet of the server-side script (Perl) that generated that error (line #1 in the script is bash interpreter directive, so reported line number 43 is most likely line number 44):
42 if (-e "/tmp/sess_$file") {
43 $session = PHP::Session->new($cgi->cookie("$session_name"));
44 $user_id = $session->get('wsUserID');
45 $user_name = $session->get('wsUserName');
The following is a session information (1) after authentication and (2) after submitting data request, obtained by tracing manual authentication and manual data request form submission:
(1) "wiki_dbUserID=449; expires=Sun, 20-Apr-2014 21:04:14 GMT;
path=/wiki_dbUserName=Blekh; expires=Sun, 20-Apr-2014 21:04:14 GMT;
path=/wiki_dbToken=deleted; expires=Thu, 21-Mar-2013 21:04:13 GMT"
(2) wiki_db_session=aaed058f97059174a59effe44b137cbc;
_ga=GA1.2.2065853334.1395410153; EDSSID=e24ff5ed891c28c61f2d1f8dec424274; wiki_dbUserName=Blekh;
wiki_dbLoggedOut=20140321210314; wiki_dbUserID=449
Would appreciate any help in figuring out the problem with my code!
Finally, finally, finally! I have figured out what was causing this problem, which gave me so much headache (figuratively and literally). It forced me to spend a lot of time reading various Internet resources (including many SO questions and answers), debugging my code and communicating with people. I spent a lot of time, but not in vain, as I learned a lot about RCurl, cookies, Web forms and HTTP protocol.
The reason appeared much simpler than I thought. While the direct reason of the form submission failure was related to cookie management, the underlying reason was using wrong parameter names (IDs) of the authentication form fields. The two pairs were very similar and it took only one extra character to trigger the whole problem.
Lesson learned: when facing issues, especially ones dealing with authentication, it's very important to check all names and IDs multiple times and very carefully to make sure they correspond the ones supposed to be used. Thank you to everyone who was helping or trying to help me with this issue!
I've simplified the code still further:
library(httr)
base_url <- "http://srda.cse.nd.edu"
loginURL <- modify_url(
base_url,
path = "mediawiki/index.php",
query = list(
title = "Special:Userlogin",
action = "submitlogin",
type = "login",
wpName1 = USER,
wpPasswor1 = PASS
)
)
r <- POST(loginURL)
stop_for_status(r)
queryURL <- modify_url(base_url, path = "cgi-bin/form.pl")
query <- list(
uitems = "user_name",
utables = "sf1104.users a, sf1104.artifact b",
uwhere = "a.user_id = b.submitted_by AND b.artifact_id = 304727",
useparator = ":",
append_query = "1"
)
r <- POST(queryURL, body = query, multipart = FALSE)
stop_for_status(r)
But I'm still getting a 500. I tried:
setting extra cookies that I see in the browser (wiki_dbUserID, wiki_dbUserName)
setting header DNT to 1
setting referer to http://srda.cse.nd.edu/cgi-bin/form.pl
setting user-agent the same as chrome
setting accept "text/html"
The following provides clarification for the scenario (error situation).
From W3C RFC 2616 - HTTP/1.1 Specification:
10.5 Server Error 5xx
Response status codes beginning with the digit "5" indicate cases in
which the server is aware that it has erred or is incapable of
performing the request. Except when responding to a HEAD request, the
server SHOULD include an entity containing an explanation of the error
situation, and whether it is a temporary or permanent condition. User
agents SHOULD display any included entity to the user. These response
codes are applicable to any request method.
10.5.1 500 Internal Server Error
The server encountered an unexpected condition which prevented it from
fulfilling the request.
My interpretation of the paragraph 10.5 is that it implies that there should be a more detailed explanation of the error situation beyond the one provided in paragraph 10.5.1. However, I recognize that it very well may be that the message for status code 500 (paragraph 10.5.1) is considered sufficient. Confirmations for either of interpretations are welcome!

How to decompress/inflate an XML response from ASP

Can anyone provide some insight into how i'd go about decompressing an XML response in classic ASP. We've been handed some code and asked to get it working:
Set oXMLHttp = Server.CreateObject("MSXML2.ServerXMLHTTP")
URL = HttpServer + re_domain + ".do;jsessionid=" + ue_session + "?" + data
oXMLHttp.setTimeouts 5000, 60000, 1200000, 1200000
oXMLHttp.open "GET", URL, false
oXMLHttp.setRequestHeader "Accept-Encoding", "gzip"
oXMLHttp.send()
if oXMLHttp.status = 200 Then
if oXMLHttp.responseText = "" then
htmlrequest_get = "Empty Response from Server"
else
htmlrequest_get = oXMLHttp.responseText
end if
else
...
Apparently now that the response is compressed using gzip, we have to un-compress the XML response before we can start to work with the data.
How should i go about this?
ServerXMLHTTP does not support compression.
You may however try to use a GZip component:
http://www.vclcomponents.com/ASP/File_Manipulation/File_Management/GZip_Component-info.html
Oooops, didn't check the date of the questions! :)

Resources