I'm using curl to perform a POST request, but I can't assume my target platform to have curl available, so I'm trying to rewrite my curl request in HTTP (which is guaranteed to be available). My knowledge of both curl and HTTP is very limited, so I'm hoping someone can point out what I'm doing wrong.
My curl request (command line):
curl.exe POST https://xxxxxx.ingest.sentry.io/api/xxxxxxx/minidump/?sentry_key=xxxxxxxxxxxxxxxxxxxxxxx -F upload_file_minidump=#"C:\path\Minidump.dmp" -F upload_file_log=#"C:\path\program.log"
A relevant part of curl's output is shown below. This is after connecting to the server and sending it the POST request. The server now lets the client know the first file can be sent, and curl responds first by sending the file's own header, and then the data (clamped here)
<= Recv header, 23 bytes (0x17)
0000: HTTP/1.1 100 Continue
=> Send data, 175 bytes (0xaf)
0000: --------------------------f2a4a742c08bf427
002c: Content-Disposition: form-data; name="upload_file_minidump"; fil
006c: ename="UE4Minidump.dmp"
0085: Content-Type: application/octet-stream
00ad:
=> Send data, 16384 bytes (0x4000)
0000: MDMP..a..... .......m/S`.........................;..............
0040: 8Z......T...=...........`.......8...........T................[..
0080: .........\...........]..........= ..............................
00c0: ....................................aJ.......`......Lw..........
0100: ............T........?..i/S`........ ... ... ............ ......
0140: ............G.M.T. .S.t.a.n.d.a.r.d. .T.i.m.e...................
0180: ................................G.M.T. .D.a.y.l.i.g.h.t. .T.i.m.
01c0: e...................................................1.9.0.4.1...
..etc..
By reading the verbose output of curl, I've created a HTTP request looking like this (c++ code using unreal engine 4 libraries):
TSharedRef<IHttpRequest, ESPMode::ThreadSafe> httpRequest = FHttpModule::Get().CreateRequest();
httpRequest->SetURL(TEXT("https://xxxxxx.ingest.sentry.io/api/xxxxxx/minidump/?sentry_key=xxxxxxxxxxxxxxxxxxxxxxxx"));
httpRequest->SetVerb(TEXT("POST"));
const FString boundary(TEXT("------------------------f2a4a742c08bf427"));
httpRequest->SetHeader(TEXT("Content-Type"), TEXT("multipart/form-data; boundary=") + boundary);
const FString fileName(FPaths::Combine(path, crashToReport.folderName, TEXT("UE4Minidump.dmp")));
ensure(FPaths::FileExists(fileName));
const FString prefixBoundary(TEXT("\r\n--") + boundary + TEXT("\r\n"));
const FString fileHeader(TEXT("Content-Disposition: form-data; name=\"upload_file_minidump\"; filename=\"UE4Minidump.dmp\"\r\nContent-Type: application/octet-stream\r\n\r\n"));
FString fileContents;
FFileHelper::LoadFileToString(fileContents, *fileName);
const FString suffixBoundary(TEXT("\r\n--") + boundary + TEXT("--\r\n"));
const FString content(prefixBoundary + fileHeader + fileContents + suffixBoundary);
httpRequest->SetContentAsString(content);
This works to a degree, the server now accepts this, and will receive the file - however the file ends up being unreadable server-side, leading me to think I'm not sending it in the right format.
What kind of data is expected in a multipart/form-data request?
A thing I notice is that the curl request sends the file's header separately (the first chunk of 175 bytes). I would love some information on how to achieve that!
I finally figured it out. I'm not sure how to report what I was doing wrong, but I think it had to do with what happens under the hood in:
httpRequest->SetContentAsString(..)
, which reliably caused the backend to fail to interpret the binary file I was trying to send. I ended up reading the binary file .. as a binary file:
TArray<uint8> dumpFileData;
FFileHelper::LoadFileToArray(dumpFileData, *FPaths::Combine(path,crashToReport.folderName, TEXT("UE4Minidump.dmp")));
Then send it via the POST request, similarly as before, but adding the complete form data part as binary data:
TSharedRef<IHttpRequest, ESPMode::ThreadSafe> httpRequest = FHttpModule::Get().CreateRequest();
httpRequest->SetURL(TEXT("https://xxxxx.ingest.sentry.io/api/xxxxxx/minidump/?sentry_key=xxxxxxxxxxxxxxxxxxxxxxxxxxx"));
httpRequest->SetVerb(TEXT("POST"));
const FString boundary(TEXT("------------------------bb33b671b1212234"));
httpRequest->SetHeader(TEXT("Content-Type"), TEXT("multipart/form-data; boundary=") + boundary);
httpRequest->SetHeader(TEXT("Accept"), TEXT("*/*"));
httpRequest->SetHeader(TEXT("Expect"), TEXT("100-continue"));
{
const FString prefixBoundary(TEXT("--") + boundary + TEXT("\r\n"));
const FString fileHeader(TEXT("Content-Disposition: form-data; name=\"upload_file_minidump\"; filename=\"UE4Minidump.dmp\"\nContent-Type: application/octet-stream\r\n\r\n"));
const FString suffixBoundary(TEXT("\r\n--") + boundary + TEXT("--\r\n"));
TArray<uint8> CombinedContent;
CombinedContent.Append(FStringToUint8(prefixBoundary + fileHeader));
CombinedContent.Append(dumpFileData);
CombinedContent.Append(FStringToUint8(suffixBoundary));
httpRequest->SetContent(CombinedContent);
}
httpRequest->ProcessRequest();
For completeness, FStringToUint8 is defined as follows:
// Convert FString to UTF8 and put it in a TArray
TArray<uint8> FStringToUint8(const FString& InString)
{
TArray<uint8> OutBytes;
// Handle empty strings
if (InString.Len() > 0)
{
FTCHARToUTF8 Converted(*InString); // Convert to UTF8
OutBytes.Append(reinterpret_cast<const uint8*>(Converted.Get()), Converted.Length());
}
return OutBytes;
}
Related
I’m trying to rework a script I found online to control a Panasonic TV, which requires a secure/encrypted pairing to occur so I can control it remotely. (The full code here -> https://forum.logicmachine.net/showthread.php?tid=232&pid=16580#pid16580)
Because it seems to be built on LuaJIT and has some other proprietary Lua elements; I’m trying to find alternatives that will allow it to work with the 5.1 Lua install on a Vera Home Automation controller (a relatively closed system).
Also, and perhaps most important for me is that I’d love to make as much of the converted code have minimal requirements to call external modules. I should add I’ve only recently started learning Lua, but one way I like to learn is to convert/repurpose code I find online..
So far i’ve managed to find alternatives for a number of the modules being used, e.g
encdec.base64dec -> Lua Base64 Encode
lmcore.hextostr -> https://github.com/tst2005/binascii/blob/master/binascii.lua
storage.set -> Alternative found in Vera Home Controllers
storage.get -> Alternative found in Vera Home Controllers
bit.ban -> Bitware module in Vera Home Controllers
bit.bxor -> Bitware module in Vera Home Controllers
Where I’m stuck is with the following..
aes:new
aes.cipher
user.aes
encdec.hmacsha256
Here’s an extract of the code where the above are used.
function encrypt_soap_payload(data, key, hmac_key, iv)
payload = '000000000000'
n = #data
payload = payload .. string.char(bit.band(bit.rshift(n, 24), 0xFF))
payload = payload .. string.char(bit.band(bit.rshift(n, 16), 0xFF))
payload = payload .. string.char(bit.band(bit.rshift(n, 8), 0xFF))
payload = payload .. string.char(bit.band(n, 0xFF))
payload = payload .. data
aes_cbc, err = aes:new(key, nil, aes.cipher(128, 'cbc'), { iv = iv }, nil, 1)
ciphertext = aes_cbc:encrypt(payload)
sig = encdec.hmacsha256(ciphertext, hmac_key, true)
encrypted_payload = encdec.base64enc(ciphertext .. sig)
return encrypted_payload
end
function decrypt_soap_payload(data, key, hmac_key, iv)
aes_cbc, err = aes:new(key, nil, aes.cipher(128, 'cbc'), { iv = iv }, nil, 0)
decrypted = aes_cbc:decrypt(encdec.base64dec(data))
decrypted = string.gsub(string.sub(lmcore.strtohex(decrypted), 33), '%x%x', function(value) return string.char(tonumber(value, 16)) end)
return decrypted
end
I can get the the point where I can create the parameters for the payload encrypt request (example below), it’s the encryption/decryption I can do..
data="1234"
key="\\S„ßÍ}/Ìa5!"
hmac_key="¹jz¹2¸F\r}òcžÎ„ 臧.ª˜¹=¤µæŸ"
iv=" {¬£áæ‚2žâ3ÐÞË€ú "
I’ve found an aes.lua module online, but that requires loads of others modules most notably ffi.lua. Ideally I’d like to avoid using that. I also came across this aes128.lua but i’m not sure how that handles all the other parameters e.g cbc etc. Finally there’s this aes256ecb.lua script, could that be converted to aes 128 cbc and then used in the above?
Is anyone aware (or maybe has) a Lua script that can handle the aes cbc requirements above ?
Many thanks !
In the end I found out that I could do aes.cbc by calling openssl from the command line, e.g.
local payload = "ENTER HERE"
Local key = "ENTER HERE"
local iv = "ENTER HERE"
local buildsslcommand = "openssl enc -aes-128-cbc -nosalt -e -a -A "..payload.." -K "..key.." -iv "..iv
-- print("Command to send = " ..buildsslcommand)
local file = assert(io.popen(buildsslcommand, 'r'))
local output = file:read('*all')
file:close()
-- print(string.len(output)) --> just count what's returned.
-- print(output) -- > Prints the output of the command.
FYI - It looks like I could do encdec.hmacsha256 via openSSL as well, but I’ve not been able to do that :-( ..
I have a compressed file.
How do I send it to the browser and make it decode it?
ASP.NET Core suports compression like this:
services.AddResponseCompression();
...
app.UseResponseCompresssion
app.Run(async (context) =>
{
var file = System.IO.File.ReadAllText("file.txt");
await context.Response.WriteAsync(file);
});
But my file is already compressed. I tried just setting the headers, reading the file and sending the content, but it messes with the encoding.
app.Run(async (context) =>
{
context.Response.Headers.Add("Content-Type", "text/plain");
context.Response.Headers.Add("Content-Encoding", "gzip");
context.Response.Headers.Add("Vary", "Accept-Encoding");
var file = System.IO.File.ReadAllText("compressed.gz");
await context.Response.WriteAsync(file);
});
If I just send the file, firefox and chrome try to download it, but do not decompress it.
Proof of that:
$ cat compressed.gz | hexdump | head -1
0000000 8b1f 0808 3768 5bc6 0302 6966 656c 742e
# that's the magic number: 8b 1f 08
$ curl -X GET http://localhost:5000/ -H 'Accept-Encoding: gzip' -H 'Cache-Control: no-cache' --output - | hexdump | head -1
0000000 ef1f bdbf 0808 3768 bfef 5bbd 0302 6966
How do I prevent encoding problems?
As #t.niese pointed out, the problem is reading the text by lines.
The proper way to do this task is to return a FileStreamResult
Example:
Response.Headers["Content-Encoding"] = "gzip";
return new FileStreamResult(new FileStream(fileName, FileMode.Open), "text/plain");
I am working on a simple R package to submit hashes for trusted timestamping and to get timestamp info back through Origin Timestamps. I manage to get the information, but I do not manage to POST it OpenTimestamp post hash.
I am using the http package in R. My package ROriginStamp is on github, and the function which I do not get to work is store_hash_info().
Whenever I execute it, I get:
> store_hash(hash = "c7be1ed902fb8dd4d48997c6452f5d7e509fbcdbe2808b16bcf4edce4c07d14e")
Error in store_hash(hash = "c7be1ed902fb8dd4d48997c6452f5d7e509fbcdbe2808b16bcf4edce4c07d14e") :
Bad Request (HTTP 400).
3.
stop(http_condition(x, "error", task = task, call = call))
2.
httr::stop_for_status(result$response) at store_hash.R#29
1.
store_hash(hash = "c7be1ed902fb8dd4d48997c6452f5d7e509fbcdbe2808b16bcf4edce4c07d14e")
>
The function is defined as follow:
store_hash <- function(
hash,
error_on_fail = TRUE,
information = NULL
) {
result <- new_OriginStampResponse()
##
url <- paste0("https://api.originstamp.org/api/", hash)
request_body_json <- jsonlite::toJSON( information, auto_unbox = TRUE )
result$response <- httr::POST(
url,
httr::add_headers(
Authorization = get_option("api_key"),
body = request_body_json
),
httr::content_type_json()
)
if (error_on_fail) {
httr::stop_for_status(result$response)
}
##
try(
{
result$content <- httr::content(
x = result$response,
as = "text"
)
result$content <- jsonlite::fromJSON( result$content )
},
silent = TRUE
)
##
return(result)
}
The function get_option("api_key") just returns my api key.
Any suggestions what I am doing wrong?
Edits
Thanks to Thomas Hepp, Here is a curl command which does work:
curl 'https://api.originstamp.org/api/ff55d7bc3fe6cb2958e4bdda3d4a4a8e528fb67d9194991e9539d97a55cda2a3' \
-H 'authorization: YOUR API KEY' \
-H 'content-type: application/json' \
-H 'accept: application/json' \
-H 'user-agent: OriginStamp cURL Test' \
--data-binary '{"url":null,"email":null,"comment":"this is a test","submit_ops":["multi_seed"]}'
I’m not familiar with R. But, it’s possible to timestamp a file using opentimestamps.org, by posting a hash of the file to one of their calendar servers, using the following methodology. There is no need for an API key, and this procedure can be used to prove the existence of the file at a point in time, via a reference to a value stored in the OP_RETURN field of a bitcoin transaction, in a block in the bitcoin blockchain.
As an example, first, create a test file:
$ echo -n 'this is a test... this is only a test...' > file.txt
Now, take a sha256 hash of the file:
$ sha256sum file.txt
This produces:
c16d7c8e23baf68525cf0a42fff6b394fdba1791db9817fd601b3f73e2f5fbca
Now, to create a timestamp of the file, post the raw bytes of the hash to one of the calendar servers (e.g. https://a.pool.opentimestamps.org/). This can be done using curl, like so:
$ echo -n 'c16d7c8e23baf68525cf0a42fff6b394fdba1791db9817fd601b3f73e2f5fbca' | xxd -r -p | curl -X POST --data-binary - https://a.pool.opentimestamps.org/digest > out.ots
[Optional: If you don’t want to disclose the hash of the file that you are timestamping to Opentimestamps, you can add a random salt to the file hash, then do sha256(original file hash + salt) and post the result of this.]
The response from the above request is redirected to a file out.ots. To get the status of the timestamp, we need to parse the raw bytes of out.ots.
First, view the raw bytes of the file using a hex editor, or xxd:
$ xxd out.ots
00000000: f010 95ee b35a b002 5b8b 5e76 3522 6970 .....Z..[.^v5"ip
00000010: 886c 08f1 0462 be3e 32f0 081a ff1c ae94 .l...b.>2.......
00000020: 4f00 4600 83df e30d 2ef9 0c8e 2e2d 6874 O.F..........-ht
00000030: 7470 733a 2f2f 616c 6963 652e 6274 632e tps://alice.btc.
00000040: 6361 6c65 6e64 6172 2e6f 7065 6e74 696d calendar.opentim
00000050: 6573 7461 6d70 732e 6f72 67 estamps.org
Some of the bytes represent instructions, as follows:
f0 xx: append xx bytes
f1 xx: prepend xx bytes
80: sha256 hash
00: stop
Start with the hash that we timestamped:
c16d7c8e23baf68525cf0a42fff6b394fdba1791db9817fd601b3f73e2f5fbca
then, proceed by parsing the response of the POST request. The first two bytes are f0 10. This means append the next 16 bytes (10 in hex is 16 in decimal). This produces:
c16d7c8e23baf68525cf0a42fff6b394fdba1791db9817fd601b3f73e2f5fbca95eeb35ab0025b8b5e7635226970886c
Continuing parsing the POST response, the next byte is 80. This means take the sha256 hash of the above.
echo -n ‘c16d7c8e23baf68525cf0a42fff6b394fdba1791db9817fd601b3f73e2f5fbca95eeb35ab0025b8b5e7635226970886c’ | xxd -p -r | sha256sum
produces:
f7d0917a163a8df26066cd669eb12e2d0d59bb5f454aaee338dcc0694ef35090
Continuing, we have f1 04. This means prepend the next 4 bytes to the above. This produces:
62be3e32f7d0917a163a8df26066cd669eb12e2d0d59bb5f454aaee338dcc0694ef35090
Next, we have f0 08. Append the next 8 bytes. This produces:
62be3e32f7d0917a163a8df26066cd669eb12e2d0d59bb5f454aaee338dcc0694ef350901aff1cae944f0046
Finally, we have 00. This means stop. At this point, skip the next 10 bytes, then extract starting from this point to get the URL that we’ll need to get the status of the timestamp:
https://alice.btc.calendar.opentimestamps.org
then, concatenate ‘/timestamp/’ followed by the result above:
https://alice.btc.calendar.opentimestamps.org/timestamp/62be3e32f7d0917a163a8df26066cd669eb12e2d0d59bb5f454aaee338dcc0694ef350901aff1cae944f0046
The status of the timestamp can be accessed by making a GET request to the above URL:
curl https://alice.btc.calendar.opentimestamps.org/timestamp/62be3e32f7d0917a163a8df26066cd669eb12e2d0d59bb5f454aaee338dcc0694ef350901aff1cae944f0046
returns:
Pending confirmation in Bitcoin blockchain
If you wait a few hours, the confirmation will be written to the bitcoin blockchain, and the above GET request will return a much longer proof like the one above, eventually chaining-up to a value that is written to the OP_RETURN field of a bitcoin transaction, in a block in the bitcoin blockchain. By saving this proof, you can verify the existence of the file at the point in time that the block was written to the blockchain, without the need to query the Opentimestamps servers.
The following python script automates the above procedure:
import hashlib
import requests
filehash=bytes.fromhex('c16d7c8e23baf68525cf0a42fff6b394fdba1791db9817fd601b3f73e2f5fbca')
server='https://a.pool.opentimestamps.org/digest'
print('posting ' + filehash.hex() + ' to ' + server)
response=requests.post(url=server, data=filehash)
bytearray=response.content
print('saving response to ./out.ots')
f=open('./out.ots', 'wb')
f.write(bytearray)
f.close()
print('analysing response')
i=0
ptr=0x00
result=filehash
while(True):
print(i)
print('ptr:', hex(ptr))
nextinstruction=bytearray[ptr]
if(nextinstruction==0xf0):
#append
ptr+=1
numberofbytes=bytearray[ptr]
ptr+=1
bytesegment=bytearray[ptr:ptr+numberofbytes]
print('append ', bytesegment.hex())
result=result+bytesegment
ptr+=numberofbytes
elif(nextinstruction==0xf1):
#prepend
ptr+=1
numberofbytes=bytearray[ptr]
ptr+=1
bytesegment=bytearray[ptr:ptr+numberofbytes]
print('prepend ', bytesegment.hex())
result=bytesegment+result
ptr+=numberofbytes
elif(nextinstruction==0x08):
#sha256
print('sha256')
result=hashlib.sha256(result).digest()
ptr+=1
elif(nextinstruction==0xff):
#fork
print('fork')
ptr+=1
elif(nextinstruction==0x00):
#stop
print('stop')
ptr+=11
url=bytearray[ptr:].decode()
url=url + '/timestamp/' + result.hex()
print('url: ', url)
else:
print('invalid ots file format')
quit()
print('result:', result.hex())
print('-----')
i+=1
if(nextinstruction==0x00): break
print('to get status of timestamp, make a GET request to ' + url)
print('GET ' + url)
response=requests.get(url)
print(response.text)
I am trying to read from a tcp connection which contains HTTP/2 data. Below is the code for reading HEADERS frame -
framer := http2.NewFramer(conn, conn)
frame, _ := framer.ReadFrame()
fmt.Printf("fh type: %s\n", frame.Header().Type)
fmt.Printf("fh type: %d\n", frame.Header().Type)
fmt.Printf("fh flag: %d\n", frame.Header().Flags)
fmt.Printf("fh length: %d\n", frame.Header().Length)
fmt.Printf("fh streamid: %d\n", frame.Header().StreamID)
headersframe := (frame1.(*http2.HeadersFrame))
fmt.Printf("stream ended? %v\n", headersframe.StreamEnded())
fmt.Printf("block fragment: %x\n", headersframe.HeaderBlockFragment())
I send request using curl as -
curl -v https://127.0.0.1:8000/ -k --http2
This is the output I get (after reading connection preface and SETTINGS), if I read from the conn using above code -
fh type: HEADERS
fh type: 1
fh flag: 5
fh length: 30
fh streamid: 1
stream ended? true
block fragment: 828487418a089d5c0b8170dc6c4d8b7a8825b650c3abb6f2e053032a2f2a
I understand the ouput, except the block fragment part and how to decode it into ascii string? I want to know the protocol/method/url path information.
The "header block fragment" is encoded using HPACK.
Go has an implementation to encode and decode HPACK, so you don't have to write your own.
You can find here an example of using both the encoder and decoder Go API.
I figured it out using Go hpack library (https://godoc.org/golang.org/x/net/http2/hpack) -
decoder := hpack.NewDecoder(2048, nil)
hf, _ := decoder.DecodeFull(headersframe.HeaderBlockFragment())
for _, h := range hf {
fmt.Printf("%s\n", h.Name + ":" + h.Value)
}
This prints -
:method:GET
:path:/
:scheme:https
:authority:127.0.0.1:5252
user-agent:curl/7.58.0
accept:*/*
I'm trying to use python requests to PUT a .pmml model to a local openscoring server.
This works (from directory containing DecisionTreeIris.pmml):
curl -X PUT --data-binary #DecisionTreeIris.pmml -H "Content-type: text/xml" http://localhost:8080/openscoring/model/DecisionTreeIris
This doesn't:
import requests
file = '/Users/weitzenfeld/IntelliJProjects/openscoring/openscoring-server/etc/DecisionTreeIris.pmml'
r = requests.put('http://localhost:8080/openscoring/model/DecisionTreeIris', files={'file': open(file, 'rb')})
r.text
returns:
u'<html>\n<head>\n<meta http-equiv="Content-Type" content="text/html;charset=ISO-8859-1"/>\n<title>Error 415 </title>\n</head>\n<body>\n<h2>HTTP ERROR: 415</h2>\n<p>Problem accessing /openscoring/model/DecisionTreeIris. Reason:\n<pre> Unsupported Media Type</pre></p>\n<hr /><i><small>Powered by Jetty://</small></i>\n</body>\n</html>\n'
I also tried:
r = requests.put('http://localhost:8080/openscoring/model/DecisionTreeIris', files={'file': open(file, 'rb')}, headers={'Content-type': 'text/xml', 'Accept': 'text/xml'})
r.text
which returns:
u'<html>\n<head>\n<meta http-equiv="Content-Type" content="text/html;charset=ISO-8859-1"/>\n<title>Error 406 </title>\n</head>\n<body>\n<h2>HTTP ERROR: 406</h2>\n<p>Problem accessing /openscoring/model/DecisionTreeIris. Reason:\n<pre> Not Acceptable</pre></p>\n<hr /><i><small>Powered by Jetty://</small></i>\n</body>\n</html>\n'
Note that my python attempt is the same as in the accepted answer to this question: Using Python to PUT PMML.
Also, someone with >1500 rep should consider making an 'openscoring' tag.
You should check the annotations of the method org.openscoring.service.ModelResource#deploy(String, HttpServletRequest) for valid request/response MIME types.
The first request fails because the server only accepts application/xml and text/xml payloads. The second request fails, because the server emits application/json payloads, but your client is only willing to accept text/xml payloads.
Solution was to put data, not the file handler:
r = requests.put('http://localhost:8080/openscoring/model/DecisionTreeIris', data=open(file, 'rb'), headers={'Content-type': 'text/xml', 'Accept': 'text/xml'})