Retrofit Response Parsing - retrofit

im looking for the possibility to control when the Response will be parsed as json.
I call an Endpoint where i actually cant modify the result, i get an ugly 'ok' from that dumb
PHP Script. Problem is that the JSON Parser tries to parse this and fails:
D/Retrofit( 6334): <--- HTTP 200 https://somewhere.com/endpoint.php?idfv=android_id&UserInterfaceIdiom=hammerhead&systemVersion=1.0&status=Not+Set&batteryLevel=100%25&localizedModel=Nexus+5&systemName=Android+OS&bundleShortVersion=1&language=eng&batteryState=USB&bundeIdentifier=com.packagename&bundleVersion=1.0 (475ms)
D/Retrofit( 6334): : HTTP/1.1 200 OK
D/Retrofit( 6334): Connection: Keep-Alive
D/Retrofit( 6334): Content-Type: text/html
D/Retrofit( 6334): Date: Tue, 07 Oct 2014 14:49:20 GMT
D/Retrofit( 6334): Keep-Alive: timeout=5, max=100
D/Retrofit( 6334): OkHttp-Received-Millis: 1412693360928
D/Retrofit( 6334): OkHttp-Response-Source: NETWORK 200
D/Retrofit( 6334): OkHttp-Selected-Protocol: http/1.1
D/Retrofit( 6334): OkHttp-Sent-Millis: 1412693360859
D/Retrofit( 6334): Server: Apache
D/Retrofit( 6334): Vary: Accept-Encoding
D/Retrofit( 6334): X-Powered-By: PHP/5.4.4-14+deb7u14
D/Retrofit( 6334): OK
D/Retrofit( 6334): <--- END HTTP (2-byte body)
My RX Subscription gets calledthat the Parsing when wrong:
retrofit.RetrofitError: com.google.gson.JsonSyntaxException: java.lang.IllegalStateException: Expected BEGIN_OBJECT but was STRING at line 1 column 1
I cant null for example the JSON Parser/Converter, null results in an NullPointerException getting thrown by Retrofit.

Use Response as the return type (or Callback generic parameter) which won't trigger parsing the body using the specified Converter. This object gives you a representation of the HTTP response where you can query the status code, headers, and body directly, if you need to.

Related

Loadrunner - Getting message in HTTP response "HTML parsing not performed for Content-Type "application/xml""

The send and receive content on the server I work on is with type "application/xml".
On my init section I added the below line to automatically to add to all my header requests
web_add_auto_header("Content-Type","application/xml");
When I run the script, I get response header showing the correct content-type but in the boo day I get message:
351-byte response headers for "http://172.29.67.68/svc/bw/cti/monitor/event/bw_perfuser1000_60a439f7-599d-4fe1-baa6-598391312954" (RelFrameId=1, Internal ID=5)
HTTP/1.1 200 OK\r\n
Date: Mon, 11 Mar 2019 18:20:09 GMT\r\n
Content-Length: 681\r\n
Content-Type: application/xml\r\n
X-Frame-Options: SAMEORIGIN\r\n
Expires: Thu, 01 Jan 1970 00:00:00 GMT\r\n
Cache-Control: no-cache, private, must-revalidate, max-stale=0, post-check=0, pre-check=0
no-store\r\n
Pragma: no-cache\r\n
Keep-Alive: timeout=15, max=96\r\n
Connection: Keep-Alive\r\n
message I get:
HTML parsing not performed for Content-Type "application/xml" ("ParseHtmlContentType" Run-Time Setting is "TEXT").
To fix this issue, I need to add the below line before each request
web_add_header("Content-Type","application/xml");
Can anyone please explain why I need to explicitly mention the content-type before each request although I used the web_add_auto_header() function?
In HTTP protocol, you need to specific Request Header Fields in HTTP request. The detail of HTTP Header please refer to wiki.

Getting strange http response codes, but the site is actually working

When I view the URL below or the other below in the code it's displayed fine. I don't see anything unusual in the network tab when I press F12 in the browser, but with the code below I will get response codes 403 or 400. When I use the response code checker here http://httpstatus.io/ it will come back fine with a 200 response for both URLS.
I get a 403 for http://psychsignal.com/ using my code below.
URL u = new URL("http://www.nasdaqomxnordic.com/"); //returns 400 response code
//u.toURI(); //to check the syntax
HttpURLConnection huc = (HttpURLConnection)u.openConnection();
huc.setRequestMethod("GET");
//huc.setRequestMethod("HEAD");
huc.connect();
System.out.println(huc.getResponseCode());
Thanks if anyone has any ideas! This is actually my first post!
My guess is that there's some restrictions placed on the User-Agent of the client. Some testing seems to support my theory:
If I use the curl default user agent:
# curl -I -H "User-Agent: curl/7.35.0" "http://www.nasdaqomxnordic.com/"
HTTP/1.1 400 Bad Request
Content-Type: text/html; charset=UTF-8
Cache-Control: no-cache
Pragma: no-cache
Expires: 0
Connection: close
If I use a hacked up standard browser agent string:
# curl -I -H "User-Agent: Mozilla/5.0" -0 "http://www.nasdaqomxnordic.com/"
HTTP/1.1 200 OK
Cache-Control: no-cache
Pragma: no-cache
Content-Length: 0
Content-Type: text/html;charset=UTF-8
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Server: Microsoft-IIS/7.5
X-Powered-By: ASP.NET
Date: Wed, 22 Jul 2015 15:06:22 GMT
Connection: close
And then if I use a Java agent string (which is my guess as to what you're using):
# curl -I -H "User-Agent: Java/1.6.0_26" "http://www.nasdaqomxnordic.com/"
HTTP/1.1 400 Bad Request
Content-Type: text/html; charset=UTF-8
Cache-Control: no-cache
Pragma: no-cache
Expires: 0
Connection: close
Only the "browser" user agent gets through. I'd try tweaking your code to set the user agent string to something commonly found in a web browser.

HTTP Get not using Port

I am trying to call a page in PHP with a http_get :
$url = "http://mysite.fr:9090/neolane-webservice/campagnesclient/Coclico=1135446";
http_get($url, $appelOptions, $appelInfos);
My problem is that it does not work every time.
I installed Wireshark to see what I'm really sending and I found an odd thing. Sometimes, the port is not used for the HTTP request.
When it works, I have :
Hypertext Transfer Protocol
GET http://mysite.fr:9090/neolane-webservice/campagnesclient/Coclico=1135446 HTTP/1.1\r\n
Request Method: GET
Request URI: http://mysite.fr:9090/neolane-webservice/campagnesclient/Coclico=1135446
Request Version: HTTP/1.1
User-Agent: PECL::HTTP/1.6.5 (PHP/5.2.4-2ubuntu5.7)\r\n
Host: mysite.fr:9090\r\n
Pragma: no-cache\r\n
Accept: */*\r\n
Proxy-Connection: Keep-Alive\r\n
Keep-Alive: 300\r\n
Connection: keep-alive\r\n
Date: Fri, 15 Jun 2012 16:40:46 +0200\r\n
Accept-Charset: utf-8\r\n
Accept-Encoding: gzip;q=1.0,deflate;q=0.5\r\n
\r\n
And when it's not :
Hypertext Transfer Protocol
GET http://mysite.fr:9090/neolane-webservice/campagnesclient/Coclico=1135446 HTTP/1.1\r\n
Request Method: GET
Request URI: http://mysite.fr:9090/neolane-webservice/campagnesclient/Coclico=1135446
Request Version: HTTP/1.1
User-Agent: PECL::HTTP/1.6.5 (PHP/5.2.4-2ubuntu5.7)\r\n
Host: mysite.fr\r\n
Pragma: no-cache\r\n
Accept: */*\r\n
Proxy-Connection: Keep-Alive\r\n
Keep-Alive: 300\r\n
Connection: keep-alive\r\n
Date: Fri, 15 Jun 2012 16:40:34 +0200\r\n
Accept-Charset: utf-8\r\n
Accept-Encoding: gzip;q=1.0,deflate;q=0.5\r\n
\r\n
I tried to call the page with wget and it's always working :
wget http://mysite.fr:9090/neolane-webservice/campagnesclient/Coclico=1135446
So I'm guessing that my problem id due to Apache config, but I don't know where to look. Could you help me please ?
You will need to set the port in the $appelOptions array.
$appelOptions['port']=9090;
http_get($url, $appelOptions, $appelInfos);
Unfortunately http_get does not seem to respect the :port syntax in the URL

Why does my Twitter OAuth API call to update status fail, but other calls work?

This is the raw HTTP call that I make to verify authentication. It returns the expected response:
GET /1/account/verify_credentials.xml HTTP/1.1
Authorization: OAuth oauth_token="12556442-pndSo1mf2i1ToPSbAyLH4qBBDHmtyutjbvMLckGER",oauth_consumer_key="ih75ityikrTdIwB9kQ",oauth_nonce="6wIbdfxL",oauth_signature_method="HMAC-SHA1",oauth_signature="7DUW5TLtntryndfhU5dSXARg%3D",oauth_version="1.0",oauth_timestamp="1267805254"
Host: api.twitter.com
This is a call that I (try) to make, which is intended to update the users status:
POST /1/statuses/update.xml HTTP/1.1
Authorization: OAuth oauth_token="1252356242-pndSo1mf2i1ToPSfghfghfQoMLckGER",oauth_consumer_key="ih75i83BXdfhnfghnfgQ",oauth_nonce="CJ9dfgXs",oauth_signature_method="HMAC-SHA1",oauth_signature="bSD7aXUdfghdfghfghfghoU%3D",oauth_version="1.0",oauth_timestamp="1267235407"
Content-Type: application/x-www-form-urlencoded
Host: api.twitter.com
Content-Length: 11
Connection: Keep-Alive
status=blah
The response that I get back from twitter for this second request is as follows:
HTTP/1.1 401 Unauthorized
Date: Fri, 05 Mar 2010 16:17:18 GMT
Server: hi
Status: 401 Unauthorized
WWW-Authenticate: Basic realm="Twitter API"
Content-Type: application/xml; charset=utf-8
Content-Length: 135
Cache-Control: no-cache, max-age=1800
Set-Cookie: guest_id=12672352252251; path=/
Set-Cookie: _twitter_sess=BAh7CDoPY3JlYXRlZF9hdGsdgsdhdrhvdrthvdthd0%250ANDdkZTEyZjczZTY3ZGE4YmQ5IgpmbGFzaElDOidBY3Rpb25Db250cm9sbGVy%250AOjpGbGFzaDo6Rmxhc2hIYXNoewAGOgpAdXNlZHsA--0eb657ba0esdrvthdtdtgcdrtgc0ece8f1460; domain=.twitter.com; path=/
Expires: Fri, 05 Mar 2010 16:47:17 GMT
Vary: Accept-Encoding
Connection: close
<?xml version="1.0" encoding="UTF-8"?>
<hash>
<request>/1/statuses/update.xml</request>
<error>Incorrect signature</error>
</hash>
Any idea what could be going wrong?
Note, the OAuth tokens and stuff have been scrambled of course.
Solved:
Even though I had to make this call a POST request, the actual parameters couldn't be in the POST body. I put the status variable in the query string on the request and it worked fine.
i got similar problem ...401 unauthorized ...using php
i was following the examples in
http://www.snipe.net/2009/07/writing-your-first-twitter-application-with-oauth/
until i discovered the parameters 'POST' and $data
was swapped in the line
$content = $to->OAuthRequest('https://twitter.com/statuses/update.xml','POST',$data);
maybe it is your case too?
maybe not anyway i found that tutorial very useful
cheer
POST parameters are fed into the hash that produces the signature, so you have to make sure they are included when you calculate it.

Http protocol content-length

I am working on a simple download application. While making a request for the following file both firefox and my application doesn't get the content-length field. But if i make the request using wget server does send the content-length field. I did change wgets user agent string to test and it still got the content-length field.
Any ideas why this is happening?
wget request
---request begin---
GET /dc-13/video/2005_Defcon_V2-P_Zimmerman-Unveiling_My_Next_Big_Project.mp4 HTTP/1.0
User-Agent: test
Accept: */*
Host: media.defcon.org
Connection: Keep-Alive
---request end---
HTTP request sent, awaiting response...
---response begin---
HTTP/1.0 200 OK
Server: lighttpd
Date: Sun, 05 Apr 2009 04:40:08 GMT
Last-Modified: Tue, 23 May 2006 22:18:19 GMT
Content-Type: video/mp4
Content-Length: 104223909
Connection: keep-alive
firefox request
GET /dc-13/video/2005_Defcon_V2-P_Zimmerman-Unveiling_My_Next_Big_Project.mp4 HTTP/1.1
Host: media.defcon.org
User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.4; en-US; rv:1.9.0.8) Gecko/2009032608 Firefox/3.0.8
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
Referer: http://www.defcon.org/html/links/defcon-media-archives.html
Pragma: no-cache
Cache-Control: no-cache
HTTP/1.x 200 OK
Server: lighttpd
Date: Sun, 05 Apr 2009 05:20:12 GMT
Last-Modified: Tue, 23 May 2006 22:18:19 GMT
Content-Type: video/mp4
Transfer-Encoding: chunked
Update:
Is there a header that I can send that will tell Lighthttpd not to use chunked encoding.My original problem is that I am using urlConnection to grab the file in my java application which automatically sends HTTP 1.1 request.
I would like to know the size of the file so i can update my percentage.
GET
/dc-13/video/2005_Defcon_V2-P_Zimmerman-Unveiling_My_Next_Big_Project.mp4
HTTP/1.1
Firefox is performing an HTTP 1.1 GET request. Lighthttpd understands that the client will support chunked-transfer encoding and returns the content in chunks, with each chunk reporting its own length.
GET
/dc-13/video/2005_Defcon_V2-P_Zimmerman-Unveiling_My_Next_Big_Project.mp4
HTTP/1.0
Wget on the other hand performs an HTTP 1.0 GET request. Lighthttpd, understanding that the client doesn't support HTTP 1.1 (and thus chunked-transfer encoding), returns the content in one single chunk, with the length reported in the response header.
Looks like it's because of the chunked transfer encoding:
Transfer-Encoding: chunked
This will send the video down in chunks, each with its own size. This is defined in HTTP 1.1, which is what Firefox is using, while wget is using HTTP 1.0, which doesn't support chunked transfer encoding, so the server has to send the whole file at once.
I was having the same problem and found a solution regardless of which HTTP version:
First use a HEAD request to the server which correctly responds with just the HTTP header and no contents. This header correctly includes the wanted Content-Length: bytes size for the file to download.
Proceed with the GET request to download the file (the header from the GET response fails to include Content-length).
An Objective-C language example:
NSString *zipURL = #"http://1.bp.blogspot.com/_6-cw84gcURw/TRNb3PDWneI/AAAAAAAAAYM/YFCZP1foTiM/s1600/paragliding1.jpg";
NSURL *url = [NSURL URLWithString:zipURL];
// Configure the HTTP request for HEAD header fetch
NSMutableURLRequest *urlRequest = [NSMutableURLRequest requestWithURL:url];
urlRequest.HTTPMethod = #"HEAD"; // Default is "GET"
// Define response class
__autoreleasing NSHTTPURLResponse *response;
// Send HEAD request to server
NSData *contentsData = [NSURLConnection sendSynchronousRequest:urlRequest returningResponse:&response error:nil];
// Header response field
NSDictionary *headerDeserialized = response.allHeaderFields;
// The contents length
int contents_length = [(NSString*)headerDeserialized[#"Content-Length"] intValue];
//printf("HEAD Response header: %s\n",headerDeserialized.description.UTF8String);
printf("HEAD:\ncontentsData.length: %d\n",contentsData.length);
printf("contents_length = %d\n\n",contents_length);
urlRequest.HTTPMethod = #"GET";
// Send "GET" to download file
contentsData = [NSURLConnection sendSynchronousRequest:urlRequest returningResponse:&response error:nil];
// Header response field
headerDeserialized = response.allHeaderFields;
// The contents length
contents_length = [(NSString*)headerDeserialized[#"Content-Length"] intValue];
printf("GET Response header: %s\n",headerDeserialized.description.UTF8String);
printf("GET:\ncontentsData.length: %d\n",contentsData.length);
printf("contents_length = %d\n",contents_length);
return;
And the output:
HEAD:
contentsData.length: 0
contents_length = 146216
GET:
contentsData.length: 146216
contents_length = 146216
(Note: This example URL does correctly provides the header Content-Length from the GET response, but shows the idea if it failed to)

Resources