ASP.NET MONO MVC4 application is running in apache with mod_mono
Pages should cached only in browser or in proxy servers to save server memory.
Browser gets product list from server using header
Request URL:http://example.com/store/Store/Category/ANDRORI?root=1&open=True&total=2.76
Request Method:GET
Status Code:200 OK
Remote Address:1.2.3.4:80
Server returns returns product list page with headers:
HTTP/1.1 200 OK
Date: Wed, 06 Jan 2016 16:52:47 GMT
Server: Apache/2.2.22 (Debian)
X-AspNet-Version: 4.0.30319
Cache-Control: private
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 8668
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
Content-Type: text/html; charset=utf-8
Chrome Developer Tools page audit wrote warning about this page
The following resources are explicitly non-cacheable. Consider making them cacheable if possible:
and read this page every time. Response contains
Cache-Control: private
so page must me cached in browser.
How to fix this so that page can cached in client?
I tried to add controller
[OutputCache(Location = System.Web.UI.OutputCacheLocation.Client, Duration = 20 * 60)]
but in this case page is cached in server and query string parameters are ignored.
Same page is sent from server for all query string parameters.
Related
i have a service .svc hosted on a machine whit windows server 2019 on IIS 10,
im tryng to send a POST request whit postman sending a json whit the following headers:
Postman-Token: calculated when request is sent>
Content-Lenght: calculated when request is sent
Host: calculated when request is sent
Accept: * / *
Content-Type: application/json
and this in the body:
{
"orderNumber": "12345",
"idTransaction":"1",
"authorization":"auth",
"amount":"14.95",
"paymentDate":"2020-01-01",
"currency":"euro",
"resultCode": "ok"
}
the service responds correctly and read the json but the status is "406 Not Accettable"
as you can see i have placed the Accept header whit value "* / *" that mean accept every kind of content-type and that is the response that i get
response headers:
Content-Type: application/json; charset=utf-8
Server: Microsoft-IIS/10.0
X-Powered-By: ASP.NET
Date: Mon, 04 Jan 2021 09:54:27 GMT
Content-Length: 69
response body
"Not found any payment whit numeroOrdine=12345."
the server/service read the json and respond corretly couse the number of order 12345 was an example and dosent exist,
but whit status code 406 not accetable,
i tryed also to add the header Accept-Charset: UTF-8 in the request but the situation dosent change,
i checked on iis on the server and after select on the mime type list i can that application/json is in the list,
Mime type IIS
maybe i have to enable it somewhere else? Specify the mime type in the web.config file?
or it could be something else?
thanks in advance
Context
My github pages are not refreshing. After diagnosing my conclusion is it's a server side caching effect.
What I did + diagnostic results
The site is working OK.
I made a change in index.html in my local
repo, then commit and push
I completely cleared my browser cache (btw also using cache clear plugins, and Chrome dev tools set not using cache)
Reloaded the page, with ctrl+f5 and ctrl+R (change is not applied)
Checked using github.com read index.html, the change is there, committed.
Monitored the traffic with Fiddler. The request for index.html sent, full response received, the content is the old NOT changed.
Examined the response header with Fiddler, says: (see header exhibit)
Reverse diagnostic
I've issued a request with a usual trick typeing: index.html?v001orAnythingYouWant and I got the new version of the page
Problem
Problem solved one can say, but it is not true. When I refresh images, css, js still this effect will prevent me to see the new result.
Question
How can I configure or overcome this server side caching, of course only for development/testing time?
Response header exhibit
HTTP/1.1 200 OK
Server: GitHub.com
Content-Type: text/html; charset=utf-8
Last-Modified: Fri, 06 May 2016 12:24:29 GMT
Access-Control-Allow-Origin: *
Expires: Fri, 06 May 2016 12:45:44 GMT
Cache-Control: max-age=600
X-GitHub-Request-Id: B91F111E:5AA6:47804:572C8F9F
Content-Length: 43752
Accept-Ranges: bytes
Date: Fri, 06 May 2016 12:35:57 GMT
Via: 1.1 varnish
Age: 13
Connection: keep-alive
X-Served-By: cache-fra1238-FRA
X-Cache: HIT
X-Cache-Hits: 1
Vary: Accept-Encoding
X-Fastly-Request-ID: 1758f53052edbfb40a0044407d53d5654ad1e983
I have made a request for a video which returns a video with an ETAG.
When I make a request for the same video again, I can see the If-non-match header passed from the browser with the Etag but instead of 304 returned, the video is downloaded again with a 200 OK response.
In fiddler for the very first request for the video, the response is:
HTTP/1.1 200 OK
Cache-Control: max-age=10
Content-Length: 76278442
Content-Type: video/mp4
Last-Modified: Wed, 21 Aug 2013 08:47:29 GMT
ETag: "2117329216"
Server: Microsoft-IIS/7.5
X-Mod-H264-Streaming: version=2.2.7
X-Powered-By: ASP.NET
Date: Fri, 23 Aug 2013 21:20:34 GMT
On the second request, the GET headers are:
GET http://test/video.mp4 HTTP/1.1
Accept: */*
Accept-Language: en-GB
x-flash-version: 11,8,800,94
Accept-Encoding: gzip, deflate
If-Modified-Since: Wed, 21 Aug 2013 08:47:29 GMT
If-None-Match: "2117329216"
Connection: Keep-Alive
But in this case, I get the whole video downloaded rather than a 304 non modified response.
I noticed that X-Mod-H264-Streaming was used, not sure if this may have something to do with it.
Edit
I used the URL to the video in IE 10 directly (not using the flex application we were using before) and I get the same response where on the first request I get the complete video and after hitting f5 I get the whole video returned again rather than a 304 response.
I have a script on GAE that requests an XML feed from a partner that's typically 40MB but only 5MB gzipped. GAE is automatically unzipping this content and throwing an error that the response is too big:
HTTP response was too large: 46677241. The limit is: 33554432.
The script is setup to uncompress the response itself. How do I prevent GAE from getting in the way and breaking?
Here's the response header from my partner:
HTTP/1.0 200 OK
Expires: Wed, 27 Jun 2012 05:42:07 GMT
Cache-Control: max-age=10368000
Content-Type: application/x-gzip
Accept-Ranges: bytes
Last-Modified: Wed, 22 Feb 2012 11:06:09 GMT
Content-Length: 5263323
Date: Tue, 28 Feb 2012 05:42:07 GMT
Server: lighttpd
X-Cache: MISS from static01
X-Cache-Lookup: MISS from static01:80
Via: 1.0 static01:80 (squid)
Most likely your partner's server responds with plain XML, because it thinks that http-client sending requests (i.e. GAE URL Fetch service) does not support gzipping. Hence "response was too large" error.
To announce that you actually want to receive gzipped content you need to set Accept-Encoding: gzip header when using URL fetch service.
I am working on a simple download application. While making a request for the following file both firefox and my application doesn't get the content-length field. But if i make the request using wget server does send the content-length field. I did change wgets user agent string to test and it still got the content-length field.
Any ideas why this is happening?
wget request
---request begin---
GET /dc-13/video/2005_Defcon_V2-P_Zimmerman-Unveiling_My_Next_Big_Project.mp4 HTTP/1.0
User-Agent: test
Accept: */*
Host: media.defcon.org
Connection: Keep-Alive
---request end---
HTTP request sent, awaiting response...
---response begin---
HTTP/1.0 200 OK
Server: lighttpd
Date: Sun, 05 Apr 2009 04:40:08 GMT
Last-Modified: Tue, 23 May 2006 22:18:19 GMT
Content-Type: video/mp4
Content-Length: 104223909
Connection: keep-alive
firefox request
GET /dc-13/video/2005_Defcon_V2-P_Zimmerman-Unveiling_My_Next_Big_Project.mp4 HTTP/1.1
Host: media.defcon.org
User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.4; en-US; rv:1.9.0.8) Gecko/2009032608 Firefox/3.0.8
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
Referer: http://www.defcon.org/html/links/defcon-media-archives.html
Pragma: no-cache
Cache-Control: no-cache
HTTP/1.x 200 OK
Server: lighttpd
Date: Sun, 05 Apr 2009 05:20:12 GMT
Last-Modified: Tue, 23 May 2006 22:18:19 GMT
Content-Type: video/mp4
Transfer-Encoding: chunked
Update:
Is there a header that I can send that will tell Lighthttpd not to use chunked encoding.My original problem is that I am using urlConnection to grab the file in my java application which automatically sends HTTP 1.1 request.
I would like to know the size of the file so i can update my percentage.
GET
/dc-13/video/2005_Defcon_V2-P_Zimmerman-Unveiling_My_Next_Big_Project.mp4
HTTP/1.1
Firefox is performing an HTTP 1.1 GET request. Lighthttpd understands that the client will support chunked-transfer encoding and returns the content in chunks, with each chunk reporting its own length.
GET
/dc-13/video/2005_Defcon_V2-P_Zimmerman-Unveiling_My_Next_Big_Project.mp4
HTTP/1.0
Wget on the other hand performs an HTTP 1.0 GET request. Lighthttpd, understanding that the client doesn't support HTTP 1.1 (and thus chunked-transfer encoding), returns the content in one single chunk, with the length reported in the response header.
Looks like it's because of the chunked transfer encoding:
Transfer-Encoding: chunked
This will send the video down in chunks, each with its own size. This is defined in HTTP 1.1, which is what Firefox is using, while wget is using HTTP 1.0, which doesn't support chunked transfer encoding, so the server has to send the whole file at once.
I was having the same problem and found a solution regardless of which HTTP version:
First use a HEAD request to the server which correctly responds with just the HTTP header and no contents. This header correctly includes the wanted Content-Length: bytes size for the file to download.
Proceed with the GET request to download the file (the header from the GET response fails to include Content-length).
An Objective-C language example:
NSString *zipURL = #"http://1.bp.blogspot.com/_6-cw84gcURw/TRNb3PDWneI/AAAAAAAAAYM/YFCZP1foTiM/s1600/paragliding1.jpg";
NSURL *url = [NSURL URLWithString:zipURL];
// Configure the HTTP request for HEAD header fetch
NSMutableURLRequest *urlRequest = [NSMutableURLRequest requestWithURL:url];
urlRequest.HTTPMethod = #"HEAD"; // Default is "GET"
// Define response class
__autoreleasing NSHTTPURLResponse *response;
// Send HEAD request to server
NSData *contentsData = [NSURLConnection sendSynchronousRequest:urlRequest returningResponse:&response error:nil];
// Header response field
NSDictionary *headerDeserialized = response.allHeaderFields;
// The contents length
int contents_length = [(NSString*)headerDeserialized[#"Content-Length"] intValue];
//printf("HEAD Response header: %s\n",headerDeserialized.description.UTF8String);
printf("HEAD:\ncontentsData.length: %d\n",contentsData.length);
printf("contents_length = %d\n\n",contents_length);
urlRequest.HTTPMethod = #"GET";
// Send "GET" to download file
contentsData = [NSURLConnection sendSynchronousRequest:urlRequest returningResponse:&response error:nil];
// Header response field
headerDeserialized = response.allHeaderFields;
// The contents length
contents_length = [(NSString*)headerDeserialized[#"Content-Length"] intValue];
printf("GET Response header: %s\n",headerDeserialized.description.UTF8String);
printf("GET:\ncontentsData.length: %d\n",contentsData.length);
printf("contents_length = %d\n",contents_length);
return;
And the output:
HEAD:
contentsData.length: 0
contents_length = 146216
GET:
contentsData.length: 146216
contents_length = 146216
(Note: This example URL does correctly provides the header Content-Length from the GET response, but shows the idea if it failed to)