Asp.Net Sending PDF to browser - asp.net

I've been trying to get this aspx page to serve up a pdf. It works correctly in Firefox, but IE gives
Internet Explorer cannot download getform.aspx from SERVER_NAME
Internet Explorer was not able to open this Internet site. The requested site is either unavailable or cannot be found.
This is the general functionality of my code. It's spread across multiple functions (this is why we're not using WriteFile - sometimes we generate the pdf on the fly), but this is generally it:
FileStream fs = File.Open(Path.Combine(PdfBasePath, "form.pdf"), FileMode.Open, FileAccess.Read);
Stream output = Response.OutputStream;
byte[] buffer = new byte[BUFFER_SIZE];
int read_count = fs.Read(buffer, 0, BUFFER_SIZE);
while (read_count > 0)
{
output.Write(buffer, 0, read_count);
read_count = fs.Read(buffer, 0, BUFFER_SIZE);
}
fs.Close();
Response.Clear();
Response.ContentType = System.Net.Mime.MediaTypeNames.Application.Pdf;
Response.AddHeader("Content-Disposition", "attachment; filename=form.pdf");
Response.Output.Flush();
Response.End();
Looking at Fiddler, the page is being fetched using this:
GET /getform.aspx?Failure=Y&r=someencryptedstring HTTP/1.1
It is being returned to the browser thus:
HTTP/1.1 200 OK
Date: Thu, 09 Apr 2009 22:08:33 GMT
Server: Microsoft-IIS/6.0
X-Powered-By: ASP.NET
X-AspNet-Version: 2.0.50727
Pragma: no-cache
Content-Disposition: attachment; filename=form.pdf
Cache-Control: no-cache, no-store
Pragma: no-cache
Expires: -1
Content-Type: application/pdf
Content-Length: 628548
This is really bugging me. I'm not using SSL, otherwise this KB article would seem to apply. Anyone have any ideas?

Is the Content-Length being returned in the header actually correct for the file you're sending? I'm just comparing this to some production code we use here and it looks like we explicitly set the Content-Length header. If I recall correctly, some browsers have a problem if the header and the actual file size don't match.
Edit
The question author found that changing the Content-Disposition header to application/download instead of application/pdf seems to work around the problem.

Related

Include text header in Image HTTP response Jetty

I'm building a simple server program that needs to return both an image and some text in the response, however, I'm having an issue with Jetty. The text should be included in headers of the HTTP response, but isn't.
Here's the code to return the image:
override fun doPost(request: HttpServletRequest, response: HttpServletResponse) {
response.contentType = "image/png"
response.status = HttpServletResponse.SC_OK
val diff = ImgDiff.getDifference("img1", "img2", tolerance)
//response.writer.println(diff.toString())
ImageIO.write(ImageIO.read(File("diffedFile.png")), "PNG", response.outputStream)
response.addHeader("diff", diff.toString())
}
This works fine, however, the header doesn't contain diff. When I use comment out the ImageIO line and uncomment the one above it, the already commented out one, and change the content type to text/plain diff is included in the headers.
The headers with the image:
Date: Mon, 13 May 2019 22:03:35 GMT
Content-Type: image/png
Transfer-Encoding: chunked
Server: Jetty(9.4.18.v20190429)
The headers without the image (As described in the latter case)
Date: Mon, 13 May 2019 22:10:32 GMT
Content-Type: text/plain;charset=iso-8859-1
diff: 62.62626262626263
Content-Length: 19
Server: Jetty(9.4.18.v20190429)
Am I doing something wrong with Jetty? Can HTTP response images not contain images? I realize I could just return a zip file containing the image and text but I think that's a bit much. Am I ignoring something fundamental to HTTP requests? Please let me know.
It seems to work if I add the headers before I print the image into the stream.
override fun doPost(request: HttpServletRequest, response: HttpServletResponse) {
response.contentType = "image/png"
response.status = HttpServletResponse.SC_OK
val diff = ImgDiff.getDifference("img1", "img2", tolerance)
response.addHeader("diff", diff.toString())
ImageIO.write(ImageIO.read(File("diffedFile.png")), "PNG", response.outputStream)
}

How to parse chunked HTTP content with Lua on nodemcu?

I have script which coomunicates between nodemcu and my server. It works good on my localhost and is parsing response retrieved from my server when I send GET request. Problem is when I upload it all on my website where transfer encoding is chunked. I am not able to retrieve content, although request is legitimate and correct. Code is written in Lua and I am trying to work on my NodeMCU device.
conn=net.createConnection(net.TCP, 0)
conn:on("connection",function(conn, payload)
conn:send("GET /mypath/node.php?id=1&update"..
" HTTP/1.1\r\n"..
"Host: www.mydomain.com\r\n"..
"Accept: */*\r\n"..
"User-Agent: Mozilla/4.0 (compatible; esp8266 Lua;)"..
"\r\n\r\n")
end)
conn:on("receive", function(conn, payload)
if string.find(payload, "UPDATE")~=nil then
node.restart()
end
conn:close()
conn = nil
end)
conn:connect(80,"www.mydomain.com")
end
Just to repeat that this GET request works and is tested manualy and on localhost. Only problem is with chunked content, I don't know how to parse it.
Update: I managed to remove chunked encoding by changing HTTP/1.1 to HTTP/1.0, but still I have problem
using this code
conn:on("receive", function(conn, payload)
print(payload)
I get this response
HTTP/1.1 200 OK
Date: Tue, 09 Jan 2018 02:34:25 GMT
Server: Apache
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
Set-Cookie: PHPSESSID=9m226vr20r4baa634bagk8k2k3; path=/
Connection: close
Content-Type: text/html; charset=utf-8
Update 2.
I have just created one file http.php with text included "php". I have uploaded it to localhost and to my domain. Once I tried to access my localhost from nodemcu, and then to domain. Results were different
This is the request
conn:send("GET /"..s.path.."/http.php"..
" HTTP/1.0\r\n"..
"Host: "..s.domain.."\r\n"..
"Accept: */*\r\n"..
"User-Agent: Mozilla/4.0 (compatible; esp8266 Lua;)"..
"\r\n\r\n")
end)
s.domain and s.path correcponds to different paths and domains on localhost and my domain
Result on domain
HTTP/1.1 200 OK
Date: Tue, 09 Jan 2018 03:09:28 GMT
Server: Apache
Connection: close
Content-Type: text/html; charset=UTF-8
result on localhost
TTP/1.1 200 OK
Date: Tue, 09 Jan 2018 03:08:48 GMT
Server: Apache/2.4.27 (Win64) PHP/7.0.23
X-Powered-By: PHP/7.0.23
Content-Length: 3
Connection: close
Content-Type: text/html; charset=UTF-8
php
As you can see, localhost is showing content "php", and domain is showing only header. When I type some file which does not exists domain is showing me html code.
I'm using the following code to put the chunks together. I'm wondering anyways, why your response from the server is missing the Content-Length header.
conn:on("receive", function(client, payload)
-- Inspired by https://github.com/marcoskirsch/nodemcu-httpserver/blob/master/httpserver.lua
-- Collect data packets until the size of HTTP body meets the Content-Length stated in header
if payload:find("Content%-Length:") or bBodyMissing then
if fullPayload then fullPayload = fullPayload .. payload else fullPayload = payload end
if (tonumber(string.match(fullPayload, "%d+", fullPayload:find("Content%-Length:")+16)) > #fullPayload:sub(fullPayload:find("\r\n\r\n", 1, true)+4, #fullPayload)) then
bBodyMissing = true
return
else
payload = fullPayload
fullPayload, bBodyMissing = nil
end
end
if (bBodyMissing == nil) then
local _, headerEnd = payload:find("\r\n\r\n")
local body = payload:sub(headerEnd + 1)
print (body)
end
end)

Download of xls working in IE/Firefox but named as aspx page in Chrome

I'm having an issue with Chrome renaming an export file to the default of the page name where the export is being initiated from. I've gone through all related forum posts I could find and have tried all suggestions - I'm not seeing any recent posts within the past couple of years.
My code for the export (before any modification attempts) is as follows:
public static void ExportToSpreadsheet(object items, string name)
{
HttpResponse response = HttpContext.Current.Response;
response.Clear();
response.ClearHeaders();
response.Charset = "";
response.ContentType = "application/vnd.ms-excel";
response.AddHeader("Context-Disposition", "attachment;filename=\"" + name + "\"");
using (StringWriter sw = new StringWriter())
{
using (System.Web.UI.HtmlTextWriter htw = new System.Web.UI.HtmlTextWriter(sw))
{
System.Web.UI.WebControls.DataGrid dg = new System.Web.UI.WebControls.DataGrid();
dg.DataSource = items;
dg.DataBind();
dg.RenderControl(htw);
response.Write(sw.ToString());
response.End();
}
}
}
The above code works flawlessly in IE/Firefox. The resulting result headers received (per Chrome net-internals) is:
Cache-Control: private
Content-Type: application/vnd.ms-excel
Server: Microsoft-IIS/8.0
Context-Disposition: attachment;filename="DataExport.xls"
X-AspNet-Version: 4.0.30319
X-SourceFiles: =?UTF-8?B?QzpcTXlfRGF0YVxBVFNNXERldjQwXHBvcnRhbFxDR0lfQXV0b21hdGlvbl9GcmFtZXdvcmtcQXV0b21hdGlvbl9EZWNrLmFzcHg=?=
X-Powered-By: ASP.NET
Date: Fri, 05 Dec 2014 19:37:05 GMT
Content-Length: 86493
I've tried several updates that includes hard coding the filename to "test.xls", clearing content of the response, setting buffer to true, setting CacheControl and Pragma to no-cache, and overriding VerifyRenderingInServerForm() in the page's code-behind. The response headers after applying all these changes are as follows:
Cache-Control: no-cache
Pragma: no-cache
Content-Type: application/vnd.ms-excel
Expires: -1
Server: Microsoft-IIS/8.0
Context-Disposition: attachment; filename="test.xls"
X-AspNet-Version: 4.0.30319
X-SourceFiles: =?UTF-8?B?QzpcTXlfRGF0YVxBVFNNXERldjQwXHBvcnRhbFxDR0lfQXV0b21hdGlvbl9GcmFtZXdvcmtcQXV0b21hdGlvbl9EZWNrLmFzcHg=?=
X-Powered-By: ASP.NET
Date: Fri, 05 Dec 2014 19:10:23 GMT
Content-Length: 86493
This still produces the same results where Chrome is not recognizing/accepting the filename and defaults back to the pagename as the downloaded filename.
Any recommendations for how to fix this issue would be greatly appreciated.
I'm using Chrome Version 39.0.2171.71 m

MVC 3 client caching

I am trying to make modifications to an existing CDN. What I am trying to do is create a short cache time and use conditional GETs to see if the file has been updated.
I am tearing my hair out because even though I am setting a last modified date and seeing it in the response headers, on subsequent get requests I am not seeing an If-Modified-Since header being returned. At first I thought it was my local development environment or the fact that I was using Fiddler as a proxy for testing so I deployed to a QA server. But what I am seeing in Firebug is so different than what I am doing. I see the last modified date, for some reason it is setting my cache-control to private, and I have cleared any header Output Caching and the only header IIS 7.5 is set to write is to enable Http keep-alive, so all the caching should be driven by the code.
This seemed like such a no-brainer, yet I've been adding and removing headers all day with no luck. I checked global.asax and anywhere else (I didn't write the app so I was looking for any hidden surprises and am stumped. Below is the current code and request and response headers. I have the expiration set to 30 seconds just for testing purposes. I have looked at several samples, I don't see myself doing anything different, but it simply won't work.
Response Headersview source
Cache-Control private, max-age=30
Content-Length 597353
Content-Type image/jpg
Date Tue, 03 Sep 2013 21:33:55 GMT
Expires Tue, 03 Sep 2013 21:34:25 GMT
Last-Modified Tue, 03 Sep 2013 21:33:55 GMT
Server Microsoft-IIS/7.5
X-AspNet-Version 4.0.30319
X-AspNetMvc-Version 3.0
X-Powered-By ASP.NET
Request Headersview source
Accept text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Encoding gzip, deflate
Accept-Language en-US,en;q=0.5
Connection keep-alive
Cookie __utma=1.759556114.1354835397.1377631052.1377732484.36; __utmz=1.1354835397.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none)
Host hqat4app1
User-Agent Mozilla/5.0 (Windows NT 6.1; WOW64; rv:20.0) Gecko/20100101 Firefox/20.0
Response.Cache.SetCacheability(HttpCacheability.Public);
Response.Cache.SetLastModified(DateTime.Now);
return new FileContentResult(fileContents, contentType);
The relevant code is:
public ActionResult Resize(int id, int size, bool grayscale)
{
_logger.Debug(() => string.Format("Resize {0} {1} {2}", id, size, grayscale));
string imageFileName = null;
if (id > 0)
using (new UnitOfWorkScope())
imageFileName = RepositoryFactory.CreateReadOnly<Image>().Where(o => o.Id == id).Select(o => o.FileName).SingleOrDefault();
CacheImageSize(id, size);
if (!ImageWasModified(imageFileName))
{
Response.Cache.SetExpires(DateTime.Now.AddSeconds(30));
Response.StatusCode = (int)HttpStatusCode.NotModified;
Response.Status = "304 Not Modified";
return new HttpStatusCodeResult((int)HttpStatusCode.NotModified, "Not-Modified");
}
byte[] fileContents;
if (ShouldReturnDefaultImage(imageFileName))
fileContents = GetDefaultImageContents(size, grayscale);
else
{
bool foundImageFile;
fileContents = GetImageContents(id, size, grayscale, imageFileName, out foundImageFile);
if (!foundImageFile)
{
// No file found, clear cache, disable output cache
//ClearOutputAndRuntimeCacheForImage(id, grayscale);
//Response.DisableKernelCache();
}
}
string contentType = GetBestContentType(imageFileName);
Response.Cache.SetCacheability(HttpCacheability.Public);
Response.Cache.SetLastModified(DateTime.Now);
return new FileContentResult(fileContents, contentType);
}
private bool ImageWasModified(string fileName)
{
bool foundImageFile;
string filePath = GetFileOrDefaultPath(fileName, out foundImageFile);
if (foundImageFile)
{
string header = Request.Headers["If-Modified-Since"];
if(!string.IsNullOrEmpty(header))
{
DateTime isModifiedSince;
if (DateTime.TryParse(header, out isModifiedSince))
{
return isModifiedSince < System.IO.File.GetLastWriteTime(filePath);
}
}
}
return true;
}

HttpClient request to local IIS 8.0 does not produce expected headers in the response

I'm making the following request to a local website running in IIS
var httpRequestMessage = new HttpRequestMessage();
httpRequestMessage.RequestUri = new Uri("http://localhost:8081/");
httpRequestMessage.Method = HttpMethod.Get;
var response = new HttpClient().SendAsync(httpRequestMessage).Result;
This produces the following response headers:
HTTP/1.1 200 OK
Accept-Ranges: bytes
Date: Mon, 03 Jun 2013 22:34:25 GMT
ETag: "50c7472eb342ce1:0"
Server: Microsoft-IIS/8.0
X-Powered-By: ASP.NET
An identical request made via Fiddler produces the following response headers (I've highlighted the differences):
HTTP/1.1 200 OK
Content-Type: text/html
Last-Modified: Fri, 26 Apr 2013 19:20:58 GMT
Accept-Ranges: bytes
ETag: "50c7472eb342ce1:0"
Server: Microsoft-IIS/8.0
X-Powered-By: ASP.NET
Date: Mon, 03 Jun 2013 22:29:34 GMT
Content-Length: 10
Why is there a difference in response headers?
Am I using HttpClient correctly (aside from the fact I am calling Send synchronously)?
TL;DR;
To access all response headers you need to read both HttpResponseMessage.Headers and HttpResponseMessage.Content.Headers properties.
Long(er) answer:
This, basically:
var response = new HttpClient().GetAsync("http://uri/").Result;
var allHeaders = response.Headers.Union(response.Content.Headers);
foreach (var header in allHeaders)
{
// do stuff
}
I see two issues with this:
The Headers property is not appropriately named: it should really be SomeHeaders or AllHeadersExceptContentHeaders. (I mean, really, when you see a property named Headers, do you expect it to return all headers or some headers? I am pretty sure they are in violation of their own framework design guidelines on this one.)
The MSDN page does not mention at any point the fact this is a subset of all headers and developers should also inspect Content.Headers.

Resources