Response.AddHeader - asp.net-2.0

When is Response.AddHeader used?

It's used to add additional HTTP Headers to your request-- read the previous link if you're unfamiliar wtih what an HTTP Header is used for.
Most of the time, you'll end up setting headers indirectly, using other ASP.NET objects or methods like Response.Cookies or Response.Redirect. However, there are advanced, relatively unusual scenarios where it's sometimes necessary to call Response.AddHeader() directly in your code.
For example, to cause an HTTP 301 (permanent) redirect in ASP.NET 3.5, you'd need to use Response.AddHeader, using code like this:
<script runat="server">
private void Page_Load(object sender, System.EventArgs e)
{
Response.Status = "301 Moved Permanently";
Response.AddHeader("Location","/newpage.aspx");
}
</script>

Just one example of Justin Grant's answer is if you want to output excel you can do the following:
Response.ContentType = "application/vnd.ms-excel";
Response.AppendHeader("content-disposition", "attachment;filename=test.xls");

Related

ASP.NET Web Forms Incorrect Values Being Displayed/Saved

Recently, we added SSL to an application I support and it broke a "Print" button with issues similar to what is described here.
Without the SSL, the button would produce a PDF report with Open and Save prompts. Once SSL was implemented and we made changes to the MasterPage HTTP Cacheability, the Print button worked again.
Our master page had the following code in the Page_Load:
Protected void Page_Load(object sender, EventArgs e) {
Response.Buffer = true;
Response.ExpiresAbsolute = DateTime.Now;
Response.Expires = 0;
Response.Cache.SetCacheability(HttpCacheability.NoCache);
}
and it was changed/implemented as:
Protected void Page_Load(object sender, EventArgs e) {
Response.Buffer = true;
Response.Expires = -1;
Response.Cache.SetNoStore();
}
The print button Click event has the following code:
protected void btnPrint_Click(object sender, EventArgs e) {
try {
ClearMessage();
BusinessObject _obj = GetBusinessObject();
Id = _obj.BusinessObjectId;
string strId = Id.ToString();
Warning[] warnings;
string[] streamIds;
string encoding = string.Empty;
string mimetype = string.Empty;
string extension = string.Empty;
ReportViewer viewer = new ReportViewer() {
ProcessingMode = ProcessingMode.Remote
};
viewer.ServerReport.ReportPath = "/ReportPath/ReportFile";
ReportParameterCollection objParams = new ReportParameterCollection();
objParams.Add(new ReportParameter("ObjId", strId));
viewer.ServerReport.SetParameters(objParams);
byte[] bytes = viewer.ServerReport.Render("PDF", null, out mimetype, out encoding, out extension, out streamIds, out warnings);
Response.Buffer = true;
Response.Clear();
Response.ContentType = mimetype;
Response.AddHeader("content-disposition", "attachment; filename=" + GetBusinessObject().ObjNumber + " Object Summary." + extension);
Response.BinaryWrite(bytes);
Response.End();
}
catch (Exception ex) {
ClearMessage();
ErrorEmail("AppError", "btnPrint_Click", ex);
}
}
I thought everything was great but it looks like it may have caused even more problems because now I have reports of incorrect data being displayed between pages (information from one record being displayed for another).
The cacheability changes were the only thing that changed in the code when the SSL was implemented and we have not had any reports of this happening before (the application has been live for over a year now). Furthermore, I had my backup programmer revert the changes in our production branch code, publish to our staging server, and try to replicate the issue...they could not.
UPDATE
I made the following change to the sitewide web.config and the problem seems to have disappeared. Can anyone explain why this worked but the code did not?
<system.webServer>
<staticContent>
<clientCache cacheControlMode="DisableCache" />
</staticContent>
<system.webServer>
UPDATE
Setting
<clientCache cacheControlMode="DisableCache" /> did not fix the issue and we were still experiencing the problem intermittently. See accepted answer as solution that worked.
So it turns out that the issue was really with the SSL certs being installed improperly. The basic setup is we have a load balancer and SSL certs terminating on 2 web servers.
The incorrect values were being displayed/saved because the load balancer could not read the cookie (since it was encrypted) to determine which server to direct to and since we use session variables, it would not clear the session variables properly.
Our stop-gap fix was to switch the routing decision to IP based. As you can imagine, this causes all users behind one firewall to direct to the same server and defeats the purpose of the load balancer.
Our permanent solution is to put a 3rd SSL cert on the load balancer, decrypt the cookie to determine which server to direct to, encrypt the data once more and then send it off to the server.
Hopefully this helps someone else that runs into the same problem.

Response.WriteFile & Response.Redirect

I'm publishing a page URL
eg mysite.com/disclaimer/someinfo
this page shows an agreement, and if the user clicks the agree button then a PDF file is streamed as an attachment.
The purpose of this is the user can then share this URL but anyone else following the link must also read the agreement before downloading the file. The actual file does not have a URL but is held elsewhere on the server so the only way to download the file is if you click the "[Agree]" button.
However - what I am finding difficult is that after clicking the button and starting the download I'd like the page to "go away". The page with the buttons stays on the screen. Ideally they'd be redirected back to the page they came from. I'm keen to not require javascript or cookies. The referrer is stored in the initial page load to a hidden field on the form.
I've tried different approaches, attempting to write a 307 redirect in the header with the file brought up warnings to the user that their post data was being redirected. The code I have below only works in browsers that support meta refresh (eg Firefox; IE7 not Chrome/Safari)
I can see the logic of why this doesn't work. There's one response to the browser and that is the file as expected. What would be the correct way to send them the file and then redirect the browser to another page. It feels from a users p.o.v. that this "Accept" page should disappear once they've clicked the button.
I have:
protected void Page_Load(object sender, EventArgs e)
{
if (Request.UrlReferrer != null & Request.UrlReferrer.ToString() != "")
{
Uri referrer = Request.UrlReferrer;
if (!IsPostBack)
{ hfReferrer.Value = Request.UrlReferrer.ToString(); }
}
}
protected void buttonAccept_Click(object sender, EventArgs e)
{
//stream file
Node nodeCurrent = Node.GetCurrent();
string fileName = nodeCurrent.GetProperty("documentFile").Value.ToString();
System.IO.FileInfo file = new System.IO.FileInfo(fileName);
if (file.Exists) //set appropriate headers
{
Response.Clear();
if (hfReferrer.Value !="")
{Response.AddHeader("Refresh", string.Format("3; URL={0}", hfReferrer.Value));}
Response.AddHeader("Content-Disposition", "attachment; filename=" + file.Name);
Response.AddHeader("Content-Length", file.Length.ToString());
Response.ContentType = "application/pdf";
// write file to browser
Response.WriteFile(file.FullName);
Response.End();
}
I'd prefer not to need to use javascript/pop-ups etc.
Presently the user just has to click a link to go to another page, but this doesn't feel quite right.
Ideally I'd like to follow a good standard for this. But I am not sure where to look.
I would suggest to try a slightly different behavior. On the buttonAccept_Click event handler, redirect the user to a thank you page. In the Page_Load of this thank you page have the code for downloading the file.
This is very easy to implement, and gives a better experience for the user.

What am I doing wrong with HttpResponse content and headers when downloading a file?

I want to download a PDF file from a SQL Server database which is stored in a binary column. There is a LinkButton on an aspx page. The event handler of this button looks like this:
protected void LinkButtonDownload(object sender, EventArgs e)
{
...
byte[] aByteArray;
// Read binary data from database into this ByteArray
// aByteArray has the size: 55406 byte
Response.ClearHeaders();
Response.ClearContent();
Response.BufferOutput = true;
Response.AddHeader("Content-Disposition", "attachment; filename=" + "12345.pdf");
Response.ContentType = "application/pdf";
using (BinaryWriter aWriter = new BinaryWriter(Response.OutputStream))
{
aWriter.Write(aByteArray, 0, aByteArray.Length);
}
}
A "File Open/Save dialog" is offered in my browser. When I store this file "12345.pdf" to disk, the file has a size of 71523 Byte. The additional 16kB at the end of the PDF file are the HTML code of my page (as I can see when I view the file in an editor). I am confused because I was believing that ClearContent and ClearHeaders would ensure that the page content is not sent together with the file content.
What am I doing wrong here?
Thanks for help!
I think you want a Response.End at the end of this method.
In a quick glance, you're missing Response.End();

Post Back does not work after writing files to response in ASP.NET

What I have?
I have an ASP.NET page which allows the user to download file a on a button click. User can select the file he wants from a list of available files (RadioButtonList) and clicks on download button to download it. (I should not provide link for each file that can be downloaded - this is the requirement).
What do I want?
I want the user to download multiple files one by one by selecting the required radio button and clicking on the button.
What problem am I facing?
I can download the file for the first time properly. But, after downloading, if I select some other file and click on the button to download it, click event of the button does not post back and the second file will not be downloaded.
I use the following code on the button click event:
protected void btnDownload_Click(object sender, EventArgs e)
{
string viewXml = exporter.Export();
Response.Clear();
Response.AddHeader("Content-Disposition", "attachment; filename=views.cov");
Response.AddHeader("Content-Length", viewXml.Length.ToString());
Response.ContentType = "text/plain";
Response.Write(viewXml);
Response.End();
}
Am I doing something wrong here?
Same problem can be replicated in IE6, IE7 and Chrome. I think this problem is browser independent.
I had this same issue with sharepoint. I have a button on the page that sends a file and after clicking the button, the rest of the form was unresponsive. Turns out it is a sharepoint thing that sets the variable _spFormOnSubmitCalled to true to prevent any further submits. When we send a file this doesn't refresh the page so we need to manually set this variable back to false.
On your button in the webpart set the OnClientClick to a function in your javascript for the page.
<asp:Button ID="generateExcel" runat="server" Text="Export Excel"
OnClick="generateExcel_Click" CssClass="rptSubmitButton"
OnClientClick="javascript:setFormSubmitToFalse()" />
Then in the javascript I have this function.
function setFormSubmitToFalse() {
setTimeout(function () { _spFormOnSubmitCalled = false; }, 3000);
return true;
}
The 3 second pause I found was necessary because otherwise I was setting the variable before sharepoint set it. This way I let sharepoint set it normally then I set it back to false right after.
Offhand, what you're doing should work. I've successfully done similar in the past, although I used a repeater and LinkButtons.
The only thing I can see that's different is that you're using Response.Write() rather than Response.OutputStream.Write(), and that you're writing text rather than binary, but given the ContentType you specified, it shouldn't be a problem. Additionally, I call Response.ClearHeaders() before sending info, and Response.Flush() afterward (before my call to Response.End()).
If it will help, here's a sanitized version of what works well for me:
// called by click handler after obtaining the correct MyFileInfo class.
private void DownloadFile(MyFileInfo file)
{
Response.Clear();
Response.ClearHeaders();
Response.ContentType = "application/file";
Response.AddHeader("Content-Disposition", "attachment; filename=\"" + file.FileName + "\"");
Response.AddHeader("Content-Length", file.FileSize.ToString());
Response.OutputStream.Write(file.Bytes, 0, file.Bytes.Length);
Response.Flush();
Response.End();
}
You may want to consider transferring the file in a binary way, perhaps by calling System.Text.Encoding.ASCII.GetBytes(viewXml); and passing the result of that to Response.OutputStream.Write().
Modifying your code slightly:
protected void btnDownload_Click(object sender, EventArgs e)
{
string viewXml = exporter.Export();
byte [] bytes = System.Text.Encoding.ASCII.GetBytes(viewXml);
// NOTE: you should use whatever encoding your XML file is set for.
// Alternatives:
// byte [] bytes = System.Text.Encoding.UTF7.GetBytes(viewXml);
// byte [] bytes = System.Text.Encoding.UTF8.GetBytes(viewXml);
Response.Clear();
Response.ClearHeaders();
Response.AddHeader("Content-Disposition", "attachment; filename=views.cov");
Response.AddHeader("Content-Length", bytes.Length.ToString());
Response.ContentType = "application/file";
Response.OutputStream.Write(bytes, 0, bytes.Length);
Response.Flush();
Response.End();
}
A simple way to do this without removing Response.End is to add client-side js to do the page refresh. Add the js to your button's onclientclick property.
e.g.
onclientclick="timedRefresh(2000)"
then in your html..
<script type="text/JavaScript">
<!--
function timedRefresh(timeoutPeriod) {
setTimeout("location.reload(true);",timeoutPeriod);
}
// -->
Remove Response.End() and let the response end naturally within the ASP.NET ecosystem.
If that does not work, I would recommend putting the button in a separate <form> and post the required data to a separate HTTP handler. Setup the HTTP handler to export the XML instead of a web page.
I had same problem. Function to perform simple Response.Writer("") on Button Click event on aspx page was never firing.
Method in class:
public test_class()
{
public test_class() { }
public static void test_response_write(string test_string)
{
HttpContext context = HttpContext.Current;
context.Response.Clear();
context.Response.Write(test_string);
context.Response.End();
}
}
ASPX Page:
protected void btn_test_Click(object sender, EventArgs e)
{
test_class.test_response_write("testing....");
}
While I was trying to find the reason, I just called same function on Page_Load event it worked.
protected void Page_Load(object sender, EventArgs e)
{
if (!IsPostBack)
{
test_class.test_response_write("testing....");
}
}
Investigating the issue I found out that Master Page of that aspx page's body was under <asp:UpdatePanel>.
I removed it, and it worked on Button_Click Event. I would recommend you to check that too.
Check at https://multilingualdev.wordpress.com/2014/08/19/asp-net-postback-after-response-write-work-around-solution/
When sending a file to client and using response.write in ASP.Net the developer is unable to do anything else after the file is sent. There are other work-arounds like adding a javascript onclick function that would call a function after the client gets the file, which is similar to adding a meta refresh to the when the function to send the file is called (e.g. Response.AppendHeader(“Refresh”, “5;URL=” & HttpContext.Current.Request.Url.AbsoluteUri))
But those weren’t giving me the look that I wanted. So I used a combination of those work-arounds to come up with this solution:
Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load
checkSendFile() ‘each time page loads it will check to see if it needs to send the file
If (Not IsPostBack) Then
‘this spot is not be executed after a form submit (postback); which means you can put
‘ the checkSendFile() in here if you want
end If
End Sub
Private Sub checkSendFile()
Dim sendFile As String = Session(“SENDFILE”) ‘ first we get the session file to see if its time
‘ to send a file
If Not sendFile Is Nothing Then
If sendFile = “YES” Then
Session(“SENDFILE”) = “” ‘here we clear the session file so it doesn’t send again if
‘ refreshed
sendClientFile() ‘ function to send the file to client
End If
End If
End Sub
Protected Sub btnGetFile_Click(sender As Object, e As EventArgs) Handles btnGetFile.Click
‘this is where the client clicks on a button or link or something that submits the form and
‘ request a file to be sent to them
Session(“SENDFILE”) = “YES” ‘ we set a session variable flag
‘then we update the GUI, or run any other method that we wanted to do after client gets file
me.lblMsgToClient.text = “Thank you for downloading file.”
RefreshPage() ‘ then we refresh the page instantly (this is where post back will update values
‘ and interface, then send file)
End Sub
Private Sub RefreshPage()
‘ here we instantly add a refresh meta tag to the header with zero seconds to refresh to the
‘ same url we are currently at
Response.AppendHeader(“Refresh”, “0;URL=” & HttpContext.Current.Request.Url.AbsoluteUri)
End Sub
Private Sub sendClientFile()
‘here you will have your file and bytes to send to browser from either file system or database
‘then you can call sendToBrowser(…)
End Sub
Private Sub sendToBrowser(ByVal fileName As String, ByVal contentType As String, ByRef fileBytes As Byte())
‘this function is just the normal send file to client
Response.AddHeader(“Content-type”, contentType)
Response.AddHeader(“Content-Disposition”, “attachment; filename=” & fileName)
Response.BinaryWrite(fileBytes)
Response.Flush()
Response.End()
End Sub

Return XML as HTTP response

I've been given a seemingly simple task.
When a given URL is requested the response should simply be some valid XML.
How do I achieve this?
The URL will contain all the necessary code behind to get the data and construct the correct XML string. How do you then go ahead and manipulate the response to return this string only? The caller is receiving the XML string and populating a database with it, that's there responsibility I just need to provide this part of the project.
Thanks
Take a look at this :
Response.Clear();
Response.Write(yourXml);
Response.ContentType = "text/xml";
Response.End();
I would go for an HttpHandler. This way you circumvent all asp.net control creation etc. which is better for performance and seeing as you will not be outputting any html there's no point in using an actual aspx page.
Assuming you have your XML string created you can clear the response and just write your string out.
Response.Clear();
Response.ContentType = "text/xml";
Response.Write(myXMLString);
If you don't want to use full blown webservice then you could do something like this:
private void Page_Load(object sender, System.EventArgs e)
{
Response.ContentType = "text/xml";
//get data from somewhere...
Response.Write(data);
}
}
See here for something similar using images http://www.informit.com/guides/content.aspx?g=dotnet&seqNum=325
Below is the way to send xml data to the browser as a response.
StringBuilder xmlBuilder = new StringBuilder();
xmlBuilder.Append("<Books>");
xmlBuilder.Append("<Book>");
xmlBuilder.Append("Maths");
xmlBuilder.Append("</Book>");
xmlBuilder.Append("</Books>");
context.Response.ContentType = "text/xml";
context.Response.BinaryWrite(Encoding.UTF8.GetBytes(xmlBuilder.ToString()));
context.Response.End();

Resources