Compression is not working - asp.net

I need to compress all dynamic content of my data export site.
I've tried numerous ways, nothing works. Chrome shows that content is not compressed and "Content-Encoding" header is not present.
Trying to do it like this as the last resort method (before writing any response):
context.Response.Filter = new DeflateStream(context.Response.Filter, CompressionMode.Compress);
context.Response.AppendHeader("Content-Encoding", "deflate");
Logging shows that this code is executed correctly.
However, Chrome shows that content is not compressed, again.
UPD when using IIS built-in compression, it seems to work and request tracing shows "DYNAMIC_COMPRESSION_SUCCESS". However, IE still shows that response is not compressed. The same when I'm requesting the page from the server itself using localhost name.
Any ideas?

If you want to do this manually first check the compression is supported,
public static bool IsGZipSupported()
{
string AcceptEncoding = HttpContext.Current.Request.Headers["Accept-Encoding"];
if (!string.IsNullOrEmpty(AcceptEncoding) &&
(AcceptEncoding.Contains("gzip") || AcceptEncoding.Contains("deflate")))
return true;
return false;
}
And compress your response,
public static void GZipEncodePage()
{
if (IsGZipSupported()) {
HttpResponse Response = HttpContext.Current.Response;
string AcceptEncoding = HttpContext.Current.Request.Headers("Accept-Encoding");
if (AcceptEncoding.Contains("gzip")) {
Response.Filter = new System.IO.Compression.GZipStream(Response.Filter, System.IO.Compression.CompressionMode.Compress);
Response.AppendHeader("Content-Encoding", "gzip");
} else {
Response.Filter = new System.IO.Compression.DeflateStream(Response.Filter, System.IO.Compression.CompressionMode.Compress);
Response.AppendHeader("Content-Encoding", "deflate");
}
You can check filter is attached just before the headers are sent to the client
protected void Application_PreSendRequestHeaders()
{
HttpResponse response = HttpContext.Current.Response;
if (response.Filter is GZipStream && response.Headers["Content-encoding"] != "gzip")
response.AppendHeader("Content-encoding", "gzip");
else if (response.Filter is DeflateStream && response.Headers["Content-encoding"] != "deflate")
response.AppendHeader("Content-encoding", "deflate");
}
For more information check this posts;
ASP.NET GZip Encoding Caveats
Built-in GZip/Deflate Compression on IIS 7.x
Benefits and Drawbacks of IIS 7 Compression

Instead of trying to do this manually I would rely on the pre-written (and tested) Microsoft code built into IIS that will do this for you:
Install Dynamic Content Compression on the machine (bullet 5 in the link) and enable it in IIS. IIS will now handle compression for on both static and dynamic content. Less code to maintain (and invariably have bugs) is always a good thing!

If you are using IIS7+, there's an Compression option. Navigate to your site, in the right main window, click "Compression", and check all 2 checkboxes:
Enable dynamic content compression
Enable static content compression

Related

MVC Permanent way to use redirects for HTTP to HTTPS and turn off for Dev Environment

I got this code from here.
Notice I remmed out the part that redirects to ISSExpress 44300 port because I want to use II7.5 on dev box without https.
public class CustomRequireHttpsFilter : RequireHttpsAttribute
{
protected override void HandleNonHttpsRequest(AuthorizationContext filterContext)
{
// The base only redirects GET, but we added HEAD as well. This avoids exceptions for bots crawling using HEAD.
// The other requests will throw an exception to ensure the correct verbs are used.
// We fall back to the base method as the mvc exceptions are marked as internal.
if (!String.Equals(filterContext.HttpContext.Request.HttpMethod, "GET", StringComparison.OrdinalIgnoreCase)
&& !String.Equals(filterContext.HttpContext.Request.HttpMethod, "HEAD", StringComparison.OrdinalIgnoreCase))
{
base.HandleNonHttpsRequest(filterContext);
}
// Redirect to HTTPS version of page
// We updated this to redirect using 301 (permanent) instead of 302 (temporary).
string url = "https://" + filterContext.HttpContext.Request.Url.Host + filterContext.HttpContext.Request.RawUrl;
//if (string.Equals(filterContext.HttpContext.Request.Url.Host, "localhost", StringComparison.OrdinalIgnoreCase))
// {
// // For localhost requests, default to IISExpress https default port (44300)
// url = "https://" + filterContext.HttpContext.Request.Url.Host + ":44300" + filterContext.HttpContext.Request.RawUrl;
// }
filterContext.Result = new RedirectResult(url, true);
}
}
Then, in my FilterDonfig.cs I added this. What it does is it only uses the override above if Web.config has "Debug=false", which is what it has in Production. I don't need to run Release in my development environment, and I also don't want configure local IIS to handle SSL. Notice I remmed out the "RequireHttpsAttribute()" and used the new one above.
public class FilterConfig
{
public static void RegisterGlobalFilters(GlobalFilterCollection filters)
{
filters.Add(new HandleErrorAttribute());
if (!HttpContext.Current.IsDebuggingEnabled)
{
/////filters.Add(new RequireHttpsAttribute());
filters.Add(new CustomRequireHttpsFilter());
}
}
}
Am I doing the right thing? Is this how to make sure SEO is optimized because search bots only keep track of one website? My understanding is that "http" and "https" are considered 2 separate websites by search engines. Am I doing this in the right place? Not sure what other code I am getting in the way of.
===============
I asked my ISP about how to do permanent redirects and suggested this solution and they said:
Dear Customer,
We did not setup redirection. However, we corrected https bind setting in IIS to fix the problem.
I wonder if IIS can do the same thing and that is what they did. I hope I'm in the right forum :)
How about doing this at an IIS level using the URL rewrite module: http://forums.iis.net/t/1153050.aspx?URL+Rewrite+for+SSL+redirection
To turn it off in dev, just set the enabled rule to false in your dev web.config, but enable it for all servers/environments that have HTTPS set up.
I've used it in the past and its worked really well. Saves cluttering your app with code that isn't app related.

Image Url validation in asp.net

i have images url , i need to check url is responding or not .
For Example :Below i i have written three image url, first two url is not valid only third url is valid .but second and fourth url is responding as valid image
and but there is no image.
http://media.expedia.com/hotels/1000000/90000/84900/84853/84853_744_b.jpg
http://www.iceportal.com/brochures/media/show.aspx?brochureid=ICE19044&did=3073&mtype=3073&type=pic&lang=en&publicid=4175749&resizing=X
http://images.trvl-media.com/hotels/1000000/30000/20400/20313/20313_166_b.jpg
http://www.iceportal.com/brochures/ice/ErrorPages/404.htm?aspxerrorpath=/brochures/media/show_A.aspx
here is my code:
public static bool CheckUrlExists(string url)
{
try
{
Uri u = new Uri(url);
WebRequest w = WebRequest.Create(u);
w.Method = WebRequestMethods.Http.Head;
using (StreamReader s = new StreamReader(w.GetResponse().GetResponseStream()))
{
return (s.ReadToEnd().Length >= 0);
}
}
catch
{
return false;
}
}
with this code i am validating only those url which is showing 404 error,but not those url which showing 'Sorry, requested brochure is temporarily un-published 'or any other type of message.
You will need a more complex logic to validate if the URL points to an image. If a resource is missing from the server or it is otherwise unavailable, you may get a HTTP error like the infamous 404, which will trigger a WebException. However, that is only part of the story.
Your second URL returns HTTP 200, confirming that the resource is there when in fact the resource is missing. What you really get there is a HTML document explaining the resource is not available. This is bad practice, but not without example.
At very least, you should examine the MIME type (Content-Type header, see WebResponse.ContentType) of the resource you test. A content type of image/* suggests an image-type resource. Failing to detect a known MIME type (e.g. if you receive application/octet-stream) you can actually HTTP GET get resource and run image type detection on the downloaded content.
I would suggest using HttpWebRequest and HttpWebResponse to do this, they are sub classes of WebRequest and WebResponse and as such are more granular for what you're trying to achive. The following code works with the example URIs provided
public static bool CheckUrlExists(string url)
{
try
{
Uri u = new Uri(url);
HttpWebRequest w = (HttpWebRequest)WebRequest.Create(u);
w.AllowAutoRedirect = false;
w.Method = WebRequestMethods.Http.Head;
HttpWebResponse response = (HttpWebResponse)w.GetResponse();
return response.StatusCode == HttpStatusCode.OK; //Check http status code
}
catch(WebException ex)
{
return false;
}
}
What's important here is that I'm checking the HttpStatus code. You're catch will already catch the 404s but the problem URIs ultimately lead to a 200 (OK). By setting AllowAutoRedirect to false the HttpWebRequest instance returns a 302 (redirect) status code, instead of following the redirect through to the "Sorry, requested brochure is temporarily un-published." page which is returning 200 (OK). This should serve your purpose.
Also: Catching a WebException will allow you to examine the status code (400+,500+, etc).
Be aware however, that you may be redirected to a new location for the image you're requesting. Taking that you might want to use PeterK's mime type check.

is there any IIS setting require for url rewriting?

i have used URL rewriting using global.asax file. url rewriting is working file on local machine but when i made it online its not working.
void Application_BeginRequest(Object sender, EventArgs e)
{
var extension = Path.GetExtension(Request.PhysicalPath).ToLower();
if (File.Exists(Request.PhysicalPath))
{
if (extension == ".html")
{
Response.WriteFile(Request.PhysicalPath);
Response.End();
}
return;
}
var path = Request.Path;
if (!Context.Items.Contains("ORIGINAL_REQUEST"))
Context.Items.Add("ORIGINAL_REQUEST", path);
if (extension == ".html")
{
var resource = UrlRewriting.FindRequestedResource();
var command = UrlRewriting.GetExecutionPath(resource);
Context.RewritePath(command, true);
}
}
url is:ind205.cfmdeveloper.com
when you click on about us ,demo,advertise page it will not display.
so please let me know is there any IIS setting require,
reply me soon
thanks
Samir
I work on url rewriting. I don't think it need any setting in IIS as i work in it and i didn't made any changes in url writing by the way you can see these links.hopefully you get any solution from them.
http://weblogs.asp.net/scottgu/archive/2007/02/26/tip-trick-url-rewriting-with-asp-net.aspx
http://learn.iis.net/page.aspx/496/iis-url-rewriting-and-aspnet-routing/
http://learn.iis.net/page.aspx/517/url-rewriting-for-aspnet-web-forms/
http://www.asp.net/learn/Videos/video-154.aspx
http://www.15seconds.com/Issue/030522.htm
http://urlrewriter.net/
If you got your answer from these links check my answer and also vote it.thanx
I notice that you have .html file that you check for. For html to pass from asp.net processing , you need to declare it (map it) ether on iis ether on web.config

GZIP Compression causing web page expiration

I have implemented GZIP compression on a few of my ASP.NET pages, using a class that inherits from System.Web.UI.Page, and implementing the OnLoad method to do the compression, like so:
protected override void OnLoad(EventArgs e)
{
base.OnLoad(e);
if (Internet.Browser.IsGZIPSupported())
{
base.Response.Filter = new GZipStream(base.Response.Filter, CompressionMode.Compress, true);
base.Response.AppendHeader("Content-encoding", "gzip");
base.Response.AppendHeader("Vary", "Content-encoding");
}
else if (Internet.Browser.IsDeflateSupported())
{
base.Response.Filter = new DeflateStream(base.Response.Filter, CompressionMode.Compress, true);
base.Response.AppendHeader("Content-encoding", "deflate");
base.Response.AppendHeader("Vary", "Content-encoding");
}
}
The IsGZIPSupported method just determines whether the browser supports GZIP, looking at the Accept-encoding request header, and the browser's user agent (IE5-6 are excluded from GZIP compression). However, with this code, I am getting the web page has expired message in IE, when I postback from the page and try to use the back button. Setting the cache control to private seems to fix the problem:
base.Response.Cache.SetCacheability(HttpCacheability.Private);
But I am not sure why, or whether this will cause other problems. I haven't set any caching for any other pages in the site, and the site is running on an intranet with only a dozen concurrent users, so performance isn't a big issue at the moment.
See this article on Vary header and WinInet/MSIE
It seems you should be sending Vary: Accept-Encoding instead of Vary: Content-Encoding, as the response will vary depending on the request header.

Why my httpwebrequest post to myhandler.ashx is rejected with status code 401

I've already written an HTTPHandler that gets POSTed from a ColdFusion page and it works successfully; now, I am trying to write a web application in ASP.NET so I can post a form to the .ashx handler from an .aspx page.
Application Trace (trace.axd) shows the following as my last 3 entries:
2 8/14/2009 1:53:56 PM /Default.aspx 200 GET View Details
3 8/14/2009 1:54:04 PM /Default.aspx 200 POST View Details
4 8/14/2009 1:54:13 PM /UploadHandler.ashx 401 POST View Details
I have a breakpoint in my .ashx file but it is never reached (I guess because of the 401 status code). Here is the snippet of code from the default.aspx trying to POST to the handler:
protected void UploadHandlerButton_Click(object sender, EventArgs e)
{
if (FileUpload1.HasFile)
{
try
{
ASCIIEncoding encoding = new ASCIIEncoding();
byte[] data = encoding.GetBytes(BuildFormData());
string baseAddress = "http://" + Environment.MachineName;
string pathInfo = Page.ResolveUrl("UploadHandler.ashx");
string URI = baseAddress + pathInfo;
HttpWebRequest myRequest = (HttpWebRequest)WebRequest.Create(URI);
myRequest.Method = "POST";
myRequest.ContentType = "application/x-www-form-urlencoded";
myRequest.ContentLength = data.Length;
Stream newStream = myRequest.GetRequestStream();
newStream.Write(data, 0, data.Length);
newStream.Close();
}
catch (Exception someError)
{
LogText("FAILURE: " + someError.Message);
}
}
}
Here is a snippet of code from the UploadHandler.ashx file (but this doesn't appear to be reached):
public void ProcessRequest(HttpContext context)
{
string returnURL = context.Request.ServerVariables["HTTP_REFERER"];
string message;
message = UploadFile(context);
StringBuilder msgReturn = new StringBuilder(returnURL);
msgReturn.Append("?n=");
msgReturn.Append(HttpUtility.UrlEncode(TRIMrecNumAssigned));
msgReturn.Append("&m=");
msgReturn.Append(HttpUtility.UrlEncode(message));
context.Response.Redirect(msgReturn.ToString());
}
Both default.aspx and UploadHandler.ashx are in the root of a virtual directory on my localhost; the directory security is currently set to "Anonymous access" CHECKED and "Integrated Windows authentication" CHECKED.
When I click the "View Details" link on the trace.axd display, I see all the data in the Forms collection that I expect to see and hope to process but this 401 seems to be stopping everything. I could post the code for my little function called BuildFormData() if useful.
EDIT: Revised handler as follows (has had no effect; same error occurs):
public void ProcessRequest(HttpContext context)
{
//-----------------------------------------------------------------------------------------
// the remainder of this block is alternative to the .Redirect and is useful for debugging.
context.Response.ContentType = "text/html";
//context.Response.Write(TRIMrecNumAssigned);
//context.Response.Write("<p>");
//context.Response.Write(msgReturn);
context.Response.Write("<H1>Trim - Kerberos Prototype for ColdFusion consuming pages</h1>");
HttpContext.Current.Trace.IsEnabled = true;
HttpContext.Current.Trace.Write(null);
HttpContext.Current.Trace.Write("-------");
HttpContext.Current.Trace.Write(context.Request.Form["txtTrimRecordType"]);
HttpContext.Current.Trace.Write(GetUserInfo());
HttpContext.Current.Trace.Write("-------");
HttpContext.Current.Trace.Write(null);
using (Html32TextWriter htw = new Html32TextWriter(context.Response.Output))
{
typeof(TraceContext)
.GetMethod("Render", BindingFlags.NonPublic | BindingFlags.Instance)
.Invoke(HttpContext.Current.Trace, new object[] { htw });
}
}
Have you tried turning off Integrated Windows Auth and just leaving anonymous checked? Does it make a difference?
Your answer: "I think it made things worse because now I cannot even browse to default.aspx. I get this: HTTP 401.3 - Access denied by ACL on resource Internet Information Services"
My response: This is actually a good thing. This means we're getting closer to what is going on. If you're getting that error message and the only thing you have enabled is anonymous authentication via IIS, that means that the ASP.NET impersonation user doesn't have NTFS permissions on the files in question.
I'm not sure if you are on XP or Win 2k3, but now you want to check and make sure that either the ASPNET (XP) or Network Service (Win 2k3) users have at least read access on the files in question. Make sure that user has at least that level of access and then let me know how it goes.
Update: I don't know why I didn't think of this before. You may need to set credentials on your HttpWebRequest. To use the credentials of the current user, try adding this to your request.
HttpWebRequest myRequest = (HttpWebRequest)WebRequest.Create(URI);
myRequest.Credentials = CredentialCache.DefaultCredentials;
If you need to add different credentials you can try Network Credentials
There's a good explanation of credentials here.
Hope this helps.
Looking at your ProcessRequest(), you do the following:
string returnURL = context.Request.ServerVariables["HTTP_REFERER"];
Based on how you are calling it with HttpWebRequest, this variable will be null. Then when you create your msgReturn, it will look something like this:
?n=XXX%m=YYY
When you redirect to this URL, it will probably not be found which is what is returning the 401.

Resources