How do I compress the output data from a web service (web method). The output is XmlDocument type.
Here is the code.
[WebMethod]
public XmlDocument GetPersonalInfo(int CustomerID)
{
XmlDocument doc = new XmlDocument();
doc.LoadXml(new CustomersXML().GetPersonalInfo(CustomerID));
return doc;
}
How do I Gzip this response. Please remember that it is not a page (HTTP call) its a web service that is being called from a Flex client.
Thanks
This answer references an ancient article explaining how to implement this programmatically using SharpZipLib.
Another answer demonstrates use of the System.IO.Compression classes which could be substituted for SharpZipLib.
Rick Strahl put together an article reviewing potential issues to be aware of when implementing compression which do not appear to be addressed in the examples provided above. He also links to an article providing more details on usage of IIS7 built-in compression.
go to your iis or whatever config file, and add the following line in the appropriate place:
<add mimeType="application/json" enabled="true" />
i've had the same problem - my IIS was able to GZIP any http response, except for json responses (which where needed a gzip compression the most on my app).
hope that helps
Update:
the application host config file should be located here: %windir%\System32\inetsrv\config
I don't have time to look for code but there is a namespace System.IO.Compression that has several classes for both Gzip and Deflation-based compression schemes. Gzip is probably a safer bet for cross-language communication as I'm not certain as to how widespread deflation compression is.
However, you shouldn't have any trouble in the communication so long as there is a header with the SOAP packet telling the client server to decompress the stream.
Note: Double check your server settings before doing this though, as some hosts have Gzip turned on by default and you don't want to do it twice.
This is nothing to do with .NET.
GZIP is an HTTP feature in Web Server - provided the client supports it which generally notifies the server by sending GZIP in the ACCEPT headers when sending the request.
You need to set it up in IIS. Depending on the version, it can be different. In IIS 7 it is very easy, just a flag to set. See here.
Related
I have an ashx handler, and the response is not gzipped. The content-encoding received by the client is empty.
The IIS settings for the site have static and dynamic compression enabled.
Research of similar problems shows some people have an httpCompression node in the web server node of IIS configuration editor. I do not have such a node. I have a url compression node, where I have set everything to true. Perhaps that is IIS version dependent. The op system is Windows Server 2008 R2.
I am about to try to "force" compression using the filter property and the GZipstream class (credit to Rick Strahl's blog). If anyone can tell my why IIS is not "auto compressing" or can point to any gotchas in my workaround I would be grateful.
Update: attaching GzipStream to the response filter reduced the content length by half as seen by the client, which seems to indicate the "manual" compression is doing something.
I am aware this was previously asked here:
.ashx handler not getting gzip compressed despite IIS Config setting
However, the previous question did not receive any answers, so I am asking the question again.
Please check if you are adding Accept-Encoding", "gzip" in request headers while making HTTP request.
I have written a WCF service to return JSON on REST requests. Works great with a browser hitting it. But when my JavaScript hits it, the first request is an OPTIONS request for the url with "Access-Control-Request-Method: GET".
I think I need to handle CORS as documented here. However the suggested code won't compile and the suggested web.config is illegal in places.
What do I need to do so the service will respond appropriately when asked if a GET can be requested on a url?
You may have to enable it in IIS as well: http://encosia.com/using-cors-to-access-asp-net-services-across-domains/
http://myfileserver/images/car/chevrolet.gif
I have this file server holding the files such as images, doc files etc. Now i want to intercept the http request and based on the file extension i want perform some action such as redirection to some other webpage.
What is the best and the easier way to accomplish this thing? I am using asp.net framework for my applications.
Pls suggest the approach.
Thx
If you are looking to intercept the request for specific file types, then go with an Http Handler. Here is the MSDN link explaining their usage - Http Handlers
In the WCF world if you're looking to handle an HTTP request with a different option based on the requested filetype, you may want to look into adding an IDispatchOperationSelector, which allows the service to route the code through a different operation. The default HTTP implementation is the WebHTTPDispatchOperationSelector, which is explained pretty well here and here.
If you want to remain in the Asp.Net world, I'd recomend going with custom message handlers. Here's an article by Mike Wasson explaining how these work & where they fall in the Asp stack.
I've got a ASP.NET Webservice up and running using the [ScriptService] Attribute. From what I've read from this article:
http://weblogs.asp.net/scottgu/archive/2007/04/04/json-hijacking-and-how-asp-net-ajax-1-0-mitigates-these-attacks.aspx
ASP.NET by defaults does not allow JSONP requests (injected into the DOM via to deny cross-domain-requests. Its does so by taking 2 measures:
1) only accept POST requests (script injection via always does GET)
2) deny connections sending a HTTP header Content-type other than "Content-type: application/json" (which browsers will not send).
I am familiar with the cross-domain issues and I know what JSONP is and I fully understand, why ASP.NET is by default restricted in that way.
But now, I have my webservice which is a public one, and should be open to everybody. So I explicitly need to enable cross-domain requests via Javascript to my Webservice, so that external websites can retrieve data via my webservice from jquery and alike.
I've already covered step (1) to allow requests via GET by modifiying the ScriptMethod Attribute this way: [ScriptMethod(UseHttpGet=true)]. I've checked with jQuery, GET requests now work (on same-domain). But how to get to fix point (2)?
I know about the Allow-Origin-* headers some browsers support, but afaik its not standard yet, and I don't want to force my users / customers to modify their HTTP headers for using my webservice.
To sum it up: I need the good practice to enable Cross-domain requests for ScriptingService for public Webservices via JSON. I mean there MUST be a way to have a Webservice public, that is what most webservices are about?
Using legacy ASMX services for something like this seems like a lost cause. Try WCF which due to its extensible nature could very easily be JSONP enabled. So if you are asking for best practices, WCF is the technology that you should be building web services on the .NET platform.
Or if you really can't afford migrating to .NET 3.5 at the moment you could also write a custom http handler (.ashx) to do the job.
The jQuery ajax() function does have a 'crossDomain' property.
Pasted from jQuery.ajax()
crossDomain(added 1.5)
Default: false for same-domain requests, true for cross-domain requests
If you wish to force a crossDomain request (such as JSONP) on the same domain, set the value of crossDomain to true. This allows, for example, server-side redirection to another domain
Does HTTP PUT have advantages over HTTP POST, particularly for File Uploads? Data transfer should be highly secure. Your ideas / guidance on this will be of great help.
PUT is designed for file uploads moreso than POST which requires doing a multipart upload, but then it comes down to what your server can do as to which is more convenient for you to implement.
Whichever HTTP method you use, you'll be transmitting data in the clear unless you secure the connection using SSL.
I think the choice of PUT vs. POST should be more based on the rule:
PUT to a URL should be used to update or create the resource that can be located at that URL.
POST to a URL should be used to update or create a resource which is located at some other ("subordinate") URL, or is not locatable via http.
Any choices regarding security should work equally with both PUT and POST. https is a good start, if you are building a REST API then keys, authorisation, authentication and message signing are worth investigating.
Does HTTP PUT have advantages over HTTP POST, particularly for File Uploads?
You can use standard tools for sending the data (i.e. ones that don't have to be aware of your custom scheme for describing where the file should be uploaded to or how to represent that file). For example, OpenOffice.org includes WebDAV support.
Data transfer should be highly secure
The method you use has nothing to do with that. For security use SSL in combination with some form of authentication and authorization.