Compiled Proxy Class (in bin) vs. Web Reference - asp.net

I have a handful of ASP.NET websites which communicate with different instances of SQL Server 2005 via a web reference to the report server's web service. However, today I toyed with the notion of using the WSDL tool to create a proxy class from one of the SQL Server instances and, in turn, using the proxy to create a dll (before doing so, I modified the proxy's constructor to accept a URL - so that I could point the proxy to any of the web service instances).
I'm pretty sure that the web service should be pretty, if not completely, static, in terms of updates. So, my question is: are there any drawbacks for using the compiled proxy class (in the bin directory), as opposed to using the proxy class (auto-generated), itself? If not, what are some motivations for going one way or the other?

As to my knowledge, there is absolutely no difference between both of them. The Service Reference does exactly what the svcutil also does, but allows the user to do it in an easier way. Service Reference is just a warpper around the svcutil.exe and it does nothing more than what the basic svcutil.exe does.
Thanks

Related

Using webservices with in .Net project from local namespace is valid approach or not?

I am newbie in webservices of .net, i am implementing a project of .net with backend in webservices as i got impressed with the benefits and portability of this architecture.
But the confusion is, i want to know the way i am adopting is right or wrong. I have made services with in my project and calling them from code behind classes, where as my service is implementing the database code. I am attaching the screenshot for further explanation.
Not really, when you do this you are calling the classes directly rather than as a service call. This is liable to cause you issues (e.g. if you set up one of your methods to return a certain HTTP response status for example).
If you don't want this to be a service call, then you would be better extracting the logic from the service method into a business layer and have both the service and your page call that code.
If you do want a service call, then you need to add a service ref to your project that points to the service and call it through the generated proxy.
Your approach makes no difference of using a C# class vs using a webservice.For this approach You can use a class instead of a webservice.
Use web services instead of C# classes for the following cases.
You want to expose some functionality to outside world/consumers/other applications
You want to decouple parts of your system so that they can be changed without affecting other parts of application
You want to do make your application scaleable, so you create webserices and deploy those on different servers

Prevent generation of proxy classes in Reference.cs when adding/updating a web reference

I have a web service and a client. The classes used in parameters and return types are in a common DLL shared by both. However, whenever I update the web reference, visual studio generates copies of the classes with the same names and public properties and methods. Then the solution won't compile because the client code tries to use the versions in the common DLL. I can solve the problem by deleting the "duplicate" classes every time I update the web reference, and adding a using statement to point at the common dll's namespace. Is there a way to fix this permanently?
UPDATE: See my comments below. This is a "feature" of asmx web services. There is no way around it other than one of the following:
1) Use a more modern type of web service.
2) Don't use a common DLL
3) Manually fix every time you update the web reference, as in the original question above.
This is a "feature" of asmx web services. There is no way around it other than one of the
following:
Use a more modern type of web service.
Don't use a common DLL
Manually fix every time you update the web reference, as in the original question above.
Sources: Other stackoverflow questions:
"Reuse existing types" is ignored when adding a service reference
How does Visual Studio 2008 and svcutil decide which types to re-use from referenced assemblies when generating a web service proxy class?
I had the same problem, but I had neglected to add the reference the correct assembly with the request/response types in my client. Once I added that reference, and ensured that the "Reuse types" checkbox was on in the Add Service Reference dialog, it worked properly.
There`s no way to do that.
However, I think we have a design problem here. When we create a web service, we expect that our clients don't need to reference any dll from us. Only the types exposed by the web service should be enough for their use (web services are all about interoperability, imagine your client app written in Java, you can't reference the .NET dll).
That's why these types are created when you reference a web service. In my opinion, you should only rely on the classes generated by the web service in your client app. Remove the reference to the shared dll from the client project.
This doesn't direct answer your question, but provides an alternative for your issue.
In the domain class, set AnonymousType=false to prevent generating class with prefix unexpected when adding the web reference
[System.Xml.Serialization.XmlTypeAttribute(AnonymousType = false)]
but this only ensure that the class, auto-gen in Reference.cs has the same structure as the domain class.
A way to walk aroud this is to serialize/deserialize to the domain object.

Hosting static content on different domain from webservices, how to avoid cross-domain?

We've recently been working on a fairly modern web app and are ready to being deploying it for alpha/beta and getting some real-world experience with it.
We have ASP.Net based web services (Web Api) and a JavaScript front-end which is 100% client-side MVC using backbone.
We have purchased our domain name, and for the sake of this question our deployment looks like this:
webservices.mydomain.com (Webservices)
mydomain.com (JavaScript front-end)
If the JavaScript attempts to talk to the webservices on the sub-domain we blow up with cross domain issues, I've played around with CORS but am not satisfied with the cross browser support so I'm counting this out as an option.
On our development PC's we have used an IIS reverse proxy to forward all requests to mydomain.com/webservices to webservices.mydomain.com - Which solves all our problems as the browser thinks everything is on the same domain.
So my question is, in a public deployment, how is this issue most commonly solved? Is a reverse proxy the right way to do it? If so is there any hosted services that offer a reverse proxy for this situation? Are there better ways of deploying this?
I want to use CloudFront CDN as all our servers/services are hosted with Amazon, I'm really struggling to find info on if a CDN can support this type of setup though.
Thanks
What you are trying to do is cross-subdomain calls, and not entirely cross-domain.
That are tricks for that: http://www.tomhoppe.com/index.php/2008/03/cross-sub-domain-javascript-ajax-iframe-etc/
As asked how this issue is most commonly solved. My answer is: this issue is commonly AVOIDED. In real world you would setup your domains such as you don't need to make such ways around just to get your application running or setup a proxy server to forward the calls for you. JSONP is also a hack-ish solution.
To allow this Web Service to be called from script, using ASP.NET AJAX, add the following line to the first web service code-behind :
[System.Web.Script.Services.ScriptService]
You can simply use JSONP for AJAX requests then cross-domain is not an issue.
If AJAX requests return some HTML, it can be escaped into a JSON string.
The second one is a little bit awkward, though.
You have 2/3 layers
in the web service code-behin class, add this atribute : <System.Web.Script.Services.ScriptService()> _
maybe you need to add this in the System.web node of your web.config:
<webServices>
<protocols>
<add name="AnyHttpSoap"/>
<add name="HttpPost"/>
<add name="HttpGet"/>
</protocols>
</webServices>
In the client-side interface
-Add web reference to the service on the subdomain (exmpl. webservices.mydomain.com/svc.asmx)
Visual studio make the "proxy class"
-add functionality in the masterpage's|page's|control's code behin
-Simply call this functions from client-side
You can use AJAX functionality with scriptmanager or use another system like JQuery.
If your main website is compiled in .NET 3.5 or older, you need to add a reference to the namespace System.Web.Extensions and declare it in your web.config file.
If you have the bandwidth (network I/O and CPU) to handle this, a reverse proxy is an excellent solution. A good reverse proxy will even cache static calls to help mitigate the network delay introduced by the proxy.
The other option is to setup the proper cross domain policy files and/or headers. Doing this in some cloud providers can be hard or even impossible. I recently ran into issues with font files and IE not being happy with cross domain calls. We could not get the cloud storage provider we were using to set the correct headers, so we hosted them locally rather than have to deal with a reverse proxy.
easyXDM is a cross domain Javascript plugin that may be worth exploring. It makes use of standards when the browser supports them, and abstracts away the various hacks required when the browser doesn't support the standards. From easyXDM.net:
easyXDM is a Javascript library that enables you as a developer to
easily work around the limitation set in place by the Same Origin
Policy, in turn making it easy to communicate and expose javascript
API’s across domain boundaries.
At the core easyXDM provides a transport stack capable of passing
string based messages between two windows, a consumer (the main
document) and a provider (a document included using an iframe). It
does this by using one of several available techniques, always
selecting the most efficient one for the current browser. For all
implementations the transport stack offers bi-directionality,
reliability, queueing and sender-verification.
One of the goals of easyXDM is to support all browsers that are in
common use, and to provide the same features for all. One of the
strategies for reaching this is to follow defined standards, plus
using feature detection to assure the use of the most efficient one.
To quote easy XDM's author:
...sites like LinkedIn, Twitter and Disqus as well as applications run
by Nokia and others have built their applications on top of the
messaging framework provided by easyXDM.
So easyXDM is clearly not some poxy hack, but I admit its a big dependency to take on your project.
The current state of the web is that if you want to push the envelop, you have to use feature detection and polyfills, or simply force your users to upgrade to an HTML5 browser. If that makes you squirm, you're not alone, but the polyfills are a kind of temporary evil needed to get from where the web is to where we'd like it to be.
See also this SO question.

Web application configuration settings - Which is the better place to store

I came across a case study few days early. It is related to a web application architecture.
Here is the scenario,
There is a single web service used by say 1000 web applications. This web service is hosted on a particular server. If web service hosting location is changed, how the other applications come to know about this change ?
Keeping it in web.config doesn't seems to be a feasible solution as we need to modify web.config files for all the applications.
Keeping these settings in a common repository and let all the applications use it for web-service address was came in my mind, but again there is a question of storing this common repository.
I am just curious to know about how this could be achieved with better performance.
Thanks in advance for any kind of suggestions.
do you have full access or control over all those web applications consuming that web service? if so, you could have a script or some custom code which updates all their web.config(s) at once. it seems too much work but in fact in this way you have more control and you could also, eventually, point to the new url only some applications and leave some others on another url.
the idea with the setting in a centralized database gives you faster update propagation which could also be bad in case of errors and then you have all applications referring to the same place and no way to split this. Then you have anyway to connect to a centralized database from all of them and maybe you should add a key to their web.config(s) with the connection string to that database, then, in case that database is not reachable or is down, the web applications will not be able to consume the web service simply because they cannot get the url of it.
I would go for the web config, eventually you could have a settings helper class that abstract the retrieval of that url so the UI or front end does not know from where that url comes from.
anyway, do you plan to change the url of a web service often? wouldn't be better to copy it to a new url but to also keep it available on the current url for a while?
another advantage of web.config approach is that everytime you update and save it the application is restarted while a change in a database might take a while to be detected in case you have some caching mechanism,
hope this helps.
Davide.

How can I use one service definition for testing and another for deployment with Flex Builder?

I would like to use different service definitions in a Flex app depending on whether I'm running on:
My local developer machine
The test tier
The QA tier
The production tier
My services are all AMFPHP remote objects, living on different hosts and at different locations depending on which tier I'm on. How can I have my flex app choose the 'correct' tier at runtime to connect to?
Are the definitions actually different, or are they only at different locations on the network?
If they're just different locations on the network, I'd suggest adding some sort of (XML) Configuration file to your Flex app that let you specify the URL of the service endpoint.
If they actually have different service definitions, I'd question why you'd want to develop against something that doesn't match what you'll be running in production.
UPDATE
Here's a link to a good quick reference on how to get started loading an XML document using AS3:
Pixelfumes Flash Blog: Easy XML Parsing using AS3
You can use those techniques to load an XML document containing your URL configurations.
Spring ActionScript allows you to do this by externalizing service endpoint configuration in xml and properties files. I blogged about this here.
Basically, you define your services/remote objects in an external xml file and use placeholders for the properties which you define in a properties file. You don't need to do any parsing yourself since Spring ActionScript does that for you.

Resources