how can i create an singleton asmx webservice ? (please don't say use WCF and WWF :D )
Short Answer: You don't want to.
Long Answer: A request to an .ASMX is going to non deterministically use a new worker thread, so even if you used the singleton pattern, the life of the singleton will not be known.
Perhaps elaborate on what you want to do, and I can guide you towards the best pattern.
I'm not sure how a singleton solves your performance problem, unless you are caching data inside the instance. In that case, I'd agree with the above suggestion of introducing the cache between the service and the database. Just how mutable is this data?
I won't suggest WCF, but only because you asked us nicely not to.
I will mention that you've found yet another reason to use WCF over ASMX. You might want to keep a list.
You might also want to keep a list of reasons to use ASMX over WCF. You might even want to use the same list for the reasons not to upgrade to .NET 3.5 SP1. It won't be a long list.
There may come a time, when Management wonders why certain things take so long to accomplish, when you'll want to send them your list.
You could use an ashx (HttpHandler). Implement IHttpHandler and set IsReusable to false.
http://neilkilbride.blogspot.com/2008/01/ihttphandler-isreusable-property.html
Depending on what you want to do, maybe you can write the engine as a singleton that's accessed by whatever thread services the ASMX call.
Related
ASP.NET is known to exhibit what is called "thread agility". In short, it means that multiple threads may be employed to fulfill a single request, although not more than one thread at a time. This is an optimization that means a thread waiting for asynchronous I/O may be returned to the pool and used to service other requests.
However, ASP.NET does not migrate all thread-related data when moving a request. Microsoft either forgot to do so, or thought that using thread-local storage (made easy by the ThreadStatic attribute) was something only the people coding ASP.NET themselves should do.
Based on quick googling, it seems to me that the only way to avoid the issue is to rely on HttpContext instead. The context is indeed migrated if ASP.NET decides to switch threads mid-request, so this overcomes the problem. But it creates a brand new headache instead: It ties your application logic to HttpContext, and therefore to a web context. That's not acceptable in all situations (in fact, I'd say it's unacceptable in most). Besides, since HttpContext is sealed and has internal constructors, you cannot mock or stub it, and therefore your logic also becomes untestable.
According to this (old) blog post, CallContext does NOT work, which is pretty infuriating given that a call context is conceptually precisely a logical thread!
Is there a simple way to reliably implement "per-LOGICAL-thread" isolation that will work in asp.net contexts as well as other contexts?
If not, does anyone know of a lightweight third-party framework that solves the problem? Does StructureMap behave correctly when ASP.NET migrates threads?
I would like a general answer, but in case anyone wonders, the specific use case I'm looking at is for use of Entity Framework in a SharePoint context. We're unfortunately stuck with SP-2010 and EF 3.5 for a while. EF basically requires that data is saved using the same context as they were originally read from - or else you have to keep track of changes yourself. I would like to introduce a "current model" concept. The first time the model is called upon in processing each HTTP request it should be instantiated, and then that same model instance should be used for the duration of the request. But the code relying on "Model.Current" should also work if executed in the context of a timer job. I'm fine with the timer job code explicitly disposing of the model when done with it (a task I'd like to give to a handler for HttpApplication.EndRequest in the SharePoint web context).
There may be reasons not to do this, and that's interesting too, but I would anyway really appreciate to learn of a way to achieve "logical thread isolation" in an asp.net context, as it'd be remarkably useful.
There is a nice post related to the problem: Implicit Async Context ("AsyncLocal").
If I got everything right, Logical CallContext i.e. CallContext.LogicalGetData and CallContext.LogicalSetData make it real to migrate immutable data correctly given you live in the world past .NET 4.5. This immutable limitation is a nut but still...way to go.
Currently I'm using Microsoft.Practices.Unity.HierarchicalLifetimeManager as the lifetime manager for my controllers because it calls dispose on the objects it contains. However, it seems that I'm running into cross-threading issues now (multiple request variables are getting mixed up). Reading further into the lifetime manager, it implements a Singleton pattern, which I believe is my problem.
I'm using Unity 2.1. Can any recommend the most appropriate lifetime manager to use with ASP.net MVC controllers, that will call dispose on each of its contained objects at the end of each request?
Thanks so much.
I would think any here that don't implement as a singleton should work. You'll need to pick the best for your needs. PerThreadLifetimeManager sounds pretty good, although it doesn't call Dispose. However, it will be garbage collected when the thread dies.
I need some advice here.
I need to make a webservice, that can make a simple query into my database, from any C# project.
Basically, I'm looking up an itemnumber, and returning the itemname.
It looks like WCF Data Services are really cool, but I'm not really sure it makes sense in my case. From what I have read, they are good for browsing datasets, where I just want to return 1 string.
But, at the same time, I don't want to use obsolete services, or services that are dying. From what I can tell, the good old ASP.NET Webservice (ashx) seems to fall into that category.
So, my question is, what type of webservice should I use?
You could use a standard WCF service (not WCF Data Services).
I'm wondering which is better approach from performance point of view, is it better to use one web-service method to load data by passing Database Table name and keys or is it better to use separate method for each database table! knowing that i'm using .net asmx through ajax requests.
it's obvious that one method is better from OO perspective since it have one function type 'data loading' but what about performance? does IIS affected by that or not? also is it better to make multi web-services 'asmx files' or just one!
I really dont think that creating separate methods for handling data fetch different tables is necessary. The performance gain\loss that u r likely to experience by passing an additional table name param to your webservice call would be too small to even consider unless your table names are really huge, which i dont think is the case.
The only reason i would even consider doing some thing like this is if i have nothing else to do in terms of performance improvement or if being forced to do it ;-).
If you really want to optimize your request size try
serializing your input params using JSON (if you are not doing it already)
use a cookieless domain for your webservice
hope this helps
I don't think the service level should have any knowledge of database tables, just like you ideally don't want to see data access code in a controller action or ASPX's code behind.
Personally, I prefer to organize my services to match my domain model.
If I have Customer, Order, and Item classes, for example, I would have corresponding Customer.asmx, Order.asmx, and Item.asmx services to expose selected methods within those classes.
Services are typically responsible for exposing business functionality through a contract. I realize ASMX services really had not concept of "Contracts" in their broadest sense, however you think of it as a set of operations supported by the service. What is your goal here, do you want to expose tabular data as a service ?
Service technology on the Microsoft stack has come a long way from ASMX. Perhaps an obvious question, have you looked at WCF Data Services?
Links:
Exposing Your Data as a Service (WCF Data Services)
Getting Started with WCF Data Services
I'm currently working with web services that return objects such as a list of files e.g. File array.
I wanted to know whether its best practice to bind this type of object directly to my front end code for example a repeater/listview or whether to first parse it into my own list of "file class" e.g. customFiles[]
If the web service changes then it will break my front end code, however if I create my own CustomFile class, then i would only need to change my code in one place to fix the issue, but it just seems like a lot of extra work to create the same classes from a web service, i wanted to know what is the best practice for this type of work.
There is a delicate balancing act in properly encapsulating implementation details. Too little encapsulation is a maintenance nightmare as small changes in any area break the application. Too many layers is a different kind of maintenance headache altogether.
In this particular case I would create a small layer in your application to encapsulate the web service calls. This will ease your maintenance in both the application and the service as they will be loosely coupled.
It sounds like you have already answered your own problem. Best practice is to create your own custom class for the reasons you point out, but it is significant extra work.
If the webservice isn't likely to change then just use the existing classes, but if you need to cater for change then create your own.
Returning a class is fine as long as your client knows how to deserialize it. If it's truly a web service, where you don't have control over both ends of the conversation, it's more common to start with schemas for XML request and response streams. That decouples the client from the web service a bit more and allows any client that can send XML via HTTP and consume an XML response fair game.