We have a Flex application which relies heavily on data driven content supplied via asp.net. Currently the majority of this data is provided via asp.net objects which are then XML serialised and sent via a simple ASHX handler. This is then parsed via e4x in singleton classes to populate either its self or arrays of sub classes which are then available to the rest of the application without making additional data calls.
This works but is it the best way? I've read quite a few articles discussing the subject but couldn't really find any consensus.
Should I look into converting these to Web Services? If so, how should I manage the bindings, automatically import them via Flex or build my own? What are the pro's and con's. An important factor in this decision is speed, lowest latency and highest throughput is essential
As a separate matter our application doesn't sit at the root of the domain, and when in local development makes data calls to our development servers. As a result we add flash vars to the application to specify the appRoot which is then appended to the service url as necessary.
MyService.url = GeneralData.ApplicationRootUrl + "Services/foobar.ashx";
Is this the best way? I have since discovered the rootURL property, should I be using this, how does it work in this context? If I were to convert the services to web services how would I go about implementing the same functionality to allow local development?
Many thanks
This works but is it the best way?
Best is very subjective based on your situation. If at all possible, I would recommend you use an AMF gateway. That way your objects can immediately convert from server side objects (.NET Classes) to client side objects (AS3 classes). This is a big time savings because you don't have to manually create your XML on the back end, nor manually process it in the front end. Also the binary format of AMF is going to give much smaller packets than XML or a SOAP WebService would.
For .NET AMF options, I'd look into WebORB or FlourineFX
Flex Application is always loaded in browser, and you can use relative URL, so that your application will connect to same server from where it is loaded.
MyService.url = "/Services/foobar.ashx";
"/" will certainly append host where it came from. And it is always good practice to connect to same host where the flash is loaded from.
Secondly, SOAP web services use xml serialization, so if you use your handler to do e4x serialization or you use SOAP web service generator of Flash Builder, speed will be almost same. SOAP web service will certainly be little slower, but the difference will be in micro seconds to milli seconds.
However, with Web services, your development will speed improve as you will not have to create proxy classes.
Related
I'm having problems because of a poorly written third-party library which our system heavily depends on. This library is not thread-safe (because of some bugs and static variables) and I need to use it in a ASP.NET webservice, which handles each user request in a separate thread.
I've tried many solutions for this problem. The best solution for now is, in my opinion, let subprocesses handle the requests. One subprocess will listen and handle the requests for one user, so I can synchronize access to the library code in a per user fashion, which is much better than all that I can do when sharing static variables between requests.
How can I route requests received by IPC communication to the appropriate WebMethods without reinventing the wheel? If possible, I would like to use the classes from .Net that handle this in a normal ASP.NET webservice, but I'm having a hard time trying to find their names.
TL;DR: I have a class MyWebService (that inherits from System.Web.Services.WebService) with some methods marked with WebMethodAttribute and I want to pass a made-up HttpRequest (or HttpContext) to it and tell it "handle it like you're receiving this from a real HTTP server, despite the fact the current process is a console application".
First, you may want to consider using WCF instead of ASMX, which is a legacy technology, kept only for backwards compatibility.
Second, you have another option: ensure that only a single thread ever uses the third-party libarary at a time. Placing lock blocks around all access to the third-party library may solve the problem.
The intent is to create a set of web services that people can reuse. These services mostly interact with a backend DB creating, retreiving and processing data.
We want to expose services so that people can use to create data mashups and other applications.
End users are webpages that can be within our domain or outside our domain. For pages outside the domain we plan to release widgets that would be configured to retreive and display the data.
One requirement - application should be extremely scalable in terms of the number of users it can handle.
Our code base is .net and we are looking at ASPX webmethods (or ASHX), ASMX webmethods and WCF (starting to read up on WCF).
In terms of security/access I found that maintaining sessionid, memberships is doable in all three. WCF seems a bit complicated to setup. I could not immediately see the value of asmx when we can get all done just using a webmethod in aspx (with a little tweaking).
Also, assuming that with the ASP.NET MVC2 I might be able to get clean urls as well for these webmethods.
Questions
Which one will be the most effective in terms of performance and scalability?
Any reason why I should choose WCF or ASMX?
Thank you for taking the time to read through this post and apologies for the naive questions since I am new to .net.
EDIT I kind of understand that WCF is the way to go. Just to understand the evolution of the technologies it would be good if someone can throw light on why a aspx webmethod is different from an asmx when similar things (apart from discovery) can be accomplished by both. The aspx webmethods can be made to return data in other formats (plaintext, json). Also, it seems that we can build restful services using ashx. Apologies again for the naive questions.
You should use WCF for developing webservices in .Net. WCF is highly configurable with many options for security, transport protocols, serialization, extensions etc. Raw performance is also significantly higher. Also WCF is being actively developed and many new features being added in version 3.5 and 4. There are also variations like WCF data services and WCF RIA services. WCF 4.0 also has better REST and JSON support which you can directly use in ASP.Net / JQuery.
ASMX is considered deprecated technology and replaced by WCF. So if you are going to start new development which requires exposing reusable services, WCF is the way to go.
I am not necessarily disagreeing with previous answer. But, from a different perspective, WFC is tricky to configure. It requires bindings, endpoints, packet sizes, a lot of confussing parameters, etc in your configuration files, and there are many serialization/deserialization issues reported. Also WCF is a relatively new technology (therefore still exposed to bugs and patches needed).
The client-generated [Reference.cs] files might have unwanted interfaces, and each public property client class exposed in the WSDL gets generated with the same observer pattern that LINQ to SQL or Entity Framework uses ( OnChanged, OnChanging, etc) so this adds a lot of fat to the client code, as opposed to the traditional SOAP Web client way.
My recommendation, if you aren't using Remoting over TCP or if you don't need the 2-way notification mechanism for remote changes - all these are very cool features of WCF - you don't need to use it.
I'm currently working with web services that return objects such as a list of files e.g. File array.
I wanted to know whether its best practice to bind this type of object directly to my front end code for example a repeater/listview or whether to first parse it into my own list of "file class" e.g. customFiles[]
If the web service changes then it will break my front end code, however if I create my own CustomFile class, then i would only need to change my code in one place to fix the issue, but it just seems like a lot of extra work to create the same classes from a web service, i wanted to know what is the best practice for this type of work.
There is a delicate balancing act in properly encapsulating implementation details. Too little encapsulation is a maintenance nightmare as small changes in any area break the application. Too many layers is a different kind of maintenance headache altogether.
In this particular case I would create a small layer in your application to encapsulate the web service calls. This will ease your maintenance in both the application and the service as they will be loosely coupled.
It sounds like you have already answered your own problem. Best practice is to create your own custom class for the reasons you point out, but it is significant extra work.
If the webservice isn't likely to change then just use the existing classes, but if you need to cater for change then create your own.
Returning a class is fine as long as your client knows how to deserialize it. If it's truly a web service, where you don't have control over both ends of the conversation, it's more common to start with schemas for XML request and response streams. That decouples the client from the web service a bit more and allows any client that can send XML via HTTP and consume an XML response fair game.
I used to use soap webservices for transferring chart data to my flex app, but recently switched over to using BlazeDS because of performance, convenient typing, etc.
I'm considering switching over to using JSON (as I do in other parts of the app) for these reasons:
Proliferation of DTOs for communicating with flex.* (With JSON, I just use JsonConfig to exclude properties as desired.)
Difficult to debug (whereas JSON is good ol' plaintext).
Problems with load balancing without sticky sessions.
Anyone else run into these problems with BlazeDS? Is BlazeDS worth the hassle?
* I could use the Externalizable interface instead of distinct DTOs, but it's also a pain.
I wouldn't give up on using remoting. Performance of remoting will be much better than JSON. Remember ActionScript doesn't have a method to decode JSON, so you'd need to use an AS library which will be slower than anything built into the player. You'd be better of using XML than JSON.
You should be able to exclude specific properties as desired by marking them as transient. ActionScript has [Transient] metadata and the idea came from Java. The C# library we use for remoting has Transient support. I'm sure BlazeDS does too.
Debugging is easy with the right tools. You should get Charles. It provides very nice views of AMF request and response messages (assuming you're using HTTP and not RTMP, I don't know about RTMP debugging).
http://www.charlesproxy.com/
You also seem to be choosing between BlazeDS and anything-not-remoting. You have more options. BlazeDS is just one remoting implementation that Adobe made available. They also have a commercial one. There are also many open-source remoting projects available. We use a wonderful one for C# called Fluorine. Open-source Java options are Red5 and OpenAMF, but I think there are others as well.
http://red5.org/
http://openamf.com/
There's also a distinction between RTMP and HTTP remoting. You can get data into Flex through either of these protocols and each will have it's advantages/disadvantages. I personally prefer HTTP remoting unless you absolutely need the functionality RTMP provides (push, streaming). HTTP will be easier to debug and should not have problems with a load balancer--it's just HTTP calls where the content happens to be binary.
I've been a longtime ASP.NET developer in the web forms model, and am using a new project as an opportunity to get my feet wet with ASP.NET MVC.
The application will need an API so that a group of other apps can communicate with it. I've always built API's out just using a standard web service prior to this.
As a sidenote, I'm a little hesitant to plunge headfirst into the REST style of creating API's, for this particular instance at least. This application will likely need a concept of API versioning, and I think that the REST approach, where the API is essentially scattered across all the controllers of the site, is a little cumbersome in that regard. (But I'm not completely opposed to it if there is a good answer to the potential versioning potential requirement.)
So, what say ye, Stack Overflow denizens?
I'd agree with Kilhoffer. Try using a "Facade" wrapper class that inherits from an "IFacade". In your Facade class put your code to consume your web service. In this way your controllers will simply make calls to the Facade. The plus side of this being that you can swap a "DummyFacade" that implements the same IFacade interface in that doesn't actually talk to the web service and just returns static content. Lets you actually do some unit testing without hitting the service. Basically the same idea as the Repository pattern.
I would still recommend a service layer that can serve client side consumers or server side consumers. Possibly even returning data in a variety of formats, depending on the consuming caller.