ASP.NET asynchronous controllers and long polling - asp.net

I've been looking for IIS based solutions for comet/push/reverse-ajax pages, and came upon asynchronous controllers
It seems like that allows xhr long polling without the problem of running out of threads, am I correct? Does this allow fairly decent scaling for long polling pages?

I would say take a look at signalR. Nodejs for iis is also an option

If you are looking for a pro solution with built in websocket, pick PokeIn

See ASP.NET MVC 3 real time events.
To make scaling easier/less of an issue I'd recommend using a dedicated realtime server (see this realtime technology guide; something that has been built from the ground up with realtime communication in mind.

Related

Why do we still use HTTP instead of WebSockets for building Web Applications?

Recently I dived into the topic of WebSockets and built a small application that utilizes them.
Now I wonder why HTTP based API'S are still being used, or rather, why they are still being proposed.
As far as I can see there is nothing I can't do with WS that would be possible via HTTP, but the other way round I gain a lot of improvements.
What would be a real world example of an application that takes more benefits from a HTTP powered backend than from a WS one?
#Julian Reschke made good points. The web is document based, if you want your application to play in the WWW ... it have to comply with the game rules.
Still, you can create WS based SPA applications that comply with those.
Using the HTML5 history API, you can change the URL in shown by the browser without causing navigation. That allows you to have a different URL in your address bar depending on the state of your app, enabling then bookmarking and page history. The plugin "ui-router" for AngularJS plays very well here, changing the URL if you change the state programmatically, and viceversa.
You can make your SPA crawlable.
But still you want to use HTTP for some other things, like getting resources or views and cache them using HTTP cache mechanisms. For example, if you have a big application you want some big views to be downloaded on demand, rather than pack everything in a big main view.
It would be a pain to implement your own caching mechanism for HTML to get views and cache them in the local storage for example. Also, by using traditional HTTP requests, those views can be cached in CDNs and other proxy caches.
Websockets are great to maintain "connected" semantics, send data with little latency and get pushed data from the server at any time. But traditional HTTP request is still better for operations that can benefit from distribution mechanisms, like caching, CDN and load balancing.
About REST API vs WebSocket API (I think your question was actually about this), it is more a convenience than a preference. If your API has a big call rate per connection... a websocket probably makes more sense. If your API gets a low call rate, there is no point in using WebSocket. Remember that a Websocket connection, although it is lightweight, it means that something in the server is being held (ie: connection state), and it may be a waste of resources if the request rate do not justify it.
Bookmarking? Page history? Caching? Visibility to search engines?
HTTP and WebSockets are two Web tools originated to accomplish different tasks.
With HTTP you typically implement the request/response paradigm.
With WebSockets you typically implement an asynchronous real-time messaging paradigm.
There are several applications where you need both the paradigms.
You can also try to use WebSockets for request/response and use HTTP for asynchronous real-time messaging paradigm. While the former makes little sense, the latter is a widespread technique, necessary in all the cases where WebSockets are not working (due to network intermediaries, lack of client support, etc.). If you are interested in thus topic, check out this other answer of mine, which tries to clarify the terminology related to these techniques: Is Comet obsolete now with Server-Sent Events and WebSocket?

Architecture For A Real-Time Data Feed And Website

I have been given access to a real time data feed which provides location information, and I would like to build a website around this, but I am a little unsure on what architecture to use to achieve my needs.
Unfortunately the feed I have access to will only allow a single connection per IP address, therefore building a website that talks directly to the feed is out - as each user would generate a new request, which would be rejected. It would also be desirable to perform some pre-processing on the data, so I guess I will need some kind of back end which retrieves the data, processes it, then makes it available to a website.
From a front end connection perspective, web services sounds like it may work, but would this also create multiple connections to the feed for each user? I would also like the back end connection to be persistent, so that data is retrieved and processed even when the site is not being visited, I believe IIS will recycle web services and websites when they are idle?
I would like to keep the design fairly flexible - in future I will be adding some mobile clients, so the API needs to support remote connections.
The simple solution would have been to log all the processed data to a database, which could then be picked up by the website, but this loses the real-time aspect of the data. Ideally I would be looking to push the data to the website every time the data changes or now data is received.
What is the best way of achieving this, and what technologies are there out there that may assist here? Comet architecture sounds close to what I need, but that would require building a back end that can handle multiple web based queries at once, which seems like quite a task.
Ideally I would be looking for a C# / ASP.NET based solution with Javascript client side, although I guess this question is more based on architecture and concepts than technological implementations of these.
Thanks in advance for all advice!
Realtime Data Consumer
The simplest solution would seem to be having one component that is dedicated to reading the realtime feed. It could then publish the received data on to a queue (or multiple queues) for consumption by other components within your architecture.
This component (A) would be a standalone process, maybe a service.
Queue consumers
The queue(s) can be read by:
a component (B) dedicated to persisting data for future retrieval or querying. If the amount of data is large you could add more components that read from the persistence queue.
a component (C) that publishes the data directly to any connected subscribers. It could also do some processing, but if you are looking at doing large amounts of processing you may need multiple components that perform this task.
Realtime web technology components (D)
If you are using a .NET stack then it seems like SignalR is getting the most traction. You could also look at XSockets (there are more options in my realtime web tech guide. Just search for '.NET'.
You'll want to use signalR to manage subscriptions and then to publish messages to registered client (PubSub - this SO post seems relevant, maybe you can ask for a bit more info).
You could also look at offloading the PubSub component to a hosted service such as Pusher, who I work for. This will handle managing subscriptions and component C would just need to publish data to an appropriate channel. There are other options all listed in the realtime web tech guide.
All these components come with a JavaScript library.
Summary
Components:
A - .NET service - that publishes info to queue(s)
Queues - MSMQ, NServiceBus etc.
B - Could also be a simple .NET service that reads a queue.
C - this really depends on D since some realtime web technologies will be able to directly integrate. But it could also just be a simple .NET service that reads a queue.
D - Realtime web technology that offers a simple way of routing information to subscribers (PubSub).
If you provide any more info I'll update my answer.
A good solution to this would be something like http://rubyeventmachine.com/ or http://nodejs.org/ . It's not asp.net, but it can easily solve the issue of distributing real time data to other users. Since user connections, subscriptions and broadcasting to channels are built in to each, that will make coding the rest super simple. Your clients would just connect over standard tcp.
If you needed clients to poll for updates then you would need a que system to store info for the next request. That could be a simple array, or a more complicated que system depending on your requirements and number of users.
There may be solutions for .net that I am not aware of that do the same thing, but those are the 2 I know of.

ASP.NET Comet Approach vs WCF Callback

I want / have to implement a chat (like facebooks) in my web app. Through my research I've found two different approaches and I don't know which want I should take (to be honest I tried one but it has a strange behavior, but let me come to this later).
Some facts about my application. It has two different clients. One is a web-application, the other one is a wpf client. Both of these two clients should have implemented a chat. They communicate over a server via WCF-Services.
So, as I found out there is the comet approach which means AJAX Long Polling. On the other hand I can use WCF Callback Services.
Are there any dis/advantages of the WCF callback / Comet approach?
For me the callback approach is pretty straight forward, ajax long pooling sound much more complicated.
Thx's in advance
I would offer you to try Reverse Ajax over PokeIn and benefit from the built-in WebSocket feature. You don't need anything else.
Since you are looking at MS technology you'll probably be interested in reading this post by Brian Raymor who is a Senior Program Manager in the Windows Networking group at Microsoft.
WebSockets: Stable and Ready for Developers
SignalR is a good solution since it will choose a transport type suitable for the web browser that is making a connection. It won't work with IIS until Windows Server 8.
Your other options are XSockets, SuperWebSocket and more. See this realtime web tech guide (disclaimer: which I maintain)
You could also look at a hosted service. Again, there are options available in the guide I've provided a link to above.
I would check out SignalR for the Web application side at least http://geekswithblogs.net/jeroenb/archive/2011/12/14/signalr-starter-application.aspx

is Silverlight more friendly to load-balancing than ASP.NET?

I was discussing load-balancing with a colleague at lunch. I admit that I know very little about this topic. We were discussing the various ways of maintaing session in a ASP.NET application -- none of which suited the high performance load balancing that he was looking for.
What about Silverlight? says I. As far as I know it is stateless, you've got the app running in the browser and you've got services on the server that feed/process data.
Does Silverlight totally negate the need for Session state management? Is it more friendly to load-balancing? Is it something in between?
I would say that Silverlight is likely to be a little more load-balancer friendly than ASP.NET. You have much more sophisticated mechanisms for maintaining state (such as isolated local storage), and pretty much, you only need to talk to the server when (a) you initially download the application, and then (b) when you make a web service call to retrieve or update data. It's analogous in this sense to an Ajax application written entirely in C#.
To put it another way, so long as either (a) your server-side persistence layer knows who your client is, or (b) you pass in all relevant data on each WCF call, it doesn't matter which web server instance the call goes to. You don't have to muck about with firewall-level persistence to make sure your HTTP call goes back to the right web server.
I'd say it depends on your application. If it's a banking application,then yes I want something timingout out after 5 minutes and asking for my password again. If it's facebook then not so much.
Silverlight depends on XMLHttpRequest like any other ajax impelementation and is therefore capable of maintaining a session, forms authentiction, roles, profiles etc etc.
The benefit you are getting is obviating virtually all of the traffic. json requests are negligable compared to serving pages. Even the .xap can be cached on the client.
I would say you are getting the best of both worlds in regards to your question.

Web Database or SOAP?

We’ve got a back office CRM application that exposes some of the data in a public ASP.NET site. Currently the ASP.NET site sits on top of a separate cut down version of the back office database (we call this the web database). Daily synchronisation routines keep the databases up-to-date (hosted in the back office). The problem is that the synchronisation logic is very complex and time consuming to change. I was wondering whether using a SOAP service could simply things? The ASP.NET web pages would call the SOAP service which in tern would do the database calls. There would be no need for a separate web database or synchronisation routines. My main concern with the SOAP approach is security because the SOAP service would be exposed to the internet.
Should we stick with our current architecture? Or would the SOAP approach be an improvement?
The short answer is yes, web service calls would be better and would remove the need for synchronization.
The long answer is that you need to understand the technology available for you in terms of web services. I would highly recommend looking into WCF which will allow you to do exactly what you want to do and also you will be able to only expose your services to the ASP.NET web server and not to the entire internet.
There would be no security problem. Simply use one of the secure bindings, like wsHttpBinding.
I'd look at making the web database build process more maintainable
Since security is obviously a concern, this means you need to add logic to limit the types of data & requests and that logic has to live SOMEWHERE.

Resources