IIS 7 replication of web site - asp.net

Is there any way to replicate website on different servers for load balancing so that some of the requests are served from one server and the others from other?

Did you at least Google for an answer? This subject is well documented all over the web. For a start, check out HTTP Load Balancing using Application Request Routing on iis.net (excellent source for anything related to IIS anyway). The blog post IIS7 Load Balancing & Routing Module Now Available! on MSDN also contains a lot of useful links. Instead of ARR you can also use pretty much any kind of load balancer (e.g. HAProxy).
To make the same content available to all servers in your farm you can simply use a Windows based file server or any kind of NAS with SMB file sharing. IIS allows you to specify the credentials that will be used when connecting to the file share.

Yeah there are heaps of solutions for doing this. The one that Stackoverflow use is called HAProxy http://haproxy.1wt.eu/
Typically what you need is a form of reverse proxy that can pipe your requests to the various servers it knows about.
As an aside - Interesting link about SO architecture. http://highscalability.com/blog/2011/3/3/stack-overflow-architecture-update-now-at-95-million-page-vi.html

Yes, there is Web Farm Framework http://www.iis.net/download/WebFarmFramework

Related

Hosting static content on different domain from webservices, how to avoid cross-domain?

We've recently been working on a fairly modern web app and are ready to being deploying it for alpha/beta and getting some real-world experience with it.
We have ASP.Net based web services (Web Api) and a JavaScript front-end which is 100% client-side MVC using backbone.
We have purchased our domain name, and for the sake of this question our deployment looks like this:
webservices.mydomain.com (Webservices)
mydomain.com (JavaScript front-end)
If the JavaScript attempts to talk to the webservices on the sub-domain we blow up with cross domain issues, I've played around with CORS but am not satisfied with the cross browser support so I'm counting this out as an option.
On our development PC's we have used an IIS reverse proxy to forward all requests to mydomain.com/webservices to webservices.mydomain.com - Which solves all our problems as the browser thinks everything is on the same domain.
So my question is, in a public deployment, how is this issue most commonly solved? Is a reverse proxy the right way to do it? If so is there any hosted services that offer a reverse proxy for this situation? Are there better ways of deploying this?
I want to use CloudFront CDN as all our servers/services are hosted with Amazon, I'm really struggling to find info on if a CDN can support this type of setup though.
Thanks
What you are trying to do is cross-subdomain calls, and not entirely cross-domain.
That are tricks for that: http://www.tomhoppe.com/index.php/2008/03/cross-sub-domain-javascript-ajax-iframe-etc/
As asked how this issue is most commonly solved. My answer is: this issue is commonly AVOIDED. In real world you would setup your domains such as you don't need to make such ways around just to get your application running or setup a proxy server to forward the calls for you. JSONP is also a hack-ish solution.
To allow this Web Service to be called from script, using ASP.NET AJAX, add the following line to the first web service code-behind :
[System.Web.Script.Services.ScriptService]
You can simply use JSONP for AJAX requests then cross-domain is not an issue.
If AJAX requests return some HTML, it can be escaped into a JSON string.
The second one is a little bit awkward, though.
You have 2/3 layers
in the web service code-behin class, add this atribute : <System.Web.Script.Services.ScriptService()> _
maybe you need to add this in the System.web node of your web.config:
<webServices>
<protocols>
<add name="AnyHttpSoap"/>
<add name="HttpPost"/>
<add name="HttpGet"/>
</protocols>
</webServices>
In the client-side interface
-Add web reference to the service on the subdomain (exmpl. webservices.mydomain.com/svc.asmx)
Visual studio make the "proxy class"
-add functionality in the masterpage's|page's|control's code behin
-Simply call this functions from client-side
You can use AJAX functionality with scriptmanager or use another system like JQuery.
If your main website is compiled in .NET 3.5 or older, you need to add a reference to the namespace System.Web.Extensions and declare it in your web.config file.
If you have the bandwidth (network I/O and CPU) to handle this, a reverse proxy is an excellent solution. A good reverse proxy will even cache static calls to help mitigate the network delay introduced by the proxy.
The other option is to setup the proper cross domain policy files and/or headers. Doing this in some cloud providers can be hard or even impossible. I recently ran into issues with font files and IE not being happy with cross domain calls. We could not get the cloud storage provider we were using to set the correct headers, so we hosted them locally rather than have to deal with a reverse proxy.
easyXDM is a cross domain Javascript plugin that may be worth exploring. It makes use of standards when the browser supports them, and abstracts away the various hacks required when the browser doesn't support the standards. From easyXDM.net:
easyXDM is a Javascript library that enables you as a developer to
easily work around the limitation set in place by the Same Origin
Policy, in turn making it easy to communicate and expose javascript
API’s across domain boundaries.
At the core easyXDM provides a transport stack capable of passing
string based messages between two windows, a consumer (the main
document) and a provider (a document included using an iframe). It
does this by using one of several available techniques, always
selecting the most efficient one for the current browser. For all
implementations the transport stack offers bi-directionality,
reliability, queueing and sender-verification.
One of the goals of easyXDM is to support all browsers that are in
common use, and to provide the same features for all. One of the
strategies for reaching this is to follow defined standards, plus
using feature detection to assure the use of the most efficient one.
To quote easy XDM's author:
...sites like LinkedIn, Twitter and Disqus as well as applications run
by Nokia and others have built their applications on top of the
messaging framework provided by easyXDM.
So easyXDM is clearly not some poxy hack, but I admit its a big dependency to take on your project.
The current state of the web is that if you want to push the envelop, you have to use feature detection and polyfills, or simply force your users to upgrade to an HTML5 browser. If that makes you squirm, you're not alone, but the polyfills are a kind of temporary evil needed to get from where the web is to where we'd like it to be.
See also this SO question.

Load Balancing in Asp.net, what I should consider while development?

While working on one ASP.NET project hosted within web farm including two front ends and load balancing, we got one issue regarding ASP.NET session state while being set to be "InProc", and we found that it's not working properly with load balancing., and we should consider using of "SQLServer" mode.
So, I'm wondering if there are any other points (Sessions, Caching, Security, file uploading, SQL Connections ...), we should take in consideration while development and deployment in such environment.
Microsoft offers some guidance on this. They have a knowledge base article with links to other resources you'll need.
http://support.microsoft.com/kb/815162
Oh, and as always, ScottGu has an excellent article and a cooler way of doing it. I just found this and it looks very promising:
The Microsoft Web Farm Framework
http://weblogs.asp.net/scottgu/archive/2010/09/08/introducing-the-microsoft-web-farm-framework.aspx
and the more recent Web Farm Framework Site has plenty of resources available. http://www.iis.net/download/webfarmframework
Although I never used it, I found out that the articles of Omar Al Zabir over at CodeProject.com seem to be rather helpful.
His article "99.99% available ASP.NET and SQL Server SaaS Production Architecture" covers some load balancing topics.
See my answer here regarding things to keep in mind with session state.
It references this article that has lots of good information on session state.
On my development server, I've configured IIS to use 3 worker processes (web garden) as a poor mans test for our load balanced environment, worked a treat.
We dont use session/application data. Our load balancer is configured with address affinity, so requests from the same IP go to the same server, thus allowing us to cache some user data. Our biggest gotcha was with cached data across the farm not being in sync, solved by wrapping the cache with a simple network library to send 'cached item changed' messages to other servers.
There are LOTS of things that you need to take into consideration. Here is an article which goes over the many considerations when moving into a distributed environment:
http://eralokpandey.wordpress.com/2010/03/31/load-balancing-in-asp-net-and-web-farm/

Load Balancing in Asp.Net

I am making a project for the university. When admission Starts, suddenly a lot of traffic comes at my site around 50000 to 100000 users. Site goes down therefore. How to manage it Please provide me details regarding this.
thank you very much
There are several different things you should look at:
ASP.NET Caching
Do you have a caching strategy? There are a lot of features in ASP.NET that will allow you to optimize how the server responds to requests. Take a look at:
http://msdn.microsoft.com/en-us/library/xsbfdd8c(v=VS.100).aspx
Load Balancing is really applied by the web server, not the server framework (ASP.NET / etc). If general opimizations and caching still give you problems, look into load balancing and web farms for IIS. Take a look at:
http://learn.iis.net/page.aspx/213/network-load-balancing/
If you have a lot of money, there are hardware solutions for load balancing that work with IIS. That's outside the provinces of programming and StackOverflow but it's helpful to know they are out there especially if you need to have a discussion with management about the pros and cons (expense!!) of which route to take.

How to write an offline version of an AJAX/ASP.NET web application

We have a web application that uses AJAX to talk to an ASP.NET web service. We would like to write another version that can be used offline. We need to be able to re-use our existing code as much as possible. What approaches should we consider?
The app is currently using XmlHttpRequest to get dynamic data from the server. Obviously the offline version will not be able to talk to the server, but it does need to talk to something! I'm sure installing IIS or Cassini on the client would work, but I was hoping for a simpler solution. Is there no other way for JavaScript to talk to some external code?
There are plenty offline web apps nowaday. It simply evolve from AJAX.
For example:
WoaS (wiki on a stick / stickwiki), Tiddly Wiki,
Google doc and Gmail is going to be offline.
You don't need a webserver to run these webapps in offline mode. Just store the required data, scripts on the client side (usually as XML).
One of the possibilities would be to use Cassini. This is a web server that acts as a host for the ASP.Net runtime. You can host Cassini in a Windows application or a Windows Service. In this scenario you do not have to rewrite the web app and the web service.
Most other solutions do require a rewrite of both your web app and your web service. Depending on the way you have written the existing app you can reuse more or less code.
Have you considered HTML5 with application cache and offline storage?
If you hope to create an "offline" version of your package your biggest issue by far will be the need to install your site into a local copy of IIS (registering a virtual directory, etc.). I pursued this briefly a few years ago and gave up in frustration. It can be done: a number of software vendors such as DevExpress do this so you have local copies of their demonstration projects. Indeed, I was able to do this. The problem was the classic "it works on my computer" syndrome. There was simply no way to guarantee that most of my end-users had anywhere near the technical proficiency to make this work.
Thus, I would strongly recommend that you not pursue this path unless you have very technically proficient users and a huge support staff.
But there is one more very important question: did you abstract all data access code to a DAL? If not, then you have a lot of work to do in managing data access as well.
Update: user "Rine" has recommended Cassini. I just wanted to let you know that I pursued Cassini and another 3rd-party web server as well. I think that there are licensing issues with Cassini but may be wrong - it has been awhile. However, I do distinctly remember running into barrier after barrier with this approach and very little documentation to help me out.
if you want a web application run offline, you need a webserver (IIS for ASP) bound to the localhost (127.0.0.1) address. After this so can access your web application by typing http://127.0.0.1/ in your web browser the same way as you do online.
If your AJAX relies on XMLHttpRequest's, you can:
Make the static versions of XML's you get over XMLHttpRequest and put then into a folder on disk.
Rewrite your XMLHttpRequest URL's so that they point to files on disk.
Rewrite your XMLHttpRequest's so that they don't check status (it's always 0 for the file:// protocol.
All JScript works on file:// pages as well as on http:// ones.
Of course it's not the best way to develop static pages, but it may save you some time on rewriting.
I havent come across any framework specifically built for asp.net like the ones available for PHP or RoR.
Here is a good article by Steven to get you started with HTML 5 and ASP.Net Creating HTML 5 Offline application
Obviously the offline version will not be able to talk to the server, but it does need to talk to something!
Enter HTML5 LocalStorage. It works like a database and enables you to put data on your client. Indeed you have to rework parts of your code in javascript and transmit it to the client, but then it would work offline.
Local Storage works like this:
- Setter: window.localStorage.setItem(KEY, VALUE)
- Getter: window.localStorage.getItem(KEY)
- Remove: window.localStorage.removeItem(KEY)
To get the main page working offline you need to create a manifest. This is used to store complete sites on the client. Please refer to this for more information about manifests:
http://diveintohtml5.info/offline.html
You want to build a web application to work offline?? It can't be done.
You could split the interface code from the rest (in diferent dlls) and create a windows application to mimic the behaviour of your web application. This way you have 2 distinct user interfaces but the same code for business rules and data access.
I don't really see any other way...

How do I cluster an upload folder with ASP.Net?

We have a situation where users are allowed to upload content, and then separately make some changes, then submit a form based on those changes.
This works fine in a single-server, non-failover environment, however we would like some sort of solution for sharing the files between servers that supports failover.
Has anyone run into this in the past? And what kind of solutions were you able to develop? Obviously persisting to the database is one option, but we'd prefer to avoid that.
At a former job we had a cluster of web servers with an F5 load balancer in front of them. We had a very similar problem in that our applications allowed users to upload content which might include photo's and such. These were legacy applications and we did not want to edit them to use a database and a SAN solution was too expensive for our situation.
We ended up using a file replication service on the two clustered servers. This ran as a service on both machines using an account that had network access to paths on the opposite server. When a file was uploaded, this backend service sync'd the data in the file system folders making it available to be served from either web server.
Two of the products we reviewed were ViceVersa and PeerSync. I think we ended up using PeerSync.
In our scenario, we have a separate file server that both of our front end app servers write to, that way you either server has access to the same sets of files.
The best solution for this is usually to provide the shared area on some form of SAN, which will be accessible from all servers and contain failover.
This also has the benefit that you don't have to provide sticky load balancing, the upload can be handled by one server, and the edit by another.
A shared SAN with failover is a great solution with a great (high) cost. Are there any similar solutions with failover at a reasonable cost? Perhaps something like DRBD for windows?
The problem with a simple shared filesystem is the lack of redundancy (what if the fileserver goes down)?

Resources