I use Visual Studio 2013's "Publish" and Web Deploy to publish my Asp.Net MVC 5 website. However when I update my website, it won't work during the upload. I'm looking for a way to minimize the downtime of the website during the update. The website is running in a VPS and I have full access to it. One solution that came to my mind is to configure the Web Deploy to first put the uploaded files to a temporary folder and once the upload is finished, it should then replace the new files. This would make the update a few seconds tops. I can do this manually but that's not an elegant way to update one's website.
PS: Maybe there are better ways to update the website but so far I like the web deploy. It's much faster than FTP for instance.
One of the most interesting things I've seen is to have two websites. Only one of them is running at a time. After finishing the upload, you disable the active one, and enable the one you uploaded too.
This works well if both are in the same Application Pool and even works with sessions if you want (how to: Configure SQL Server to Store ASP.NET Session State).
I've never done it this way, seems a little to complex for the minimal down time there actually is, but it's one way.
Here's the solution that we use on a high traffic website with 4 web servers.
1) Files are moved to the server into /site/version-xxx
2) IIS Web Application is re-pointed to the new version.
All this is automated and synchronized across the web servers. The end user doesn't notice any difference. (we don't rely on sessions to persist the user experience, if sessions is a must for you and you don't want to interrupt them, then you need to consider to store them on an external system that will not flush them when the websites are repointed).
This approach also allows us to rollback to any previous version.
Related
I have a problem deploying a .net core application via FTP which is hosted on IIS.
The main DLLs (core application) that I want to update just wont upload, FTP just gives me a generic permission error message. I think the reason is because they are in use because then I stop the application pool, upload and restart it works just fine.
But this isn't really a solution, are there any other methods of publishing that will alleviate this problem?
Edit:
"open for write: failure"
Is the only error I'm getting. I can't find anything online and the only solution I have is restart the app pool.
I found an answer and I figured it should be here for future Googling.
The issue is as I first expected IIS proxies the request to kestrel and that means the process is in use as far as Windows is concerned. There are three solutions.
The Good Solution
Have two (or more) VMs on azure behind a load balancer. Have a script which turns off the sites one at a time, does what it needs to do and turns them back on. Do this right and no downtime!
Intermission
Before I talk about the other solutions a little explanation. I have not been working with .NET for a long time but apparently there was this thing you could do where you add a app_offline.htm and it will temporarily take down the site for you.
In the context of IIS and .Net Core it also releases the process, which is really useful as it solves my problem! Although I had to visit the web page first for it to take effect, unless I'm mistaken.
The Bad Solution
Use an automated script to rename _app_offline.htm to app_offline.htm. Do the upgrade and then revert that change. Takes your site down, kind of ugly but scripting is always better than...
The Ugly Solution
You only have access to FTP, no remote admin or proper deployment process because... reasons.
Upload an app_offline.htm, upload as little as possible and hope it doesn't break anything before deleting or renaming app_offline.htm.
Also you would have to perform any DB migrations by using EnableAutomaticMigrations = true because you have no server access or scripting methods.
We've quite a big project at work which sometimes needs to be published during the day. However, the application will (of course) crash during publish and show errors. (because the /bin is being overwritten)
Is there a way to avoid this? Sometimes a publish takes up to 3-4 minutes.
I know I can use the app_offline.htm file to display a message instead of them seeing the actual errors.
There isn't a way to have 0 downtime on a deployment. But you have some options
Deploy a pre-compiled site
To speed up the time taken for a site to be active after deployment, you can use ASP.NET Web Site Pre-compilation. This process will package the site so that it will not need to be compiled on-the-fly after deployment.
Benefits
Faster site startup
Less assets to deploy
Can be packaged via the Web Deployment Tool
Drawbacks
Debugging can be harder as the names are scrambled; Debug mode can be turned on for it though
You cannot edit a pre-compiled site; you must do a full re-deployment
You should still use the app_offline.htm so avoid yellow-screen-of-death errors
Add another web server
The best option for these scenarios is to have 2 servers with a load balancer in front of them. Then this way you use the load balancer to redirect active traffic to one node while the new application is being deployed.
Benefits
Parallel deployments do not affect the existing site
Have double the capacity for future expansion
Load balancing allows the load to be spread evenly across servers.
Drawbacks
Session now needs to be stored out-of-process. Therefore, you must check you can switch to an out-of-process session storage to confirm you do not have any serialization problems. StateServer is a quick one to get started with locally.
More maintenance overhead for a deployment, as you need to include load balancing configuration as part of your deployment processes. Network Admins are normally familiar with this.
When one updates an ASP.NET MVC app in IIS the framework keeps the connections open. All responses to the connections are sent once IIS has caught up. Unfortunately this can take some time (eg. 15 seconds). Is there any way to update part of the app without affecting connections to another part.
An example use case: if you have a web chat app and you want to make a minor change to one section of the website, can it be done without 'pausing' the connections to the chat app.
If you can physically separate the code into its own folder, I.E. (c:/inetpub/wwwroot/myapp and then c:/inetpub/wwwroot/myapp/chatapp), you could define "chatapp" as its own application within the IIS website, and then create a new application pool just for that application. I had to do this before because the project I was running needed to have part of the IIS site on a different recycle schedule due to performance issues, also it crashed a lot so it was advantageous for it to have its own process so it didn't take everything else down with it :)
We have DLLs that contain hundreds of custom client processes that are kicked off from an ASP.NET application. Our clients run these processes while performing data entry, and typically there's only 1 process per client. On any given day, we might update 2 or 3 of these processes.
Currently these are all housed in a series of DLLs, which means that we are publishing our application a couple times per day. As a result, any logged-in clients get booted out of the system since the publish causes an app restart.
Is there a way that we can update these DLLs without requiring a full publish each time?
If your client processes have a common API then you could host, the DLLs separately in a WCF (or similar) service, and call the client processes remotely. So basically, consider moving to a service oriented architecture.
Check out the Managed Extensibility Framework (MEF) from Microsoft. It provides not only dependency management but also plug-in like library loading. Most likely it's exactly what you're looking for.
You can switch to SQL server or state server sessions in order to perserve session and logged users after app restart. Or store these dlls in APP_DATA and load it dinamically. Then of course you have to think of some refreshing system and refresh loaded dlls with newly uploaded ones.
There is no sensible way to avoid an application restart. Please note the emphasis on the word sensible.
I want to update an ASP .NET web application (including web.config file changes and database scripts) to multiple production environments - ideally with the click of a button. I do not have direct network connectivity to any of them. I think this means the application servers will have to "pull" the information required for updating the application, and run a script to update the application that resides on the server.
Basically, I need a way to "publish" an update, and the servers see that update and automatically download and run it.
I've thought about possibly setting up an SFTP server for publishing updates, and developing a custom tool which is installed on production environments which looks at the SFTP server every day and downloads application files if they are available. That would at least get the required files onto the servers, and I could use xcopy/robocopy and Migrator.NET to deploy the updates. Still not sure about config file changes, but that at least gets me somewhere.
Is there any good solution for this scenario? Are there any tools that do this for you?
I think the pull rather than push strategy somewhat flaunts conventional wisdom... but this seems like something CruiseControl.NET could easily do. Remember the web.config file is also an XML document, so is easily modifiable in a CruiseControl script. You could xcopy files or use an svn export.
http://varunkumargoel.blogspot.com/2010/03/how-to-make-automatic-deployment-for.html
Please view the above mentioned blog there i have posted details regarding automatic deployment of .NET application with SVN.