Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
How could you update asp.net application from a central location. I would like to publish the final updates to a central location and have all the sites download latest updates from that central location.
Its like having ur wordpress site getting the latest version for word press ur yahoo asking users to migrate to the latest version.
You can use tool you buy, or try to write your own scripts. Writing them for yourself is not that hard.
From tools I know of Octopus Deploy or Redgate Deployment Manager (RedGate bought Octopus' sources and now sells as it's own project with some modifications/additional features). With those tools you install their agent on target servers to be able to deploy new versions of your web sites to those servers. Octopus works in both push and pull model. I don't know about Deployment Manager.
In my work I publish using self written scripts that are configured as Jenkins jobs. One click and new version is installed. This is a "push" update.
Remember that in order to update the web site without users noticing (no technical break) you need at least two servers and load balancer and a shared state service. Before deployment you need to reconfigure Load Balancer so that it routes all requests to one server. Then you update that server. Now reconfigure Load Balancer so that it routes all requests to the updated server, and update the other one. After you update all servers reconfigure Load Balancer again so requests are routed to all servers.
There are two main approaches:
The first is to "push" the updates, such that when you upload a new release it will contact each website (such as sending a request to a trigger URI) causing each site to download-and-install the update you've provided. This would require your central update site to maintain a database of sites to inform about an update.
The second is to have the site's code include a check (on a separate thread so it doesn't block any page requests) that it hasn't been n days since it last checked for an update, and if so, to check with your central server for an update, and if applicable, download and isntall it. This approach means that your central server does not need to be aware of consuming sites, only vice-versa.
Recently I have done same thing. But in my approach application does not ask user to download the latest version. It directly download the latest version. What I have done :
Create a webservice in your application. Which will receive data and directly update the database or any other part of your project.
2.Then you have to create a windows application which will be installed on your server.
And it consume the webservice of your application in client machine. Now just write a scheduler in your win app that will start every night or whatever you want. Whenever it will start, it call all the services and send them the updated data. And your service will receive this and update the application.
Let me know, it helps or not
Related
I am using an Azure WebApp with development slots running a netcore aspnet install. How often are these instances patched and how are they patched?
Is there any way for me to see a log of exactly when they are patched?
If you are referring to the app itself in Azure App Service when you deploy a Web App from the gallery you get an install of whatever item you selected. That item will not be automatically updated, if they try to update the container this will most likely break your application especially if you had customized that container in any way.
If you are referring to the OS, Microsoft will update the OS and IIS version from time to time and when they do, if there is any possibility of this affecting your app there will be an e-mail sent to the e-mail account registered under the subscription notifying him/her of the maintenance. Normally you shouldn't experience any downtime.
You may also check the Operating system functionality on Azure App Service and Kudu Console for more details.
Hope this helps.
My question is simple and straightforward: I am new to hosting and its process. I have Web Application and I have to host it. I dont know where to host and how? In my list there are Godaddy.com and hostcat.com for hosting and I want to use filezilla for updating the Application. But My concern is my application have Payment gateway and I do not want my site to get down for a minute while updating the version.
Now my question is How to maintain concurrency while updating the application? Any solution where should I host the application and what software should I use for updating.
Any solution or tutorial are highly appreciated.
I use Visual Studio 2013's "Publish" and Web Deploy to publish my Asp.Net MVC 5 website. However when I update my website, it won't work during the upload. I'm looking for a way to minimize the downtime of the website during the update. The website is running in a VPS and I have full access to it. One solution that came to my mind is to configure the Web Deploy to first put the uploaded files to a temporary folder and once the upload is finished, it should then replace the new files. This would make the update a few seconds tops. I can do this manually but that's not an elegant way to update one's website.
PS: Maybe there are better ways to update the website but so far I like the web deploy. It's much faster than FTP for instance.
One of the most interesting things I've seen is to have two websites. Only one of them is running at a time. After finishing the upload, you disable the active one, and enable the one you uploaded too.
This works well if both are in the same Application Pool and even works with sessions if you want (how to: Configure SQL Server to Store ASP.NET Session State).
I've never done it this way, seems a little to complex for the minimal down time there actually is, but it's one way.
Here's the solution that we use on a high traffic website with 4 web servers.
1) Files are moved to the server into /site/version-xxx
2) IIS Web Application is re-pointed to the new version.
All this is automated and synchronized across the web servers. The end user doesn't notice any difference. (we don't rely on sessions to persist the user experience, if sessions is a must for you and you don't want to interrupt them, then you need to consider to store them on an external system that will not flush them when the websites are repointed).
This approach also allows us to rollback to any previous version.
We have DLLs that contain hundreds of custom client processes that are kicked off from an ASP.NET application. Our clients run these processes while performing data entry, and typically there's only 1 process per client. On any given day, we might update 2 or 3 of these processes.
Currently these are all housed in a series of DLLs, which means that we are publishing our application a couple times per day. As a result, any logged-in clients get booted out of the system since the publish causes an app restart.
Is there a way that we can update these DLLs without requiring a full publish each time?
If your client processes have a common API then you could host, the DLLs separately in a WCF (or similar) service, and call the client processes remotely. So basically, consider moving to a service oriented architecture.
Check out the Managed Extensibility Framework (MEF) from Microsoft. It provides not only dependency management but also plug-in like library loading. Most likely it's exactly what you're looking for.
You can switch to SQL server or state server sessions in order to perserve session and logged users after app restart. Or store these dlls in APP_DATA and load it dinamically. Then of course you have to think of some refreshing system and refresh loaded dlls with newly uploaded ones.
There is no sensible way to avoid an application restart. Please note the emphasis on the word sensible.
I am developing an ASP.NET website. The users can open up the web page and work with data when they are online. But I want to make them able to work when they are offline and submit their changes later when they are online again. The offline application must be very simple to run and I don't want to develop a separate windows application for this purpose.
So I want to build an executable that when starts, first starts up a local IIS and then opens the startup page of my website (which is locally available) in the user's browser.
And of course it would be great if this executable can be installed on the user's system along with my website files, IIS and SQL Server Express all in one package.
OK I re-read your question and see that you will have local IIS and local Database installed on all client systems.
So then the solution is very simple.
The Applicaiton (main form)
Create a windows forms application.
Put a WebBrowser control and a StatusStrip control on the form.
Add two string resources named say LocalStartUrl and OnlineStartUrl, which holds the addresses of your local and online website home/startup pages.
On Form_Load, check for online internet connectivity and accordingly launch either LocalStartUrl or OnlineStartUrl in the webbrowser control. You can show messagebox and use the StatusBar to inform the user of the same.
The sync module:
The database sync module runs in the timer/separate thread and synchronizes your local database with online database in the background. It sends any unsaved changes to the server and downloads any missing data from the server to local database. You would need to control the speed of this module so that user doesn't face difficulty browsing other websites or using the application smoothly etc. It should be slow and steady and send/request only small chunks of data at a time.
When in offline mode, it just periodically checks for online connectivity at regular intervals. As soon as an internet connectivity can be found, the user is informed. If they permit, it switches over to online mode. Otherwise the user continues to work offline until the application is closed and launched again.
In online mode, the sync module syncs data to & from the online database. This doesn't affect the user because they are not connected to this database. Instead they are connected to the online website and database.
It will take efforts to streamline this approach, but ultimately it is achievable easily.
This won't be just a single task. It would be a series of task working together in sync.
Your windows application does the following:
Write the offline changes to a disk file/database etc.
Periodically check the online availability of your website/ftp site.
Whenever the website is found to be available, and there are some cached changes, submit the file to the website/ftp site.
Your server does the following:
Whenever a file is recieved, check for its validity and integrity. If found correct, put it in a specific folder.
A service on your server watches the folder and as soon as any file is found there, processes the file(s)
The service moves the file to another folder (backup) after processing.