I have an unconventional legacy asp.net 1.1 web app that uses the src attribute in the page directive that utilizes JIT compile on the server as supposed to the normal compile of assemblies in Visual Studio 2003. We are approaching performance capacity limits of the 32-bit platform it's on and would like to migrate to a 64-bit capable asp.net version. I'm unsure of the best migration path to take: migrate to 3.5 or 4.0?
What are the pros and cons of going with either version? Is 3.5 in the process of being phased out? Is 4.0 established enough for high-traffic web sites in production environment? Does the same web app in 4.0 require more CPU and Memory resources than in 3.5? (new server needed?)
Thanks!
You might want to look at this:
http://support.microsoft.com/kb/894435
You will either migrate to 2.0 first or do a complete re-write. The main jump is from 1.1 to 2.0 and you need to do this manually. You can easily use the VS wizard to then convert to 4.0. Forget 3.5; it was an interim release and the config files are horrible.
As for your dev env, note that VS will remain 32-bit for a long time to come.
Both are very production worthy. It is your choice whether you can live with or without the features ASP.NET 4.0 gives you.
To address one specific point:
Is 3.5 in the process of being phased out?
.NET 3.5 is a system component of Windows 7 and 2008 R2: therefore it will be supported until those OS's go out of support. This is currently 2018 for Server 2008 R2 (see here).
Related
We have recently taken on support of a web application that was written many years ago and targeted v1.1 of the .net framework. It runs on Windows Server 2003/IIS 6 environment.
After looking at the configuration of the site in IIS the target framework is set to 2.0.
Given that extended support for .net 1.1 will cease in October of this year (http://support.microsoft.com/lifecycle/?p1=1249) I am trying to ascertain whether the site will still use any of the .net 1.1 framework assemblies given that the application is built and compiled in Visual Studio 2003.
I am assuming this is the case because although ASP.net 2 is set as the target framework
in IIS (and therefore the aspet_isapi.dll invoked is the .net 2 one etc) the assembly is a .net 1.1 assembly and will therefore still use the 1.1 framework. However, is this assumption actually true?
The website only has another year or so to live before being replaced by a new solution entirely so I would prefer not to upgrade it if possible and run the risks such changes bring with them.
However, we obviously can't run something on an unsupported version of the framework if any element of if that framework is actually being used.
Any thoughts would be appreciated.
Update:
It would seem that .net 1.1 is a core component of WS2k3 so you can't just uninstall it. I could have attempted to remove the ASP.net component but I don't think that would fully uninstall everything and given that the dev environment is shared I can't risk causing any issues right now.
However I have previously set everything up on my local machine (Windows 7/IIS 7), so I changed the application pool to point at .net 2 (it was already running in classic pipeline mode), uninstalled .net frameworks 1 and 1.1 and cleaned up the files left behind afterwards.
The result was that the site ran absolutely fine, which would suggest in an IIS 7 environment at least that I don't need to worry about upgrading given we are running under .net 2 within IIS.
It's not an ideal test as it isn't a mimick of our live environment. I'm going to post a question on MSDN and asp.net to see if any Microsoft folks can add anything more definitive. I will post back here with any updates.
Just because official support will end doesn't mean Microsoft will pull the plug and force an uninstall of .NET 1.1 via Windows Update. It only means that:
if a gaping hole in the framework's security is ever found, they'll not fix it;
There won't be redistributables for the next versions of Windows, and the next version of IIS won't run it.
So the application will still run in a year. If you leave the server alone, the application might run until the machine breaks of old age.
So my suggestion is relax, and focus more on the new solution.
I got the answer to this questions after reading this link (provided as an answer to this question on the ASP.net forums)
http://msdn.microsoft.com/en-us/library/ms994381.aspx
Under "Application Load Mechanisms and Possible Issues" it states:
By default, an application built using the .NET Framework will run using the version of the Framework it was built against if that version is installed on the computer
It then goes on to detail (for .net 1.1 and 2.0 at least) when a particular version of the framework is used.
Essentially, because our server has both 1.1 and 2.0 installed the application will still be using version 1.1. If 1.1 was not installed then it would run by default under 2.0, which explains why the web application still worked after I uninstalled .net 1.0 and 1.1 from my local machine.
Given that the live server is W2K3 and I can't remove .net 1.1, I will be rebuilding my application to target .net 4.0.
I am looking to rent a server that has IIS 7.5 and .NET 4.0 installed. Is it possible somehow to use .NET 4.5 features (async for example), perhaps by including the respective DLLs in the bin folder or some other way?
Thank you very much for your time and help,
Richard Hughes
I would advise against that, if you are renting a server then I would assume you have full access to it? In that case simply install .net 4.5 on the server as well.
Despite the .5 change to the name it's not a simple change the dll's and get different feature set style of release.
So in simple terms, no. You will need to install .net 4.5 to get those features.
You can't use .net 4.5 features on .net 4.0.
Luckily for you, async-await is mainly a C# 5.0 feature, and less a .net 4.5 feature. You can use the Async Targeting Pack to use this C# 5.0 feature on .net 4.0 (with some minor changes compared to .net 4.5). Check my related question: Using async-await on .net 4
We have a web application that runs on IIS using .NET 2.0 developed and built with Visual Studio 2005.
We're going to upgrade to .NET 3.5 and begin using Visual Studio 2008. Here are my questions:
I note the runtime is still 2.0-based.
When I loaded the solution in Visual Studio 2008, I was asked to convert, and I did. I then checked the target framework for the default project, and it was set to 3.5. However, all of the other target frameworks for the other projects are set to 2.0.
Do I need to manually set the target frameworks from 2.0 to 3.5 for each of the projects in the solution?
Are there any "gotcha's" anyone can think of to be concerned with a web-application conversion?
As I understand it, the 1.1 to 2.0 migration was a much more difficult issue due to the massive runtime and web-page design changes. However, 2.0 to 3.5 isn't such a big change.
I was not at my current job for that upgrade, but I understand there was a problem with some textarea tags using a deprecated attribute that failed to function correctly after the upgrade.
Can anyone think of any similar issues I might encounter?
Any other issues or thoughts anyone has after having done such a conversion themselves?
Thanks, I appreciate the input.
---Dan---
Do I need to manually set the target
frameworks from 2.0 to 3.5 for each of
the projects in the solution?
Not necessary, but I would recommend you to do so. Visual Studio actually filters the assemblies you can reference based on the target framework version.
Are there any "gotcha's" anyone can
think of to be concerned with a
web-application conversion?
Not any that I am aware of when migrating from 2.0 to 3.5. You don't even need to modify the CLR version of the host application pool. When you need to migrate to 4.0 there might be more issues.
If you're also upgrading your own target server, from my own experiences, be patient with the installer.
It does quite a lot including uninstalling the existing .NET 2.0 and 3.0 frameworks and replacing it pretty much wholesale.
It can look as if the installer is stuck. On one of our production servers it ran for nearly 20 minutes. I was almost ready to pull the plug then it magically jumped into life.
I'm still fairly new to ASP.NET development so bear with me.
I'm going to start development on an updated version of an ASP.NET 1.1 website, which I will develop in ASP.NET 3.5. Currently, my development server allows me to run web sites on 1.1 and 2.0. I've had the 3.5 framework installed, but is there any other configuring/issues I should know about? This server will need to keep running the ASP.NET 1.1 web site alongside the 3.5 one I will be developing.
Thanks in advance.
EDIT: Although I have .NET 3.5 framework installed, when I go into IIS and create a new Virtual Directory, it only gives me the options of 1.1 or 2.0.
This is what you're looking for:
http://www.hanselman.com/blog/HowToSetAnIISApplicationOrAppPoolToUseASPNET35RatherThan20.aspx
You can set each site (indeed, each virtual directory if you want) to use a specific version of the runtime.
You should have no other configuration issues. If the server is set up to run 2.0 sites, you're good to go, since sites written against 3.5 use the 2.0 runtime, plus 3.5 bits if they are available.
I would like to upgrade my web projects on an IIS 5 server from .NET 2.0 to .NET 3.5. These web applications live on a server with other web applications that will not be upgraded to .NET 3.5. The server administrator is reluctant to install .NET 3.5 because he is afraid it will break the applications on that machine that are running 2.0 and 1.1.
As far as I know this WON'T be a problem since .NET 3.5 is an addition to 2.0 more than it is a new Framework. I would like the communities help gathering evidence to show him that their concerns are moot and it won't hurt the other applications.
Thanks in advance.
If you have .NET 2 SP1 you shouldn't have a problem.
To be exact .NET 3 & 3.5 are built on top of .NET 2.0 SP 1, we had a problem deploying 3.5 onto a server which only had .NET 2 (not SP1) and it caused the apps on there to break. The reason is your core framework assemblies in .NET 2 are upgraded and have new version numbers which the app wasn't compiled against.
It won't have any problem and you will be able to run your 2.0 and 3.5 application using the same server. This is because the code base for both of the frameworks is the same.
Walk the server administrator through the content of the redistributable for 3.5. It adds a lot of new dlls it doesn't update anything in the 2.0.x directory. You might want to show him how the apps targeting 3.5 are still using System.dll etc from the 2.0.x framework directory.
Both frameworks can run concurrently. In fact, that is the default behavior.
One caveat though, make sure that you don't use the same application pool for apps using different versions of the framework. Otherwise you will get "Server Application Unavailable" errors. Use a different app pool for each set of applications.
Installing 3.5 will modify your .NET 2.0 web.config file and a few others.
This certainly breaks at least 1 application I use. Uninstalling 3.5 will revert the files and fixes the issue.
I've upgraded a couple servers from .net 1.1 to 2.0 & 3.5ץ there haven't been any problems.