Slow solution loading in visual studio 2008 - asp.net

I am working on an ASP.NET 3.5 project which has 55 projects in a solution. When opening the solution in Visual Studio 2008, it takes over a minute to open - about 1 second for each project. However, if I disconnect the network cable before opening the solution, it only takes about 15 seconds! Any ideas about what could be causing the slowdown?

I had this happen to me back in the days when we were using Visual Source Safe.
Could be your source control plugin asking for updates if you have the solution under source control.

You should do some investigation, fire up Wireshark, start a capture on the interface in question and see what traffic is flowing over the wire.

Can I answer a question with a question? What is the secret to getting VS to not just die with that many projects, let alone load in a phenomenally quick 60 seconds?
At about 10-12 projects the compile time on Visual Studio becomes unbearable, at about 5-8 projects Resharper will crash. The IDE is such a memory pig that even opening more projects by using multiple instances of VS usually isn't an option.
Anyhow, it's all about memory usage and the odd ball out project is probably doing it, e.g. the one with the most files.

I had the same problem this week (5 years later!!). It was caused by a huge .suo file (almost 400 Mb), deleting it fixed the problem.

A few years ago I remember a colleague having some similar problem (with a lot smaller solution, and in VS2003). Can't remember the details, but I think it was related to the local ASPNET user account (or rather, that it did not exist). Not sure though...
As a side note: I usually find it more efficient to have perhaps around a handful of projects in each solution (usually one solution produces one or two assemblies used in production code), and then have a few Visual Studio instances running at the same time. 50+ projects in the same solutions feels like asking for problems.
Might be that you have other dependencies though, just wanted to share my thoughts.

which has 55 projects in a solution
WOW. I can't imagine what type of solution needs that many projects. The answer is probably that your source control provider needs to refresh the status of each of the items, all of which take time.
For edit-merge-commit style version control systems, such as subversion, this operation doesn't take place. Try temporarily removing source control from the entire solution to see if this is the culprit.

If your solution is attached to source control, then it is trying to load up the symbols and verify which items you have checked out. So, if you have a slow connection, it is oftentimes faster to take the solution offline.
http://www.tmgirvin.com/2009/03/working-offline-with-visual-studio-2008-and-tfs.html
EDIT
Another solution which I've seen used,
create a
_webTier.sln
_database.sln
_build.sln
( is your project name)
and each of those solutions is a self-sufficient part of the entire project, but that way if you are working on the webtier and you don't need the database project or the mobile project parts to load up, you can just open the webtier solution.
The build solution contains the entire package that needs to be built, and takes a very long time to load.

I had this problem on a development machine with no internet connection and it turned out that the problem was related to a setting in IE's internet options:
Control Panel -> Internet Options ->
Advanced -> Security -> Check for
publisher's certificate revocation
After making sure this was unchecked my solutions started loading quickly again.

Related

Do you have any suggestions on how to speed up precompiling a Kentico site? AKA: the aspnet_compiler is VERY slow

I have a Kentico 6 site that we have squeezed better performance out of by precompiling. We have a Team City Continuous Integration (CI) server that runs the build automatically on check in to the subversion repository.
However, the build takes 40 min! About 33 min are during the aspnet_compiler step! This is a pretty long time for a build that SHOULD take less than 10 min. I've gone through some of the various perf improvements such as moving the asp.temp files to a fast SSD. During the aspnet_compiler step, the server is using very little CPU, Disk and RAM. The CPU use seems to avg about 1%! In researching about aspnet_compiler, I have found many pages complaining about the speed, but a dramatic silence from Microsoft about it.
https://aspnet.uservoice.com/forums/41199-general-asp-net/suggestions/4417181-speed-up-the-aspnet-compiler http://programminglife.wordpress.com/2009/04/16/aspnet_compiler-compilation-speed-part-1/
I've come to (perhaps erroneous) conclusion that if I can't speed up the aspnet_compiler, then perhaps I can reduce it's workload. Since Kentico has lots of controls in the project, are there any savings that I can glean by remove extraneous controls? ie: if I run the aspnet_compiler for a project with a single page, it's quick.
I've also thought about maybe making the cmsdesk a separate application that only needs to be compiled after a new hotfix has been applied.
To recap: My two concerns are: #1 - can I speed up the aspnet_compiler somehow? #2 - If I can't, then I'm guessing I can reduce it's workload.
In relation to #1, maybe I can do incremental compilation, so that I only precompile files that have changed since the last build? I haven't found much info about doing this; there are a few unanswered questions on StackOverflow about this very topic - eg: aspnet_compiler incremental precompile Incremental Build aspnet_compiler
FYI - for those of you unfamiliar with Kentico CMS, It's a Web Site Project, with LOTS of controls - maybe hundreds of them.
Any ideas?
PS - I have a reply on the Kentico forums: http://devnet.kentico.com/questions/do-you-have-any-suggestions-on-how-to-speed-up-precompiling-a-kentico-site-aka-the-aspnet_compiler-is-really-slow
One thing our team did try to improve the compilation time was to remove the modules and set of controls that were not necessary to the project(forum, ecommerce etc..) to speed up the compilation time.
Also, which approach do you use for developement, portals or aspx? as we've noticed that compilation prove to be faster in the portal approach than in the ASPX .
Everything I've read says that precompiling with aspnet_compiler is super difficult to achieve with Kentico.
However, I managed to find this post on Kentico's forums where someone appears to have figured out a way to make your system work for them (search for EHUGGINS-PINNACLEOFINDIANA on the page and you'll find it).
I have heard that using MSDeploy or MSBuild are a much more efficient way to precompile than directly calling aspnet_compiler (although I'm pretty sure both of those either still call aspnet_compiler or do the same thing it does, only faster).
I've personally never tried using any of these methods (and I'm pretty new to ASP.NET myself), but I figured I could at least give you some leads:
http://odetocode.com/blogs/scott/archive/2006/10/18/what-can-aspnet_compiler-exe-do-for-me.aspx
http://therightstuff.de/2010/02/06/How-We-Practice-Continuous-Integration-And-Deployment-With-MSDeploy.aspx
http://www.troyhunt.com/2010/11/you-deploying-it-wrong-teamcity_26.html

ASP.NET Visual Studio 2013 extremely slow debugging

I have an ASP.NET Application I want to debug on localhost. When I run it without debugging, it runs very fast and smoothly, with about 3 seconds per page load. However, when I try to debug the app with Visual Studio and Chrome or Firefox, every page load takes about 20-30 or even 40 seconds to load, which is extremely slow. I have tried everything I found on the internet about these issues, yet none seems to help me out:
"Load all Symbols" from Microsoft Symbol servers, then uncheck that location
Delete all breakpoints
Uncheck "Enable property evaluation"
Other options I can't recall
Which is a good debugging configuration for asp.net apps? Any extra suggestions that may help?
Thank you very much and kind regards,
David
It is possible that your visual studio is using a lot of memory. You should try to turn off the browserlink which will reduce the amount of memory being allocated. It is still fine to disable the browserling and the preview still works.
Here is a guide and explanation
http://blogs.msdn.com/b/webdev/archive/2013/06/28/browser-link-feature-in-visual-studio-preview-2013.aspx
It is also possible that you have a lot of data or calculation to be ran in your form load.
Another possibility is a slow internet connection, this only applies if you have items or scripts on your page the are from an external source like JavaScript, CSS etc.

What are some reasons ASP .NET startup would be so slow

I have Visual Studio 2010 and a pretty large web application project running on IIS 7. Startup for the web application is over a minute (75 seconds). I've attached ANTS to it and very little of the 75 seconds is my code. Most of it seems to be something like CreateAppDomainWithHostingEnvironment and BuildManager stuff. Now I know that ASP .NET will compile dynamically the first time but I certainly don't expect it to compile for that long. Why could I be experiencing this problem and what are some ways I can try to fix it or try to better understand what is taking so much time. Aksi the CPU utilization doesn't seem to be that high. I have an awesome machine.
The problem with the 75 second startup is that for developers working on this, everytime they make a change they have to wait this 75 seconds.
I am using .NET 4.0
EDIT
I ran Microsoft Network Monitor on my machine to see if there was anything suspicious going on the network. There wasn't as far as I can tell though i wasn't sure what to look for (I am familiar with network monitor though so I did have an idea of what I was doing). I tried to run it in release build and though it may have improved the performance a little bit its not really significant
EDIT
I have SQL Session state. As far as i can tell, the connection string is pointing local. For some reasons though, when examinning ANTS, i'm getting a lot PollLockedSessionCallback on many threads. The function seems to be called over 70 times. Does this help at all?
Try building the application in release mode. You can set this in the Build tab of the properties window. You might also consider pre-compiling when publishing the application before deployment.
Are you trying to access anything via a network share at startup? If so, bring those resources local for startup comparison.

My under development local drupal site become very slow, how to solve?

I am developing locally a site with drupal and suddenly it became very slow. The last thing I made was installing the internationalization module.
Now when I try to reach administration panel I receive:
Fatal error: Maximum execution time of 60 seconds exceeded...
What to do now? Should I increase the maximum execution time allowed? OR could be that I have too many modules installed?
EDIT: Forgot to tell you that I am working on a PC with 2GB RAM and CPU 2.9 GHz, Windows XP + XAMPP
Exceeding 60 seconds execution time is quite something - indicates that something is going quite wrong.
I'd start troubleshooting by disabling modules (physically moving them out of your modules directory) one at a time until the problem goes away. Then, add them back one at a time, until the problem returns (you'll need to re-enable them through the Modules page as you go). You should be able to quickly isolate exactly which module is causing the problem.
Since the last thing you did was to install internationalization, I'd start by disabling that module.
Once you've isolated the module, you can try to work out what's going wrong.
Some things to look into ...
is your database running out of space
Are you missing any indexes
Do you need to "update statistics" (rebuild metrics on table contents and column distributions)
The Devel module can be useful for logging performance statistics, to help you track down the bottleneck.
A php accelerator may help you get the time down a bit, there are also a number of caching options that your site can use (look in admin under performance), this may make developing more difficult but can make pages load faster.
I wouldn't increase your maximum execution time, at some stage you want to put your site wide, and if people don't get a page within a second or so they will think the site is down.
To have too many modules installed you would have to have a lot of modules, it is more likely that one of your modules is causing a performance bottleneck. Or something on your site like a view is causing things to slow down. mattv's answer helps with that.
try also activating the cache system under site settings / performance. It could be helpful.
there is a known and documented problem about massive queries getting dynamically built by the Views module when rebuilding the dynamic menu, apparently.
Unfortunately, no simple and definitive answer has been found, yet.
You can find more information here (please be aware that some answers relate to version 5).
I would really like to know how to fix this in a definitive and efficient manner.
Use Zend Server. For detailded information check this out: http://drupal.org/node/348202#comment-3349704

How can we improve our deployment and build systems?

We have 4 different environments:
Staging
Dev
User Acceptance
Live
We use TFS, pull down the latest code and code away.
When they finish a feature, the developers individually upload their changes to Staging. If the site is stable (determined by really loose testing), we upload changes to Dev, then UserAcceptance and then live.
We are not using builds/tags in our source control at all.
What should I tell management? They don't seem to think there is an issue as far as I can tell.
If it would be good for you, you could become the Continuous Integration champion of your company. You could do some research on a good process for CI with TFS, write up a proposed solution, evangelize it to your fellow developers and direct managers, revise it with their input and pitch it to management. Or you could just sit there and do nothing.
I've been in management for a long time. I always appreciate someone who identifies an issue and proposes a well thought-out solution.
Whose management? And how far removed are they from you?
I.e. If you are just a pleb developer and your managers are the senior developers then find another job. If you are a Senior developer and your managers are the CIO types, i.e. actually running the business... then it is your job to change it.
Tell them that if you were using a key feature of very expensive software they spent a lot of money on, it would be trivial to tell what code got pushed out when. That would mean in the event of a subtle bug getting introduced that gets passed user acceptance testing, it would be a matter of diffing the two versions to figure out what changed.
One of the most important parts of using TAGS is so you can rollback to a specific point in time. Think of it as an image backup. If something bad gets deployed you can safely assume you can "roll" back to a previous working version.
Also, developers can quickly grab a TAG (dev, prod or whatever) and deploy to their development PC...a feature I use all the time to debug production problems.
So you need someone to tell the other developers that they must label their code every time a build is done and increment a version counter. Why can't you do that?
You also need to tell management that you believe the level of testing done is not sufficient. This is not a unique problem for an organisation and they'll probably say they already know. No harm in mentioning it though rather than waiting for a major problem to arrive.
As far as individuals doing builds or automated build processes this depends on whether you really need this based on how many developers there are and how often you do builds.
What is the problem? As you said, you can't tell if management see the problem. Perhaps they don't! Tell them what you see as the current problem and what you would recommend to fix the problem. The problem has to of the nature of "our current process has failed 3 out of 10 times and implementing this new process would reduce those failures to 1 out of 10 times".
Management needs to see improvements in terms of: reduced costs, icreased profits, reduced time, reduced use of resources. "Because it's widely used best practice" isn't going to be enough. Neither is, "because it makes my job easier".
Management often isn't aware of a problem because everyone is too afraid to say anything or assumes they can't possibly fail to see the problem. But your world is a different world than theirs.
I see at least two big problems:
1) Developers loading changes up themselves. All changes should come from source control. Do you encounter times where someone made a change that went to production but never got into source control and then was accidentally removed on the next deploy? How much time (money) was spent trying to figure out what went wrong there?
2) Lack of a clear promotion model. It seems like you guys are moving changes between environments rather than "builds". The key distinction is that if two changes work great in UAT because of how they interact, if only one change is promoted to production it could break there. Promoting consistent code - whether by labeling it or by just zipping up the whole web application and promoting the zip file - should cause fewer problems.
I work on the continuous integration and deployment solution, AnthillPro. How we address this with TFS is to retrieve the new code from TFS based on a date-time stamp (of when someone pressed the "Deliver to Stage" button).
This gives you most (all?) the traceability you would have of using tags, without actually having to go around tagging things. The system just records the time stamp, and every push of the code through the testing environments is tied to a known snapshot of code. We also have customers who lay down tags as part of the build process. As the first poster mentioned - CI is a good thing - less work, more traceability.
If you already have TFS, then you are almost there.
The place I'm at was using TFS for source control only. We have a similar setup with Dev/Stage/Prod. I took it upon myself to get a build server installed. Once that was done I added in the ability to auto deploy to dev for one of my projects and told a couple of the other guys about it. Initially the reception was luke warm.
Later I added TFS Deployer to the mix and have it set to auto deploy the good dev build to stage.
During this time the main group of developers were constantly fighting the "Did you get latest before deploying to Stage or Production?" questions; my stuff was working without a hitch. Believe me, management and the other devs noticed.
Now (6 months into it), we have a written rule that you aren't even allowed to use the Publish command in visual studio. EVERYTHING goes through the CI build and deployments. When moving to prod, our production group pulls the appropriate copy off of the build server. I even trained our QA group on how to do web testing and we're slowly integrating automated tests into the whole shebang.
The point of this ramble is that it took awhile. But more importantly it only happened because I was willing to just run with it and show results.
I suggest you do the same. Start using it, then show the benefits to get everyone else on board.

Resources