how to identify program creating large Visual Studio Extension files daily - visual-studio-extensions

I am having a space issue on my C drive. I found that large (~300 [MB]) Visual Studio Extension (vsix) files are landing at C:\Windows\Temp daily (at exactly the same time), and sometimes multiple times per day. I am not creating these. I was able to delete the existing ones to buy me some time. However, I want to identify the program / script that is creating these things, such that I can turn it off. How can I go about this?

It is probably Visual Studio tries to automatically update extensions:
And the program that actually download files is typically a scheduled task that you can try to locate in Windows Task Scheduler.

Related

Updating a single .NET assembly in a production environment

Does anyone know whether it is 100% safe to replace (copy + paste) an assembly with an updated version of itself, where all version history (AssemblyInfo.vb) is exactly the same but the only difference being that a minor code change took place in one of the aspx.vb files.
It is safe if you are sure you didn't break your existing code (method doesn't exist anymore, ...).
If you manually load assemblies make sure to update versions and tokens.
If you are not sure you can duplicate your website into another folder, create a test IIS instance and test the deployment of single files.
Keep in mind that a clean deployment is safe rather than single file deployments that may cause breaking changes if you are not extremely careful. This may not be a problem on test instances but should never be done on Live.
It is definitely possible and very easily doable but it can easily become a habit that can have a negative impact when the production system is becoming very large, possibly unstable and have a large number of users.
There are a couple of ways I would suggest trying or to be mindful of:
Create a installer or multiple installers depending on the projects
in you solution. These installer produce te excutable(s) to install
on the production server. This will be done manually. Backup the previous
installers for rollback.
You can create a ClickOnce application. This can be run manually and also can be scheduled using a stable scheduling app.
One can also make use of Visual Studio's publishing function. After
compiling code and setting it to release mode it gets publish via
file/network/ftp on the production space. This replaces all the markup files and assemblies.
The automate the process you can use a TFS Builder server and
schedule daily or weekly builds. These installations can be made
manually or one can use SMS(Microsoft System Management Services) to
schedule timely installations
One can use Windows Powershell as well
TFS Build Server and SMS carry cost implications of course but it
will be a small price to pay if problems on a production environment
can bring a company down.
There are a couple of ways to do this. See what might work and get into good habits when it comes to an production environment

Asp.net Deployment Strategies

I have heard lots of strategies for deploying asp.net applications, but am not quite sure which strategy is best for my needs.
I have an asp.net 4 application. I have separate development/staging/production environments (different web.configs). I also need to manage sql server changes. It is possible that I may have more than 1 DB server and more than 1 app server to push changes to. Ideally, I would like to hit a button and say "deploy to staging" or "deploy to production" and it brings deploys code/db/config files to the correct servers. Ideally, I'd like there to be some process to rollback in case of a bad release as well.
I have heard xcopy/robocopy strategies, MSDeploy (now called Web Deploy?) strategies, and building MSI packages to deploy.
Which of these seems like the best fit for this type of need?
Method #1
If you have some time to spend, I suggest using CruiseControl.NET. For a while at least, the stackoverflow team used this for deployments.
Method #2
As far as copy strategies go, I recommend using a combination of 7zip and ftp for application and media. 7Zip is nice, as it allows you to exclude file types (web.config), folders, and file types, and allows you to compress different files differently. Example, there is no point in compressing a PNG. Note, this does a full deployment every time. So, if you have large media folders, I'd handle them separately.
As for the database, I believe you will have the best of luck using SQL Compare by Redgate. They are commercial applications, but they are very, very good. They've been positively mentioned multiple times on the stackoverflow podcasts.
Build a CMD file on the development/build server that generates the master 7zip file and FTP's it to a dedicated folder on the staging (or production) server. I end up with multiple calls to 7zip feeding files into a single 7zip file, using different compression methods for each batch.
Build a CMD file for each staging or production server. This file will execute proper file backups, and extract the 7zip file to the proper location.
A deployment to staging will go like this:
Execute your 7zip-prep command file which will trigger FTP upload to a dedicated FTP folder on the staging server
Execute DB changes against the staging database server via scripts generated by SQL Compare
Execute 7zip extraction command file on the staging server
This is the method that I use. I have not invested the time to master CruiseControl.NET, but when I do I will probably use it instead, at least for larger applications. It's not a one-click deployment, but it allows for multiple efficient deployments per day (as I've been doing off and on for few years now). The 7zip method is nice, because once you have your command files, you can copy them and use them for new projects very quickly.

Speeding up ASP.NET website publishing

Is there any way in ASP.NET website project, that allow to publish it from command prompt and I can continue working on project, or if it is not easy to use , at least speed up my publish task?
I know about the auto publishing tools like TFS or CruiseControl, so please don't tell me these ways.
I am thinking to create a .bat file , that I'll run everytime I have to publish. but it should not take changes made by me during its running process.
asp.net single file publish
I really like the Answer given by Ludwo, providing more information on that would be very helpful.
You can use MsBuild to publish your websites in parallel. Start with this article. It is about publishing one website using MsBuild. Define your projects inside ItemGroup and use MSBuild task this way:
<MSBuild Projects="#(YourProjectsToBuildInParallel)" BuildInParallel="true" ...
The final step is to enable parallel processing for MSBuild task.
Open another Visual Studio to continue :P. Publishing mechanism can detect updated and can send only changes. So dont upload full site everytime, if its really disturbs you.
Use source control and a build server mechanism. The build server should be able to pull from source control when you commit a change, build the project, do any unit tests you may/should have, and then deploy to a test site.
Depending on which build server platform you use you may or may have to do varying amounts of work. In the past I have used Bamboo by Atlassian. Fantastic product but you have to configure the deployment using MSBuild - it's fine but it can take some time to get it perfect. I am sure there are some good examples out there for it.
How it will work for you:
When you are finished working on a file/issue you can commit your changes. The build server will then detect these changes and wait a varying amount of time (waiting for you to commit more) e.g. 3 minutes, check out your changes, and deploy. You can set up notifications when the deployment is done to goto your testing team - with a link in the email saying where the site is, and what the change that occurred (based on your SVN commit log).
So your net effort is to check a file in with a correct comment - and you are finished.

Looking for a good web application deployment strategy (ASP.NET MVC3)

I’m looking for a good deployment strategy for deploying a ASP.NET MVC3 application. What I imagine is that each deployment would be some kind of commit to a Source Management System in the sense that a deployment tool could automatically do the following:
1) Upon generating a deployment package (a commit) the tool would
remember the state of my Web.Config file, the state of a folder of
auto-generated scripts containing new database changed, the state of
a folder of batch files that contain new tasks to be run on the
server, the state of files specifying ISS settings changes, etc.
2) When I build a package the next time, the tool would know to only
package the new script files, web.config changes, new batch files,
new ISS settings since my last package
3) Apply the package unto my web application
I started looking into MS Deploy but it only seems to do number 3. I’ve been searching around for either an application that that does what I imagine or a strategy to combine some SMS and MS Deploy. I'm hoping that someone has already solved the problem I feel I have here. My last resort of course is to build the tool but again, that would be my last resort.
Are you using Team Foundation Server? If so, TFS comes with tools to automate builds (including labeling code, running unit tests, deploying, et cetera.) Take a look at http://msdn.microsoft.com/en-us/library/ms181710(v=vs.80).aspx
TFS is not exactly easy to configure and get going but it's free if you are already using TFS.
If you are not using TFS, look for continuous integration tools like NAnt or TeamCity.
Have you used Web Deploy and the "Publish" feature under Build in Visual Studio?
You can set options for things like leaving the previous files on the server.
Your web.config file, do you mean the main one or one that already exists elsewhere on the server? Your web.config file should copy from your project to the server, or are there settings that are different when running locally vs server? If so, look at using transforms to modify web.config.
This is only a partial answer to #1 for you, but we looked for a long time on a migration tool that we liked... We ultimately found Migrator.Net: http://code.google.com/p/migratordotnet/
Doing this, you can turn db migrations into a batch command

Publishing my web application can take long time to accomplish because of the .suo file

I have a web application project that I publish via Visual Studio 2010 to my server. My problem is that it can take very long time before it can actually publish it. Maybe 10 minutes! It doesnt happen everytime but very often.
Here's a summary of what i have in the Output -> Build console when I try to publish:
The project is compiling - OK
Connecting to C:\Users{user}\Desktop\MyProjectTest... (this is where it can take up to 10 minutes)
The files are publishing...
Process Explorer
When I open ProcessExplorer, I see that devenv.exe is taking all the CPU. When I open this process, I see that the task that consume all the CPU is clr.dll!StrongNameSignatureVerification+0x11ee1. As soon as this task finish, after 10 minutes, the publishing task finish quickly.
Process Monitor
With Process Monitor, I have monitored the TID of clr.dll!StrongNAmeSignatureVerification and I've got MANY redondant events. For over 5 minutes, the task try to access a file that I don't have on my computer. He is searching for Microsoft.Build.Task.resources.dll. It's like if the publishing task was trying again and again and again something that doesnt exist. For your information, I'm using Windows 7 French with Visual Studio 2010 English. On the screenshot, you see like 10 events of over 2000 events of the same thing!
More info on my setup
Here is some info that can help to identify the problem:
My application is build with MVC3
I have a few third party dlls. Some of them are signed.
I'm publishing with the File System method.
I tried to publish on my local computer and the problem is also there so its not a network problem between my computer and my server.
I have tested on Windows 7 x86 & x64 French edition
My Visual Studio 2010 SP1 is the English edition
UPDATE 2011-09-23
I now know how to solve the problem BUT I don't know what is causing it. If I delete the **.suo file (at the same level as the .sln file) and i reopen Visual Studio, the publishing will be really fast. So reinitializing the .suo file seems to solve the problem each time the publishing get slow.
Just to make another test, I've made a backup of the .suo file when the publishing was slow and deleted it. Now the publishing is fast. If I copy the .suo file back to his position and reopen Visual Studio, the publishing will be slow again. So all seems to point to that file.
Any idea on this one?
Try this way
In order to deploy the release on the development or production server, please follow the following steps.
Install Web Deployment MSI.
Right Click on the your project under the solution explorer and add the web deployment project( Here I am not using convert to web application or publishing)
Then compile the files. This will create a folder in your project directory which will contain the required files to be deploy on the server.
Take the back up of your virtual directory and remove the virtual directory as well as files from inetpub.
Goto Inet mgr , type inetmgr in run hit enter.
Under the default website, create a virtual directory, keep the deployed files in the inetpub and browser the files.
Allow appropriate access such as read, run script and browse. That's all
Flag it as your answer if you have find it useful else let me know ...
I am not sure if it's a suo file that is causing, but for me, this solved the problem.
After compiling, the publish will call aspnet_compiler, which actually takes longer as it is generating custom dll for all code.
But check with your VSPackages, is there any package written for some interpretation or so, that might be interrupting your publish.
I just experienced the same problem publishing to a network share and discovered that copying the files in Windows Explorer was also extremely slow. When I zipped the build folder and copied it across it took a few seconds. I conclude that the VPN OR Antivirus OR Firewall at one end or the other are adding some overhead to every file transfer.
Windows is far faster transferring 1 giant file than transferring thousands of tiny files, even it the net size is the same. So try this:
Publish to a local file (not directly to the IIS file share).
Zip the local files (these files will compress well).
Use file explorer, delete the IIS file share files.
Copy/paste the local zipped file to the network share.
Unzip the files on the server using file explorer.
(you will not have to remote into the server to do this)
This accomplishes a couple things. 1) Its 1 giant file, not thousands of tiny files. 2) The compressed file will be compressed 50% to 80% so the data transfer over the wire will be that much smaller.
If you need a backup, its the same process in reverse but without publishing. I typically use L7.zip, but the built in windows zip will work. I don't know why Visual Studio cannot do this programmatically.

Resources