I use the "publish website" option to directly publish my ASP.NET - website to my ftp-server. This works quite nice.
The problem is the biggest part of my project are DLL-Files in the bin-directory which are external libraries that I only update quite rare.
So I do not want then to be uploaded every time. With my local resources I can select whether these files should be uploaded every time, never or only when changed, but I do not find this options for files in the bin-directory.
Any way to solve this?
Related
Maybe i'm totally outdated but for last four years i've been using simple FTP upload feature while uploading new website even without building it within Visual Studio. Just bunch of ASPX and CS files as in Visual Studio.
I do understand that compiling the project will provide me with some security defence so ones who have access to the server won't be able to read those files in text editors and i will avoid first time compilation but is that so important?
I mean, you can always do a lot of harm if you have access to server that just reading CS files instead of DLL.
First time compilation usually takes no more than 1 minute just searching for compiled version of the site will take as much time.
Now i'm watching video on PluralSight which explains new MSDeploy tool available from ASP.NET and i can't see any good reason to use it.
So what's wrong with the old fashioned way of just sending files via FTP without compiling or using fancy tools?
I did speed test and with MSDeploy i can deploy a website twice faster than old-fashioned FTPing. So instead of 4 minutes it will take 2.
Now from another perspective, when i already have alive project on the web. In which have to change Default.aspx because i have typo in some html tag. Deployment via MSDeploy will take 10 times more than uploading one file
Maybe i miss something?
MSDeploy does things which FTPing to a site can't do. Need to change a machine.config? You're unlikely to have FTP write access to the folder which contains it. Want to change a server setting in a server-version-independent manner? FTP won't do that. Etc. FTP works fine for copying files to folders in which you have write access, but that's all it can do.
When you deploy a project you can do a lot of things with it.
You can set up a job in your deploy that packages all your javascript into one file and all your css into one file.
You can set up a job in your deployment that changes a bunch of config settings to match your production server settings (rather then development settings).
The idea of deployment is that you take your current development website and transform it into a production website without having to do any of that manually.
The most important thing is that when you can only deploy your website you will never forget to package your js or forget to remove some debugging code because you can't just sneakly update a single file.
I am having an ongoing situation where when I try to upload files via FTP that I get an error that the DLL is locked and currently cannot be overwritten. This is only DLLs that this is happeneing to and normal files (aspx, ascx, css etc) can be overwritten fine.
Our Setup
We have 2 webservers that are kept in sync via DFS which is managed from a separate server.
They all belong to the same domain.
They all do internal transfers on 1GB ethernet cards on a private network.
Our Problem
We develop in VS2010 and build the site we are working on, when it gets to a level where it needs to be checked on the server then its hit and miss whether we can overwrite the DLL's in the BIN folder. I only started experiencing this issue when we migrated from our old, unreliable sync tool to the super Windows 2008 DFS tool. Its a good tool and works well but this is the only thing I can think thats causing this issue.
To actually overwrite the file I need to take down all the sites that are using this base level code which then releases the lock on the DLL and I can upload it.
I come today in desperation, I am fed up and bored of having to take sites down every so often just so I can upload a DLL.
It is my understanding that ASP.Net caches the DLLs into a temporary folder, so god knows why the lock remains on the DLL itself in the BIN folder.
The weird thing is, this does not always happen, it can go for weeks and not do it. Or like recently, its around every day I have to take the IIS sites down so I can upload.
As of writing this, I cannot upload to FTP even though I have taken the sites down.
Could anyone please shed any light on this so I can actually just get on with my work rather than messing with this every ten mins. It's bad enough that VS2010 is so unstable and visual source safe only checks in what it wants without this being an issue as well!
Trying using UnLocker to free the handles.
I'm trying desperately to move from VSS to a real source control system. Options include TFS and SVN.
My designers need to keep their ability to modify source files and instantly preview their changes in a browser without having to commit their changes. Using FPSE with VSS, this works flawlessly, since saving a file causes the copy in the working folder on the dev server to be updated, so they can just save and refresh their browser which is pointed at the dev server.
The site in question consists of 350k+ lines of classic ASP code and some new ASP.NET MVC. They only need to be able to modify views within the MVC code, not C#.
Though Expression includes a version of Cassini for local debugging, Cassini does not support classic ASP.
Surely someone has solved this problem before. It can't be necessary to install IIS on each designer's machine (this is absolutely untenable). I need a way to have a common working folder on a dev webserver updated whenever someone saves a file locally, just like using FPSE.
I'd rather not write an FPSE proxy that knows how to talk to TFS/SVN. Any suggestions?
(I know I've asked this question in the past, but I haven't yet found a solution.)
Why the need to copy the source files when they are saved, why not simply save the files to a network share and work on them directly? If the dev server is constantly being overwritten after every save anyway surely the effect is the same?
This probably won't be as instantaneous as you like, but with TFS you could set up a Continuous Integration (CI) build that builds and deploys the project to a test server on check-in. If you do this, you'll want them checking in to a QA type branch, then, once they are happy with how they look, they can then merge to the mainline branch for the real build and integration.
My friend and I are collaborating on a ASP.NET powered website. To develop it locally, we use Visual Web Developer Express (good enough for our needs). Subversion (using Tortoise SVN) is our source control of choice with the repository residing on Unfuddle.com.
We run into problems when we need to update the live site - since there's no version control on it. Currently we use the "Copy to Website" feature in VWD which copies the files using FTP. Here are some problems:
VWD only keeps track of files uploaded by one user, so if the other user uploads a newer version of a file to the live site, VWD on my side cannot tell whether the live version of the file is newer or mine is.
There's no way to tell whether all the latest changes are available on the live site.
We have to be careful not to party all over the shared web.config file since the other user's local DB settings are different from mine, and of course, the live DB settings are a whole other story!
What do you guys use to publish to a live site? Does anything out there tie into Subversion so that we can automate the process and always guarantee that the live site is synced to a change list number? Also, how do you manage the different web.config file settings?
Thanks!
Well...
wait another 2 weeks and you have.... Visual Studio 2010. Lots of nice things for you:
Different web.config settings for local and production, so basically you can ahve one web.config and.... vs 2010 will automatically merge a delta file on publication. This is really a weak point so far. Check http://weblogs.asp.net/gunnarpeipman/archive/2009/06/16/visual-studio-2010-web-config-transforms.aspx
More modern publishing mechanism. Working alon gpackages. You need a host to support it, but seriously - the FTP publishing mechanism was retired a LONT time ago. http://blogs.msdn.com/webdevtools/archive/2009/02/09/web-packaging-creating-a-web-package-using-vs-2010.aspx
Any subversion tie in? Sure - any CI server can do that. Point is, though: you dont want. Version Control != website publishing. You dont publish every checkin. In non-trivial non-playing-around setups, changes are done on development, then moved to a test server, thne possibly to an integration server and then finally to production. You dont publish "as you go".
Does anybody have an idea how to use he "publish website" command on VS 2008 and be able to track changes so as to only sent to the hosting server the modified files?
When the command is called, the destination folder files are wiped and replaced with the result of the new build (assembly file are created as well as some marker files).As my website is getting bigger and bigger I have to transfer to the server all the assemblies in my bin directory and keep in mind which other files I may have modified.
Is there a better way of doing this?
ps: I use FileZila to transfer my files to the server.
Publish to a local directory, then use a diff tool (such as WinMerge) to find and copy the modified files to the server.
You can publish locally and use any mechanism of your choice to transfer the files.
Have you tried Website->Copy Website menu item? It seems to know which files are changed.