Recently I have published my website and uploaded it to GoDaddy server. After that I have made changes in .aspx as well in .aspx.cs files. Then I uploaded that files to server but it throws an error. So anybody is having idea about how to upload only changes to published ASP.NET website.
Thanks in advance!
Here is what I do:
I keep a copy of the older version locally.
Then run a diff tool(folderpatch) to compare it with the new version
Generate a patch(zip file)
Upload the patch
Patch the online version to make it same as my new version
Using a tool ensure that you don't miss any updated files.
Having to upload only one file is a lot faster for uploads.
Using the patch tool on the server, ensures that all files are uploaded to the correct locations.
P.s: I wrote folderpatch to make my life easier. But its free and open source, so feel free to use it where it helps you.
Related
Maybe i'm totally outdated but for last four years i've been using simple FTP upload feature while uploading new website even without building it within Visual Studio. Just bunch of ASPX and CS files as in Visual Studio.
I do understand that compiling the project will provide me with some security defence so ones who have access to the server won't be able to read those files in text editors and i will avoid first time compilation but is that so important?
I mean, you can always do a lot of harm if you have access to server that just reading CS files instead of DLL.
First time compilation usually takes no more than 1 minute just searching for compiled version of the site will take as much time.
Now i'm watching video on PluralSight which explains new MSDeploy tool available from ASP.NET and i can't see any good reason to use it.
So what's wrong with the old fashioned way of just sending files via FTP without compiling or using fancy tools?
I did speed test and with MSDeploy i can deploy a website twice faster than old-fashioned FTPing. So instead of 4 minutes it will take 2.
Now from another perspective, when i already have alive project on the web. In which have to change Default.aspx because i have typo in some html tag. Deployment via MSDeploy will take 10 times more than uploading one file
Maybe i miss something?
MSDeploy does things which FTPing to a site can't do. Need to change a machine.config? You're unlikely to have FTP write access to the folder which contains it. Want to change a server setting in a server-version-independent manner? FTP won't do that. Etc. FTP works fine for copying files to folders in which you have write access, but that's all it can do.
When you deploy a project you can do a lot of things with it.
You can set up a job in your deploy that packages all your javascript into one file and all your css into one file.
You can set up a job in your deployment that changes a bunch of config settings to match your production server settings (rather then development settings).
The idea of deployment is that you take your current development website and transform it into a production website without having to do any of that manually.
The most important thing is that when you can only deploy your website you will never forget to package your js or forget to remove some debugging code because you can't just sneakly update a single file.
I use the "publish website" option to directly publish my ASP.NET - website to my ftp-server. This works quite nice.
The problem is the biggest part of my project are DLL-Files in the bin-directory which are external libraries that I only update quite rare.
So I do not want then to be uploaded every time. With my local resources I can select whether these files should be uploaded every time, never or only when changed, but I do not find this options for files in the bin-directory.
Any way to solve this?
I'm trying desperately to move from VSS to a real source control system. Options include TFS and SVN.
My designers need to keep their ability to modify source files and instantly preview their changes in a browser without having to commit their changes. Using FPSE with VSS, this works flawlessly, since saving a file causes the copy in the working folder on the dev server to be updated, so they can just save and refresh their browser which is pointed at the dev server.
The site in question consists of 350k+ lines of classic ASP code and some new ASP.NET MVC. They only need to be able to modify views within the MVC code, not C#.
Though Expression includes a version of Cassini for local debugging, Cassini does not support classic ASP.
Surely someone has solved this problem before. It can't be necessary to install IIS on each designer's machine (this is absolutely untenable). I need a way to have a common working folder on a dev webserver updated whenever someone saves a file locally, just like using FPSE.
I'd rather not write an FPSE proxy that knows how to talk to TFS/SVN. Any suggestions?
(I know I've asked this question in the past, but I haven't yet found a solution.)
Why the need to copy the source files when they are saved, why not simply save the files to a network share and work on them directly? If the dev server is constantly being overwritten after every save anyway surely the effect is the same?
This probably won't be as instantaneous as you like, but with TFS you could set up a Continuous Integration (CI) build that builds and deploys the project to a test server on check-in. If you do this, you'll want them checking in to a QA type branch, then, once they are happy with how they look, they can then merge to the mainline branch for the real build and integration.
I am using Visual Studio 2008 and trying to publish a Web Application Project, but it keeps failing when trying to add files in the project. Below is a sample of the message;
Publishing folder JavaScript... Unable
to add 'JavaScript/hoverIntent.js' to
the Web site. Unable to add file
'JavaScript\hoverIntent.js'. The
specified file could not be encrypted.
This happens for image files too. I am lost as to why it is happening. I should add that I am using Windows 7 build 7100, not sure if this is casuing the issue??
Any help greatly appreciated
I know this is an old topic, but I found it when I googled for the same problem.
My solution was to remove the "Encrypt" flag from Windows Explorer for the files listed (Right click -> Properties -> Advanced)
This blog post at BlackMarble is suggesting that you may have the target directory set to use encryption. Sounds like the exception you're seeing is the inability for the VS publish process to handle that.
To get around this problem:
use VS to publish to an intermediate directory. Somewhere on your PC perhaps.
copy the files yourself (with a batch file maybe) to the server
That's a workaround, at least.
Disable windows encrypted file system in cmd with the following:
fsutil behavior set disableencryption 1
Then restart your PC.
When I had this problem on publishing a Visual Studio 2010 web project either to a local folder or to a host, I was stumped. Visual Studio didn't indicate which files or even folders had caused the problem. I wasn't aware there were any encrypted files in the solution and I couldn't find any. I was unable to update my website.
I googled how to find encrypted files but none of the solutions involving efsinfo.exe were appropriate to Windows 7 then I found an example using the cipher command:
https://superuser.com/questions/58878/how-to-list-encrypted-files-in-windows-7
There were a number of different answers to finding the encrypted files. I used the command prompt method.
I opened a command prompt in the root of my application and did:
D:\Data\Code2011>cipher /s:MyWeb >Encryption.txt
I then did a case sensitive search in Encryption.txt for lines beginning E[space] or 'the file is encrypted'
I found two .htc files which were encrypted in a styles subfolder and was able to unencrypt them in the advanced tab of explorer file properties.
The Web then compiled and published OK.
I had this issue as well. I set the source files properties to not be encrypted but that still wasn't working. Turned out that the files were cached in the temporary deployment folder and I had to uncheck encryption there as well. It probably would have worked to delete the temporary deployment directory but the other way worked.
Does anybody have an idea how to use he "publish website" command on VS 2008 and be able to track changes so as to only sent to the hosting server the modified files?
When the command is called, the destination folder files are wiped and replaced with the result of the new build (assembly file are created as well as some marker files).As my website is getting bigger and bigger I have to transfer to the server all the assemblies in my bin directory and keep in mind which other files I may have modified.
Is there a better way of doing this?
ps: I use FileZila to transfer my files to the server.
Publish to a local directory, then use a diff tool (such as WinMerge) to find and copy the modified files to the server.
You can publish locally and use any mechanism of your choice to transfer the files.
Have you tried Website->Copy Website menu item? It seems to know which files are changed.