I have an ASP .NET Web Site Project that is being moved into TFS. There is a folder that is used for user uploaded files (e.g., company logos, excel spread sheets, etc.) that need to be kept. I'm trying to figure out a good way to manage these files without placing the folder in TFS (it's really big), and make it easy for new developers to grab the folder structure to their local machines for development.
I was thinking of doing the following and was wondering if this is a good way of doing it, or if there are better alternatives:
Create a script which will, when executed, create the folder structure of the storage folder. This would be placed in source control.
New developers could grab this file and execute it on their local machine.
To make sure the folder is added to source control, get the developer to remove it from their local project.
Store the folder on a NAS - no need for the files to be part of the source-controlled code.
Related
I have always used the express versions of Visual Studio for my Asp.Net projects. In the past, I would use a basic FTP synchronizer to push updated files (*.vb) to our server, then the changes would just show up on the website instantly. Now, for some reason, when I make changes to our *.vb files, they are not being reflected on the server after I synchronize over ftp, unless I build the project first. In addition, for our .Net 4.0 project, VS 2015 14.0.23107 is adding the following directories, with tons of stuff inside of them:
/.vs
/My Project
/Obj
There are loads of files within these directories which I have no idea what they do, and for some reason our project has taken on a completely different behavior. Now when we try to synchronize over FTP, there are a ton more files, and it seems that changing the actual underlying source doesn't work. We have to synchronize all the other files in the above directories, then we can see the changes.
Is this a new way they are doing things, or is this because VS is now free and we are getting a better version where we have to "publish" not "synchronize?"
Is there a way to go back to the simple way of doing things, where we just have a plain directory with our source files and sync them over to the server? Should we not do it this way? If not, what method should we be using and what files should we be pushing to the server?
I'll just promote my comment to an answer. There are several aspects of this question:
Use publish, this feature is already for long available in Visual Studio and works well. There is plenty of ways to customize it and it supports a lot of technologies, including FTP. It's also more convenient, systematic and reliable way of deployment than manually copying files to your FTP. You can also share your publishing configuration among developers and store several of them. No loss here.
I don't quite get why would you like to copy the source (.vb) files to the server. What you would usually like to achieve is to get compiled DLL's + resources copied to your server, and source files 'secure' on developers machines. You can compile your sources on the server if you really need it, but then just plug it into a source control, use ms build etc. Anyway, build/publish actions are there to prepare the deployment files for you, manual copying is pure bad.
For the new folders:
Obj is everything but new, its created to store some resources, crap, more here: What is obj folder generated for?
.vs stores user specific settings, and you should ignore it as well as obj folder, more here:
Should I add the Visual Studio 2015 .vs folder to source control?
My Project is most likely your own folder, nothing related to VS.
To sum up, as long as you use asp 4, 4.5 nothing changes. Only the 5.0 intruduces a bit different rules for deployment. Most of the problems you get are easily solved using the right tools (Publish). It will know what files to ship (binaries + resources included in project) and what to ignore (source files, caches, crap). It's convenient, less error-prone and can do much more for you.
Definitely, use "Publish" option (right click on your web application at solution explorer, under Run/Build options), thus you can update your server site with those files created on Publish. As Mikus mentioned, you DON'T need vb files on your published site, you just need dll's and resources (images, js, css, resx, e.g.).
Regards, hope it helps.
Use the Publish Option which is provided by Visual Studio.
This will compile your project and you can then host this in your reliant manner.
I personally host on IIS and considering I have no data stored locally I can publish directly to the published path on the IIS Server.
The Publish tool is very simple and only takes a few minutes.
I have a website with huge number of pages, i keep pre-compiled version (with fixed naming) on production server.
Every time i make any change on my code i have to Publish the whole website just for a small change.
It takes about an hour to get the website published before i can deploy my changes to production server.
Is there a way to publish only a batch of pages so that the Publish process is faster?
Is there any other option to save the publishing time?
NOTE: By publishing I mean pre-compiling
Any suggestions are welcome.
If you're modifying only the html tags (nto the server side tags) or css, you can deploy only the part you changed.
If it's compiled code you got no choice.
I think you might have to ask your self why it's taking an hour to publish your web site ? Is your compilation time that much long ? .
One method to reduce the compile time, and size of a web-site project is to split your website into several smaller and more maintainable sites.
You can still deploy these separate publishes together in production.
References to pages from other projects work perfectly. All your pages within the same application on IIS will share the same session. So to an end user, this will still appear to be one website.
Since you reduce the work to be done while publishing any given module publishes will be faster. Divide your modules as per what you see as a suitable batch.
You must be aware of this, but I will say it just for completion. When you publish a website you get the option,Use fixed naming and single page assemblies. Select this to have a different dll for each page in your bin directory. You only need to upload the pages and corresponding dlls where you made changes. If upload time is a concern, this will take care of it.
Microsoft doesn't really have an idea of "pre-compiling" if you notice your pages have 3 components to them, the *.designer, *.aspx *.cs. The *.cs all needs to be compiled into a *.dll to be deployed to your website. Traditionally there are two types of executables, exe's and dll's. Asp.Net websites are compiled into a dll for all the code behinds that run on the server. Microsoft does not have a way to "half" compile a dll and then merge it with the other half you haven't changed.
If your website is taking that long, to compile & deploy. I would suggest you have more of an architecture problem then a code problem. Where I work our main website is 3,000,000+ lines of code, to accomplish everything the user needs to do and does. We don't take an hour to deploy. however what we have done is broken our business logic up into a number of dll's over 100 dll's and our website project in and of it self is just the aspx and the bare bones code behind to drive the flow through to our business logic. This allows us to alter x number dll's with our changes to support a new feature, We don't have to deploy all 100 dll's every time just the ones that have changed, that's the nature of dll's. if our business logic was 100% contained in our website project, then our compile, deployment would be significantly longer.
You want to consider refactoring your code into dll's. Another option if you're not married to the ASPX/ASP.NET solution is to consider an ASP/MVC.NET solution. I would consider refactoring your site. If it takes that long there's some serious issues, even if you could break Data access into a separate dll, then you wouldn't have to constantly compile & deploy the dll which handles your data access, every time you changed the website, only when you changed the Data access layer as well.
As previous posts mention, you cannot do this in an automatic fashion, but you could manually deploy your files if you want to reduce your publishing time.
When publishing a website, all code files for your site are compiled into a single .dll file in the website bin folder and all .aspx files are be deployed to their relevant paths.
To update the site manually, simply build the website on your local machine to create an updated .dll and overwrite the .dll in the bin folder on the production server. If the source/HTML has been modified on any of your actual pages/.aspx files then you will also need to copy them over.
Steps:
Build website locally
Overwrite production server .dll with locally built .dll
Copy any .aspx pages to production server where HTML/Source modified
Very simple.
Have all the HTML content stored in separate files to the code. A database would be an excellent idea. All one would have to do to change some text or swap an image would be to go into the database or file for that content and change a few tags. I recommend MySQL.
:)
For example how this site is organized?
What i do not understand is what they upload to the Microsoft server?
I have created, with Visual studio, a very small web-page and i have to upload the whole site, even after the smallest change...
The usual approach is to replace everything with xcopy or the publish function in visual-studio, and in some cases replacing everything is the only approach - for example if you're using the web-application project model everything gets packaged into a single assembly and there you go - even to apply a small change you'll have to re-deploy the whole thing.
An alternative to this could be the Website model in visual studio, using which you should be able to deploy single code files on your server and they should be picked-up if you re-start the website from the IIS management tool. This model - in fact - works in a different way compared to the web-application project model. It's just a bunch of code files that will be dynamically compiled by the ASP.NET runtime.
Even if possible though - I wouldn't suggest the approach of deploying single files - as this is easily error prone (you deploy the code-behind and could easily forget to deploy the aspx counterpart, or similar). Unless you're delpoying Gigs of stuff over slow-networks, redeploying the whole thing is always the safest bet.
Have a look at this and this interesting links to find out more about website and web-application project models in visual studio.
It really depends a lot on how you're building your app.
If you're in VS and you're doing an ASP.Net site, then you can either do it as a Website Project, or as a Web Application project.
in the former case, your files will remain as aspx and .aspx.cs files and you xcopy (or FTP) whichever files change. if you want logic that's outside the scope of a single page, you'll either create a separate class library project or else use the App_code directory.
In the latter case, you'll compile all the logic into one or more .dll files that get copied to your site's /bin directory, and any number of aspx files that can either stay as such or be embedded (recommend leaving them as aspx files). Again, if an aspx file changes, you just movethe one that changed, if anything in the dll changes, you replace a whole dll.
All that said, a huge chunk of what's on the site you posted is probably being pulled out of the database. Most sites now dont' have content on pages, they just have organizational (view) logic on paes, and have other classes which fetch the actual content out of a database to serve up. This allows greater reuse and means that the 4,000 pages (number chosen at random) on MSDN don't have to be each coded individually as an HTML page.
After Googling i think, the check-box Use fixed naming and single page assemblies in the publish Website form of the Visual studio, is the right choice.
Although it might slow things down...
I am developing an asp.net website that will need regular updates for source codes and HTML sides. After i have completed the necessary updates, i use the 'publish website tool' and publish the site to my local directory. Then i upload all files to Remote File. Is there any way to fix my site with just changed files. For example if i have updated just 2 files of 84 (HTML or source side), is it possible to update just this 2 files without any problem?
While Visual Studio does offer a few different compiling/build options, I think that you are probably doing everything just fine for a Visual Studio Website project. When I'm working with a website rather than a web application, I will only FTP the files that have changed. For example, if I were to change some HTML tags around within an .aspx page, then only that page would need to be uploaded to the web server. If I change the Page_Load function in the .aspx.cs page, then I will definitely need to get the updated DLL on the web server as well.
I use FileZilla for my FTP tool and there is an option to only upload the changed files. So after you use visual studio to publish your website, you can grab the entire contents of that directory and drag it over to the FTP server location and only update what has changed.
The ASP.NET website project offers an option in the Project properties -> "MSBuild Options" tabpage where you can set "Allow this precompiled site to be updatable".
Setting this option
"Specifies that the content of .aspx
pages are not compiled into an
assembly; instead, the markup is left
as-is, allowing you to change HTML and
client-side functionality after
precompiling the Web site. Selecting
this check box is equivalent to adding
the -u option to the
aspnet_compiler.exe command."
according to the docs.
I've been entirely underwhelmed by the available tools to VStudio. And am instead using Gulp.
Even in 2017 the problem remains.
With gulp you can "watch" directories for changed files, so I have it configured to watch the [bin] folder, along with assets separate from aspx/cshtml files.
That way whenever I change anything, it is instantly copied to my publish folder where I can later zip it up and deploy ONLY the changeset. The script even has a delay so I can delay the copy/upload in situations like bundles where they are generated dynamically and take some time to be fully modified.
The only thing I have to do is clean it up before I start my next milestone, so that the "publish" folder which gulp "auto deploys" to is empty and ready for the next run. Emptying a folder and running "gulp watch" in background seems like a small price to pay for such a needed feature.
I feel like this should be a good item to develop as a vstudio plugin.
Recently I have been forced to move to a windows/C#/.NET/MVC environment from linux/node/angular. Sigh. I found the following solution (next paragraph) elminates the pain caused when using visual studio to "publish" the code. The VS2017 publish process copies every file in the entire application to the web server, even if just one character is changed in one file. This can take over an hour for our moderately sized app.
So here's what I do. I first publish the solution locally (typically to bin/release/Publish/). That takes about 1 minute as opposed to 1 hour to publish to the server. Then, I compare the files between my local Publish directory and the server directory using FreeFileSync. FreeFileSync is amazing -- and free. I have access to the server directory via a windows file share. The compare takes about 15 minutes. I can then see exactly what files are different and need to be pushed. Note, the option I use compares the actual CONTENT of the files, not just the create time. I then use the FreeFileSync sync feature (mirror option) to copy the few files needed to the server. This takes maybe two minutes. So the total operation takes about 20% the time Visual Studio "Publish" takes. But best of all, the actual hit on the prod server is only the two minutes it takes to copy the diff files, rather than the 1 hour outage inflicted by Visual Studio Publish as it slogs along copying each and every file.
It depends on the type of project.
If is a project created with File > New Web Site then is ok to just copy the changed .aspx files (make sure that the corresponding code - .cs - files are also copied).
If the project was created with File > New Project > ASP.NET Web Application then you will need to copy the .aspx files and the compiled project dll (by default the dll has the same name as the project like TestProject.dll) from the bin folder inside the project.
i think you need something http://winmerge.org/
where it will compare files that are changed and upload them.
hope this helps
Right now I'm working with an ASP.NET website that automatically generates images and stores them in a temporary folder. When working on my local system these go going into a temporary folder that gets picked up by Visual Source Safe which then wants to check them in. As such, I am wondering if there is a way to just exclude that particular folder from source control?
I've done a bit of reading and found that there are ways to do this for individual files, but I haven't found anything yet about an entire folder.
I think you've found one of the main reasons MS went back to projects in VS2008 and in MVC.
It's been a long time since I've used VSS (mainly because it's really out of date now), but most source providers let you exclude files and folders as a setting of the provider, rather than the project under control.
If you can switch to a Web Project rather than a WebSite then do so, otherwise I'd look at updating your source control provider, as this sort of exclusion is easy with Vault, CSV, SVN, Git, VSTS and so on (to name but a few).
Are you using ASP.NET Website or ASP.NET Web Project? The difference is significant enough to solve or promote this problem.
Websites, love to scan the file system and auto checkin.
Projects, checkin only what you tell them to.
Also Visual Source Safe is pretty out dated, most recent source control systems allow you to do what you are asking. SVN and TFS 2008 SP1 do from my experience.
You can also try to right click and pick "Exclude" on the folder, but in the case of a Website I believe this renames the folder.
I'm not sure if this is an option for you, but if you exclude your temporary folder from VSS (delete the folder inside VSS using the VSS UI), the files that go into it should not get "picked up" again.
If you perform operations on a parent project of the temporary folder, you may try cloaking the folder.
http://msdn.microsoft.com/en-us/library/x2398bf5(VS.80).aspx
I would suggest emptying/deleting your folder from your website. Have your website on startup create/verify the folder, and on shutdown to clean it up and remove anything in it. This can be DEBUG code only (wrap in #if DEBUG) if so needed. Also add a build script to your project that does this every time it is built also.
Could you just make your application write to a temporary folder that is outside of your website?
e.g. in C:\tempfiles
VSS shouldn't be able to pick it up then.