I have a helper project where the whole system depend on it, to avoid versioning problems on production server, I need to check that, all source files in the project is no older than 15 days ago, how to do that without viewing the history for each cs file under the project.
thanks
Open the Source Control Explorer. Select a folder, right click and start the "View history" command.
You will now see a recursive list of changes.
Related
I have converted a Win32 Application to UWP using MakeAppX and it doesn't seem to run. When I click the icon in the start menu literally nothing happens except a busy icon briefly appears on the cursor.
I completed the same process with Notepad++ and all it's DLLs and that worked fine (using the exact same manifest file, just changing the exe)
My questions are:
Where does the UWP save files that it creates/temporary files etc? If I run an executable and it generates files next to it, where would that be when you run a UWP?
Can I set that location in the AppxManifest?
Is there anyway to see if it has run correctly or not?
Edit:
Could this be a file permissions issue? My application needs to write to 'C:\MyFolder' & creates a folder with a load of files next to the executable upon startup and that doesn't happen.
So looking into this a bit more I came across this blog which discusses preparing for conversion. I think the above file accesses probably contravene the following:
Your app writes to the install directory for your app. For example, your app writes to a log file that you put in the same directory as your exe. This isn't supported, so you'll need to find another location, like the local app data store.
This looks like a fairly halting issue, am I correct in that assumption?
If your app is writing to the install directory you will need to change that code to write to your local app data folder instead, as the preparation guide calls out.
Write operations to the install directory are not allowed in order to ensure the ability for the app deployment stack to perform seamless, differential updates and clean uninstalls of your app.
Btw, to debug through your app launch failures you can do the following in Visual Studio: Debug -> Other Debug Target -> Debug Installed App Package -> select your app from the list of installed apps.
Maybe i'm totally outdated but for last four years i've been using simple FTP upload feature while uploading new website even without building it within Visual Studio. Just bunch of ASPX and CS files as in Visual Studio.
I do understand that compiling the project will provide me with some security defence so ones who have access to the server won't be able to read those files in text editors and i will avoid first time compilation but is that so important?
I mean, you can always do a lot of harm if you have access to server that just reading CS files instead of DLL.
First time compilation usually takes no more than 1 minute just searching for compiled version of the site will take as much time.
Now i'm watching video on PluralSight which explains new MSDeploy tool available from ASP.NET and i can't see any good reason to use it.
So what's wrong with the old fashioned way of just sending files via FTP without compiling or using fancy tools?
I did speed test and with MSDeploy i can deploy a website twice faster than old-fashioned FTPing. So instead of 4 minutes it will take 2.
Now from another perspective, when i already have alive project on the web. In which have to change Default.aspx because i have typo in some html tag. Deployment via MSDeploy will take 10 times more than uploading one file
Maybe i miss something?
MSDeploy does things which FTPing to a site can't do. Need to change a machine.config? You're unlikely to have FTP write access to the folder which contains it. Want to change a server setting in a server-version-independent manner? FTP won't do that. Etc. FTP works fine for copying files to folders in which you have write access, but that's all it can do.
When you deploy a project you can do a lot of things with it.
You can set up a job in your deploy that packages all your javascript into one file and all your css into one file.
You can set up a job in your deployment that changes a bunch of config settings to match your production server settings (rather then development settings).
The idea of deployment is that you take your current development website and transform it into a production website without having to do any of that manually.
The most important thing is that when you can only deploy your website you will never forget to package your js or forget to remove some debugging code because you can't just sneakly update a single file.
I use the "publish website" option to directly publish my ASP.NET - website to my ftp-server. This works quite nice.
The problem is the biggest part of my project are DLL-Files in the bin-directory which are external libraries that I only update quite rare.
So I do not want then to be uploaded every time. With my local resources I can select whether these files should be uploaded every time, never or only when changed, but I do not find this options for files in the bin-directory.
Any way to solve this?
I am using Visual Studio 2008 and trying to publish a Web Application Project, but it keeps failing when trying to add files in the project. Below is a sample of the message;
Publishing folder JavaScript... Unable
to add 'JavaScript/hoverIntent.js' to
the Web site. Unable to add file
'JavaScript\hoverIntent.js'. The
specified file could not be encrypted.
This happens for image files too. I am lost as to why it is happening. I should add that I am using Windows 7 build 7100, not sure if this is casuing the issue??
Any help greatly appreciated
I know this is an old topic, but I found it when I googled for the same problem.
My solution was to remove the "Encrypt" flag from Windows Explorer for the files listed (Right click -> Properties -> Advanced)
This blog post at BlackMarble is suggesting that you may have the target directory set to use encryption. Sounds like the exception you're seeing is the inability for the VS publish process to handle that.
To get around this problem:
use VS to publish to an intermediate directory. Somewhere on your PC perhaps.
copy the files yourself (with a batch file maybe) to the server
That's a workaround, at least.
Disable windows encrypted file system in cmd with the following:
fsutil behavior set disableencryption 1
Then restart your PC.
When I had this problem on publishing a Visual Studio 2010 web project either to a local folder or to a host, I was stumped. Visual Studio didn't indicate which files or even folders had caused the problem. I wasn't aware there were any encrypted files in the solution and I couldn't find any. I was unable to update my website.
I googled how to find encrypted files but none of the solutions involving efsinfo.exe were appropriate to Windows 7 then I found an example using the cipher command:
https://superuser.com/questions/58878/how-to-list-encrypted-files-in-windows-7
There were a number of different answers to finding the encrypted files. I used the command prompt method.
I opened a command prompt in the root of my application and did:
D:\Data\Code2011>cipher /s:MyWeb >Encryption.txt
I then did a case sensitive search in Encryption.txt for lines beginning E[space] or 'the file is encrypted'
I found two .htc files which were encrypted in a styles subfolder and was able to unencrypt them in the advanced tab of explorer file properties.
The Web then compiled and published OK.
I had this issue as well. I set the source files properties to not be encrypted but that still wasn't working. Turned out that the files were cached in the temporary deployment folder and I had to uncheck encryption there as well. It probably would have worked to delete the temporary deployment directory but the other way worked.
Does anybody have an idea how to use he "publish website" command on VS 2008 and be able to track changes so as to only sent to the hosting server the modified files?
When the command is called, the destination folder files are wiped and replaced with the result of the new build (assembly file are created as well as some marker files).As my website is getting bigger and bigger I have to transfer to the server all the assemblies in my bin directory and keep in mind which other files I may have modified.
Is there a better way of doing this?
ps: I use FileZila to transfer my files to the server.
Publish to a local directory, then use a diff tool (such as WinMerge) to find and copy the modified files to the server.
You can publish locally and use any mechanism of your choice to transfer the files.
Have you tried Website->Copy Website menu item? It seems to know which files are changed.