It is possible to forcibly include IDE's directory (workspace) into Caching SSD? - solid-state-drive

I have Hybrid storage drive (HDD 500 GB+ and SSD 32 GB). Can I forcibly include Eclipse IDE and Eclipse's work directory (Workspace) into SSD drive (which is used for caching by default)?

No. A hybrid drive works in a completely seamless fashion. While you might know that it's a hybrid disk your computer does not. There's nothing in the SATA spec to tell the hard drive what to store in the SSD cache. It analogous to CPU cache. It's there and it makes your computer faster, but the processor controls it, not the programmer.
Fortunately for you there's no need to force the hard drive to put those things into cache. If you use them a lot they'll get cached automatically. That's the beauty of a hybrid drive.

Related

Hosting big files for users

We need to be able to supply big files to our users. The files can easily grow to 2 or 3GB. These files are not movies or similiar. They are software needed to control and develop robots in an educational capacity.
We have some conflict in our project group in how we should approach this challenge. First of all, Bittorrent is not a solution for us (despite the goodness it could bring us). The files will be availiable through HTTP (not FTP) and via a filestream so we can control who gets access to the files.
As a former pirate in the early days of the internet i have often struggled with corrupt files and using filehashes and filesets to minimize the amount of redownload required. I advocate a small application that downloads and verifies a fileset and extracts the big install file once it is completely downloaded and verified.
My colleagues don't think this is nessecary and point to the TCP/IP protocols inherit capabiltities to avoid corrupt downloads. They also mention that Microsoft has moved away from a downloadmanager for their MSDN files.
Are corrupt downloads still a widespread issue or will the amount of time we spend creating a solution to this problem be wasted, compared to the amount of people who will actually be affected by it?
If a download manager is the way to go, what approach would you suggest we take?
-edit-
Just to clearify. Is downloading 3GB of data in one chunk, over HTTP a problem OR should we make our own EXE that downloads the big file in smaller chunks (and verifies them).
You do not need to go for your own download manager. You can use some really smart approach.
Split files in smaller chunks, let's say 100MB each. So even if a download is corrupted, user will end-up downloading with that particular chunk.
Most of web servers are capable of understanding and treating/serving range headers. You can recommend the users to use download manager / browser add-ons which can use this capacity. If your users are using unix/linux systems, wget is such a utility.
Its true that TCP/IP has capacities of preventing corruption but it basically assumes that network is still up and accessible. #2 mentioned above can be one possible work-around to the problems where network was completely down in middle of download.
And finally, it is always good to provide file hash to your users. This is not only to ensure the download but also to ensure the security of the software that you are distributing.
HTH

Protecting a USB drive in java

I'm going to create a Java program that allows "locking" a USB drive by making it's files accessible only with a password. Similar software that does this is USB safeguard.
Here is what I am thinking of doing:
Store all files into a single archive on the USB.
Encrypt the archive using AES or
blowfish
Hide the archive.
The problem is, how can I "unlock" the USB? What approach can I take here? Here is what I have thought of:
Ramdisk: It is very hard, if not impossible, to load a Ramdisk from an encrypted arhive. While it may be plausible in c++, I think it may be much harder in Java and might involve messing with the system classes, which would kill the compatibility of the software and defeat the whole purpose of using Java.
Loading the unencrypted archive onto the USB - Nobody likes waiting 10 minutes just to view a file on a USB. Copying all the files might take some time. Also, what about free space on the USB?
Loading unencrypted archive onto harddrive - While being very unsecure and error-prone, this looks like the only possible way to get it done.
Creating a custom file browser allowing the user to browse the archive - Do you use winrar to browse your files? Would you like doing it? No. Creating a custom file browser will take alot of time to create, and again, is an error-prone and user-unfriendly approach.
I can't think of any other way of doing this. Can anyone think of a better way? Note that this is going to be free and open-source software.
TrueCrypt is Free Open-Source software for storing encrypted files on a storage device (i.e. USB drive). It runs on Windows, Linux, and MacOS. TrueCrypt even allows hidden volumes. I would start with their source code, and proceed from there.

Ways to make ASP.NET build faster

When I'm building my web project it takes about 20 seconds to compile. Then when I try to browse to a web page in project, asp.net does its runtime compilation(another 20 seconds). I know I can't escape these steps because thats how asp.net works, just want to see if anyone has some kind of optimization to make these builds faster.
Trying to improve my Edit-Compile-Test loop
My machine details:
-Intel Core i7 processor #2.80GHz
-8GB of RAM
-HD # 7200 RPM
Buy a faster machine? Sounds like a smart answer. I know that the compiler can take advantage of multi core machines. Also, during compilation there's a lot of Hard drive access, so it may make sense to get a solid state drive. Maybe not the answer you are looking for, but it's a definite solution.
The other thing you can do is configure your project to allow for "Edit And Continue". This will allow for small things to be change, and continue debugging, without doing a full recompile.
Here are a couple of thoughts:
Disable any "realtime" virus / malware protection, at least during this process.
Disable indexing (Windows, Google desktop, etc.) for the folders that VS uses during this process.
Disable / stop other processes that may be accessing the hard disk. The biggest issue here is latency - even if other applications are accessing / writing tiny files, it is the access time that kills speed.
As the original poster suggested, your biggest bang will come from hardware: get an SSD and a processor with at least 4 cores. If you were to buy 4 cheap 64GB SSD's and put them in RAID 0, you would be shocked at the difference and even discover that your CPU and RAM will suddenly become bottlenecks.
Move your code onto a RAMDisk, or buy an SSD drive.
Suspend Resharper - R# helps tremendously when you're just coding but really slows down the Edit-Compile-Test loop.

How does BitLocker affect performance? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I'm an ASP.NET / C# developer. I use VS2010 all the time. I am thinking of enabling BitLocker on my laptop to protect the contents, but I am concerned about performance degradation. Developers who use IDEs like Visual Studio are working on lots and lots of files at once. More than the usual office worker, I would think.
So I was curious if there are other developers out there who develop with BitLocker enabled. How has the performance been? Is it noticeable? If so, is it bad?
My laptop is a 2.53GHz Core 2 Duo with 4GB RAM and an Intel X25-M G2 SSD. It's pretty snappy but I want it to stay that way. If I hear some bad stories about BitLocker, I'll keep doing what I am doing now, which is keeping stuff RAR'ed with a password when I am not actively working on it, and then SDeleting it when I am done (but it's such a pain).
2015 Update: I've been using Visual Studio 2015 on my Surface Pro 3 when I travel, which has BitLocker enabled by default. It feels pretty much like my desktop, which is an i7-2600k # 4.6 GHz. I think on modern hardware with a good SSD, you won't notice!
2021 Update: I have been enabling bitlocker on all my computers and it flies now. No worries. Get an NVMe SSD and don't look back.
With my T7300 2.0GHz and Kingston V100 64gb SSD the results are
Bitlocker off → on
Sequential read 243 MB/s → 140 MB/s
Sequential write 74.5 MB/s → 51 MB/s
Random read 176 MB/s → 100 MB/s
Random write, and the 4KB speeds are almost identical.
Clearly the processor is the bottleneck in this case. In real life usage however boot time is about the same, cold launch of Opera 11.5 with 79 tabs remained the same 4 seconds all tabs loaded from cache.
A small build in VS2010 took 2 seconds in both situations. Larger build took 2 seconds vs 5 from before. These are ballpark because I'm looking at my watch hand.
I guess it all depends on the combination of processor, ram, and ssd vs hdd. In my case the processor has no hardware AES so compilation is worst case scenario, needing cycles for both assembly and crypto.
A newer system with Sandy Bridge would probably make better use of a Bitlocker enabled SDD in a development environment.
Personally I'm keeping Bitlocker enabled despite the performance hit because I travel often. It took less than an hour to toggle Bitlocker on/off so maybe you could just turn it on when you are traveling then disable it afterwards.
Thinkpad X61, Windows 7 SP1
Some practical tests...
Dell Latitude E7440
Intel Core i7-4600U
16.0 GB
Windows 8.1 Professional
LiteOn IT LMT-256M6M MSATA 256GB
This test is using a system partition. Results for a non-system partition are a bit better.
Score decrease:
Read: 5%
Write: 16%
Without BitLocker:
With BitLocker:
So you can see that with a very strong configuration and a modern SSD disk you can see a small performance degradation with tests. I don't know what about a typical work, especially with the Visual Studio.
Having used a laptop with BitLocker enabled for almost 2 years now with more or less similar specs (although without the SSD unfortunately), I can say that it really isn't that bad, or even noticable. Although I have not used this particular machine without BitLocker enabled, it really does not feel sluggish at all when compared to my desktop machine (dual core, 16 GB, dual Raptor disks, no BitLocker). Building large projects might take a bit longer, but not enough to notice.
To back this up with more non-scientifical "proof": many of my co-workers used their machines intensively without BitLocker before I joined the company (it became mandatory to use it around the time I joined, even though I am pretty sure the two events are totally unrelated), and they have not experienced noticable performance degradation either.
For me personally, having an "always on" solution like BitLocker beats manual steps for encryption, hands-down. Bitlocker-to-go (new on Windows 7) for USB devices on the other hand is simply too annoying to work with, since you cannot easily exchange information with non-W7 machines. Therefore I use TrueCrypt for removable media.
I am talking here from a theoretical point of view; I have not tried BitLocker.
BitLocker uses AES encryption with a 128-bit key. On a Core2 machine, clocked at 2.53 GHz, encryption speed should be about 110 MB/s, using one core. The two cores could process about 220 MB/s, assuming perfect data transfer and core synchronization with no overhead, and that nothing requires the CPU in the same time (that one hell of an assumption, actually). The X25-M G2 is announced at 250 MB/s read bandwidth (that's what the specs say), so, in "ideal" conditions, BitLocker necessarily involves a bit of a slowdown.
However read bandwidth is not that important. It matters when you copy huge files, which is not something that you do very often. In everyday work, access time is much more important: as a developer, you create, write, read and delete many files, but they are all small (most of them are much smaller than one megabyte). This is what makes SSD "snappy". Encryption does not impact access time. So my guess is that any performance degradation will be negligible(*).
(*) Here I assume that Microsoft's developers did their job properly.
The difference is substantial for many applications. If you are currently constrained by storage throughput, particularly when reading data, BitLocker will slow you down.
It would be useful to compare with other software based whole disk or whole partition encryption like TrueCrypt (which has the advantage if you dual boot with Linux since it works for both Windows and Linux).
A much better option is to use hardware encryption, which is available in many SSDs as well as in Hitachi 7200 RPM HDD. The performance of encrypted v. not is undetectable, and the encryption is invisible to operating systems. If you have a decent laptop, you can use the built-in security functions to generate and store the key, which your password unlocks from the encrypted key storage of the laptop.
I used to use the PGP disk encryption product on a laptop (and ran NTFS compressed on top of that!). It didn't seem to have much effect if the amount of disk to be read was small; and most software sources aren't huge by disk standards.
You have lots of RAM and pretty fast processors. I spent most of my time thinking,
typing or debugging.
I wouldn't worry very much about it.
My current work machine came with bitlocker, and being an upgrade from the prior model. It only seemed faster to me. What I have found, however, is that bitlocker is more bullet proof than truecrypt, when it comes to accurately laying down the data. I do a lot of work in SAS which constantly writes backup copies to disk as it moves along and shoots a variety of output types to disk at the end. SAS works fine writing output from multithreaded processes back to bitlocker and doesn't seem to know it's there. This has not been the case for me with truecrypt. I'm not sure what happens or how, but I found that processes got out of synch when working with source/output data in a truecrypt container, which is what I installed on my second work computer since it had no bitlocker. The constant backups were shooting to an SSD while the truecrypt results were on a regular HD. Maybe that speed difference helped trip it up. Whatever the cause, I had to quit using truecrypt on that second computer because it made my SAS results out of synch with respect to processing order and it screwed up some of my processes and data. Scary stuff in my world.
I work with people who have successfully used Truecrypt on the exact same computer, but they weren't using a disk intensive app. like SAS.
Bitlocker to Go, the encryption which bitlocker applies to thumb-drives, does slow things down quite a bit when it comes to read/write times. It's not too hard to use as long as you remember your password on the thumbdrive, and are willing to wait for it to format/initialize the drive, but in my experience it made access to the flash drive about 4 times as slow. Don't know why it would slow down a thumb drive and not a disk but that's how it was for me and my coworker.
Based on my success with bitlocker at work, I bought Windows Pro for my home computer to get bitlocker and plan to encrypt some directories with it for things like financials.

How to avoid pauses when editing code on a network drive?

I'm planning on doing more coding from home but in order to do so, I need to be able to edit files on a Samba drive on our dev server. The problem I've run into with several editors is that the network latency causes the editor to lock up for long periods of time (Eclipse, TextMate). Some editors cope with this a lot better than others, but are there any file system or other tweaks I can make to minimize the impact of lag?
A few additional points:
There's a policy against having company data on personal machines, so I'd like to avoid checking out the code locally.
The mount is over a PPTP VPN connection.
Mounting to Linux or OS X client
Use a source control system — Subversion, Perforce, Git, Mercurial, Bazaar, etc. — so you're never editing code on a shared server. Instead you should be editing a local work area and committing changes to a repository located on the network.
Also, convince your company to adapt their policy such that company code is allowed on personal machines if it's on an encrypted volume. Encrypted disk images that you can use for this are trivial to create using Disk Utility, and can use strong cryptography. You can get even more security by not storing your encryption passphrase in your keychain, and instead typing it every time you mount the encrypted volume; this means that even if your local user account is compromised, as long as you don't have the volume mounted, nobody else will be able to mount it.
I did this all the time when I was consulting and none of my clients — some of whom had similar rules about company code — ever had a problem with it once I explained how things worked. (I think some of them even started using encrypted disk images even within their offices.)
Remate plugin simply disables this dreadful refresh-on-focus feature.
Download, unpack, doubleclick and choose "Disable Refresh on Regaining Focus" from "Window" menu (you can refresh manually by right-clicking project in drawer). Voila!
If you are accessing the data from your personal computer, it is in your RAM, so we will assume that you just can't store it on your hard drive, floppy, USB stick, etc.
Your solution is a RAM drive. Copy the files you need to edit there using whatever method you prefer (I would suggest source control) and then you can edit them without lag. When you are done commit them back to the server.
As was pointed out your editor may be caching changes to your temp directory, or maybe even your swap file (if it is in memory, then it can get swapped out). The solution to that is get a much larger RAM drive and run a Virtual Machine in the RAM drive. Not sure what OS you are running, but you can get a pretty slim install of most OS's if all you are doing is editing source code.
If you don't have enough RAM, then get a Gigabyte i-RAM solid state drive and remove the battery, that way it will lose everything when you power down.
Set your VMWare to not allow the OS to swap any of the virtual machine. Keep a baseline VM on your hard drive and copy it to your RAM drive before booting it up. Then you can use the hard drive in the VM like a hard drive, even though it is RAM.
Might be a good idea to run a secure erase on your RAM drive before powering down. Also keep in mind that they have found if you super cool a RAM chip before removing it from a functioning computer, and place it in a new computer quick enough, the data may still be intact.
I guess it all comes down to how detailed that policy is, and how it is interpreted.
Good luck!
Short answer: you can do no trick. CIFS is really geared towards LAN with a reasonably calm trafic, so you have zero chance to not suffer intermittent lag accessing a share through a VPN. The editor at some point needs to access the file in blocking IO, because it makes no real sense to do otherwise.
You could switch editor and use Emacs + TRAMP which is geared to work on remote files.

Resources