We have an application that can use both Postgres and SQLite as its backend, depending on what the customer requires: High load and concurrency, or easy setup.
But I fear that some customers will be too clever for their own good, and to get around the (slightly) complex Postgres installation, they use SQLite, but put the file on a network disk.
Then the inevitable happens and the database gets corrupted, because SQLite is not meant for that, and we have Postgres support for that very reason.
Is there an ideal way to prevent SQLite from working on a network drive? There are some questionable ideas like looking for a \\ at the beginning, or the colon in "C:\" (it's purely a windows app), or parsing for IPs, but none of these are fool-proof and can be circumvented by making a link to a network disk.
Related
It is not a good idea to use a SQLite database, for write access, on a CIFS share. Understood.
I have a need to do so on a very infrequent basis. The database is written very infrequently on the Windows server (Actually windows 10, and like once ever few weeks) and equally infrequently from the Linux (Ubuntu 16.04.02 if it matters) server. The chances of simultaneous writes is near zero (which is not zero of course).
As I understand it (and I am not sure I do) using the "nobrl" option on the mount allows this to work (and indeed it does work for me), but does so by disabling locking entirely (right? Unless there are other types?).
Is there a technique, without deploying code on the Windows side, to ensure that this is in fact safe -- options for SQLite for example, that might not be the default. Locking the entire database is perfectly acceptable during the update on the ubuntu side, performance is not an issue, and simultaneous access is not required. The main restriction is I cannot change the process on the windows side.
I can't seem to find an application to monitor SQLite DB performance. Currently I have a test server that uses SQLite. I'm primarily concerned with obtaining a benchmark of storage requirements and performance for scaling this server to production.
I know for MySQL there is the standard Nagios for monitoring (changing to mySQL is not an option at this point). Is there anything analogous for SQLite?
SQLite has functions like sqlite3_status() and sqlite3_db_status(), but those do not really give you the information you want, and might not even be available in all languages.
Anyway, SQLite is an embedded library, so you'd have to monitor your actual application. Tools like Nagios allow to monitor a server's CPU load and disk usage, but you can also use any other tool of your OS.
Is there any program that can serve as a GUI front end for SQLite3 database?
The general idea is to connect to the database remotely, and administer it in FileMaker-like GUI interface, where the online scripts would have a job of just presenting stuff.
I tried FileMaker with ODBC drivers, but have not been successful. SQLite3 is not directly supported, and I couldn't find another driver or software online. (Please mind that I am not talking about database editor software, but something that would be used as a database "CMS" if you will.)
Thanks!
Are you using a mac? Try the ODBC from Actual Technologies that will work for the most databases out there.
There is a more direct approach, that I like better because ODBC is not fast. For small amounts of data it's ok. Besides that communicating direct with the external database without to much middleware is in my opinion always better.
MBS has a very nice plugin, and supports a lot of SQL connections. It works really well and fast. Perhaps you should take a look. I use it a lot, as I've used ODBC a lot in the past.
I'm going to create a Java program that allows "locking" a USB drive by making it's files accessible only with a password. Similar software that does this is USB safeguard.
Here is what I am thinking of doing:
Store all files into a single archive on the USB.
Encrypt the archive using AES or
blowfish
Hide the archive.
The problem is, how can I "unlock" the USB? What approach can I take here? Here is what I have thought of:
Ramdisk: It is very hard, if not impossible, to load a Ramdisk from an encrypted arhive. While it may be plausible in c++, I think it may be much harder in Java and might involve messing with the system classes, which would kill the compatibility of the software and defeat the whole purpose of using Java.
Loading the unencrypted archive onto the USB - Nobody likes waiting 10 minutes just to view a file on a USB. Copying all the files might take some time. Also, what about free space on the USB?
Loading unencrypted archive onto harddrive - While being very unsecure and error-prone, this looks like the only possible way to get it done.
Creating a custom file browser allowing the user to browse the archive - Do you use winrar to browse your files? Would you like doing it? No. Creating a custom file browser will take alot of time to create, and again, is an error-prone and user-unfriendly approach.
I can't think of any other way of doing this. Can anyone think of a better way? Note that this is going to be free and open-source software.
TrueCrypt is Free Open-Source software for storing encrypted files on a storage device (i.e. USB drive). It runs on Windows, Linux, and MacOS. TrueCrypt even allows hidden volumes. I would start with their source code, and proceed from there.
I'm planning on doing more coding from home but in order to do so, I need to be able to edit files on a Samba drive on our dev server. The problem I've run into with several editors is that the network latency causes the editor to lock up for long periods of time (Eclipse, TextMate). Some editors cope with this a lot better than others, but are there any file system or other tweaks I can make to minimize the impact of lag?
A few additional points:
There's a policy against having company data on personal machines, so I'd like to avoid checking out the code locally.
The mount is over a PPTP VPN connection.
Mounting to Linux or OS X client
Use a source control system — Subversion, Perforce, Git, Mercurial, Bazaar, etc. — so you're never editing code on a shared server. Instead you should be editing a local work area and committing changes to a repository located on the network.
Also, convince your company to adapt their policy such that company code is allowed on personal machines if it's on an encrypted volume. Encrypted disk images that you can use for this are trivial to create using Disk Utility, and can use strong cryptography. You can get even more security by not storing your encryption passphrase in your keychain, and instead typing it every time you mount the encrypted volume; this means that even if your local user account is compromised, as long as you don't have the volume mounted, nobody else will be able to mount it.
I did this all the time when I was consulting and none of my clients — some of whom had similar rules about company code — ever had a problem with it once I explained how things worked. (I think some of them even started using encrypted disk images even within their offices.)
Remate plugin simply disables this dreadful refresh-on-focus feature.
Download, unpack, doubleclick and choose "Disable Refresh on Regaining Focus" from "Window" menu (you can refresh manually by right-clicking project in drawer). Voila!
If you are accessing the data from your personal computer, it is in your RAM, so we will assume that you just can't store it on your hard drive, floppy, USB stick, etc.
Your solution is a RAM drive. Copy the files you need to edit there using whatever method you prefer (I would suggest source control) and then you can edit them without lag. When you are done commit them back to the server.
As was pointed out your editor may be caching changes to your temp directory, or maybe even your swap file (if it is in memory, then it can get swapped out). The solution to that is get a much larger RAM drive and run a Virtual Machine in the RAM drive. Not sure what OS you are running, but you can get a pretty slim install of most OS's if all you are doing is editing source code.
If you don't have enough RAM, then get a Gigabyte i-RAM solid state drive and remove the battery, that way it will lose everything when you power down.
Set your VMWare to not allow the OS to swap any of the virtual machine. Keep a baseline VM on your hard drive and copy it to your RAM drive before booting it up. Then you can use the hard drive in the VM like a hard drive, even though it is RAM.
Might be a good idea to run a secure erase on your RAM drive before powering down. Also keep in mind that they have found if you super cool a RAM chip before removing it from a functioning computer, and place it in a new computer quick enough, the data may still be intact.
I guess it all comes down to how detailed that policy is, and how it is interpreted.
Good luck!
Short answer: you can do no trick. CIFS is really geared towards LAN with a reasonably calm trafic, so you have zero chance to not suffer intermittent lag accessing a share through a VPN. The editor at some point needs to access the file in blocking IO, because it makes no real sense to do otherwise.
You could switch editor and use Emacs + TRAMP which is geared to work on remote files.