I have set up table-level InnoDB database encryption on MariaDB.
I'd like to know if there is any way to confirm that the data is truly encrypted. I've tried searching /var/lib/mysql/ibdata1 for sample data in the tables, but I don't know if that's a reliable test or not.
I posted this question on mariadb.com, and the suggestion there was to perfom a grep for some known data.
A DBA at Rackspace suggested using the strings command instead, to better handle the binary data, for example:
strings /var/lib/mysql/sample_table/user.ibd | grep "knownuser"
This approach returns no results on an encrypted table and does return results on an unencrypted table (assuming both have "knownuser" loaded into them).
You can query information_schema.innodb_tablespaces_encryption. When innodb tablespace is encrypted it is present in the table.
SELECT * FROM information_schema.INNODB_TABLESPACES_ENCRYPTION
WHERE NAME LIKE 'db_encrypt%';
source
My advice for testing is to copy the full dataset to another node without the encryption keys in place and try to start MySQL and query the encrypted tables. I'm making an (big) assumption that they will not be readable since the valid encryption keys are missing.
To parse the files on disk as they lay may prove difficult unless you have a special tool to do this. Maybe something like Jeremy Cole's innodb_ruby would be another litmus test https://github.com/jeremycole/innodb_ruby.
[probably don't works if you change the key which encrypts the log.]
Stop the database server.
BACKUP the keyfile
Change a key in the keyfile. (don't delte - it still has to remain a valid key otherwiese the server can't restart)
Start MariaDB again.
Try to read the table (e.g. with phpMyAdmin).
If encrypted correctly there is an answer: "The table is encrypted..." when trying to read the encryted table.
Stop Maria
Restore the backup
Restart Maria
Related
I am facing an issue where I have to decrypt a db column in Snowflake.The transformation to decrypt the column is a unix command.How do I achieve this decryption in Snowflake.
If you have a row with normal data and one column that is encrypted, and
are not prepared to decrypt the column prior to loading the data into Snowflake
you are also not prepared to decrypt the column after returning result rows from Snowflake via a query.
Then point 2 would imply you ether cannot decrypt client side, OR you need the results to do some form of JOIN/Filtering on, that it would make sense to store the data non-encrypted.
When you refer to decrypt as a command line tool, implies to you are ether encrypting the whole file/pipe-stream with does not match your column reference.
But if you have to decrypt in Snowflake you will need to implement a Javascript UDF to do that. You might find the Using Binary Data doc's helpful.
You can't run unix commands in the Snowflake environment.
If you can't do client side decryption on the way in or out, you have to figure out what the unix command actually does and hopefully you will be able to recreate it using the Cryptographic/Checksum functions.
I want to transfer tables data from SQL server to Informix and vice versa.
The transferring should be run scheduled and sometimes when the user make a specific action.
I do this operation through delete and insert transactions and it takes along long time through the web between 15 minute to 30 minute.
How to do this operation in easy way taking the performance in consideration?
Say I have
Vacation table in SQL Server and want to transfer all the updated data to the Vacation table in Informix.
and
Permission table in Informix and want to transfer all the updated data to the Permission table in SQL Server.
DISCLAIMER: I am not an SQL Server DBA. However, I have been an Informix DBA for over ten years and can make some recommendations as to its performance.
Disclaimer aside, it sounds like you already have a functional application, but the performance is a show-stopper and that is where you are mainly looking for advice.
There are some technical pieces of information that would be helpful to know, but in their absence, I'm going to make the following assumptions about your environment and application. Please comment or edit your question if I am wrong on any of these.
Database server versions. From the tags, it appears you are using SQL server 2012. However, I cannot determine the Informix server and version. I will assume you are running at least IDS 11.50 or greater.
How the data is being exchanged currently. Are you connecting directly from your .NET application to Informix? I would assume that is the case with SQL Server and will make the same assumption for your Informix connection as well.
Table structures. I assume you have proper indexing on the tables. On the Informix side, dbschema -d *dbname* -t *tablename* will give the basic schema.
If you haven't tried exporting data to CSV and as long as you don't have any compliance concerns doing this, I would suggest loading the data from a comma-delimited file. (Informix normally deals with pipe-delimited files, so you'll either need to adjust the delimiter on the SQL Server side to a pipe | or on the Informix import side). On the Informix end, this would be a
LOAD FROM 'source_file_from_sql_server' DELIMITER '|' INSERT INTO vacation (field1, field2, ..)
For reusability, I would recommend putting this in a stored procedure. Just wrap that load statement inside a BEGIN WORK; and COMMIT WORK; to keep your transactional integrity. MichaĆ Niklas suggested some ways to track changes. If there is any correlation between the transfer of data to the vacation table in Informix and the permission table back in SQL Server, I would propose another option, which is adding a trigger to the vacation table so that you write all new values to a staging table.
With the import logic in a stored procedure, you can fire the import on demand:
EXECUTE PROCEDURE vacation_import();
You also mentioned the need to schedule the import, which can be accomplished with Informix's "dbcron". Using this feature, you'll create a scheduled task that executes vacation_import() periodically as well. If you haven't used this feature before, using OAT will be helpful. You will also want to do some housekeeping with the CSV files. This can be addressed with the system() call, which you can make from stored procedures in Informix.
Some ideas:
Add was_transferred column to source tables setting its default value to 0 (you can use 0/1 instead of false/true).
From source table select data with was_transferred=0.
After transferring data update selected source row, set its was_transferred to 1.
Make table syncro_info with fields like date_start and date_stop. If you discover that there is record with date_stop IS NULL it will mean that you are tranferring data. This will protect you against synchronizing data twice.
Could someone please explain how to obtain a list of all existing databases on a PostgreSQL server, to which the user already has access, using Qt? PostgreSQL documentation suggests the following query:
SELECT datname FROM pg_database WHERE datistemplate = false;
What are the correct parameters to the following functions:
QSqlDatabase::setDatabaseName(const QString & name) //"postgres" or "pg_database"?
QSqlDatabase::setUserName(const QString & name) //actual user name?
QSqlDatabase::setPassword(const QString & password) //no password? or user password?
Much appreciated. Thank you in advance.
You appear to have already answered the first part of your question. Connect to the postgres or template1 database and issue the query you've quoted above to get a list of databases. I'm guessing - reading between the lines - that you don't know how to connect to PostgreSQL to send that query, and that's what the second part of your question is about. Right?
If so, the QSqlDatabase accessor functions you've mentioned are used to set connection parameters, so the "correct" values depend on your environment.
If you want to issue the query above - to list databases - then you would probably want to connect to the postgres database as it always exists and isn't generally used for anything specific, it's there just to be connected to. That means you'd call setDatabaseName("postgres");. Passing pg_database to setDatabaseName would be nonsensical, since pg_database is the pg_catalog.pg_database table, it isn't a database you can connect to. pg_database is one of those odd tables that exists in every database, which might be what confused you.
With the other two accessors specify the appropriate username and password for your environment, same as you'd use for psql; there's no possible way I could tell you which ones to use.
Note that if you set a password but one isn't required because authentication is done over unix socket ident, trust, or other non-password scheme the password will be ignored.
If this doesn't cover your question, consider editing it and explaining your problem in more detail. What've you tried? What didn't work how you expected? Error messages? Qt version?
I need to encrypt data while we take mysqldump from database through command prompt. My OS is windows7. Please help me.
Can't you just pipe the dump output directly though your encryption tool?
ie:
mysqldump mydb | some-encryption-tool.sh
btw, the only reason I suggested piping directly through an encryption tool is to the (unsafe) plain-text version never exists on disk, which is the only interpretation of the question that makes sense. Otherwise, just save the dump to a file and encrypt it - there is nothing to "answer".
I have a sqlite database that I want to open using sqlite3.exe. Now I get an error when I try to make queries, saying "file is encrypted or is not a database". This may seem stupid but I've been looking around on internet and I just don't find how to supply a password (or key) to sqlite3.exe to decrypt the database. The -help option or .help command of sqlite3.exe don't show anything to do that... Is it possible to do that, and if so how can I do it?
It is unlikely that the database would be encrypted, unless you have a reason to believe it is. Are you able to open the database at all, or are you getting this error once you issue some SQL query? If it's the former, your file is probably either not an sqlite db to begin with, or it is corrupted; if it's the latter, please check the integrity of your db with:
pragma integrity_check;
See http://www.sqlite.org/pragma.html#pragma_integrity_check for more info about this pragma.
In any case, unless your db is really encrypted (which sqlite does not support natively), your db is probably unusable.
SQLite reports that error when you pass it a file which is either not actually a SQLite database, or alternatively has been corrupted. There are several SQLite addons to support encryption, but other than that SQLite doesn't have encryption.
It can also happen when you try and open a SQLite v3 database with SQLite v2 (and possibly for other version mismatches).
Assuming you have experienced corruption (and not just passing the wrong file, or using the wrong version of SQLite), you may want to check the PRAGMA synchronous settings you're using, and also review the list of fixed data-corrupting bugs.
Checkout this this forum here. The guy had the same question as you. The thing is that there is not any form of protection offered as a standard package in sqlite3 API, but you can try System.Data.SQLite. These are the codes posted on the forum:
#include <SQLite.au3> don't include sqlite.dll.au3 !!!
_SQLite_Startup ("System.Data.SQLite.dll")
ConsoleWrite(_SQLite_LibVersion() & #LF)
_SQLite_Open("testcrypt.db")
_SQLite_Exec(-1, "pragma key = 'Radu is happy!';create table if not exists test (id integer, val text);" & _
"insert into test values (1, 'abc');")
Local $row
_SQLite_QuerySingleRow(-1, "select * from test;", $row)
ConsoleWrite($row[1] & #LF)
_SQLite_Close()
_SQLite_Shutdown()
hope that helps.