Consuming Client-Side Encrypted Data from Snowflake - encryption

I am trying to ingest client side encrypted data files from S3 to Snowflake, and want to query the data in Snowflake in readable format using Snowflake SQLs.
I have encrypted the data file using AES 256 and placed in S3. Also, followed the pre requisites of setting up my external stage with MASTER_KEY (AES-256, base64 encoded). However when I read data, it does not show me in readable format.
I would like to know if client side encrypted data can be read in the clear in Snowflake with the right authentication and authorization (without having to unload them back to S3).
Thanks in advance.

Related

Decrypt in Snowflake using Unix command

I am facing an issue where I have to decrypt a db column in Snowflake.The transformation to decrypt the column is a unix command.How do I achieve this decryption in Snowflake.
If you have a row with normal data and one column that is encrypted, and
are not prepared to decrypt the column prior to loading the data into Snowflake
you are also not prepared to decrypt the column after returning result rows from Snowflake via a query.
Then point 2 would imply you ether cannot decrypt client side, OR you need the results to do some form of JOIN/Filtering on, that it would make sense to store the data non-encrypted.
When you refer to decrypt as a command line tool, implies to you are ether encrypting the whole file/pipe-stream with does not match your column reference.
But if you have to decrypt in Snowflake you will need to implement a Javascript UDF to do that. You might find the Using Binary Data doc's helpful.
You can't run unix commands in the Snowflake environment.
If you can't do client side decryption on the way in or out, you have to figure out what the unix command actually does and hopefully you will be able to recreate it using the Cryptographic/Checksum functions.

Encryption at REST for Hbase data

http://hbase.apache.org/book.html#_server_side_configuration_3
I have checked the URL in which it'll encrypt the data based on the Java java.security.KeyStore. But we need to keep the file .jks which contains master key for all the hbase servers (all master and Region servers have to have this file).
NOTE: Also the password to open the file has been given in hbase-site.xml.
For HDFS alone, there is option to keep the keystore file in the KMS server and not for HBase. Still now we need to keep it in the local store.
I don't need KMS option. I need something to keep the value in common place has to be accessed instaed of having same file in the all servers.
Is there any method/custom class available to get master key from the common storage like DB/redis/Zookeeper?
UPDATE #1
Someone has asked similar question but no solution for that: Encrypt HBase at-rest data in Cloud.

Transferring encrypted data using Sqoop

Is this use case possible:
To first extract data, encrypt it, transfer it over the network, decrypt it and load in hive or HDFS using Sqoop?
You can achieve this by following below steps :
Use sqoop codegen tool to generate mapper code which handles deserialization of table data.
Modify this code to encrypt the data read from table. Each instance represents one row.
Now run sqoop import command which will use this modified mapper code to generate encrypted data. This is transmitted to hdfs.
Use decryption logic over output files in hdfs to get back the content.

Decryption issue when running presto queries in EMR for data encrypted using AWS client side master key

I have used your latest script that successfully installs presto server(version 0.99) and java 8 on Amazon EMR instance. My data files are located in a s3 bucket encrypted with client-side customer managed key that were encrypted . When I create a hive table that references those encrypted data files in s3, hive can successfully decrypt the records and display it in console. However, when viewing the same external table from presto command line interface the data is displayed in its encrypted form. I have looked at your link given in:
https://prestodb.io/docs/current/release/release-0.57.html and added those properties in my hive.properties file and it looks like given below.
hive.s3.connect-timeout=2m
hive.s3.max-backoff-time=10m
hive.s3.max-error-retries=50
hive.metastore-refresh-interval=1m
hive.s3.max-connections=500
hive.s3.max-client-retries=50
connector.name=hive-hadoop2
hive.s3.socket-timeout=2m
hive.s3.aws-access-key=***
hive.s3.aws-secret-key=**
hive.metastore.uri=thrift://localhost:9083
hive.metastore-cache-ttl=20m
hive.s3.staging-directory=/mnt/tmp/
hive.s3.use-instance-credentials=true
Any help on how to decrypt the files in using presto cli will be much appreciated.
We will follow-up in the issue you filed: https://github.com/facebook/presto/issues/2945

CommonCrypto in iOS [sqlite file encryption]

I'm using Core Data API in my iOS application. Also, I'm using the commoncrypto library (CCCrypt()) to encrypt/decrypt the database file(.sqlite file) that resides in the documents folder when the application state changes (Background/Foreground).
The problem which i'm facing is... some of the records in the database gets lost when the application is killed manually by the user from the background state and this issue is inconsistent.
I'm just converting the sqlite file contents into NSData and used as an input to CCCrypt() function to encrypt/decrypt and I'm not decoding the any input data in the crypt operation.
Can someone please help me .....what could be reason for the data loss ? that too only when the application gets killed manually from the background state ..... For both encryption and decryption operation, the CCCrypt function returns the status as kCCSuccess...
Does it needed to use any sort of decoding the input data (raw bytes) before CCCrypt operation ?
Probably you should use the life cycle methods to save your data before it gets killed.
Try saving data in applicationWillTerminate. Go through this discussion for more details about
[enter link description here][1]
[1]: applicationWillTerminate when is it called and when not"Save data before getting Killed"

Resources