Snowflake encryption scenario - encryption

I have a use case to encrypt the data while loading from S3 bucket to Snowflake tables. The S3 bucket is enabled with SSE-S3.
The files in S3 is additionally encrypted using KMS key before they are pushed to S3 (which I like to call as double encryption). I wanted to understand how Snowflake works on decryption of these data files. To be specific, is the data in transit (while undergoing auto-ingest) also encrypted.
Secondly, if the external stage in Snowflake is configured with the same KMS key id
encryption = (type = 'AWS_SSE_KMS' kms_key_id = 'xxxx-yyyy'
will Snowflake decrypt the data files and make it readable upon querying the table on which the files are loaded?
Thanks in advance

Snowflake supports either client-side encryption or server-side encryption. Either can be configured to decrypt files staged in S3 buckets.
Client-side encryption:
AWS_CSE: Requires a MASTER_KEY value. The master key must be a 128-bit or 256-bit key in Base64-encoded form.
For more information, see the AWS documentation for client-side encryption. Note that for client-side encryption, Snowflake supports using a master key stored in Snowflake; using a master key stored in AWS Key Management Service (AWS KMS) is not supported.
Server-side encryption:
AWS_SSE_S3: Requires no additional encryption settings.
AWS_SSE_KMS: Accepts an optional KMS_KEY_ID value.
For more information, see the AWS documentation for server-side encryption.
Using AWS Key Management Service (KMS) to manage keys requires configuring an IAM policy. For information, see the KMS documentation.
Details: https://docs.snowflake.com/en/user-guide/data-load-s3-encrypt.html#aws-data-file-encryption

Related

Where to store encryption key using node.js?

I have been doing a lot of research but I can't understand where I should save the encryption key in a production environment?
In local environment I have a .env file, but it feels very risky to have the encryption key written there in plain text in a production environment. I could encrypt it but then I just have another key to save somewhere.
I am not using AWS or any other big cloud platform, so I can't use AWS KMS etc.
I have looked into alternatives to AWS KMS, such as Doppler (doppler.com). You can store the key there, but to get the key with their API they use tokens to authenticate the requests, so then I have to store the token somewhere safe.. so it feels like I just running a rat race.
So I really need help here. Where should I store the encryption key? Where would you (and where can you) store it if you were not using any big cloud platform?

How to implement Firestore data encryption?

I have actually one SPA in ReactJs + one mobile application in Flutter + one REST API developed with SailsJs running on a separate server. I managed user authentication with the secured session cookie generated by Firebase Authentication sent back by the API when we are login with valid information (id/password).
Now, I want to encrypt highly sensitive data (medicines, treatments, patients) in the Firestore database so no one can see the data in clear when an intrusion happens or with the basic admin access to the console for the production database.
Do I need to encrypt the data at the client-level considering the fact that the connection between the clients and the API server is over HTTPS? Or can I just encrypt the received body at the api-level before storing it in Firestore and decrypt the encrypted data at the GET endpoints?
My idea is to generate an encryption key with AES at the user registration and store it in another database from an European/French hosting company in order to avoid any risk with the US Cloud Act or whatever (user id from Firebase Authentication <-> encryption key). Is it a good idea? What other solution can I choose to securely store and use the encryption keys of my users?
Thanks for your help.
Do I need to encrypt the data at the client-level considering the fact that the connection between the clients and the API server is over HTTPS? Or can I just encrypt the received body at the api-level before storing it in Firestore and decrypt the encrypted data at the GET endpoints?
If you encrypt/decrypt the data in your custom API, that API will need to have access to the encryption keys. While the chances are small, it does mean the keys could be taken from here, and then be used to compromise the data.
If you encrypt/decrypt the data in the client-side code, only that code will need access to those keys. If you then exchange the keys through some out-of-band mechanism, something that doesn't get stored on your servers along the way, there is no way for anyone with access to those servers to decrypt the data.

How to encrypt actual storage/volume being used by Kubernetes pods using client managed keys(least/zero knowledge of keys on the provider side)?

I want to have a per client namespace and storage in my kubernetes environment where a dedicated instance of app runs per client and only client should be able to encrypt/decrypt the storage being used by that particular client's app.
I have seen hundreds of examples on secrets encryption in kubernetes environment but struggling to achieve actual storage encryption that is controlled by the client. is it possible to have a storage encryption in K8s environment where only client has the knowledge of encryption keys (and not the k8s admin) ?
The only thing that comes to my mind as suggested already in the comment is hashicorp vault.
Vault is a tool for securely accessing secrets. A secret is anything
that you want to tightly control access to, such as API keys,
passwords, or certificates. Vault provides a unified interface to any
secret, while providing tight access control and recording a detailed
audit log.
Some of the features that you might to check out:
API driven interface
You can access all of its features programatically due to HTTP API.
In addition, there are several officially supported libraries for programming languages (Go and Ruby). These libraries make the interaction with the Vault’s API even more convenient. There is also a command-line interface available.
Data Encryption
Vault is capable of encrypting/decrypting data without storing it. The main implication from this is if an intrusion occurs, the hacker will not have access to real secrets even if the attack is successful.
Dynamic Secrets
Vault can generate secrets on-demand for some systems, such as AWS or SQL databases. For example, when an application needs to access an S3 bucket, it asks Vault for credentials, and Vault will generate an AWS keypair with valid permissions on demand. After creating these dynamic secrets, Vault will also automatically revoke them after the lease is up. This means that the secret does not exist until it is read.
Leasing and Renewal: All secrets in Vault have a lease associated with them. At the end of the lease, Vault will automatically revoke that secret. Clients are able to renew leases via built-in renew APIs.
Convenient Authentication
Vault supports authentication using tokens, which is convenient and secure.
Vault can also be customized and connected to various plugins to extend its functionality. This all can be controlled from web graphical interface.

Azure Key Vault - Obtaining Encryption Passphrase

I have two methods that perform encryption/decryption. These methods accept three parameters ...
Plain Text (for encryption) or Cipher Text (for decryption)
Initialization Vector
Encryption Passphrase
I was planning on using Azure Key Vault to store the Encryption Passphrase but as I read through the documentation it appears as though Azure insists on performing the encryption/decryption itself.
Is there a way to just read the Encryption Passphrase from the Azure Key Vault and use it within my own encryption methods?
You could store it as a secret in the Key Vault.
Encryption/decryption is done by the Key Vault if you're using keys, not secrets.

Can OpenTok send an encryption key to Amazon S3 now that S3 supports server-side encryption with a user-supplied key?

Currently, Tokbox's OpenTok supports archiving to Amazon S3. Amazon S3 supports AES-256 encryption at no additional charge. They recently added the ability to submit a user-generated key to encrypt the files with, but the key must come with the PUT request when adding the file to their service. Can I submit an encryption key to Tokbox/OpenTok to provide to Amazon S3 when archiving?
Unfortunately, no, OpenTok archiving doesn't currently accept an encryption key.
Tokbox now offers encrypted archiving as a Premium or Enterprise feature:
https://tokbox.com/blog/introducing-encrypted-archiving/

Resources