How to use newsecret in capacitor-community/sqlite when creating connection - sqlite

In my Ionic 5 app, I am using the capacitor-community/sqlite plugin. To create and encrypt a database. I am doing the following as per documentation.
await this.sqlite.createConnection('database1', false, "no-encryption", 1);
await this.sqlite.createConnection('database1', true, "encryption", 1);
await this.sqlite.createConnection('database1', true, "secret", 1);
the mode "encryption" is to be used when you have an already existing database non encrypted and you want to encrypt it.
the mode "secret" is to be used when you want to open an encrypted database.
the mode "newsecret" is to be used when you want to change the secret of an encrypted database with the newsecret.
A secret and newsecret is maintained in the configuration file as an encryption password. When I create the connection with the secret it works fine but I am unable to use the newsecret.
await this.sqlite.createConnection('database1', true, "newsecret", 1);
The above code is supposed to change my connection secret but it's not working. When I run this code it executes with no error but when I run await this.db.open();, it fails with error "Open command failed: Failed in openOrCreateDatabase Wrong Secret". I didn't find the correct way to implement this method in the official documentations.

Related

Using Vault UI to get secrets

I have the following policies:
path "/kv/dev/*" {
capabilities = ["read","list", "update"]
}
path "/kv/data/dev/*" {
capabilities = ["read","list", "update"]
}
Using the CLI I and able to use the following command to get the secrets:
vault kv get -mount=kv dev/db
And it outputs the secrets correctly. The issue occurs when using the the UI
-With the input of dev/db I get Ember Data Request POST /v1/sys/capabilities-self returned a 400 Payload (application/json) [object Object]
-With the input of /data/dev/db I get undefined is not an object (evaluating 'n.data')
Any advice on how to access the secrets using the UI ?
I think I get the state you are looking for. Let me share with you what i did:
First I specified in my terminal what I need in terms of my Vault:
export VAULT_TOKEN='the token I use to authenticate myself in the UI'
export VAULT_ADDR='my vault address'
Login myself in the same way i will do in the UI:
vault login -method=token token=$VAULT_TOKEN
Creating policy
vault policy write my-policy - << EOF
path "/kv/dev/*" {
capabilities = ["read","list", "update"]
}
path "/kv/data/dev/*" {
capabilities = ["read","list", "update"]
}
EOF
Enabling secrets engine for specific path. As you can see in this StackOverflow question
vault secrets enable -path=kv kv
Inserting and reading secret:
vault kv put kv/dev/db value=yes
vault kv get -mount=kv dev/db
After all of this steps I can see the secret in:
VAULT_ADDR/ui/vault/secrets/kv/show/dev/db
So, if VAULT_ADDR was http://127.0.0.1:8200 the full path in the browser will be:
http://127.0.0.1:8200/ui/vault/secrets/kv/show/dev/db

How to use AWS SSM parameter for token in provider github?

This is the code snippet in my main.tf file:
provider "github" {
token = var.github_token_ssm
owner = var.owner
}
data "github_repository" "github" {
full_name = var.repository_name
}
The github token is stored in AWS secretsmanager parameter.
If the value of the token is hardcoded github token, then it works fine.
If the value of the token is a AWS secretsmanager parameter (eg. arn:aws:secretsmanager:us-east-1:xxxxxxxxxxxx:secret:xxxx-Github-t0UOOD:xxxxxx), it is not working.
I don't want to hardcode github token in the code. How can I use secretsmanager parameter for token above?
As far as I know, Terraform not supporting aws Secret Manager (but you can use the vault to store secrets).
you can also deploy it with TF_VAR variable and ENV Var
export TF_VAR_db_username=admin TF_VAR_db_password=adifferentpassword
You can also run a script that will pull the secret from aws and store it in EnvVar.
just remember to secure your state file (the password will exist in clear text)

How to set appsettings..json sensitive data into encrypted/hashed form using power shell and decrypt in c# code

I am having a problem securing the sensitive information stored as plain text in apsettings.json.
Currently, My application is in .net core, reading those sensitive configurations from appsettings.json that are store in plain text format.
For Example
{
"UserName": ABC,
"Password": xyz
}
I want to make them encrypted/secure/masked so that any unauthorized user could not read that data. Other way can be encrypt the appsettings.json at deployment time and decrypt it in memory while using configuration. How can I encrypt the appsettings.json at deployment time and decrypt it in memory.
any help would be appreaciated.
With .net core you can use encrypted JSON by using Package EncryptedJsonConfiguration. In order to add the package you install it using NuGet manager or PowerShall :
Install-Package Miqo.EncryptedJsonConfiguration
Then in your Program.cs :
var key = Convert.FromBase64String(Environment.GetEnvironmentVariable("SECRET_SAUCE"));
Host.CreateDefaultBuilder(args)
.ConfigureAppConfiguration((hostingContext, config) =>
{
config.AddEncryptedJsonFile("settings.ejson", key);
})
...
then in your startup file :
services.AddJsonEncryptedSettings<AppSettings>(_configuration);
For more info you should check : https://github.com/miqoas/Miqo.EncryptedJsonConfiguration
The most recommended way to create encrypted files is by using Kizuna command line tool.
Read more : https://github.com/miqoas/Kizuna

How to decrypt file from Google Cloud Storage?

I have an encrypted file stored in a Google Cloud Storage bucket that was generated with the following command line:
gcloud kms encrypt --location=global --keyring=my-keyring --key=-my-key --plaintext-file=my-file --ciphertext-file=my-file.enc
I am now trying to decrypt such file in a Cloud Run service with the following code:
const kms = require('#google-cloud/kms');
const client = new kms.KeyManagementServiceClient();
const file = storage.bucket("my-bucket").file('my-file.enc');
const name = client.cryptoKeyPath( 'projectId', 'global', 'my-keyring', 'my-key' );
let encrypted = (await file.download())[0];
const [result] = await client.decrypt({name, encrypted });
I am getting the following error:
Error: Decryption failed: verify that 'name' refers to the correct CryptoKey.
Which, according to this, is misleading and should be considered as not being properly deciphered. I cannot shake the feeling that I am missing a base64 encode/decode somewhere but I don't seem to find the solution.
If I run the decryption from the command-line it works just fine.
Any help is very appreciated.
Thanks.
EDIT:
Problem solved thanks to this awesome community. Here goes the steps to make this work, in case others face the same issue:
Encrypt the file using the following command line and upload it via the web UI.
gcloud kms encrypt --location=global --keyring=my-keyring --key=-my-key --plaintext-file=my-file --ciphertext-file=my-file.enc
Decrypt using the following code:
const kms = require('#google-cloud/kms');
const client = new kms.KeyManagementServiceClient();
const file = storage.bucket("my-bucket").file('my-file.enc');
const name = client.cryptoKeyPath( 'projectId', 'global', 'my-keyring', 'my-key' );
let encrypted = (await file.download())[0];
const ciphertext = encrypted .toString('base64');
const [result] = await client.decrypt({name, ciphertext});
console.log(Buffer.from(result.plaintext, 'base64').toString('utf8'))
I spot a few things here:
Assuming your command is correct, my-file-enc should be my-file.enc instead (dot vs dash)
Verify that projectId is being set correctly. If you're populating this from an environment variable, console.log and make sure it matches the project in which you created the KMS key. gcloud defaults to a project (you can figure out which project by running gcloud config list and checking the core/project attribute). If you created the key in project foo, but your Cloud Run service is looking in project bar, it will fail.
When using --ciphertext-file to write to a file, the data is not base64 encoded. However, you are creating a binary file. How are you uploading that binary string to Cloud Storage? The most probable culprit seems to be an encoding problem (ASCII vs UTF) which could cause the decryption to fail. Make sure you are writing and reading the file as binary.
Looking at the Cloud KMS Nodejs documentation, it specifies that the ciphertext should be "exactly as returned from the encrypt call". The documentation says that the KMS response is a base64 encoded string, so you could try base64 encoding your data in your Cloud Run service before sending it to Cloud KMS for decryption:
let encrypted = (await file.download())[0];
let encryptedEncoded = encrypted.toString('base64');
const [result] = await client.decrypt({name, encrypted});
You may want to take a look at Berglas, which automates this process. There are really good examples for Cloud Run with node.
For more patterns, check out Secrets in Serverless.

Evernote Windows SDK Csharp SampleAppAdvanced - can't authenticate

I am trying to run the Csharp sampleAppAdvanced from this code https://github.com/evernote/evernote-cloud-sdk-windows
I substituted the consumer key and secret with those that I got in the email when I requested the API key.
ENSessionAdvanced.SetSharedSessionConsumerKey("xyz","123","sandbox.evernote.com");
if (ENSession.SharedSession.IsAuthenticated == false)
{
ENSession.SharedSession.AuthenticateToEvernote();
}
But I always end up hitting an error at this point
ENNoteRef myRef = ENSession.SharedSession.UploadNote(myNoteAdv, null);
With exception reading "Exception of type 'EvernoteSDK.ENAuthExpiredException' was thrown."
On the console the error reads "EvernoteSDK: ENSession is unauthenticating."
Am I missing something? I know the Application is authorized for access.
The other sample code called sampleApp, doesn't throw an error but doesn't display notes either.
When you register the api, did you choose Basic Access or Full Access?

Resources