How to establish a connection to DynamoDB using python using boto3 - amazon-dynamodb

I am bit new to AWS and DynamoDB.
My aim is to embed a small piece of code.
The problem I am facing is how to make a connection in python code. I made a connection using AWS cli and then entering access ID and key.
But how to do it in my code as i wish to deploy my code on other systems.
Thanks in advance !!

First of all read documentation for boto3 dynamo, it's pretty simple:
http://boto3.readthedocs.io/en/latest/reference/services/dynamodb.html
If you want to provide access keys while connecting to dynamo, you can do the following:
client = boto3.client('dynamodb',aws_access_key_id='yyyy', aws_secret_access_key='xxxx', region_name='***')
But, remember, it is against best practices from security perspective to store such keys within the code.
For best security efforts use IAM roles.
boto3 driver will automatically consume IAM role if it is attached to the instance.
Link to the docs: https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/iam-roles-for-amazon-ec2.html
Also, if IAM roles is to complicated, you can install and aws-cli and run aws configure on your server, and boto3 will use the key from here (less secure than a previous approach).
After implementing one of the options, you can connect to DynamoDB without the keys from code:
client = boto3.client('dynamodb', region_name='***')

Related

Unable to connect to snowflake database in airflow using SnowflakeOperator with private key

I am trying to connect to snowflake database in airflow by using SnowflakeOperator which operates on created snowflake connection in airflow UI portal (see linked picture below)
top part of my snowflake connection
where all essential fields are populated including Host, Login, Warehouse, Account, Database, Role and Private key (Text). However, when triggering the DAG I am facing the following error:
TypeError: Password was given but private key is not encrypted.
I have not been able to find the answer to resolve this, so it might be a simple issue which I am missing (I am very new to software engineering coding), such as using some tools to encrypt the key. Any help or tips here would be appreciated!

AWS CloudWatch with mobile applications

I have a backend system built in AWS and I'm utilizing CloudWatch in all of the services for logging and monitoring. I really like the ability to send structured JSON logs into CloudWatch that are consistent and provide a lot of context around the log message. Querying the logs and getting to the root of an issue is simple or just exploring the health of the environment - makes CloudWatch a must have for my backend.
Now I'm working on the frontend side of things, mobile applications using Xamarin.Forms. I know AWS has Amplify but I really wanted to stick with Xamarin.Forms as that's a skill set I've already got and I'm comfortable with. Since Amplify didn't support Xamarin.Forms I've been stuck looking at other options for logging - one of them being Microsoft's AppCenter.
If I go the AppCenter route I'll end up having to build out a mapping of the AppCenter installation identifier and my users between the AWS environment and the AppCenter environment. Before I start down that path I wanted to ask a couple questions around best practice and security of an alternative approach.
I'm considering using the AWS SDK for .Net, creating an IAM Role with a Policy that allows for X-Ray and CloudWatch PUT operations on a specific log group and then assigning it to an IAM User. I can issue access keys for the user and embed them in my apps config files. This would let me send log data right into CloudWatch from the mobile apps using something like NLog.
I noticed with AppCenter I have to provide a client secret to the app, which wouldn't be any different than providing an IAM User access key to my app for pushing into CloudWatch. I'm typically a little shy about issuing access keys from AWS but as long as the Policy is tight I can't think of any negative side-effects... other than someone flooding me with log data should they pull the key out of the app data.
An alternative route I'm exploring is instead of embedding the access keys in my config files - I could request them from my API services and hold it in-memory. Only downside to that is when the user doesn't have internet connectivity logging might be a pain (will need to look at how NLog handles sinks that aren't currently available - queueing and flushing)
Is there anything else I'm not considering or is this approach a feasible solution with minimal risk?

How to read from a SQLite database in ROBLOX

I literally have to idea how to do that, I thought of using a HTTP server.
Roblox provides their own cloud hosted database for free use via their DataStore database abstraction API. If you really wanted to, you could create your own database server hosted on something like AWS and allow only specific HTTP headers to retrieve/store information.

How to I pass secrets stored in AWS Secret Manager to a Docker container in Sagemaker?

My code is in R. And I need to excess external database. I am storing database credentials in AWS Secret Manager.
So I first tried using paws library to get aws secrets in R but that would require storing access key, secret id and session token, and I want to avoid that.
Is there a better way to do this? I have created IAM role for Sagemaker. Is it possible to pass secrets as environment variables?
Edit: I wanted to trigger Sagemaker Processing
I found a simple solution to it. Env variables can be passed via Sagemaker sdk. It minimizes the dependencies.
https://sagemaker.readthedocs.io/en/stable/api/training/processing.html
As another answer suggested, paws can be used as well to get secrets from aws. This would be a better approach
You should be able to use Paws for this. According to documentation it will use the IAM role configured for your Sagemaker instance
If you are running the package on an instance with an appropriate IAM role, Paws will use it automatically and you don’t need to do anything extra.
You only have to add the relevant access permissions (e.g. Allow ssm:GetParameters) to the Sagemaker IAM role.

Post to Azure Cosmos Db from NiFi

I created Azure CosmosDb database and container for my documents.
I use NiFi as a main data ingestion tool and want to feed my container with documents from NiFi flow files.
Can anybody please share a way to post flowfile content to Azure Cosmos Db from NiFi?
Thanks in advance
UPDATE(2019.05.26):
In the end I used Python script and called it from NiFi to post messages. I passed a message as a parameter. The reason I chose python is because it has some examples on official Microsoft site with all the required connection settings and libraries, so it was easy to connect to Cosmos.
I tried Mongo component, but couldn't connect to Azure (security config didn't work), didn't really go too far with it as Python script worked just fine.
Azure CosmosDB exposes MongoDB API so you can use the following MongoDB processors which are available in NiFi to read/query/write to & from Azure CosmosDB using Apache NiFi.
DeleteMongo
GetMongo
PutMongo
PutMongoRecord
RunMonogAggregation
Useful Links
https://learn.microsoft.com/en-us/azure/cosmos-db/mongodb-introduction
https://learn.microsoft.com/en-us/azure/cosmos-db/mongodb-feature-support
Valeria. According to the components list supported by Apache Nifi related to Azure, you could only get Azure Blob Storage, Queue Storage, Event Hub etc,not including Cosmos DB.
So,I suggest you using PutAzureBlobStorage to feed azure blob container with documents from NiFi flow files. Then please create a copy activity pipeline in Azure Data Factory to transfer data from Azure Blob Storage into Azure Cosmos DB.

Resources