How to export DynamoDB and import to another aws account? - amazon-dynamodb

I am doing an exercise using serverless model with services (cognito, api gateway, lambda, DynamoDB)
I want to backup DynamoDB to import to another AWS account (my customer's account)
I don't know what to do. Please help me! thank you

Related

Where an I find performance insight metrics `db.SQL.total_query_time.avg` in CloudWatch?

There is a useful metrics from AWS RDS Performance Insight called db.SQL.total_query_time.avg (https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/USER_PerfInsights_Counters.html#USER_PerfInsights_Counters.Aurora_PostgreSQL)
I would like to setup alarm for it. However I cannot find it anywhere in Cloudwatch. Does any one know if it exists in Cloudwatch?
Amazon RDS Performance Insights metrics are not shown on AWS CloudWatch Metrics Dashboard, but you can query RDS Performance Insights metrics using APIs, you could create custom AWS Lambda function to query those metrics and trigger an alert using AWS SNS. Below you can find links to access metrics using AWS SDK and CLI APIs.
AWS CLI
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/pi.html#PI.Client.get_resource_metrics
AWS SDK
https://awscli.amazonaws.com/v2/documentation/api/latest/reference/pi/get-resource-metrics.html

How to export all kinds(tables) to local system?

I have a project whose database is in cloud datastore. Now I want to take a backup of all kinds including all its entities in local system. How it should be possible. I also have checked the cloud documentation i.e
1- https://cloud.google.com/datastore/docs/export-import-entities#exporting_entities
2- https://cloud.google.com/sdk/gcloud/reference/datastore/export
but it describes that how to export data from cloud datastore to cloud storage not in local system. Please let me know if anyone knows that how it should be possible.
Thanks!
It is not possible to get the Managed Export service to export directly to your local filesystem. So you'll need to export your entities to GCS. To use the exports on your local machine you can copy them to your local machine, then import them into the Datastore emulator.
I do something like this, but I had to create my own exporter and importer, see my answer to this question https://stackoverflow.com/a/52767415/4458510
To do this I wrote a google dataflow job that exports select models and saves them in google cloud storage in jsonl format. Then on my local host I have an endpoint called /init/ which launches a taskqueue job to download these exports and import them.
To do this i reuse my JSON REST handler code which is able to convert any model to json and vice versa.

How to establish a connection to DynamoDB using python using boto3

I am bit new to AWS and DynamoDB.
My aim is to embed a small piece of code.
The problem I am facing is how to make a connection in python code. I made a connection using AWS cli and then entering access ID and key.
But how to do it in my code as i wish to deploy my code on other systems.
Thanks in advance !!
First of all read documentation for boto3 dynamo, it's pretty simple:
http://boto3.readthedocs.io/en/latest/reference/services/dynamodb.html
If you want to provide access keys while connecting to dynamo, you can do the following:
client = boto3.client('dynamodb',aws_access_key_id='yyyy', aws_secret_access_key='xxxx', region_name='***')
But, remember, it is against best practices from security perspective to store such keys within the code.
For best security efforts use IAM roles.
boto3 driver will automatically consume IAM role if it is attached to the instance.
Link to the docs: https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/iam-roles-for-amazon-ec2.html
Also, if IAM roles is to complicated, you can install and aws-cli and run aws configure on your server, and boto3 will use the key from here (less secure than a previous approach).
After implementing one of the options, you can connect to DynamoDB without the keys from code:
client = boto3.client('dynamodb', region_name='***')

AWS IOT to third party DynamoDB service

Using a AWS IOT Rule functionality I can define a rule that maps MQTT data to DynamoDB. Is it possible instead of using local DynamoDB on the same account use a third party DynamoDB resource from a different account to achieve the same result? If positive, how it can be achieved?
You can use cross-account credentials to access resources belonging to another AWS account in an IoT rule. This process is described in detail in the blog here.

Firebase Hosting on own server

I am looking for a solution for developing iOS and Android chat to replace our current (unreliable, maybe poorly written by previous devs) XMPP/OpenFire chat. I came across Firebase which looks good. However, I don't quite get the setup for it.
Can I host Firebase on my own server and not have to subscribe to any of Firebase's plans?
Firebase offers a few products:
the Firebase realtime database
Firebase hosting (for hosting static resources)
Firebase authentication
I think you are looking for the Firebase realtime database.
There is no way currently to host the Firebase realtime database on your own servers.
Probably to late to be of any help but an alternative is RethinkDb. It is an open source realtime database and can be installed on your own machines.
Never used it myself just researching my options like you.
One more tool to add to the list is Appwrite. It is a self-hosting solution that seems to be inspired by firebase. It has much if not all of firebase's functionality. PS: I am not in any way associated with the project, just a (happy) user.
You should check if RESTHeart fits your needs. It's mainly a REST, GraphQL, and WebSocket API on top of MongoDB, but it has many additional features
The Open Source Firebase Alternative supabase
currently it supports Postgres realtime Database, Authentication and storage only .
And it can be installed on your own server supabase self hosting docs

Resources