How to delete all the Existing tables in DynamoDb? - amazon-dynamodb

I wants to delete all the existing tables in my database in DynamoDb?
Is there any way to do it?

You can use aws cli to delete all tables, may be except one table.
aws dynamodb list-tables --profile your_profile | jq .'TableNames[]' -r | grep -v table_you_dont_want_to_delete | xargs -ITABLE -n 1 aws dynamodb delete-table --table-name TABLE --profile your_profile

You can delete table using below code.
var params = {
TableName: 'table-name',
};
dynamodb.deleteTable(params, function(err, data) {
if (err) ppJson(err); // an error occurred
else ppJson(data); // successful response
});

You have 2 options:
Manually go into the AWS console and delete every table
Programmatically by calling ListTables (multiple calls if pagination is needed), iterate through the resulting TableNames, and call DeleteTable for each one.

Javascript to delete all tables
var AWS = require("aws-sdk")
AWS.config.update({
region: "us-west-2",
endpoint: process.env.DYNAMODB_URL
})
var dynamodb = new AWS.DynamoDB();
dynamodb.listTables({}, function(err, data) {
if (err) console.error(err, err.stack)
for (tableName of data.TableNames) {
dynamodb.deleteTable({TableName: tableName}, function(err, data) {
if (err) console.error(err, err.stack)
else console.log('Deleted', tableName)
})
}
})

The new UI console (2020) now allows you to select multiple tables and delete.

require 'json'
tables=JSON.parse(`aws dynamodb list-tables`)["TableNames"]
tables.each do |table|
`aws dynamodb delete-table --table-name #{table}`
sleep 2 # Only 10 tables can be created, updated, or deleted simultaneously
puts "#{table} deleted"
end

To delete all the DynamoDB tables having a specific keyword in their names, simply run the following CLI commands:
tableList=$(aws dynamodb list-tables | jq .'TableNames[]' -r | grep tableNameKeyword)
for table in $tableList; do aws dynamodb delete-table --table-name $table ; done

Related

DynamoDB: ReturnConsumedCapacity does not return RCU or WCUs

When writing to DynamoDB or reading from DynamoDB you can specify: ReturnConsumedCapacity.
When you do this the API does return total CapacityUnits, but I am not able to get it to return ReadCapacityUnits or WriteCapacityUnits. The documentation indicates we should indeed get data on RCUs and WCUs: https://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_ConsumedCapacity.html
This is true whether or not you set ReturnConsumedCapacity to TOTAL or INDEXES.
This is also true if you are simply doing a read query too.
Is there anyway to get RCUs and WCUs returned?
Here is a sample query:
aws.exe dynamodb query \
--table-name tableName \
--index-name GSI1 \
--key-condition-expression "GSI1PK = :pk" \
--expression-attribute-value '{":pk": {"S": "blah"}}' \
--return-consumed-capacity TOTAL
which returns something like this:
"ConsumedCapacity": {
"TableName": "tableName",
"CapacityUnits": 128.5
}
If I change the request from TOTAL to INDEXES I get:
"ConsumedCapacity": {
"TableName": "oaas-performance-table-dev",
"CapacityUnits": 128.5,
"Table": {
"CapacityUnits": 128.5
}
}
}
pretty much the same in other words. No RCU or WCU.
Any idea how to get this additional data?
This is the intended behavior, per the Query docs:
TOTAL - The response includes only the aggregate ConsumedCapacity for the operation.
A query can only consume read capacity, so CapacityUnits is effectively the same thing as ReadCapacityUnits, and WriteCapacityUnits is always 0.

How do I delete multiple records from an AWS Amplify GraphQL API?

I've made a GraphQL API in an AWS Amplify React Native app. The API contains the model Transaction. AWS Amplify provides CRUD operations out of the box, and I can delete a single transaction no problem.
However, I would like to delete all transactions that meet certain criteria. How do I delete multiple transactions using this stack (AWS Amplify + GraphQL API, React Native)?
You can send 1 request with batch delete mutation.
using listTransaction to get filtered transactions.
const txnMutation: any = transactions.map((txn, i) => {
return `mutation${i}: deleteTransaction(input: {id: "${txn.id}"}) { id }`;
});
await API.graphql(
graphqlOperation(`
mutation batchMutation {
${txnMutation}
}
`)
);
import { API, graphqlOperation } from "aws-amplify";

alexa skill local could not write to dynamodb

I am writing a node.js skill using ask-sdk and using alexa-skill-local to test the endpoint. I need to persist data to DynamoDb in one of the handler. But I keep getting "missing region error". Please find my code below:
'use strict';
// use 'ask-sdk' if standard SDK module is installed
const Alexa = require('ask-sdk');
const { launchRequestHandler, HelpIntentHandler, CancelAndStopIntentHandler, SessionEndedRequestHandler } = require('./commonHandlers');
const ErrorHandler = {
canHandle() {
return true;
},
handle(handlerInput, error) {
return handlerInput.responseBuilder
.speak('Sorry, I can\'t understand the command. Please say again.')
.reprompt('Sorry, I can\'t understand the command. Please say again.')
.getResponse();
},
};
////////////////////////////////
// Code for the handlers here //
////////////////////////////////
exports.handler = Alexa.SkillBuilders
.standard()
.addRequestHandlers(
launchRequestHandler,
HelpIntentHandler,
CancelAndStopIntentHandler,
SessionEndedRequestHandler,
ErrorHandler
)
.withTableName('devtable')
.withDynamoDbClient()
.lambda();
And in one of the handler I am trying to get persisted attributes like below:
handlerInput.attributesManager.getPersistentAttributes().then((data) => {
console.log('--- the attributes are ----', data)
})
But I keep getting the following error:
(node:12528) UnhandledPromiseRejectionWarning: AskSdk.DynamoDbPersistenceAdapter Error: Could not read item (amzn1.ask.account.AHJECJ7DTOPSTT25R36BZKKET4TKTCGZ7HJWEJEBWTX6YYTLG5SJVLZH5QH257NFKHXLIG7KREDKWO4D4N36IT6GUHT3PNJ4QPOUE4FHT2OYNXHO6Z77FUGHH3EVAH3I2KG6OAFLV2HSO3VMDQTKNX4OVWBWUGJ7NP3F6JHRLWKF2F6BTWND7GSF7OVQM25YBH5H723VO123ABC) from table (EucerinSkinCareDev): Missing region in config
at Object.createAskSdkError (E:\projects\nodejs-alexa-sdk-v2-eucerin-skincare-dev\node_modules\ask-sdk-dynamodb-persistence-adapter\dist\utils\AskSdkUtils.js:22:17)
at DynamoDbPersistenceAdapter.<anonymous> (E:\projects\nodejs-alexa-sdk-v2-eucerin-skincare-dev\node_modules\ask-sdk-dynamodb-persistence-adapter\dist\attributes\persistence\DynamoDbPersistenceAdapter.js:121:45)
Can we read and write attributes from DynamoDb using alexa-skill-local ? Do we need some different setup to achieve this ?
Thanks
I know that this is a really old topic, but I had the same problem few days ago, and I'm gonna explain how I did it work.
You have to download DynamoDB Locally and follow the instructions from here
Once that you have configure your local DynamoDB and check that it is working. You have to pass it through your code, to DynamoDbPersistenceAdapter constructor.
Your code should look similar to:
var awsSdk = require('aws-sdk');
var myDynamoDB = new awsSdk.DynamoDB({
endpoint: 'http://localhost:8000', // If you change the default url, change it here
accessKeyId: <your-access-key-id>,
secretAccessKey: <your-secret-access-key>,
region: <your-region>,
apiVersion: 'latest'
});
const {DynamoDbPersistenceAdapter} = require('ask-sdk-dynamodb-persistence-adapter');
return new DynamoDbPersistenceAdapter({
tableName: tableName || 'my-table-name',
createTable: true,
dynamoDBClient: myDynamoDB
});
Where <your-acces-key-id>, <your-secrete-access-key> and <your-region> are defined at aws config and credentials files.
The next step is launch your server with alexa-skill-local command as always.
Hope this will be helpfull! =)
Presumably you have an AWS config profile that your skill is using when running locally.
You need to edit the .config file and set the default region (ie us-east-1) there. The region should match the region where your table exists.
Alternatively, if you want to be able to run completely isolated, you may need to write come conditional logic and swap the dynamo client with one targeting an instance of DynamoDB Local running on your machine.

Export json from Firestore

As we can download json file at Firebase RTDB console, are there any way to export json file of Firestore collection/document data?
One of my main objectives is to compare data before/after updating document.
I just wrote a backup and restore for Firestore. You can have a try on my GitHub.
https://github.com/dalenguyen/firestore-backup-restore
Thanks,
There is not, you'd need to come up with your own process such as querying a collection and looping over everything.
Update
As of August 7th, 2018, we do have a managed export system that allows you to dump your data into a GCS bucket. While this isn't JSON, it is a format that is the same as Cloud Datastore uses, so BigQuery understands it. This means you can then import it into BigQuery.
Google made it harder than it needed to be, so the community found a workaround. If you have npm installed, you can do this:
Export
npx -p node-firestore-import-export firestore-export -a credentials.json -b backup.json
Import
npx -p node-firestore-import-export firestore-import -a credentials.json -b backup.json
Source
I've written a tool that traverses the collections/documents of the database and exports everything into a single json file. Plus, it will import the same structure as well (helpful for cloning/moving Firestore databases). Since I've had a few colleagues use the code, I figured I would publish it as an NPM package. Feel free to try it and give some feedback.
https://www.npmjs.com/package/node-firestore-import-export
If someone wants a solution using Python 2 or 3.
Edit: note that this does not backup the rules
Fork it on https://github.com/RobinManoli/python-firebase-admin-firestore-backup
First install and setup Firebase Admin Python SDK: https://firebase.google.com/docs/admin/setup
Then install it in your python environment:
pip install firebase-admin
Install the Firestore module:
pip install google-cloud-core
pip install google-cloud-firestore
(from ImportError: Failed to import the Cloud Firestore library for Python)
Python Code
# -*- coding: UTF-8 -*-
import firebase_admin
from firebase_admin import credentials, firestore
import json
cred = credentials.Certificate('xxxxx-adminsdk-xxxxx-xxxxxxx.json') # from firebase project settings
default_app = firebase_admin.initialize_app(cred, {
'databaseURL' : 'https://xxxxx.firebaseio.com'
})
db = firebase_admin.firestore.client()
# add your collections manually
collection_names = ['myFirstCollection', 'mySecondCollection']
collections = dict()
dict4json = dict()
n_documents = 0
for collection in collection_names:
collections[collection] = db.collection(collection).get()
dict4json[collection] = {}
for document in collections[collection]:
docdict = document.to_dict()
dict4json[collection][document.id] = docdict
n_documents += 1
jsonfromdict = json.dumps(dict4json)
path_filename = "/mypath/databases/firestore.json"
print "Downloaded %d collections, %d documents and now writing %d json characters to %s" % ( len(collection_names), n_documents, len(jsonfromdict), path_filename )
with open(path_filename, 'w') as the_file:
the_file.write(jsonfromdict)
There is an npm for firestore export / import
Project to export
Goto -> project settings -> Service account -> Generate new private key -> save it as exportedDB.json
Project to import
Goto -> project settings -> Service account -> Generate new private key -> save it as importedDB.json
run these 2 commands from the folder where u saved the files
Export:
npx -p node-firestore-import-export firestore-export -a exportedDB.json -b backup.json
Import:
npx -p node-firestore-import-export firestore-import -a importedDB.json -b backup.json
Firestore is still early in its development so please check the docs on backups for any information pertaining to Firestore.
I found this npm package, node-firestore-backup, to be easy and useful.
Note that the --accountCredentials path/to/credentials/file.json is referring to a service account key json file that you can get by following instructions from https://developers.google.com/identity/protocols/application-default-credentials.
Go to the API Console Credentials page.
From the project drop-down, select your project.
On the Credentials page, select the Create credentials drop-down, then select Service account key.
From the Service account drop-down, select an existing service account or create a new one.
For Key type, select the JSON key option, then select Create. The file automatically downloads to your computer.
Put the *.json file you just downloaded in a directory of your choosing. This directory must be private (you can't let anyone get access to this), but accessible to your web server code.
It works for me.
I used Cloud Functions to export all data in Firestore to JSON format. The function that I was used:
exports.exportFirestore2Json = functions.https.onRequest((request, response) => {
db.collection("data").get().then(function(querySnapshot) {
const orders = [];
var order = null
querySnapshot.forEach(doc => {
order = doc.data();
orders.push(order);
});
response.send(JSON.stringify(orders))
return true
})
.catch(function(error) {
console.error("Error adding document: ", error);
return false
});
})
Then, go to https://your-project-id.cloudfunctions.net/exportFirestore2Json you will see something like this
Yes you can, you did not need to start billing in your firebase console. There is a great npm package https://www.npmjs.com/package/firestore-export-import with this you can export and import firestore collection and documents easily. Just follow some steps:
-Get your service account key
Open Firebase console > Project settings > Service accounts > generate new private key
rename the downloaded file with serviceAccountKey.json
-Now create a new folder and index.js file.
-Paste you servicekey.json in this folder
-Now install this package
npm install firestore-export-import
OR
yarn add firestore-export-import
Exporting data from firebase
const { initializeApp} = require('firestore-export-import')
const serviceAccount = require('./serviceAccountKey.json')
const appName = '[DEFAULT]'
initializeApp(serviceAccount, appName)
const fs = require('fs');
const { backup } = require('firestore-export-import')
//backup('collection name')
backup('users').then((data) =>
{
const json = JSON.stringify(data);
//where collection.json is your output file name.
fs.writeFile('collection.json', json, 'utf8',()=>{
console.log('done');
})
});
Execute node index.js and you should see a new collection.json file with your collection and documents in it. If it looks a little messy pretty format it online with
https://codebeautify.org/jsonviewer
This index.js was just a very basic configuration which exports the whole collection with everything in it, read their documentation you could do queries and much more!
Importing data to firebase
const { initializeApp,restore } = require('firestore-export-import')
const serviceAccount = require('./serviceAccountKey.json')
const appName = '[DEFAULT]'
initializeApp(serviceAccount, appName)
restore('collection.json', {
//where refs is an array of key items
refs: ['users'],
//autoParseDates to parse dates if documents have timestamps
autoParseDates: true,
},()=>{
console.log('done');
})
After execution you should see your firestore populated with collection users!
for dumping json from your local to firestoreDB:
npx -p node-firestore-import-export firestore-import -a credentials.json -b backup.json
for downloading data from firestoreDB to your local:
npx -p node-firestore-import-export firestore-export -a credentials.json -b backup.json
to generate credentials.json, go to project settings -> service accounts -> generate a private key.
Create a blank folder (call it firebaseImportExport ) and run npm init
Go to the source Firebase project -> Settings -> Service Accounts
Click on the Generate new private key button and rename the file as source.json and put it in the firebaseImportExport folder
Do the same (step 2 & 3) for the destination project and rename the file as destination.json
Install the npm i firebase-admin npm package.
Write the following code in the index.js
const firebase = require('firebase-admin');
var serviceAccountSource = require("./source.json");
var serviceAccountDestination = require("./destination.json");
const sourceAdmin = firebase.initializeApp({
credential: firebase.credential.cert(serviceAccountSource),
databaseURL: "https://**********.firebaseio.com" // replace with source
});
const destinationAdmin = firebase.initializeApp({
credential: firebase.credential.cert(serviceAccountDestination),
databaseURL: "https://$$$$$.firebaseio.com"
}, "destination");
const collections = [ "books", "authors", ...]; // replace with your collections
var source = sourceAdmin.firestore();
var destination = destinationAdmin.firestore();
collections.forEach(colName => {
source.collection(colName).get().then(function(querySnapshot) {
querySnapshot.forEach(function(doc) {
destination.collection(colName).doc(doc.id).set({...doc.data()});
});
});
});
Open any of your clientside firebase apps (React, Angular, etc.). Use this code anywhere to log console and copy
const products = await db
.collection("collectionName")
.where("time", ">", new Date("2020-09-01"))
.get()
const json = JSON.stringify(products.docs.map((doc) => ({ ...doc.data() })))
console.log(json)
Documents can also be downloaded as JSON via the REST API.
This is an example using curl in conjunction with the Cloud SDK to obtain an access token:
curl -H "Authorization: Bearer "$(gcloud auth print-access-token) \
"https://firestore.googleapis.com/v1/projects/$PROJECT/databases/(default)/documents/$COLLECTION/$DOCUMENT"
I found an easier solution. There is a tool called Firefoo. It lists all the collection documents along with created users with multiple providers(email & password, phone number, google, facebook etc). You can export data in JSON & CSV along with that you can view the data in simplified format like Table, Tree, JSON.
Note:- You don't have to go through all the process for importing or exporting data from your firebase console.

Using mongofiles with GridFS in a meteor app

I am starting to use GridFS within a Meteor app. I have set up the file collection "assetFiles" with a GridFS storage adapter like this :
AssetCollection = new Mongo.Collection( "assets" );
AssetFileStore = new FS.Store.GridFS( "assetFiles" );
AssetFilesCollection = new FS.Collection( "assetFiles", {
stores: [AssetFileStore]
});
AssetFilesCollection.allow({
insert: function(){
return true;
},
update: function(){
return true;
},
remove: function(){
return true;
},
download: function(){
return true;
}
});
I have inserted some files in it and using meteor mongo client checked they actually exist in db.
Now I would like to extract a file from this db to my file system using mongofiles utility.
using the meteor mongodb database, here is the list of collections :
meteor:PRIMARY> show collections
assets
cfs._tempstore.chunks
cfs.assetFiles.filerecord
cfs_gridfs._tempstore.chunks
cfs_gridfs._tempstore.files
cfs_gridfs.assetFiles.chunks
cfs_gridfs.assetFiles.files
meteor_accounts_loginServiceConfiguration
system.indexes
users
And I don't understand how with the mongofiles utility I could target my assetFiles GridFS file collection to get a particular file or even a list of files.
Here is my attempt:
./mongofiles -h 127.0.0.1:3001 -d meteor list
2015-05-11T17:34:40.701+0200 connected to: 127.0.0.1:3001
It just returns nothing while successfully connecting to db.
My db is on my own FS. I wanted to specify a collection name, but this parameter does not exists anymore apparently.
Thank you for your help!
you need to change the prefix to attach to the right collection.
$ mongofiles --help
....
--prefix= GridFS prefix to use (default is 'fs')`
eg.
mongofiles --port 3001 -d meteor --prefix 'cfs_gridfs.assetFiles' list
hope this helps!
accept please thanks!

Resources