How Do I Change An Existing DynamoDB Table's Pricing Model in AWS Amplify? - amazon-dynamodb

Problem: AWS Amplify has built all my tables with the "On-Demand" pricing model.
How can I change "On-Demand" to provisioned and set the read and write capacity units?
Requirements:
Cannot lose data in the table
Has do be done following the infrastructure as code principals, where I run amplify push apiName to push the new changes

You cannot set the billing mode of your DynamoDB table directly using the Amplify CLI, however you can override the base functionality by extending the Amplify Cloudformation stack with the use of CDK (Cloud Development Kit).
With the Storage Module you can run amplify override storage to create an override.ts file to add custom CDK Typescript code to override the base functionality of your Storage resources created via the Amplify CLI.
This will append these changes to your existing Amplify Cloudformation stack and apply these changes when you run amplify push to provision your stack.
For your DynamoDB tables you can customize the following properties of the dynamoDBTable and override the changes of the attributes you need. For example to update the read and write capacity you would override the ProvisionedThrought property and update the Read and write capacity units value to your desired amount. Below is an example of TypeScript code which shows this.
import { AmplifyDDBResourceTemplate } from '#aws-amplify/cli-extensibility-helper';
export function override(resources: AmplifyDDBResourceTemplate) {
resources.dynamoDBTable.billingMode = 'PROVISIONED'
}
https://docs.amplify.aws/cli/storage/override/#customize-amplify-generated-s3-resources

Related

Is there a way to read environment variables inside override.ts with AWS Amplify Auth

I used to AWS Amplify Auth for a social login, recently.
and, for social provider setting, I'm trying to use amplify auth override.
docs is here: https://docs.amplify.aws/cli/auth/override/
for security reason, I don't want write the secrets inside override.ts like client id, client secrets, etc.
Is it possible to read environment variables in override.ts?
or any idea?
Amplify CLI retained the information in amplify/backend/amplify-meta.json such as project environment information and others resources information.
I used amplify-meta.json as a module.
There is a StackName with the value of amplify-[PROJECT_NAME]-[ENVIRONMENT_NAME]-[PROECT_NUMBER]. So we can get the environment name by deconstructing the string.
override.ts
export function override(resources: AmplifyAuthCognitoStackTemplate) {
const amplifyMetaJson = require('../../../amplify-meta.json');
const envName = amplifyMetaJson.providers.awscloudformation.StackName.split("-").slice(-2, -1).pop();
console.log("Environment for cloudformation => ", envName);
}
Note: This is the temporary solution of an evil way. It is better to fix the issue.
https://github.com/aws-amplify/amplify-cli/issues/9063

Is there a way to add new field to all of the documents in a firestore collection?

I have a collection that needs to be updated. There's a need to add new field and fill it out based on the existing field.
Let's say I have a collection called documents:
documents/{documentId}: {
existingField: ['foo', 'bar'],
myNewField ['foo', 'bar']
}
documents/{anotherDocumentId}: {
existingField: ['baz'],
myNewField ['baz']
}
// ... and so on
I already tried to fire up local cloud function from emulator that loops for each document and writes to production data based on the logic I need. The problem is that function can only live up to max of 30 seconds. What I need would be some kind of console tool that I can run as admin (using service-account) to quickly manage my needs.
How do you handle such cases?
Firebase does not provide a console or tool to do migrations.
You can write a program to run on your development machine that uses the one of the backend SDKs (like the Firebase Admin SDK) to query, iterate, and update the documents and let it run as long as you want.
There is nothing specific built into the API for this type of data migration. You'll have to update each document in turn, which typically involves also reading all documents (or at least their IDs).
While it is possible to do this on Cloud Functions, I find it easier to do it with a local Node.js script, as that doesn't have the runtime limits Cloud Functions imposes.

How to set up a database on startup in AWS amplify?

I have an amplify and I get how to add an api function, use a lambda layer etc. What I don't see is how to create the database on startup -- it appears from the documentation that this is done from a CloudFormation stack, but I still can't see how to ensure that the database is set up on startup of the app (or build the tables if not) using something like SQLAlchemy.
What's the intended flow here?

Does aws appsync have scan operations to scan dynamoDB

I am building a serveless web app with aws amplify - graphql - dynamodb. I want to know what exactly a scan operation is in this context. For example, I have an User table and queries listUsers and getUser were generated from amplify schema. Are they scan operations or queries?
Thank you for your answers in advance as I could only find the definition of a scan operation but there aren't example for me to identify one when it comes to graphql.
Amplify uses Filter Expressions which are a type of Query.
You can see this yourself by looking at the .vtl files that amplify generates and uploads to appsync.
They are located here: amplify/#current-cloud-backend/api/[API NAME]/build/resolvers
In that folder you can open up one of the Query.list[Model].req.vtl. Even if you are not familiar with Velocity Template Language you can still get the idea. You can see that it uses the expression $util.transform.toDynamoDBFilterExpression.
More info about that util and then looking at the docs for toDynamoDBFilterExpression.

AWS Java SDK - How to Connect a dynamodb through AppSync

We have a table in DynamoDB and we need to fetch data from the table using AppSync using AWS Java SDK.
AWSAppSync awsAppSyncClient;
public UserManagementAppSyncService(AWSCredentialsProvider credentialsProvider) {
this.awsAppSyncClient = AWSAppSyncClientBuilder.standard()
.withRegion("eu-central-1")
.withCredentials(credentialsProvider)
.build();
}
How to achieve this further? I cannot find further code to do this. Any leads?
The Resolvers were generated from a GraphQL schema using Schema first approach. It is a maven project AND the generated resolvers are in target folder.
You do not actually need to run your Java code to connect AppSync to DynamoDB.
Instead you attach resolvers to your schema and just configure your mappings using VTL language. You can find the documentation of what it can do here.
When you attach a resolver to a graphql field in the AppSync console you can select a sample template that fetches dynamo item by Id:
If you really have to run your own code with custom business logic you can make use of Lambda Resolvers or if you want to skip the mapping template work you can also use Direct Lambda Resolvers

Resources