We are using LocalStack and having issues interacting with DynamoDB using the AWS js SDK. We are getting an "UnknownError" when trying to use the putItem method of the DynamoDB instance. Here is the error output:
{
“name”: “DynamoDbError”,
“data”: {
“dynamoDbPutObjectParameters”: {
“TableName”: “fooTableName”,
“Item”: {
“hello”: {
“S”: “world”
}
}
}
},
“baseError”: {
“message”: null,
“code”: “UnknownError”,
“time”: “2020-11-23T11:40:26.382Z”,
“statusCode”: 500,
“retryable”: true
}
}
We have set the endpoint in the DynamoDB instance options to be:
http://localhost:4566
4566 is the Edge port we have set for the DynamoDB service in LocalStack.
We get validation errors from the SDK if our put object parameters are incorrect BUT these parameters seem to be okay. They DO work on AWS proper when deployed and the table does update.
We have deployed the DynamoDB tables locally using the AWS CDK and aws-cdk-local and WE CAN query, put and update the table in LocalStack using the AWS CLI successfully.
The issue seems to be using the AWS js SDK to interact with the DynamoDB in LocalStack.
Here is the call to putItem which works on AWS but not with LocalStack via the AWS SDK:
putItem: async (
dynamoDbPutObjectParameters
) => {
try {
const data = await dynamoDbInstance.putItem(dynamoDbPutObjectParameters).promise();
return data;
} catch (error) {
throw new DynamoDbError('Failed to put data to DynamoDB', {
data: { dynamoDbPutObjectParameters },
baseError: error
});
}
},
Does anyone have any ideas what might be going on or how to progress?
Thanks
FWIW the issue that was causing the UnknownError was to do with the putItem command parameter. We were not correctly supplying the sortKey value. The LocalStack implementation of DynamoDB does not provide any detailed error output in the response BUT when we tried a similar call on AWS proper we got detailed error output which lead us to understand what was wrong with the implementation.
Its also worth noting that detailed error information did appear in the output of the LocalStack Docker container, just not in the error response when calling the aws js sdk putItem command.
Related
Step 1: Automatically create a new Next.js project using the new beta app directory:
npx create-next-app#latest --experimental-app
pages/api/hello.ts
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import type { NextApiRequest, NextApiResponse } from 'next'
type Data = {
name: string
}
export default function handler(
req: NextApiRequest,
res: NextApiResponse<Data>
) {
res.status(200).json({ name: 'John Doe' })
}
This file is identical to the one created automatically created by npx - there are no modifications.
I am trying to build a simple home page, which fetches data from the api which gets data from my database. Either way an await/async will be required. I am following the instructions from here.
In this example I will demonstrate that even awaiting the supplied hello api can't seem to run in production, and I can't work out why.
app/page.tsx
async function getHelloAsync() {
const res = await fetch('http://localhost:3000/api/hello', { cache: 'no-store' });
// The return value is *not* serialized
// You can return Date, Map, Set, etc.
// Recommendation: handle errors
if (!res.ok) {
// This will activate the closest `error.js` Error Boundary
throw new Error('Failed to fetch data');
}
return res.json();
}
export default async function Page() {
const hello = await getHelloAsync();
return (
<main>
<h1>Hello: {hello.name}</h1>
</main>
)
}
To test the hello api works, I confirm that running pn run dev and then curl http://localhost:3000/api/hello the following successful response is received:
{"name":"John Doe"}
Next up we exit the dev server and run:
pn run build
The first headache is that the build will completely fail to build unless one adds { cache: 'no-store' } to the fetch command:
const res = await fetch('http://localhost:3000/api/hello', { cache: 'no-store' });
or adds this to the top of app/page.tsx:
export const fetchCache = 'force-no-store';
I am actually not sure how one would even build this if you wanted to cache the response or use revalidate instead and provide an initial optimistic response, because without cache: no-store it refuses to build outright. Ideally instead it should just cache the result from /api/hello and not fail. Running the dev server at the same idea as doing the build does allow the build to work, but then as soon as you exit the dev server and run pn run start then all the api calls fail anyway. So that is not a good idea.
This leads us to the next problem - why are the api calls not working in production (i.e. when calling pn run start).
Step 2:
pn run build
pn run start
Confirm that the following still works and yes it does:
curl http://localhost:3000/api/hello
Result:
{"name":"John Doe"}
Now we visit http://localhost:3000 in a browser but, surprise! We get the following error:
> next start
ready - started server on 0.0.0.0:3000, url: http://localhost:3000
warn - You have enabled experimental feature (appDir) in next.config.js.
warn - Experimental features are not covered by semver, and may cause unexpected or broken application behavior. Use at your own risk.
info - Thank you for testing `appDir` please leave your feedback at https://nextjs.link/app-feedback
(node:787) ExperimentalWarning: The Fetch API is an experimental feature. This feature could change at any time
(Use `node --trace-warnings ...` to show where the warning was created)
TypeError: fetch failed
at Object.fetch (node:internal/deps/undici/undici:11118:11)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async getHelloAsync (/Users/username/nextjstest/.next/server/app/page.js:229:17)
at async Page (/Users/username/nextjstest/.next/server/app/page.js:242:19) {
cause: Error: connect ECONNREFUSED ::1:3000
at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1300:16)
at TCPConnectWrap.callbackTrampoline (node:internal/async_hooks:130:17) {
errno: -61,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '::1',
port: 3000
}
}
[Error: An error occurred in the Server Components render. The specific message is omitted in production builds to avoid leaking sensitive details. A digest property is included on this error instance which may provide additional details about the nature of the error.] {
digest: '3567993178'
}
Why is it saying that the connection is refused when we know the API is available? I can't get this to run at all. I know this is beta but surely the code should actually run right? How do I make this code work?
Also if anyone knows where where the logs are that I'm supposed to be accessing to see digest '3567993178' please let me know.
I'm using localStorage on the server and it works fine locally. But when I deployed my code to Deno deploy is not defined
Do I need to import the localStorage? I Deno.com I couldn't find any docs talking about localStorage so maybe that feature is not supported yet. In that case, where can I deploy my code to use it? Thanks
import {Handlers, PageProps} from "$fresh/server.ts";
interface Data {
email: string[]
}
export const handler: Handlers<Data> = {
GET(_req, ctx) {
const emailsStorage = localStorage.getItem("email");
const email = emailsStorage ? JSON.parse(emailsStorage) : [];
console.log(email);
return ctx.render({ email });
},
};
export default function EmailPage({ data }: PageProps<Data>) {
const { email } = data;
return (
<main>
<h1>Emails</h1>
<ul>
{email.map((email) => (
<li>{email}</li>
))}
</ul>
</main>
);
}
The full list of available APIs is here (note that localStorage is not listed).
Deploy does not offer any persistent data storage mechanism. After your deployed code finishes executing in response to a request, all of the JS memory is destroyed, so if you want to work with mutable data that persists between requests, then you'll have to store that data yourself elsewhere — e.g. by sending the data in a network request to another server / hosted database / etc. and then requesting it when you need it.
The docs include several "persist data" tutorials that you can use as a guide/reference in order to learn.
You can persist data in local storage by creating a virtual local Storage by using this code.
import { installGlobals } from "https://deno.land/x/virtualstorage#0.1.0/mod.ts";
installGlobals();
localStorage.getItem("email") will work on Deno Deploy also.
My DataStore is not syncing with DynamoDB for some reason.
I've ready every issue on stackoverflow to see if I can find a resolution but no dice.
There are no errors.
Hub is showing the events firing.
Here is an example of the issue:
try {
const result = await DataStore.save(
new Employer({
name: 'Test Employer',
rate: 123.45,
}),
)
console.log('Employer saved successfully!')
console.log(result)
// const employer = await DataStore.query(Employer)
// console.log('EMPLOYER = ')
// console.log(employer)
} catch (err) {
console.log('ERROR: An error occurred during getEmployer')
console.log('Error message was ' + JSON.stringify(err))
}
DataStore nevers seems to sync with DynamoDB.
Other than that everything is fine. No issues. DataStore contains the correct data.
The only difference between my project and the code examples if that I have used Amplify Studio to build the data model and performed Amplify pull to update the project.
When I do an "amplify status" the API looks correct.
The aws-exports.js file seems to be correct.
It contains entries for Auth, API, Storage etc.
Auth is working correctly.
What am I missing?
I am trying to create a firebase project with a firestore instance programatically. I've been using the firebase-tools cli and have managed to create a new project, a web app and get the app config, but I still need to manually enter the console and click the "Create database" button. Is it possible to automate this process?
Given that Firestore depends on an implementation of Google App Engine, I was able to programatically create a Firestore database using the Google App Engine api:
package main
import (
"context"
"google.golang.org/api/appengine/v1"
"google.golang.org/grpc/codes"
"google.golang.org/grpc/status"
"log"
)
// CreateFirestoreDatabase uses the Google App Engine API
// to create a Firestore database
func CreateFirestoreDatabase(ctx context.Context) error {
// instantiate service
service, err := appengine.NewService(context.Background())
if err != nil {
log.Fatalf("appengine.NewService: %v", err)
}
// create application
op, err := service.Apps.
Create(&appengine.Application{
DatabaseType: "CLOUD_FIRESTORE",
Id: "your-google-project-id",
LocationId: "europe-west",
}).
Context(ctx).Do()
if err != nil {
return status.Errorf(
codes.Internal,
"service.Apps.Create: %s", err,
)
}
// check the status of the longrunning operations
// TODO: loop until op.Done == true
_ = op
return nil
}
The same process is used by the gcloud SDK:
gcloud firestore databases create [--region=REGION]
It is useful to have a look a the underlying python code backing the gcloud bin folder. Here is a screenshot of the gcloud firestore create command confirming this:
The process can be automated using terraform and the work around to configure an app engine application resource.
resource "google_app_engine_application" "app" {
project = "your-project-id"
location_id = "us-central"
database_type = "CLOUD_FIRESTORE"
}
As stated in the answers before firestore seems to rely on app engine even in the gcloud SDK.
Ugly but works.
How do I fetch Amazon DynamoDB data using a RESTful API?
Is there a way to get Amazon DynamoDB data using a REST url, and if so what are the required parameters to pass in the url?
We have considered the DynamoDB endpoint as the url and append it with the accesskey and the secretaccesskey, is anything more required to append to the url?
If any one has tried this with DynamoDB RESTful API, can you give me an example of how to get table data?
A sample url would also be good, something showing how to connect to DynamoDB through a RESTful API.
Ideally, a sample url with all the parameters required.
There is an AWS documentation including an example:
http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/MakingHTTPRequests.html
You can use API gateway which points to a Lambda function to fetch the data from DynamoDB. End point of your API gateway URL will be your rest end point.
That is not how Dynamo works, you will have to build your own RESTful API (i.e. use the AWS SDK for PHP) that hits Dynamo, reformats the data to however you want it then returns it. Quite easy to do :-)
Here is a minimal JavaScript example of performing a GetItem operation on a DynamoDB table via the AWS API:
const DYNAMODB_ENDPOINT = 'https://dynamodb.us-east-1.amazonaws.com';
let aws = new AwsClient({ accessKeyId: AWS_ACCESS_KEY, secretAccessKey: AWS_SECRET_KEY });
async function dynamoDbOp(op, opBody) {
let result = await aws.fetch(DYNAMODB_ENDPOINT, {
method: 'POST',
headers: {
'Content-Type': 'application/x-amz-json-1.0',
'X-Amz-Target': 'DynamoDB_20120810.' + op
},
body: JSON.stringify(opBody)
});
return result.json();
}
let dbResponse = await dynamoDbOp('GetItem', {
TableName: "someTableName",
Key: {
someTableKey: {
S: "someKeyValue"
}
}
});
console.log(dbResponse.json());
Note that AwsClient is a class defined by aws4fetch which takes care of signing AWS requests for you. See the AWS GetItem API doc for more info.