Why does dynamoose store the data only for a very short time? - amazon-dynamodb

I use simple setup from dynamoose page.
const startUpAndReturnDynamo = async () => {
const dynaliteServer = dynalite();
await dynaliteServer.listen(8000);
return dynaliteServer;
};
const createDynamooseInstance = () => {
dynamoose.AWS.config.update({
accessKeyId: 'AKID',
secretAccessKey: 'SECRET',
region: 'us-east-1'
});
dynamoose.local(); // This defaults to "http://localhost:8000"
}
const bootStrap = async () => {
await startUpAndReturnDynamo();
createDynamooseInstance();
}
bootStrap();
I can save the data, get the data by Model.get(hashKey) and my data seems likely be saved only for less than a minute? After that query returns undefined.
There is another TTL (time to live) setup but since I didn't use it. My data should stay permanent in DynamoDB, right?

I found the problem.
Because I was using the remoting dynamodb, not the local one.
dynamoose.local() should be changed to dynamoose.ddb()
dynamoose.local() Configure Dynamoose to use a local DynamoDB
dynamoose.ddb() Configures and returns the AWS.DynamoDB object.
The document of dynamoosejs is very detailed but somehow not easily comprehensible to me.
I posted the answer in case newbie with dynamoose facing the same problem.

Related

NextJS generate page getServerSideProps to getStaticProps [duplicate]

This question already has answers here:
Internal API fetch with getServerSideProps? (Next.js)
(3 answers)
Closed last year.
using getServerSideProps to do fetch internal API data, the TTFb time is really high, my page run slow.
So I'm searching for other fetching strategies, my MongoDB data is not large (DATABASE SIZE: 33.84KB), and data does not change often, the best way I think is the State generation page, the total should only 25 pages being generated, but the problem is getStateProps() method can't fetch internal API (development works, production not).
I try:
useEffect : slower than getServerProps
export the MongoDB file to data.js and put it into the project as a fake API: it can work with getStaticProp but the date I still want to storge in the database.
Host API to other domains as external: getStateProps works, approach weird
hard code every 25 page (X)
Question:
the method to improve the code and TTFB
Why getStateProps can't fetch internal API, why design like that.
I saw an article on MongoDB, here is the link, just don't use internal API and then fetch data direct to MongoDB in getStaticProps, here in my code.
BEFORE
export async function getServerSideProps() {
const response = await fetch(`${server}/api/gallery`);
const data = await response.json();
if (!data) {
return {
notFound: true,
};
}
return {
props: { data },
};
}
AFTER
export async function getStaticProps() {
await dbConnect()
//connect to mongodb
const gallery = await art.find()
//i use mongoose model to fetch data
return {
props:{
data:JSON.parse(JSON.stringify(gallery))
}
}
}

How to clear RTK Query cache in tests between requests when using MSW and Jest?

I'm using Redux Toolkit and RTK Query with MSW for mocking, but I seem to be getting back the same data when trying to return an error in tests. I suspect this is an issue with RTK Querys caching behavior, and have tried to disable it with these options to the toolkit createApi method, but they don't seem to address the issue:
keepUnusedDataFor: 0,
refetchOnMountOrArgChange: true,
refetchOnFocus: true,
refetchOnReconnect: true,
In the MSW documentation it gives examples of how to solve this when using other libraries: https://mswjs.io/docs/faq#why-do-i-get-stale-responses-when-using-react-queryswretc
// react-query example
import { QueryCache } from 'react-query'
const queryCache = new QueryCache()
afterEach(() => {
queryCache.clear()
})
// swr example
import { cache } from 'swr'
beforeEach(() => {
cache.clear()
})
How could I achieve the same when using Redux Toolkit and RTK Query?
I can recommend giving the RTK Query tests a read: https://github.com/reduxjs/redux-toolkit/blob/18368afe9bd948dabbfdd9e99b9e334d9a7beedf/src/query/tests/helpers.tsx#L153-L166
This is what we do:
const refObj = {
api,
store: initialStore,
wrapper: withProvider(initialStore),
}
let cleanupListeners: () => void
beforeEach(() => {
const store = getStore() as StoreType
refObj.store = store
refObj.wrapper = withProvider(store)
if (!withoutListeners) {
cleanupListeners = setupListeners(store.dispatch)
}
})
afterEach(() => {
if (!withoutListeners) {
cleanupListeners()
}
refObj.store.dispatch(api.util.resetApiState())
})
So you are looking for dispatch(api.util.resetApiState())
Building on the above answer, this is what I did in my app:
beforeEach(() => {
const { result } = renderHook(() => useAppDispatch(), { wrapper });
const dispatch = result.current;
dispatch(myApi.util.resetApiState());
});
wrapper here is the providers for Redux and other context.
SOLUTION: This will immediately remove all existing cache entries, and all queries will be considered 'uninitialized'. So just put the below code into onClick or according to your scenario so when you hit an enter request will go and cache would also be clear. below here api is your name of an api which you would set in your rtk query in store.
dispatch(api.util.resetApiState());
For more info please have a look in documentation https://redux-toolkit.js.org/rtk-query/api/created-api/api-slice-utils

Using Google Cloud Speech to Text in Firebase Cloud Functions

Google Cloud Speech to Text documentation dictates that you can access it by:
const client = new speech.SpeechClient();
const [operation] = await client.longRunningRecognize({
config: {
encoding: 'LINEAR16',
sampleRateHertz: 16000,
languageCode: 'en-US'
},
audio: {
uri: `gs://${bucket}/${name}`
}
});
const [response] = await operation.promise();
response.results.forEach(result => {
console.log(`Transcription: ${result.alternatives[0].transcript}`);
});
Now, I wanna run this code in a Firebase Cloud Function. Unfortunately, Cloud Functions run on a version of Node that does not yet support async and await functions.
Some things I've tried:
Trying TypeScript, which supports async and await: Ran into a bunch of problems with some of the other APIs I'm using.
Upgrading all my functions to Node 8 (beta), which supports async and await: Again, ran into quite a bit of bugs from the Firebase side doing this.
"Translating" the code manually (is this even a thing?): I tried to treat the code to expect a promise.
That didn't work too well either, this is how it looks:
exports.onStorageObjectFinalize = functions.storage.object()
.onFinalize((object) => {
const client = new speech.SpeechClient();
return client.longRunningRecognize({
config: {
encoding: 'LINEAR16',
sampleRateHertz: 16000,
languageCode: 'en-US'
},
audio: {
uri: `gs://${object.bucket}/${object.name}`
}
})
.then(r1 => {
const [operations] = r1;
return operations.promise();
})
.then(r2 => {
const [response] = r2;
// response.results...
return true;
});
});
Edit: When the above function runs, it says there's no operations.promise(). In fact, after taking a look at the whole operations object, the structure doesn't look like its the same function. I did found there's a promise property in operations._callOptions, so I tried returning operations._callOptions.promise() but I got a strange error: TypeError: #<CallSettings> is not a promise at client.longRunningRecognize.then.r1.
Did I mess the translation code up or would this never work anyways?
Any other things I can try or are TypeScript and Node 8 my only two options here?
Thanks, much appreciated.

Add timestamp in Firestore documents

I'm newbie to Firestore. Firestore docs says...
Important: Unlike "push IDs" in the Firebase Realtime Database, Cloud Firestore auto-generated IDs do not provide any automatic ordering. If you want to be able to order your documents by creation date, you should store a timestamp as a field in the documents.
Reference: https://firebase.google.com/docs/firestore/manage-data/add-data
So do I have to create key name as timestamp in document? Or created is suffice to fulfill above statement from Firestore documentation.
{
"created": 1534183990,
"modified": 1534183990,
"timestamp":1534183990
}
firebase.firestore.FieldValue.serverTimestamp()
Whatever you want to call it is fine afaik. Then you can use orderByChild('created').
I also mostly use firebase.database.ServerValue.TIMESTAMP when setting time
ref.child(key).set({
id: itemId,
content: itemContent,
user: uid,
created: firebase.database.ServerValue.TIMESTAMP
})
Use firestore Timestamp class, firebase.firestore.Timestamp.now().
Since firebase.firestore.FieldValue.serverTimestamp() does not work with add method from firestore. Reference
For Firestore
ref.doc(key).set({
created: firebase.firestore.FieldValue.serverTimestamp()
})
REALTIME SERVER TIMESTAMP USING FIRESTORE
import firebase from "firebase/app";
const someFunctionToUploadProduct = () => {
firebase.firestore().collection("products").add({
name: name,
price : price,
color : color,
weight :weight,
size : size,
createdAt : firebase.firestore.FieldValue.serverTimestamp()
})
.then(function(docRef) {
console.log("Document written with ID: ", docRef.id);
})
.catch(function(error) {
console.error("Error adding document: ", error);
});
}
All you need is to import 'firebase' and then call
firebase.firestore.FieldValue.serverTimestamp() wherever you need it. Be careful with the spelling though, its "serverTimestamp()". In this example it provides the timestamp value to 'createdAt' when uploading to the firestore's product's collection.
That's correct, like most database, Firestore doesn't store creation times. In order to sort objects by time:
Option 1: Create timestamp on client (correctness not guaranteed):
db.collection("messages").doc().set({
....
createdAt: firebase.firestore.Timestamp.now()
})
The big caveat here is that Timestamp.now()uses the local machine time. Therefore, if this is run on a client machine, you have no guarantee the timestamp is accurate. If you're setting this on the server or if guaranteed order isn't so important, it might be fine.
Option 2: Use a timestamp sentinel:
db.collection("messages").doc().set({
....
createdAt: firebase.firestore.FieldValue.serverTimestamp()
})
A timestamp sentinel is a token that tells the firestore server to set the time server side on first write.
If you read the sentinel before it is written (e.g., in a listener) it will be NULL unless you read the document like this:
doc.data({ serverTimestamps: 'estimate' })
Set up your query with something like this:
// quick and dirty way, but uses local machine time
const midnight = new Date(firebase.firestore.Timestamp.now().toDate().setHours(0, 0, 0, 0));
const todaysMessages = firebase
.firestore()
.collection(`users/${user.id}/messages`)
.orderBy('createdAt', 'desc')
.where('createdAt', '>=', midnight);
Note that this query uses the local machine time (Timestamp.now()). If it's really important that your app uses the correct time on the clients, you could utilize this feature of Firebase's Realtime Database:
const serverTimeOffset = (await firebase.database().ref('/.info/serverTimeOffset').once('value')).val();
const midnightServerMilliseconds = new Date(serverTimeOffset + Date.now()).setHours(0, 0, 0, 0);
const midnightServer = new Date(midnightServerMilliseconds);
The documentation isn't suggesting the names of any of your fields. The part you're quoting is just saying two things:
The automatically generated document IDs for Firestore don't have a natural time-based ordering like they did in Realtime Database.
If you want time-based ordering, store a timestamp in the document, and use that to order your queries. (You can call it whatever you want.)
This solution worked for me:
Firestore.instance.collection("collectionName").add({'created': Timestamp.now()});
The result in Cloud Firestore is:
Cloud Firestore Result
Try this one for Swift 4 Timestamp(date: Date())
let docData: [String: Any] = [
"stringExample": "Hello world!",
"booleanExample": true,
"numberExample": 3.14159265,
"dateExample": Timestamp(Date()),
"arrayExample": [5, true, "hello"],
"nullExample": NSNull(),
"objectExample": [
"a": 5,
"b": [
"nested": "foo"
]
]
]
db.collection("data").document("one").setData(docData) { err in
if let err = err {
print("Error writing document: \(err)")
} else {
print("Document successfully written!")
}
}
The way it worked with me, is just taking the timestamp from the snapshot parameter snapshot.updateTime
exports.newUserCreated = functions.firestore.document('users/{userId}').onCreate(async (snapshot, context) => {
console.log('started! v1.7');
const userID = context.params['userId'];
firestore.collection(`users/${userID}/lists`).add({
'created_time': snapshot.updateTime,
'name':'Products I ♥',
}).then(documentReference => {
console.log("initial public list created");
return null;
}).catch(error => {
console.error('Error creating initial list', error);
process.exit(1);
});
});
I am using Firestore to store data that comes from a Raspberry PI with Python. The pipeline is like this:
Raspberry PI (Python using paho-mqtt) -> Google Cloud IoT -> Google Cloud Pub/Sub -> Firebase Functions -> Firestore.
Data in the device is a Python Dictionary. I convert that to JSON.
The problem I had was that paho-mqtt will only send (publish) data as String and one of the fields of my data is timestamp. This timestamp is saved from the device because it accurately says when the measurement was taken regardless on when the data is ultimately stored in the database.
When I send my JSON structure, Firestore will store my field 'timestamp' as String. This is not convenient. So here is the solution.
I do a conversion in the Cloud Function that is triggered by the Pub/Sub to write into Firestore using Moment library to convert.
Note: I am getting the timestamp in python with:
currenttime = datetime.datetime.utcnow()
var moment = require('moment'); // require Moment
function toTimestamp(strDate){
return parsedTime = moment(strDate, "YYYY-MM-DD HH:mm:ss:SS");
}
exports.myFunctionPubSub = functions.pubsub.topic('my-topic-name').onPublish((message, context) => {
let parsedMessage = null;
try {
parsedMessage = message.json;
// Convert timestamp string to timestamp object
parsedMessage.date = toTimestamp(parsedMessage.date);
// Get the Device ID from the message. Useful when you have multiple IoT devices
deviceID = parsedMessage._deviceID;
let addDoc = db.collection('MyDevices')
.doc(deviceID)
.collection('DeviceData')
.add(parsedMessage)
.then ( (ref) => {
console.log('Added document ID: ', ref.id);
return null;
}).catch ( (error) => {
console.error('Failed to write database', error);
return null;
});
} catch (e) {
console.error('PubSub message was not JSON', e);
}
// // Expected return or a warning will be triggered in the Firebase Function logs.
return null;
});
Firestone method does not work. Use Timestamp from java.sql.Timestamp and don't cast to string.. Then firestone formats it properly. For example to mark a now() use:
val timestamp = Timestamp(System.currentTimeMillis())
multiple ways to store time in Firestore
firebaseAdmin.firestore.FieldValue.serverTimestamp() method. The actual timestamp will be computed when the doc is written to the Firestore.
while storing it looks like this:
firebaseAdmin.firestore.Timestamp.now() method.
while storing it looks like this:
For both the methods, next time you fetch data it will return Firestore Timestamp object:
So, you first need to convert it to native js Date object and then you can perform methods on it like toISOString().
export function FStimestampToDate(
timestamp:
| FirebaseFirestore.Timestamp
| FirebaseFirestore.FieldValue
): Date {
return (timestamp as FirebaseFirestore.Timestamp).toDate();
}
Store as unix timestamp Date.now, it'll be stored as number i.e. 1627235565028 but you won't be able to see it as readable Date in firestore db.
To query on this Firestore field, you need to convert the date to timestamp and then query.
Store as new Date().toISOString() i.e. "2021-07-25T17:56:40.373Z" but you won't be able to perform date range query on this.
I prefer the 2nd or 3rd way.
According to the docs, you can "set a field in your document to a server timestamp which tracks when the server receives the update".
Example:
import { updateDoc, serverTimestamp } from "firebase/firestore";
const docRef = doc(db, 'objects', 'some-id');
// Update the timestamp field with the value from the server
const updateTimestamp = await updateDoc(docRef, {
timestamp: serverTimestamp() // this does the trick!
});
Sharing what worked for me after googling for 2 hours, for firebase 9+
import { serverTimestamp } from "firebase/firestore";
export const postData = ({ name, points }: any) => {
const scoresRef = collection(db, "scores");
return addDoc(scoresRef, {
name,
points
date: serverTimestamp(),
});
};
Swift 5.1
...
"dateExample": Timestamp(date: Date()),
...
The newest version from Firestore you should use it as follow
import { doc, setDoc, Timestamp } from "firebase/firestore";
const docData = {
...
dateExample: Timestamp.fromDate(new Date("December 10, 1815"))
};
await setDoc(doc(db, "data", "one"), docData);
or for sever timestamp
import { updateDoc, serverTimestamp } from "firebase/firestore";
const docRef = doc(db, 'objects', 'some-id');
const updateTimestamp = await updateDoc(docRef, {
timestamp: serverTimestamp()
});

Calling getItem() on DynamoDB object from AWS Lambda, why doesn't my callback execute?

I'm trying to get an item from my DynamoDB database. The way my code is presently written, I fail to retrieve any data from DynamoDB. I must be doing something wrong, because as far as I can tell from my test, my callback is not being called.
I spent all day on this yesterday and have been tinkering with it unsuccessfully since I woke up this morning.
If anyone can provide insight into what I'm doing wrong here, I would be very grateful. Thanks to everyone in advance!
Final note: The timeout on the Lambda function itself is set to 5 minutes. So I don't think the Lambda function is timing out before the db query can return. When I run the function, it exits after only a moment.
const AWS = require('aws-sdk');
const dynamodb = new AWS.DynamoDB();
var response = null;
var test = false;
function getFromDB(callback) {
const params = {
TableName: process.env['DB_TABLE_NAME'] // evaluates to 'test-table',
Key: {
"id": {
S: postId // evaluates to a big string, pulling it in from an SNS message. Verified it with console.log(). It stores the expected value.
}
}
};
dynamodb.getItem(params, function(err, data) {
if (err) callback(data, true); // an error occurred
else callback(data, true); // successful response
});
}
getFromDB((data, isCalled) => {
response = data;
test = isCalled;
});
console.log(data); // evaluates to null
console.log(test); // evaluates to false
I Had faced similar issue.
I removed async in the statement below to resolve :
exports.handler = async (event,context)
I think what's going on is Lambda calls the function, but it's not going to wait for the call back, so it thinks it is done and exits.
I think I had a similar problem and resolved it by using Bluebird and async/await.
I can provide a snippet from my code if you need it
Have you loaded the SDK? I can't see it in your code snippet
// Load the AWS SDK for Node.js
var AWS = require('aws-sdk');
// Set the region
AWS.config.update({region: 'REGION'});
EDIT: Included region

Resources