I have a simple use case.
I am using S3, DynamoDB now I want to send a message to a device using SNS.
This message is triggered by DynamoDB update.
How do I do that?
Are there any 'examples' available for this kind of problem?
We use a DynamoDB stream to trigger a Lambda function in JavaScript which then uses SNS to send a message.
There is no direct notification(sns) that can be generated by dynamodb update.
Refer to the Dynamodb streams. It is still in preview mode but will soon be available for all.
http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Streams.html
Related
I need to implement a publish/subscribe with DynamoDB. Every of my cloud node should send events to all other nodes of my application that are connected to the same DynamoDB database. A typical use case is to clear caches if data in the database has changed.
Can be the DynamoDB streams a solution for it? It look for me that the stream can consumed only once and not from every other node. Is this right?
Is there some like tailed cursor support in DynamoDB?
Are there any other feature that I can use?
DynamoDB doesn't support Pub/Sub as a feature like you might see in Redis.
If you need to have cache functionality, you can check DynamoDB Accelerator (Dax). You can check specifically the consistency best practices documentation.
You can also use a dedicated Pub/Sub service such as Simple Notification Service (SNS).
On Flutter Web, I have a stream from firestore which displays the current messages in a collection but when the internet is not available I can't have access to those messages from that stream.
I Decided to use a hive box to cache the messages and return them when no internet is available.
I cache the massages using a Hive Box but it's really challenging for me to leverage providers and return the cache data first and linking that up with the stream from firestore.
Tried convert the cached data from a Hive Box into broadcast stream and combine with the firestore stream using ZipStreams and did a mapping but doesn't still work.
Please I will need help on the best way to go about this.
Firestore has its own data persistence (read/write data when offline) and it handles all the hard work so you don't need Hive.
Works on web apps, android and iOS.
See:
https://firebase.google.com/docs/firestore/manage-data/enable-offline
I configured AWS EventBridge to post events to an SNS target. When an event is received, the SNS target is never triggered and none of the SNS subscribers get the event.
If I add other targets to EventBridge it works (eg, Lambda) but the SNS does not. Adding a dead-letter queue to SQS show there's a permissions issue:
However, changing the Access Policy of SNS does not work.
If encryption is enabled for SNS, disabling it is not the solution. Rather give events service the required access policy to encrypted SNS topic.
More details on following link here:
https://aws.amazon.com/premiumsupport/knowledge-center/cloudwatch-receive-sns-for-alarm-trigger/#:~:text=If%20the%20SNS%20topic%20must,messages%20to%20encrypted%20SNS%20topics.
Replace cloudwatch.amazonaws.com to events.amazonaws.com
Update:
It seems encryption is not supported if using the default AWS Key Management Service (KMS). It can be configured using customer managed keys as explained in this other answer.
Old answer:
The problem seems to be a configuration issue or even a bug in AWS. If the SNS Topic has Encryption At-Rest enabled, then it fails.
The solution is to disable Encryption under the SNS Topic settings:
This issue was with using the AWS-managed keys(aws/sns).
Changing to Customer managed keys worked for me.
You can also add multiple targets to check if logs are generating. The easiest would be log-group.
I have a cloudwatch alarm for my s3 bucket, if there are no changes to the bucket in a day, the alarm is triggered and an SNS topic is sent.
I have set a cloudwatch event rule to schedule the target SNS topic daily if it meets the conditions.
However, I am having trouble customizing the SNS message needed to provided detail to the notifications.
I have attempted to use the input transformer, but I cannot wrap my head around keys I need to map for this service.
How can I map the details required? How can I find the key-value details to send to my Input Transformer to formulate a message?
The easiest method would be to trigger an AWS Lambda function that can read the incoming information, customize the content, and then send it as a message through Amazon SNS.
I don't think Amazon CloudWatch can directly trigger an AWS Lambda function, so you'll probably need two SNS topics:
One SNS topic used by CloudWatch, that triggers the Lambda function
One SNS topic that where Lambda sends the customized message and people can subscribe to receive it
The Lambda function can also do additional work, such as reporting on the size of the bucket and retrieving additional stats you would like mentioned.
Has anyone tried to consume DynamoDB streams in Apache Flink ?
Flink has a Kinesis consumer. But I am looking for how can i consume the Dynamo stream directly.
DataStream<String> kinesis = env.addSource(new FlinkKinesisConsumer<>(
"kinesis_stream_name", new SimpleStringSchema(), consumerConfig));
I tried searching a lot, but did not find anything. However found an open request pending the Flink Jira board. So I guess this option is not available yet ? What alternatives do I have ?
Allow FlinkKinesisConsumer to adapt for AWS DynamoDB Streams
UPDATED ANSWER - 2019
FlinkKinesisConsumer connector can now process a DynamoDB stream after this JIRA ticket is implemented.
UPDATED ANSWER
It seems that Apache Flink does not use the DynamoDB stream connector adapter, so it can read data from Kinesis, but it can't read data from DynamoDB.
I think one option could be implement an app that would write data from DynamoDB streams to Kinesis and then read data from Kinesis in Apache Flink and process it.
Another option would be to implement custom DynamoDB connector for Apache Flink. You can use existing connector as a starting point.
Also you can take a look at the Apache Spark Kinesis connector. But it seems that it has the same issue as well.
ORIGINAL ANSWER
DynamoDB has a Kinesis adaptor that allow you to consume a stream of DynamoDB updates using Kinesis Client Library. Using Kinesis adaptor is a recommended way (according to AWS) of consuming updates from DynamoDB. This will give you same data as using DynamoDB stream directly (also called DynamoDB low-level API).