How can I run dynamo DB on my local machine with serverless API structure(node js)? - amazon-dynamodb

I just want to run my serverless apis on local... how can I achieve that...do I need to run dynamoDB locally?? OR we can achieve this without configuring Dynamo on local!!
I was trying with dynamo db local plugin. . . but I want to run dynamo on AWS console...but my serverless apis on my local machine

You can use environment variables to specify the AWS access key and secret key that Local should use to connect to the DynamoDB service on AWS.

Related

Use Firebase-functions-emulator with remote firebase services

Can I connect the firebase-functions-emulator with the remote authentication and firestore service? Currently it is connecting only to the other local emulators. I don't want to setup my db locally.
It's not possible to have the local Cloud Functions emulator respond to (or "trigger" on) events in a production database. You have to deploy your functions in order for them to execute in response to change in the cloud-hosted database. The local functions emulator only responds to changes in the locally emulated database.
Refer the following to understand how to Connect your app to the Realtime Database Emulator
Check a similar example here.

How to drop and recreate local dynamodb/appsync/amplify database created by mock api?

I am using AWS Amplify to build a Web Application. I am using Appsync and DynamoDb and I've defined my GraphQL schema. Now, Amplify offers the ability to test local GraphQL endpoints by running "amplify mock api" from the command line. I did this and it successfully created some local GraphQL endpoints for me and I was able to insert some data and do some local queries. (When I ran "amplify mock api" the first time I got some messages on the console that my tables were created.)
I have since made quite significant changes to my GraphQL schema, including keys, sorting keys, etc. I don't think all of my changes successfully got applied to my local api and database tables. So I just basically want to completely delete my local "database" so that "amplify mock api" can regenerate a new local database for me based on my new schema. How do I do this? I don't know where this amplify local database resides or what underlying technology it uses. (Otherwise I would just connect directly to the database and drop all tables to force a recreation.) I have tried "amplify remove api" which removed the local endpoints. I even pushed this to AWS (I am in development mode currently, so I didn't mind destroying my AWS environment.) I then did "amplify add api" again from scratch and I typed out my schema again. But if I run "amplify mock api" then it doesn't recreate the tables. The endpoint starts up and if I perform a GraphQL query I get the data back that I originally added. Which means those tables persist.
How can I completely drop my local "mock" Amplify Appsync GraphQL endpoints and database to force a recreate? (I am using a Mac, if it's relevant).
It ended up being very simple. Amplify creates the mock data in ./amplify/mock-data. So to delete the database and recreate it I just deleted this directory in my project. This question was helpful in working out how the mock API and database setup works.

Using kubernetes-secrets with Google Composer

Is it possible to use kubernetes-secrets together with Google Composer in order to access secrets from Airflow workers?
We are using k8s secrets with our existing standalone k8s Airflow cluster and were hoping we can achieve the same with Google Composer.
By default, Kubernetes secrets are not exposed to the Airflow workers deployed by Cloud Composer. You can patch the deployments to add them (airflow-worker and airflow-scheduler), but there will be no guarantee that they won't be reverted if you perform an update on the environment (such as configuration update or in-place upgrade).
It's probably easiest to use an Airflow connection (which are encrypted in the metadata database using Fernet), or to launch new pods using KubernetesPodOperator/GKEPodOperator and mounting the relevant secrets into the pod at pod launch.
Kubernetes secrets are available to the Airflow workers. You can contribute the components for whatever API you wish to call to work natively in Airflow so that the credentials can be stored as a Connection in Airflow's metadata database, which is encrypted at rest. Using Airflow connection involves storing the secret key in GCS with an appropriate ACL, and setting up Composer to secure the connection.
You can write your own custom operator to access the secret in the Kubernetes and use it. Take a look for SimpleHttpOperator - this pattern can be applied to any arbitrary secret management scheme. This is for for scenarios that access external services that aren't explicitly supported by Airflow Connections, Hooks, and Operators.
I hope it helps.

Post to Azure Cosmos Db from NiFi

I created Azure CosmosDb database and container for my documents.
I use NiFi as a main data ingestion tool and want to feed my container with documents from NiFi flow files.
Can anybody please share a way to post flowfile content to Azure Cosmos Db from NiFi?
Thanks in advance
UPDATE(2019.05.26):
In the end I used Python script and called it from NiFi to post messages. I passed a message as a parameter. The reason I chose python is because it has some examples on official Microsoft site with all the required connection settings and libraries, so it was easy to connect to Cosmos.
I tried Mongo component, but couldn't connect to Azure (security config didn't work), didn't really go too far with it as Python script worked just fine.
Azure CosmosDB exposes MongoDB API so you can use the following MongoDB processors which are available in NiFi to read/query/write to & from Azure CosmosDB using Apache NiFi.
DeleteMongo
GetMongo
PutMongo
PutMongoRecord
RunMonogAggregation
Useful Links
https://learn.microsoft.com/en-us/azure/cosmos-db/mongodb-introduction
https://learn.microsoft.com/en-us/azure/cosmos-db/mongodb-feature-support
Valeria. According to the components list supported by Apache Nifi related to Azure, you could only get Azure Blob Storage, Queue Storage, Event Hub etc,not including Cosmos DB.
So,I suggest you using PutAzureBlobStorage to feed azure blob container with documents from NiFi flow files. Then please create a copy activity pipeline in Azure Data Factory to transfer data from Azure Blob Storage into Azure Cosmos DB.

How to connect a database server running on local machine as a service to web application hosted on pivotal cloud foundry?

I am trying to test run a basic .NET web application on pivotal cloud foundry. This web application uses as its database a MongoDB server hosted on my local machine. At the moment I am limited to use of the cloud infrastructure by using just the Apps Manager.
I have read the pivotal cloud foundry docs about user provided services, but cannot figure out as to how the connection is to be really made. I have already come across various other ways like using MongoDB as a service (beta version), but at the moment I am not allowed access to the Operations Manager. Looking for an explanation on user provided services or how to implement the service broker API, specifically.
I am new to Mongo as well, so any suggestion regarding making a connection through tweaking Mongo may help as well. Thanks
The use case you describe (web app in PCF connecting to a resource in your local machine) is not recommended.
You can create a MongoDB instance for development purposes in PCF.
$ cf marketplace
...
mlab sandbox Fully managed MongoDB-as-a-Service
...
You can create a mlab service and bind it to your application. You will then have a MongoDB instance in PCF that you can use for development purposes.
Edit:
In that case a user provided service might help you, where you pass in your remote MongoDB instance configuration that you can read in your application. e.g.:
cf.exe cups my-mongodb -p '{"key1":"value1","key2":"value2"}'
You can add your local mongo-db as a CUPS service to your PCF Dev.
Check out the following post.
How to create a CUPS service for mongoDB?

Resources