Using Dynamo DB client in my custom ask-sdk webhook - amazon-dynamodb

I have used my own custom webhook using ask-sdk and is deployed in my ec2 instance. Now I want to use DynamoDB as DynamoDbPersistenceAdapter
but I am not getting any reference how to do that.
DynamoDbPersistenceAdapter will need AWS Keys and table name and some details for dynamo db but where to initialize? I found some code, but this dont have anything :
persistenceAdapter = new DynamoDbPersistenceAdapter({
tableName: 'global_attr_table',
createTable: true,
partitionKeyGenerator: keyGenerator
});

This can probably be solved by adding environmental variables and by setting up an AWS CLI profile:
Heres how you setup an AWS CLI Profile:
https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html
Once you have a profile setup with your AWS access information you can export Environmental Variables in your command line or in a shell script
$> export AWS_PROFILE=YourNewAWSCLIProfileName
$> export AWS_REGION=us-east-1
$> export AWS_DEFAULT_REGION=us-east-1
and you can check that these variables are set by typing
$> echo $AWS_PROFILE
$> echo $AWS_REGION
$> echo $AWS_DEFAULT_REGION
This is what I use. If for some reason that doesnt work here is some research into how you might add a DynamoDB Client:
Trying to solve a different problem so let me solve yours as I walk through mine:
In: node_modules/ask-sdk/dist/skill/factory/StandardSkillFactory.js
there is reference to something similar to what you have above
new ask_sdk_dynamodb_persistence_adapter_1.DynamoDbPersistenceAdapter({
tableName: thisTableName,
createTable: thisAutoCreateTable,
partitionKeyGenerator: thisPartitionKeyGenerator,
dynamoDBClient: thisDynamoDbClient,
})
I believe you need to create a DynamoDbClient instance which I found referenced here in the AWS SDK Docs.
https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/dynamodb-example-document-client.html
You'd have to set your own service:
In: node_modules/aws-sdk/lib/dynamodb/document_client.js
/**
* Creates a DynamoDB document client with a set of configuration options.
*
* #option options params [map] An optional map of parameters to bind to every
* request sent by this service object.
* #option options service [AWS.DynamoDB] An optional pre-configured instance
* of the AWS.DynamoDB service object to use for requests. The object may
* bound parameters used by the document client.
* #option options convertEmptyValues [Boolean] set to true if you would like
* the document client to convert empty values (0-length strings, binary
* buffers, and sets) to be converted to NULL types when persisting to
* DynamoDB.
* #see AWS.DynamoDB.constructor
*
*/
constructor: function DocumentClient(options) {
var self = this;
self.options = options || {};
self.configure(self.options);
},
/**
* #api private
*/
configure: function configure(options) {
var self = this;
self.service = options.service;
self.bindServiceObject(options);
self.attrValue = options.attrValue =
self.service.api.operations.putItem.input.members.Item.value.shape;
},
/**
* #api private
*/
bindServiceObject: function bindServiceObject(options) {
var self = this;
options = options || {};
if (!self.service) {
self.service = new AWS.DynamoDB(options);
} else {
var config = AWS.util.copy(self.service.config);
self.service = new self.service.constructor.__super__(config);
self.service.config.params =
AWS.util.merge(self.service.config.params || {}, options.params);
}
},
I'm not sure what those options might look like.

Related

Symfony 4: Connect to database manually / Create a dynamic entity manager

I'm working on a Symfony 4 Web Project, and I have a database for every Client, so, in every request I have to connect to a database based on client id.
How to use doctrine to connect to a database manually ?
MyController:
/**
* #Route("/api/log", name="log", methods={"GET"})
*/
public function log(Request $request)
{
$this->denyAccessUnlessGranted(['ROLE_CLIENT','ROLE_ADMIN']);
$clientId = $request->query->get('client_id');
$dbName = 'project_'.$clientId;
//I have database credentials: $host,$port,$username,$password & $dbName:
$this->getDoctrine()->........
You can work with multiple connections and managers. Here the official symfony documentation.
So, if you could change manually config/packages/doctrine.yaml, you will solve your problem.
Another way, is to work with entity manager directly:
use Doctrine\ORM\Tools\Setup;
use Doctrine\ORM\EntityManager;
$paths = array('/path/to/entity/mapping/files');
$config = Setup::createAnnotationMetadataConfiguration($paths);
$dbParams = array('driver' => 'pdo_sqlite', 'memory' => true);
$entityManager = EntityManager::create($dbParams, $config);
Entity Manager API documenation

How to pass a parameter from the Jupyter backend to a frontend extension

I currently have a value that is stored as an environment variable the environment where a jupyter server is running. I would like to somehow pass that value to a frontend extension. It does not have to read the environment variable in real time, I am fine with just using the value of the variable at startup. Is there a canonical way to pass parameters a frontend extension on startup? Would appreciate an examples of both setting the parameter from the backend and accessing it from the frontend.
[update]
I have posted a solution that works for nbextentions, but I can't seem to find the equivalent pattern for labextensions (typescript), any help there would be much appreciated.
I was able to do this by adding the following code to my jupter_notebook_config.py
from notebook.services.config import ConfigManager
cm = ConfigManager()
cm.update('notebook', {'variable_being_set': value})
Then I had the parameters defined in my extension in my main.js
// define default values for config parameters
var params = {
variable_being_set : 'default'
};
// to be called once config is loaded, this updates default config vals
// with the ones specified by the server's config file
var update_params = function() {
var config = Jupyter.notebook.config;
for (var key in params) {
if (config.data.hasOwnProperty(key) ){
params[key] = config.data[key];
}
}
};
I also have the parameters declared in my main.yaml
Parameters:
- name: variable_being_set
description: ...
input_type: text
default: `default_value`
This took some trial and error to find out because there is very little documentation on the ConfigManager class and none of it has an end-to-end example.

alexa skill local could not write to dynamodb

I am writing a node.js skill using ask-sdk and using alexa-skill-local to test the endpoint. I need to persist data to DynamoDb in one of the handler. But I keep getting "missing region error". Please find my code below:
'use strict';
// use 'ask-sdk' if standard SDK module is installed
const Alexa = require('ask-sdk');
const { launchRequestHandler, HelpIntentHandler, CancelAndStopIntentHandler, SessionEndedRequestHandler } = require('./commonHandlers');
const ErrorHandler = {
canHandle() {
return true;
},
handle(handlerInput, error) {
return handlerInput.responseBuilder
.speak('Sorry, I can\'t understand the command. Please say again.')
.reprompt('Sorry, I can\'t understand the command. Please say again.')
.getResponse();
},
};
////////////////////////////////
// Code for the handlers here //
////////////////////////////////
exports.handler = Alexa.SkillBuilders
.standard()
.addRequestHandlers(
launchRequestHandler,
HelpIntentHandler,
CancelAndStopIntentHandler,
SessionEndedRequestHandler,
ErrorHandler
)
.withTableName('devtable')
.withDynamoDbClient()
.lambda();
And in one of the handler I am trying to get persisted attributes like below:
handlerInput.attributesManager.getPersistentAttributes().then((data) => {
console.log('--- the attributes are ----', data)
})
But I keep getting the following error:
(node:12528) UnhandledPromiseRejectionWarning: AskSdk.DynamoDbPersistenceAdapter Error: Could not read item (amzn1.ask.account.AHJECJ7DTOPSTT25R36BZKKET4TKTCGZ7HJWEJEBWTX6YYTLG5SJVLZH5QH257NFKHXLIG7KREDKWO4D4N36IT6GUHT3PNJ4QPOUE4FHT2OYNXHO6Z77FUGHH3EVAH3I2KG6OAFLV2HSO3VMDQTKNX4OVWBWUGJ7NP3F6JHRLWKF2F6BTWND7GSF7OVQM25YBH5H723VO123ABC) from table (EucerinSkinCareDev): Missing region in config
at Object.createAskSdkError (E:\projects\nodejs-alexa-sdk-v2-eucerin-skincare-dev\node_modules\ask-sdk-dynamodb-persistence-adapter\dist\utils\AskSdkUtils.js:22:17)
at DynamoDbPersistenceAdapter.<anonymous> (E:\projects\nodejs-alexa-sdk-v2-eucerin-skincare-dev\node_modules\ask-sdk-dynamodb-persistence-adapter\dist\attributes\persistence\DynamoDbPersistenceAdapter.js:121:45)
Can we read and write attributes from DynamoDb using alexa-skill-local ? Do we need some different setup to achieve this ?
Thanks
I know that this is a really old topic, but I had the same problem few days ago, and I'm gonna explain how I did it work.
You have to download DynamoDB Locally and follow the instructions from here
Once that you have configure your local DynamoDB and check that it is working. You have to pass it through your code, to DynamoDbPersistenceAdapter constructor.
Your code should look similar to:
var awsSdk = require('aws-sdk');
var myDynamoDB = new awsSdk.DynamoDB({
endpoint: 'http://localhost:8000', // If you change the default url, change it here
accessKeyId: <your-access-key-id>,
secretAccessKey: <your-secret-access-key>,
region: <your-region>,
apiVersion: 'latest'
});
const {DynamoDbPersistenceAdapter} = require('ask-sdk-dynamodb-persistence-adapter');
return new DynamoDbPersistenceAdapter({
tableName: tableName || 'my-table-name',
createTable: true,
dynamoDBClient: myDynamoDB
});
Where <your-acces-key-id>, <your-secrete-access-key> and <your-region> are defined at aws config and credentials files.
The next step is launch your server with alexa-skill-local command as always.
Hope this will be helpfull! =)
Presumably you have an AWS config profile that your skill is using when running locally.
You need to edit the .config file and set the default region (ie us-east-1) there. The region should match the region where your table exists.
Alternatively, if you want to be able to run completely isolated, you may need to write come conditional logic and swap the dynamo client with one targeting an instance of DynamoDB Local running on your machine.

symfony translation by using keys

If I want to translate content in symfony, I will use the translator as it is described in the book:
$translated = $this->get('translator')->trans('Symfony2 is great');
But, if this translations are already existing in a database, how can I access this?
The db looks like
ID | locale | type | field | content
1 | en | message| staff.delete | delete this user?
I wil have to tell the translator where he can get the translation information. Cann you help me with a good tutorial or tipps an tricks?
According to docs you need to register a service in order to load translations from other source like from database
You can also store translations in a database, or any other storage by
providing a custom class implementing the LoaderInterface interface.
See the translation.loader tag for more information.Reference
What i have done,i have a translation bundle where my translation entity resides so i have registered a service in config.yml and passed doctrine manager #doctrine.orm.entity_manager in order to get data from entity
services:
translation.loader.db:
class: Namespace\TranslationBundle\Loader\DBLoader
arguments: [#doctrine.orm.entity_manager]
tags:
- { name: translation.loader, alias: db}
In DBLoader class i have fetched translations from database and sets as mentioned in docs translation.loader
My Loader class
namespace YourNamespace\TranslationBundle\Loader;
use Symfony\Component\Translation\Loader\LoaderInterface;
use Symfony\Component\Translation\MessageCatalogue;
use Doctrine\ORM\EntityManager;
class DBLoader implements LoaderInterface{
private $transaltionRepository;
private $languageRepository;
/**
* #param EntityManager $entityManager
*/
public function __construct(EntityManager $entityManager){
$this->transaltionRepository = $entityManager->getRepository("YourNamespaceTranslationBundle:LanguageTranslation");
$this->languageRepository = $entityManager->getRepository("YourNamespaceTranslationBundle:Language");
}
function load($resource, $locale, $domain = 'messages'){
//Load on the db for the specified local
$language = $this->languageRepository->findOneBy( array('locale' => $locale));
$translations = $this->transaltionRepository->getTranslations($language, $domain);
$catalogue = new MessageCatalogue($locale);
/**#var $translation YourNamespace\TranslationBundle\Entity\LanguageTranslation */
foreach($translations as $translation){
$catalogue->set($translation->getLanguageToken(), $translation->getTranslation(), $domain);
}
return $catalogue;
}
}
Note: Each time you create a new translation resource (or install a bundle
that includes a translation resource), be sure to clear your cache so
that Symfony can discover the new translation resources:
php app/console cache:clear

Unit testing Symfony application using FOSElasticaBundle without an ES Server?

I have an application with an existing set of unit tests which are using SQLite as the DB. I have recently added search capabilities via ES which have replaced many of the endpoint actions that used to query the DB directly. I want to test all of the business logic involved with these endpoints without testing ES itself, which means no ES server available. I plan to test ES itself in a set of integration tests to be run less frequently.
My problem is trying to track down exactly what is going on with the execution flow.
My first inclination was to simply create a mock object of the ES Finder that FOSElasticaBundle creates for my index. Because I'm using pagination, it turned out to be more complex than I thought:
// code context: test method in unit test extending Symfony's WebTestCase
$client = $this->getClient();
$expectedHitCount = 10;
// Setup real objects which (as far as I can tell) don't act upon the ES client
// and instead only hold / manipulate the data.
$responseString = file_get_contents(static::SEARCH_RESULT_FILE_RESOURCE);
$query = SearchRepository::getProximitySearchQuery($lat, $lng, $radius, $offset, $limit);
$response = new Response($responseString, 200);
$resultSet = new RawPartialResults(new ResultSet($response, $query ));
// Create a mock pagination adapter which is what my service expects to be returned from
// the search repository.
$adapter = $this->getMockBuilder('FOS\ElasticaBundle\Paginator\RawPaginatorAdapter')
->disableOriginalConstructor()
->getMock();
$adapter->method('getTotalHits')->will($this->returnValue($expectedTotalCount));
$adapter->method('getResults')->will($this->returnValue($resultSet));
$adapter->method('getQuery')->will($this->returnValue($query));
$es = $this->getMockBuilder(get_class($client->getContainer()->get(static::ES_FINDER_SERVICE)))
->disableOriginalConstructor()
->getMock();
$es->method('createPaginatorAdapter')->will($this->returnValue($adapter));
// Replace the client container's service definition with our mock object
$client->getContainer()->set(static::ES_FINDER_SERVICE, $es);
This actually works all the way until I return the view from my controller. My service gets back the mock paginatior adapter with the pre-popuated result set from the JSON search response I have stored in a file (and subsequently passed into my ResultSet object). However, once I return the view, there seems to be a listener involved that tries to query ES again with the Query instead of using the ResultSet I already passed in.
I can't seem to find this listener. I also don't understand why it would try to query when a ResuletSet already exists.
I am using FOSRestBundle, as well, and making use of their ViewListener to auto-serialize whatever I return. I don't see any suspects in that flow, either. I think it may have something to do with the serialization of the result set, but so far haven't been able to track the offending code down.
Has anyone tried to do something similar to this before and have any suggestions on either how to debug my current setup or an alternative, better setup for mocking ES for this type of test?
After digging around I found an alternative solution that does not involve using mock objects. I am going to leave this open for the time being in case someone has a better approach, but the approach I decided to take in the mean time is to override the Client in my test environment.
FOSElasticaBundle has an example for overriding the client here: https://github.com/FriendsOfSymfony/FOSElasticaBundle/blob/master/Resources/doc/cookbook/suppress-server-errors.md
I was able to override the client in such a way that I could create a unique key from the request and then provide responses based on that key, essentially stubbing the server for all known requests. For requests that don't match I return a default empty response. This works well enough for me.
Client Code
<?php
namespace Acme\DemoBundle\Tests\Elastica;
use Elastica\Request;
use Elastica\Response;
use FOS\ElasticaBundle\Client as BaseClient;
class Client extends BaseClient
{
/**
* This array translates a key which is the md5 hash of the Request::toString() into
* a human friendly name so that we can load the proper response from a file in the
* file system.
*
* #var array
*/
protected $responseLookup = array(
'7fea3dda860a424aa974b44f508b6678' => 'proximity-search-response.json'
);
/**
* {#inheritdoc}
*/
public function request($path, $method = Request::GET, $data = array(), array $query = array())
{
$request = new Request($path, $method, $data, $query);
$requestKey = md5($request->toString());
$this->_log($request);
$this->_log("Test request lookup key: $requestKey");
if (!isset($this->responseLookup[$requestKey])
|| !$response = file_get_contents(__DIR__ . "/../DataFixtures/Resources/search/{$this->responseLookup[$requestKey]}")) {
return $this->getNullResponse();
}
return new Response($response);
}
public function getNullResponse()
{
$this->_log("Returning NULL response");
return new Response('{"took":0,"timed_out":false,"hits":{"total":0,"max_score":0,"hits":[]}}');
}
}
Configuration Change
// file: config_test.yml
parameters:
fos_elastica.client.class: Acme\DemoBundle\Tests\Elastica\Client
Sample Response File (proximity-search-response.json)
{
"took": 7,
"timed_out": false,
"_shards": {
"total": 5,
"successful": 5,
"failed": 0
},
"hits": {
"total": 1,
"max_score": null,
"hits": [
{
"_index": "search",
"_type": "place",
"_id": "1",
"_score": null,
"_source": {
"location": "40.849100,-73.644800",
"id": 1,
"name": "My Place"
},
"sort": [
322.52855474383045
]
}
]
}
}
This solution works well and is fast, but the maintenance is a pain. If anything about the request changes, you need to retrieve the new request key from the log, update it in the array, and update the file with the new response data for the new request. I generally just curl the server directly and modify it from there.
I would love to see any other solutions that may be simpler, but I hope this helps someone else in the meantime!
you can try to disable the event listeners in your config_test.yml (or whatever is your test environment name).
fos_elastica:
indexes:
your_index_name:
types:
your_type_name:
persistence:
listener:
insert: false
update: false
delete: false

Resources