I'm seeing .project().by() traversals returning {} under Gremlin JS 3.4.0. When I downgrade to 3.2.10 they work correctly.
gremlin> g.addV("trip").property(single, "trackName", "Ohio")
==>v[1]
In Gremlin JS `3.4.0`:
const result = await g.V("1").project("trackName").by("trackName").next();
result:
{
"value": {},
"done": false
}
but when I downgrade to Gremlin 3.2.10 the result is correct:
{
"value": {
"trackName": "Ohio"
},
"done": false
}
Do I need to change how I use project in 3.4.0?
EDIT: Results from testing against different versions. I ran each test for a gremlin version, captured results, then bumped up the version and ran the tests again. I am only running a single Neptune instance so we can be sure this is the same data.
A failing test means it returned data in the form of:
"results": {
"value": {},
"done": false
}
For the console testing I removed the final .next().
The environment I am testing in is:
AWS Lambda Node 8.10
AWS Neptune 1.0.1.0
EDIT 2: Adding JS files used during Neptune test.
index.js
const gremlin = require("gremlin");
const { DriverRemoteConnection } = gremlin.driver;
const { Graph } = gremlin.structure;
const initGremlinClient = () => {
try {
const dc = new DriverRemoteConnection(
`ws://my-cluster.XXXXXXX.us-east-1.neptune.amazonaws.com:8182/gremlin`,
{}
);
const graph = new Graph();
return {
g: graph.traversal().withRemote(dc),
closeGremlinConnection: () => dc.close()
};
} catch (error) {
console.log("[GREMLIN INIT ERROR]", error);
throw new Error(error);
}
};
exports.handler = async event => {
const { g, closeGremlinConnection } = initGremlinClient();
const result = await g
.addV("test")
.property("myProp", "myValue")
.project("myProp")
.by("myProp")
.next();
closeGremlinConnection();
return result;
};
package.json
{
"name": "gremlinTest",
"version": "1.0.0",
"main": "index.js",
"license": "MIT",
"dependencies": {
"gremlin": "3.4.0"
}
}
I spoke with someone on the AWS team. There is a bug affecting interoperability between Gremlin ^3.3.5 and Lambda. Specifically, the issue is with the underlying GraphSON v3 engine and how Lambda parses JSON.
The temporary workaround is to fall back to GraphSON v2 when instantiating DriverRemoteConnection:
const dc = new DriverRemoteConnection(
`ws://my-neptune-cluster.us-east-1.neptune.amazonaws.com:8182/gremlin`,
{ mimeType: "application/vnd.gremlin-v2.0+json" } // Fall back to GraphSON v2
);
Edit: This issue still exists as of gremlin#3.4.6.
Related
I use DynamoDB with nodeJS on a Lambda function using serverless.
When I scan item from my local computer it works but when I deploy my function scan does not respond. No errors
const docClient = new AWS.DynamoDB.DocumentClient({
apiVersion: "2012-08-10",
});
const checkApiKey = async (apiKey, ) => {
try {
log.debug("before scan");
let result = await docClient
.scan({
"MY_TABLE",
FilterExpression: "#apiKey = :apiKey",
ExpressionAttributeNames: {
"#apiKey": "apiKey",
},
ExpressionAttributeValues: { ":apiKey": apiKey },
})
.promise();
log.debug("after scan");
} catch (error) {
log.error("Can not get dynamo object", { message: error.message });
throwError(error);
}
};
When I call this function on AWS, I can see in my log before scan but I don't see after scan nor error message from catch.
DynamoDB operations like "create" works fine.
I have been looking for a solution for several days ... Without success
I'm not sure if it's causing your problem, but the first thing that stuck out to me is how you are defining the table name in the call to scan:
.scan({
"MY_TABLE",
...
According to the docs, that should be a key/value pair
.scan({
TableName: "MY_TABLE",
...
If you are using the Serverless Framework, do you get different results if you run the function local vs remote?
For example, running the function locally from the command line:
sls invoke local --function <FUNCATION NAME from serverless.yml>
vs running the function remotely (in AWS) from the command line
sls invoke --function <FUNCATION NAME from serverless.yml>
Please try to add the callbacks for the promise of await docClient.scan().promise()
I mean:
let result = await docClient
.scan({
TableName: "MY_TABLE",
FilterExpression: "#apiKey = :apiKey",
ExpressionAttributeNames: {
"#apiKey": "apiKey",
},
ExpressionAttributeValues: { ":apiKey": apiKey },
})
.promise()
.then(data => console.log)
.catch(error => console.error)
and check the result, maybe then the picture became clear.
I seem to be having asynchronous problems. I'm using react, express, sequelize and mariadb for the entire app. I am using axios in my front end to make the get request. However, the get request always returns an empty value. However, in my backend code I know the request is calling the database findAll().
Front-end (React/Axios)
componentDidMount() {
this.getPets();
}
getPets = async () => {
try {
const resp = await axios.get('/getdogs');
this.setState({ pets: resp.body });
console.log(resp.data);
} catch (err) {
// Handle Error Here
console.error(err);
}
server.js
app.get('/getdogs', (req, res) => {
console.log("IN THE FUNCTION");
const pets = db.getPets();
console.log("All pets:", JSON.stringify(pets, null, 2));
res.send(pets);
});
database.js
async function getPets() {
const pets = await Pets.findAll();
console.log("All pets:", JSON.stringify(pets, null, 2));
return JSON.stringify(pets, null, 2);
}
output from server.js
nodemon] starting `node server.js`
Listening on port 5000
IN THE FUNCTION
All pets: {}
warning: please use IANA standard timezone format ('Etc/GMT0')
warning: please use IANA standard timezone format ('Etc/GMT0')
Executing (default): SELECT `id`, `name`, `createdAt`, `updatedAt` FROM `Pets` AS `Pets`;
All pets: [
{
"id": 1,
"name": "HULK",
"createdAt": "2020-09-15T23:09:43.000Z",
"updatedAt": "2020-09-15T23:09:43.000Z"
},
{
"id": 2,
"name": "Martha",
"createdAt": "2020-09-15T23:09:43.000Z",
"updatedAt": "2020-09-15T23:09:43.000Z"
},
{
"id": 3,
"name": "Bernie",
"createdAt": "2020-09-15T23:09:43.000Z",
"updatedAt": "2020-09-15T23:09:43.000Z"
}
]
An axios response does not have body property. Response is in the data property, see Response schema
this.setState({ pets: resp.data });
console.log(resp.data);
You haven't awaited a result from DB:
db.getPets().then(pets => {
console.log("All pets:", JSON.stringify(pets, null, 2));
res.send(pets);
});
Present i am using janusgraph-0.2.0-hadoop2 server and using gremlin#2.6.0 library for querying
const Gremlin = require("gremlin");
const client = Gremlin.createClient(8182, "192.168.0.103");
function test(p){
client.execute(q, {}, (err, results) => {
if (err) {
console.error(err);
client.closeConnection();
}
else {
console.log(results);
client.closeConnection();
}
});
}
for query g.V().count() result is [ 12 ]
for query g.V().has('name', 'saturn').valueMap() result is [ { name: [ 'saturn' ], age: [ 10000 ] } ]
I am ok with that
But after update my janusgraph to janusgraph-0.5.0-hadoop2 server and using same library gremlin#2.6.0
Getting data in different
for query g.V().count() result is [ { '#type': 'g:Int64', '#value': 12 } ]
for query g.V().has('name', 'saturn').valueMap() result is
[
{ '#type': 'g:Map', '#value': [ 'name', [Object], 'age', [Object] ] }
]
Updating library to gremlin#3.4.6
const gremlin = require('gremlin');
const client = new gremlin.driver.Client('ws://192.168.0.106:8182/gremlin', { traversalSource: 'g' });
async function test(q){
const res = await client.submit(q, {});
console.log('res',res)
client.close();
}
test()
for query g.V().count() result is [ 12 ]
for query g.V().has('name', 'saturn').valueMap() result is [ Map { 'name' => [ 'saturn' ], 'age' => [ 10000 ] } ]
Getting data in Hashmap
I want to know
1. Is it necessary to update gremlin library 3.4.6 getting correct result.
2. After updating to 3.4.6 get data in hashmap format, Means i want to know i am getting correct data or not.
3. I want data in object format but got in hashmap. I know i can convert to object but incase data is in nested hashmap, I dont want to repeat and convert it.
Please give me suggestions
I would say it is a very good idea to be on the current version of Janus Graph. Note that you should use the Gremlin libraries that come with Janus graph and not update those independently. The most recent Javascript/Node Gremlin clients do return Map types as you are seeing.
I am having an issue getting results back from my AppSync API via AWSAppSyncClient. I can run the query in the AWS AppSync console and get the complete results, however when I run the query from my client the portion of the results I am looking for returns an empty array.
I have tried slimming down the query to return less results, as I read at one point that dynamo will run a filter on the results being returned if you do not provide your own. I have also read this could have something to do with the partition keys used in the dynamoDB table, however AppSync provisioned that resource for me and handled the initial config. I am new to working with AppSync so I am sort of drawing a blank on where to even start looking for the issue because there is not even an error message.
The Query I am running
export const getUserConversations = `query getUser($id: ID!) {
getUser(id: $id) {
id
conversations {
items {
conversation{
id
associated{
items{
convoLinkUserId
}
}
}
}
}
}
}
`;
Call being made in a redux actions file
export const getUserConvos = (id) => async dispatch => {
AppSyncClient.query({
query: gql(getUserConversations),
variables: {
id: id
}
}).then(res => {
console.log("RES FROM CONVO QUERY", res)
})
}
This is the response I am getting in the browser
Notice conversations.items returns an empty array.
getUser:
conversations:
items: []
__typename: "ModelConvoLinkConnection"
__proto__: Object
id: "HIDDEN_ID"
__typename: "User"
__proto__: Object
__proto__: Object
However if i run the exact same query in the playground on the AppSync console I get this...
{
"data": {
"getUser": {
"id": "HIDDEN_ID",
"conversations": {
"items": [
{
"conversation": {
"id": "HIDDEN_ID",
"associated": {
"items": [
{
"convoLinkUserId": "HIDDEN_ID"
},
{
"convoLinkUserId": "HIDDEN_ID"
}
]
}
}
},
{
"conversation": {
"id": "HIDDEN_ID",
"associated": {
"items": [
{
"convoLinkUserId": "HIDDEN_ID"
},
{
"convoLinkUserId": "HIDDEN_ID"
}
]
}
}
}
]
}
}
}
}
*HIDDEN_ID is a placeholder
I know that the objects are in my DB, however if i run the query via my react application I get nothing, and if I run it in the console on AWS I get another. I need to be able to have access to these conversations via the client. What could be causing this?
here is the multiple document for offer and each offer cantains bidderId that is referenced to users collection and user id.
I want to fetch offer list contains user collection.
I am using angularfire and here is my code.
this.liveOffers=this.db.collection("offers",ref => ref.where('offerExpired', '==', 0).where('isStart', '==', 1)).snapshotChanges().pipe(
map(actions => actions.map(a => {
const data={} = a.payload.doc.data() as offer;
const id = a.payload.doc.id;
var bidder=this.db.doc(data.bidderId).snapshotChanges().subscribe(key=>{
console.log(key.payload.data());
});
return { id, ...data,bidder };
})) );
Here console.log(key.payload.data()); is logging the data for user but it can not bind with bidder variable and i can not use the user object in front end.
Please let me know how can I fetch the offer record with user details.
You need to use a combination of switchMap and combineLatest to get it done.
This is a pseudo-code approach
const temp = []
this.offers$ = this.db.collection().snapshotChanges().pipe(
map(auctions=>{
//we save all auctions in temp and return just the bidderId
return auctions.map(auction=>{
const data={} = a.payload.doc.data() as offer;
const id = a.payload.doc.id;
temp.push({id, ...data})
return data.bidderId
})
}),
switchMap(bidderIds=>{
// here you'll have all bidderIds and you need to return the array to query
// them to firebase
return combineLatest(bidderIds.map(bidderId=> return this.db.doc(bidderId)))
}),
map(bidders=>{
// here you'll get all bisders you'll have to set the bidder on each temp obj
// you saved previously
})
)
Make sure you import { combineLatest } from 'rxjs/operators' not 'rxjs'
I found a way. It is working. But I think it is bit big and there might be way to optimize it. Also it is node-js server API and not for the web (JS). Again there might be similar solution for the web (JS)
Once you get data from snapshot object there is _path key in the object returned by data which again have segments which is array and contain collection and ID
const gg = await firestore.collection('scrape').doc('x6F4nctCD').get();
console.log(JSON.stringify(gg.data(), null, 4));
console.log(gg.data().r._path.segments[0]);
console.log(gg.data().r._path.segments[1]);
const gg2 = await firestore
.collection(gg.data().r._path.segments[0])
.doc(gg.data().r._path.segments[1])
.get();
console.log(gg2.data());
{
"fv": {
"_seconds": 1578489994,
"_nanoseconds": 497000000
},
"tsnowpo": 1578489992,
"createdAt": {
"_seconds": 1578489992,
"_nanoseconds": 328000000
},
"r": {
"_firestore": {
"_settings": {
"libName": "gccl",
"libVersion": "3.1.0",
"servicePath": "firestore.googleapis.com",
"port": 443,
"clientConfig": {},
"scopes": [
"https://www.googleapis.com/auth/cloud-platform",
"https://www.googleapis.com/auth/datastore"
]
},
"_settingsFrozen": true,
"_serializer": {},
"_projectId": "sss",
"_lastSuccessfulRequest": 1578511337407,
"_preferTransactions": false,
"_clientPool": {
"concurrentOperationLimit": 100,
"activeClients": {},
"terminated": false
}
},
"_path": {
"segments": [
"egpo",
"TJTHMkxOx1C"
]
}
}
}
egpo
TJTHMkxOx1C
{ name: 'Homeware' }