Error 400 on Google Cloud Datastore query with sort and filter - google-cloud-datastore

I am writing a query for Google Cloud Datastore, wanting to find the entity with a certain "moduleID" and the highest number of "sessionID_tot".
$session_query = $datastore->query()
->kind('Session')
->filter('moduleID', '=', $module)
->order('sessionID_tot', Query::ORDER_DESCENDING)
->limit(1);
$session_result = $datastore->runQuery($session_query);
I get the following error.
{ "error": { "code": 400, "message": "no matching index found.
recommended index is:\n- kind: Session\n properties:\n - name:
moduleID\n - name: sessionID_tot\n direction: desc\n", "status":
"FAILED_PRECONDITION" } }
I have read all the limitations on queries here, but can't seem to find a solution all the same. It works if I remove either the ordering or the filter for "moduleID". The properties I'm sorting and filtering are integers. Any ideas what I'm doing wrong?

As mentioned in the error, and linked to from the page you referenced. You need to create a composite index for that query.

Related

on conflict mutation gives unexpected result

on_conflict returns unknown argument
new to hasura, tried looking at multiple how to on_conflict, ran mutation from api explorer and from frontend, tried upsert_users (suggest me to change it to insert)
mutation upsert_users {
insert_users(
objects: [{
auth0_id: "iexistindb",
name: "somename"}
],
on_conflict: {
constraint: users_pkey,
update_columns: [last_seen, name]
}
) {
affected_rows
}
}
expected to update the user table if auth0 already exist
so i just encountered this now. i had the on_conflict / update_columns but hadn't given update permissions to the role, only insert

Simple GetItem with ctx.identity.username returns null

I'm using AppSync with IAM auth with a DynamoDB resolver and Cognito. I'm trying to do the following.
{
"version": "2017-02-28",
"operation": "GetItem",
"key": {
"userId": $util.dynamodb.toDynamoDBJson($ctx.identity.username)
}
}
$ctx.identity.username is supposed to contain userId generated by Cognito and I'm trying to use it to fetch current user data.
Client side, I'm using AWS Amplify that tells me I'm currently logged:
this.amplifyService.authStateChange$.subscribe(authState => {
if (authState.state === 'signedIn') {
this.getUserLogged().toPromise();
this._isAuthenticated.next(true);
}
});
getUserLogged is the Apollo query that is supposed to returns user data.
What I've tried:
If I leave it like this, getUserLogged returns null.
If I replace in the resolver $util.dynamodb.toDynamoDBJson($ctx.identity.username) with a known userId like this $util.dynamodb.toDynamoDBJson("b1ad0902-2b70-4abd-9acf-e85b62d06fa8"): It works! I get this user data.
I tried to use the test tool in the resolver page but it only gives fake data so I can't rely on this.
Did I make a mistake? To me everything looks good but I guess I'm missing something?
Can I clearly see what $ctx.identity contains?
You'll want to use $ctx.identity.cognitoIdentityId to identify Cognito IAM users:
https://docs.aws.amazon.com/appsync/latest/devguide/resolver-context-reference.html#aws-appsync-resolver-context-reference-identity
You could see the contents of $ctx.identity by creating a Lambda resolver and logging the event or by creating a local resolver and returning the input that the mapping template receives:
https://docs.aws.amazon.com/appsync/latest/devguide/tutorial-local-resolvers.html
My cognitoIdentityId looks like this: eu-west-1:27ca1e79-a238-4085-9099-9f1570cd5fcf

How to filter complex object in Firebase?

Could you please help to filter below JSON using eventId?
{
"-Kb2dYPV0yUXpD_1moc9": {
"eventId": 1,
"message": "sdfsdfsf",
"sentOn": 1485004202943
},
"-Kb2etFm1xHd8sSsESeK": {
"eventId": 1,
"message": "shdfhsf",
"sentOn": 1485004553847
},
"-Kb2etKON8nWVKS2R0sj": {
"eventId": 2,
"message": "shdfhsf",
"sentOn": 1485004553853
}
}
I'm using below URL,
https://xxxxxxxx.firebaseio.com/chats.json?orderBy="eventId"&equalTo=1
but getting constraint index field must be a json primitive error.
I want to retrieve chat objects with respect to particular event ID. I referred to the REST API, there, it was explained with simple JSON.
Please help me.
I've run into the same error before using the firebase REST api, and it was because I didn't have quotes around the value in the equalTo clause (e.g. equalTo="1")

How to delete a large node in firebase

I have a Firebase child node with about 15,000,000 child objects with a total size of about 8 GB of data.
exampele data structure:
firebase.com/childNode/$pushKey
each $pushKey contains a small flat dictionary:
{a: 1.0, b: 2.0, c: 3.0}
I would like to delete this data as efficiently and easy as possible. How?
What i Tried:
My first try was a put request:
PUT firebase.com/childNode.json?auth=FIRE_SECRET
data-raw: null
response: {
"error": "Data requested exceeds the maximum size that can be accessed with a single request. Contact support#firebase.com for help."
}
So that didn't work, let's do a limit request:
PUT firebase.com/childNode.json?auth=FIRE_SECRET&orderBy="$key"&limitToFirst=100
data-raw: null
response: {
"error": "Querying related parameters not supported on this request type"
}
No luck so far :( What about writing a script that will get the first X number of keys and then create a patch request with each value set to null?
GET firebase.com/childNode.json?auth=FIRE_SECRET&shallow=true&orderBy="$key"&limitToLast=100
{
"error" : "Mixing 'shallow' and querying parameters is not supported"
}
It's really not going to be easy this one? I could remove the shallow requirement and get the keys, and finish the script. I was just hoping there would be a easier/more efficient way???
Another thing i tried were to create a node script that listen for childAdded and then directly tries to remove those children?
ref.authWithCustomToken(AUTH_TOKEN, function(error, authData) {
if (error) {console.log("Login Failed!", error)}
if (!error) {console.log("Login Succeeded!", authData)}
ref.child("childNode").on("child_added", function(snap) {
console.log(`found: ${snap.key()}`)
ref.child("childNode").child(snap.key()).remove( function(err) {
if (!err) {console.log(`deleted: ${snap.key()}`)}
})
})
})
This script actually hangs right now, but earlier I did receive somethings like a max stack limit warning from firebase. I know this is not a firebase problem, but I don't see any particular easy way to solve that problem.
Downloading a shallow tree, will download only the keys. So instead of asking the server to order and limit, you can download all keys.
Then you can order and limit it client-side, and send delete requests to Firebase in batches.
You can use this script for inspiration: https://gist.github.com/wilhuff/b78e7391396e09f6c614
Use firebase cli tool for this: firebase database:remove --project .
In Browser Console this is fastest way
database.ref('data').limitToFirst(10000).once('value', snap => {
var updates = {};
snap.forEach(snap => {
updates[snap.key] = null;
});
database.ref('data').update(updates);
});

Google Cloud Datastore runQuery returning 412 "no matching index found"

** UPDATE **
Thanks to Alfred Fuller for pointing out that I need to create a manual index for this query.
Unfortunately, using the JSON API, from a .NET application, there does not appear to be an officially supported way of doing so. In fact, there does not officially appear to be a way to do this at all from an app outside of App Engine, which is strange since the Cloud Datastore API was designed to allow access to the Datastore outside of App Engine.
The closest hack I could find was to POST the index definition using RPC to http://appengine.google.com/api/datastore/index/add. Can someone give me the raw spec for how to do this exactly (i.e. URL parameters, what exactly should the body look like, etc), perhaps using Fiddler to inspect the call made by appcfg.cmd?
** ORIGINAL QUESTION **
According to the docs, "a query can combine equality (EQUAL) filters for different properties, along with one or more inequality filters on a single property".
However, this query fails:
{
"query": {
"kinds": [
{
"name": "CodeProse.Pogo.Tests.TestPerson"
}
],
"filter": {
"compositeFilter": {
"operator": "and",
"filters": [
{
"propertyFilter": {
"operator": "equal",
"property": {
"name": "DepartmentCode"
},
"value": {
"integerValue": "123"
}
}
},
{
"propertyFilter": {
"operator": "greaterThan",
"property": {
"name": "HourlyRate"
},
"value": {
"doubleValue": 50
}
}
},
{
"propertyFilter": {
"operator": "lessThan",
"property": {
"name": "HourlyRate"
},
"value": {
"doubleValue": 100
}
}
}
]
}
}
}
}
with the following response:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "FAILED_PRECONDITION",
"message": "no matching index found.",
"locationType": "header",
"location": "If-Match"
}
],
"code": 412,
"message": "no matching index found."
}
}
The JSON API does not yet support local index generation, but we've documented a process that you can follow to generate the xml definition of the index at https://developers.google.com/datastore/docs/tools/indexconfig#Datastore_Manual_index_configuration
Please give this a shot and let us know if it doesn't work.
This is a temporary solution that we hope to replace with automatic local index generation as soon as we can.
The error "no matching index found." indicates that an index needs to be added for the query to work. See the auto index generation documentation.
In this case you need an index with the properties DepartmentCode and HourlyRate (in that order).
For gcloud-node I fixed it with those 3 links:
https://github.com/GoogleCloudPlatform/gcloud-node/issues/369
https://github.com/GoogleCloudPlatform/gcloud-node/blob/master/system-test/data/index.yaml
and most important link:
https://cloud.google.com/appengine/docs/python/config/indexconfig#Python_About_index_yaml to write your index.yaml file
As explained in the last link, an index is what allows complex queries to run faster by storing the result set of the queries in an index. When you get no matching index found it means that you tried to run a complex query involving order or filter. So to make your query work, you need to create your index on the google datastore indexes by creating a config file manually to define your indexes that represent the query you are trying to run. Here is how you fix:
create an index.yaml file in a folder named for example indexes in your app directory by following the directives for the python conf file: https://cloud.google.com/appengine/docs/python/config/indexconfig#Python_About_index_yaml or get inspiration from the gcloud-node tests in https://github.com/GoogleCloudPlatform/gcloud-node/blob/master/system-test/data/index.yaml
create the indexes from the config file with this command:
gcloud preview datastore create-indexes indexes/index.yaml
see https://cloud.google.com/sdk/gcloud/reference/preview/datastore/create-indexes
wait for the indexes to serve on your developer console in Cloud Datastore/Indexes, the interface should display "serving" once the index is built
once it is serving your query should work
For example for this query:
var q = ds.createQuery('project')
.filter('tags =', category)
.order('-date');
index.yaml looks like:
indexes:
- kind: project
ancestor: no
properties:
- name: tags
- name: date
direction: desc
Try not to order the result. After removing orderby(), it worked for me.

Resources