Converting a painless script into a visualisation on Kibana (Logs from AWS Connect) - kibana

I have logs being shipped from AWS Connect to Kibana through AWS OpenSearch. I have written the following script to return the latest status of an Agent like so:
GET agent-logs-*/_search
{
"script_fields": {
"data": {
"script": {
"lang": "painless",
"source": "params._source.CurrentAgentSnapshot.Configuration.Username + ', ' + params._source.CurrentAgentSnapshot.AgentStatus.Name + ', ' + params._source.EventTimestamp"
}
}
},
"collapse": {
"field": "CurrentAgentSnapshot.Configuration.Username.keyword"
},
"sort": [
{
"EventTimestamp": {
"order": "desc"
}
}
]
}
This returns a value of:
{
"took" : 29,
"timed_out" : false,
"_shards" : {
"total" : 65,
"successful" : 65,
"skipped" : 0,
"failed" : 0
},
"hits" : {
"total" : {
"value" : 10000,
"relation" : "gte"
},
"max_score" : null,
"hits" : [
{
"_index" : "agent-logs-2022-06-28",
"_type" : "_doc",
"_id" : "",
"_score" : null,
"fields" : {
"data" : [
"al.pacino#email.com, Available, 2022-06-28T10:52:01.238Z"
],
"CurrentAgentSnapshot.Configuration.Username.keyword" : [
"al.pacino#email.com"
]
},
"sort" : [
1656413521238
]
},
{
"_index" : "agent-logs-2022-06-28",
"_type" : "_doc",
"_id" : "",
"_score" : null,
"fields" : {
"data" : [
"robert.deniro#email.com, Available, 2022-06-28T10:50:45.622Z"
],
"CurrentAgentSnapshot.Configuration.Username.keyword" : [
"robert.deniro#email.com"
]
},
"sort" : [
1656413445622
]
},
{
"_index" : "agent-logs-2022-06-26",
"_type" : "_doc",
"_id" : "",
"_score" : null,
"fields" : {
"data" : [
"marlon.brando#email.com, Offline, 2022-06-26T14:51:55.203Z"
],
"CurrentAgentSnapshot.Configuration.Username.keyword" : [
"marlon.brando#email.com"
]
},
"sort" : [
1656255115203
]
}
]
}
}
I wanted to take the data lines from the JSON - "al.pacino#email.com, Available, 2022-06-28T10:52:01.238Z" and represent this in a visualisation such as a Data Table to get a list of agents with their corresponding status.
By using the current agent-logs, there is a delay whereby the status change and heart beats overlap, causing an inaccurate count of the status, thus need to use this script.

Related

Elasticsearch low indexing speed

I have a blog that contains 14k posts and tried to add these posts to the elastic search index.
I indexed some of the posts, but it's extremely slow, and it will take about 6 hours to estimate. All the performance optimization tips from the official site I made. In my opinion, I removed the redundant data such as post meta. Can I increase indexing speed? Add the index configuration below:
{
"test-post-1" : {
"aliases" : { },
"mappings" : {
"date_detection" : false,
"properties" : {
"ID" : {
"type" : "long"
},
"guid" : {
"type" : "keyword"
},
"menu_order" : {
"type" : "long"
},
"permalink" : {
"type" : "keyword"
},
"post_content" : {
"type" : "text"
},
"post_date" : {
"type" : "date",
"format" : "yyyy-MM-dd HH:mm:ss"
},
"post_excerpt" : {
"type" : "text"
},
"post_id" : {
"type" : "long"
},
"post_mime_type" : {
"type" : "keyword"
},
"post_modified" : {
"type" : "date",
"format" : "yyyy-MM-dd HH:mm:ss"
},
"post_name" : {
"type" : "text",
"fields" : {
"post_name" : {
"type" : "text"
},
"raw" : {
"type" : "keyword",
"ignore_above" : 10922
}
}
},
"post_parent" : {
"type" : "long"
},
"post_status" : {
"type" : "keyword"
},
"post_title" : {
"type" : "text",
"fields" : {
"post_title" : {
"type" : "text",
"analyzer" : "standard"
},
"raw" : {
"type" : "keyword",
"ignore_above" : 10922
},
"sortable" : {
"type" : "keyword",
"ignore_above" : 10922,
"normalizer" : "lowerasciinormalizer"
}
}
},
"post_type" : {
"type" : "text",
"fields" : {
"post_type" : {
"type" : "text"
},
"raw" : {
"type" : "keyword"
}
}
}
}
},
"settings" : {
"index" : {
"mapping" : {
"total_fields" : {
"limit" : "5000"
},
"ignore_malformed" : "true"
},
"number_of_shards" : "1",
"provided_name" : "test-post-1",
"max_shingle_diff" : "8",
"max_result_window" : "1000000",
"creation_date" : "1582745447768",
"analysis" : {
"filter" : {
"shingle_filter" : {
"max_shingle_size" : "5",
"min_shingle_size" : "2",
"type" : "shingle"
},
"edge_ngram" : {
"min_gram" : "3",
"side" : "front",
"type" : "edgeNGram",
"max_gram" : "10"
},
"ewp_word_delimiter" : {
"type" : "word_delimiter",
"preserve_original" : "true"
},
"ewp_snowball" : {
"type" : "snowball",
"language" : "russian"
}
},
"normalizer" : {
"lowerasciinormalizer" : {
"filter" : [
"lowercase",
"asciifolding"
],
"type" : "custom"
}
},
"analyzer" : {
"ewp_lowercase" : {
"filter" : [
"lowercase"
],
"type" : "custom",
"tokenizer" : "keyword"
},
"shingle_analyzer" : {
"filter" : [
"lowercase",
"shingle_filter"
],
"type" : "custom",
"tokenizer" : "standard"
},
"default" : {
"filter" : [
"ewp_word_delimiter",
"lowercase",
"stop",
"ewp_snowball"
],
"char_filter" : [
"html_strip"
],
"language" : "russian",
"tokenizer" : "standard"
}
}
},
"number_of_replicas" : "1",
"uuid" : "cWGjSF4FQ1Or0A_0oSlA2g",
"version" : {
"created" : "7050299"
}
}
}
}
}
Wordpress version: 5.3.2
Elasticsearch version: 7.5.2
Enabled plugins: ElasticPress

Drupal 8 Jsonapi Get request throws bad request error

Setting up drupal website API using json api, when accessing the link for get resource http://example.com/jsonapi/node/article - getting error
"title": "Bad Request",
"status": "400",
"detail": "The following query parameters violate the JSON:API spec: 'q'.",
Your web server seems to rewrite the request URL. In particular it adds a q query parameter. The full error message returned by Drupal includes the full URL that the application was receiving under links.via path of first errors object: http://207.148.125.64/jsonapi/node/article?q=%2Fjsonapi%2Fnode%2Farticle Please see above for full response.
This likely caused by a wrong configuration of web server used to serve Drupal. In particular the rewrite rule seems to be wrong. A similar issue has been reported in this bug.
I would recommend that you compare your web server configuration against the default .htaccess provided. If you are using nginx, you might want to have a look at an example configuration for nginx.
$ curl http://207.148.125.64/jsonapi/node/article | json_pp
{
"errors" : [
{
"title" : "Bad Request",
"status" : "400",
"meta" : {
"exception" : "Drupal\\Core\\Http\\Exception\\CacheableBadRequestHttpException: The following query parameters violate the JSON:API spec: 'q'. in /var/www/html/modules/contrib/jsonapi/src/EventSubscriber/JsonApiRequestValidator.php:78\nStack trace:\n#0 /var/www/html/modules/contrib/jsonapi/src/EventSubscriber/JsonApiRequestValidator.php(36): Drupal\\jsonapi\\EventSubscriber\\JsonApiRequestValidator->validateQueryParams(Object(Symfony\\Component\\HttpFoundation\\Request))\n#1 [internal function]: Drupal\\jsonapi\\EventSubscriber\\JsonApiRequestValidator->onRequest(Object(Symfony\\Component\\HttpKernel\\Event\\GetResponseEvent), 'kernel.request', Object(Drupal\\Component\\EventDispatcher\\ContainerAwareEventDispatcher))\n#2 /var/www/html/core/lib/Drupal/Component/EventDispatcher/ContainerAwareEventDispatcher.php(111): call_user_func(Array, Object(Symfony\\Component\\HttpKernel\\Event\\GetResponseEvent), 'kernel.request', Object(Drupal\\Component\\EventDispatcher\\ContainerAwareEventDispatcher))\n#3 /var/www/html/vendor/symfony/http-kernel/HttpKernel.php(127): Drupal\\Component\\EventDispatcher\\ContainerAwareEventDispatcher->dispatch('kernel.request', Object(Symfony\\Component\\HttpKernel\\Event\\GetResponseEvent))\n#4 /var/www/html/vendor/symfony/http-kernel/HttpKernel.php(68): Symfony\\Component\\HttpKernel\\HttpKernel->handleRaw(Object(Symfony\\Component\\HttpFoundation\\Request), 1)\n#5 /var/www/html/core/lib/Drupal/Core/StackMiddleware/Session.php(57): Symfony\\Component\\HttpKernel\\HttpKernel->handle(Object(Symfony\\Component\\HttpFoundation\\Request), 1, true)\n#6 /var/www/html/core/lib/Drupal/Core/StackMiddleware/KernelPreHandle.php(47): Drupal\\Core\\StackMiddleware\\Session->handle(Object(Symfony\\Component\\HttpFoundation\\Request), 1, true)\n#7 /var/www/html/core/modules/page_cache/src/StackMiddleware/PageCache.php(106): Drupal\\Core\\StackMiddleware\\KernelPreHandle->handle(Object(Symfony\\Component\\HttpFoundation\\Request), 1, true)\n#8 /var/www/html/core/modules/page_cache/src/StackMiddleware/PageCache.php(85): Drupal\\page_cache\\StackMiddleware\\PageCache->pass(Object(Symfony\\Component\\HttpFoundation\\Request), 1, true)\n#9 /var/www/html/modules/contrib/jsonapi/src/StackMiddleware/FormatSetter.php(45): Drupal\\page_cache\\StackMiddleware\\PageCache->handle(Object(Symfony\\Component\\HttpFoundation\\Request), 1, true)\n#10 /var/www/html/vendor/asm89/stack-cors/src/Asm89/Stack/Cors.php(49): Drupal\\jsonapi\\StackMiddleware\\FormatSetter->handle(Object(Symfony\\Component\\HttpFoundation\\Request), 1, true)\n#11 /var/www/html/core/lib/Drupal/Core/StackMiddleware/ReverseProxyMiddleware.php(47): Asm89\\Stack\\Cors->handle(Object(Symfony\\Component\\HttpFoundation\\Request), 1, true)\n#12 /var/www/html/core/lib/Drupal/Core/StackMiddleware/NegotiationMiddleware.php(52): Drupal\\Core\\StackMiddleware\\ReverseProxyMiddleware->handle(Object(Symfony\\Component\\HttpFoundation\\Request), 1, true)\n#13 /var/www/html/vendor/stack/builder/src/Stack/StackedHttpKernel.php(23): Drupal\\Core\\StackMiddleware\\NegotiationMiddleware->handle(Object(Symfony\\Component\\HttpFoundation\\Request), 1, true)\n#14 /var/www/html/core/lib/Drupal/Core/DrupalKernel.php(693): Stack\\StackedHttpKernel->handle(Object(Symfony\\Component\\HttpFoundation\\Request), 1, true)\n#15 /var/www/html/index.php(19): Drupal\\Core\\DrupalKernel->handle(Object(Symfony\\Component\\HttpFoundation\\Request))\n#16 {main}",
"trace" : [
{
"class" : "Drupal\\jsonapi\\EventSubscriber\\JsonApiRequestValidator",
"type" : "->",
"line" : 36,
"file" : "/var/www/html/modules/contrib/jsonapi/src/EventSubscriber/JsonApiRequestValidator.php",
"args" : [
{
"server" : {},
"headers" : {},
"cookies" : {},
"query" : {},
"request" : {},
"attributes" : {},
"files" : {}
}
],
"function" : "validateQueryParams"
},
{
"class" : "Drupal\\jsonapi\\EventSubscriber\\JsonApiRequestValidator",
"type" : "->",
"args" : [
{},
"kernel.request",
{
"_serviceId" : "event_dispatcher"
}
],
"function" : "onRequest"
},
{
"file" : "/var/www/html/core/lib/Drupal/Component/EventDispatcher/ContainerAwareEventDispatcher.php",
"args" : [
[
{
"_serviceId" : "jsonapi.custom_query_parameter_names_validator.subscriber"
},
"onRequest"
],
{},
"kernel.request",
{
"_serviceId" : "event_dispatcher"
}
],
"function" : "call_user_func",
"line" : 111
},
{
"class" : "Drupal\\Component\\EventDispatcher\\ContainerAwareEventDispatcher",
"type" : "->",
"line" : 127,
"args" : [
"kernel.request",
{}
],
"function" : "dispatch",
"file" : "/var/www/html/vendor/symfony/http-kernel/HttpKernel.php"
},
{
"file" : "/var/www/html/vendor/symfony/http-kernel/HttpKernel.php",
"args" : [
{
"attributes" : {},
"files" : {},
"request" : {},
"server" : {},
"headers" : {},
"query" : {},
"cookies" : {}
},
1
],
"function" : "handleRaw",
"class" : "Symfony\\Component\\HttpKernel\\HttpKernel",
"type" : "->",
"line" : 68
},
{
"args" : [
{
"request" : {},
"query" : {},
"headers" : {},
"cookies" : {},
"server" : {},
"files" : {},
"attributes" : {}
},
1,
true
],
"function" : "handle",
"file" : "/var/www/html/core/lib/Drupal/Core/StackMiddleware/Session.php",
"class" : "Symfony\\Component\\HttpKernel\\HttpKernel",
"line" : 57,
"type" : "->"
},
{
"line" : 47,
"type" : "->",
"class" : "Drupal\\Core\\StackMiddleware\\Session",
"function" : "handle",
"args" : [
{
"server" : {},
"query" : {},
"headers" : {},
"cookies" : {},
"request" : {},
"attributes" : {},
"files" : {}
},
1,
true
],
"file" : "/var/www/html/core/lib/Drupal/Core/StackMiddleware/KernelPreHandle.php"
},
{
"args" : [
{
"files" : {},
"attributes" : {},
"request" : {},
"headers" : {},
"query" : {},
"cookies" : {},
"server" : {}
},
1,
true
],
"function" : "handle",
"file" : "/var/www/html/core/modules/page_cache/src/StackMiddleware/PageCache.php",
"class" : "Drupal\\Core\\StackMiddleware\\KernelPreHandle",
"type" : "->",
"line" : 106
},
{
"class" : "Drupal\\page_cache\\StackMiddleware\\PageCache",
"type" : "->",
"line" : 85,
"file" : "/var/www/html/core/modules/page_cache/src/StackMiddleware/PageCache.php",
"args" : [
{
"files" : {},
"attributes" : {},
"query" : {},
"headers" : {},
"cookies" : {},
"server" : {},
"request" : {}
},
1,
true
],
"function" : "pass"
},
{
"file" : "/var/www/html/modules/contrib/jsonapi/src/StackMiddleware/FormatSetter.php",
"args" : [
{
"server" : {},
"headers" : {},
"cookies" : {},
"query" : {},
"request" : {},
"attributes" : {},
"files" : {}
},
1,
true
],
"function" : "handle",
"class" : "Drupal\\page_cache\\StackMiddleware\\PageCache",
"type" : "->",
"line" : 45
},
{
"file" : "/var/www/html/vendor/asm89/stack-cors/src/Asm89/Stack/Cors.php",
"function" : "handle",
"args" : [
{
"files" : {},
"attributes" : {},
"request" : {},
"query" : {},
"headers" : {},
"cookies" : {},
"server" : {}
},
1,
true
],
"type" : "->",
"line" : 49,
"class" : "Drupal\\jsonapi\\StackMiddleware\\FormatSetter"
},
{
"class" : "Asm89\\Stack\\Cors",
"line" : 47,
"type" : "->",
"file" : "/var/www/html/core/lib/Drupal/Core/StackMiddleware/ReverseProxyMiddleware.php",
"args" : [
{
"request" : {},
"headers" : {},
"cookies" : {},
"query" : {},
"server" : {},
"files" : {},
"attributes" : {}
},
1,
true
],
"function" : "handle"
},
{
"file" : "/var/www/html/core/lib/Drupal/Core/StackMiddleware/NegotiationMiddleware.php",
"function" : "handle",
"args" : [
{
"query" : {},
"headers" : {},
"cookies" : {},
"server" : {},
"request" : {},
"files" : {},
"attributes" : {}
},
1,
true
],
"line" : 52,
"type" : "->",
"class" : "Drupal\\Core\\StackMiddleware\\ReverseProxyMiddleware"
},
{
"file" : "/var/www/html/vendor/stack/builder/src/Stack/StackedHttpKernel.php",
"function" : "handle",
"args" : [
{
"request" : {},
"headers" : {},
"query" : {},
"cookies" : {},
"server" : {},
"files" : {},
"attributes" : {}
},
1,
true
],
"type" : "->",
"line" : 23,
"class" : "Drupal\\Core\\StackMiddleware\\NegotiationMiddleware"
},
{
"file" : "/var/www/html/core/lib/Drupal/Core/DrupalKernel.php",
"args" : [
{
"attributes" : {},
"files" : {},
"server" : {},
"headers" : {},
"query" : {},
"cookies" : {},
"request" : {}
},
1,
true
],
"function" : "handle",
"class" : "Stack\\StackedHttpKernel",
"line" : 693,
"type" : "->"
},
{
"line" : 19,
"type" : "->",
"class" : "Drupal\\Core\\DrupalKernel",
"function" : "handle",
"args" : [
{
"attributes" : {},
"files" : {},
"request" : {},
"server" : {},
"cookies" : {},
"headers" : {},
"query" : {}
}
],
"file" : "/var/www/html/index.php"
}
]
},
"detail" : "The following query parameters violate the JSON:API spec: 'q'.",
"source" : {
"file" : "/var/www/html/modules/contrib/jsonapi/src/EventSubscriber/JsonApiRequestValidator.php",
"line" : 78
},
"links" : {
"info" : {
"href" : "http://jsonapi.org/format/#query-parameters"
},
"via" : {
"href" : "http://207.148.125.64/jsonapi/node/article?q=%2Fjsonapi%2Fnode%2Farticle"
}
}
}
],
"jsonapi" : {
"version" : "1.0",
"meta" : {
"links" : {
"self" : {
"href" : "http://jsonapi.org/format/1.0/"
}
}
}
}
}

Error using elastic on linux server but no error on windows

When I execute
elastic::Search(index=index,body=body,size=1000,scroll="3m")
on a linux server I receive the following error.
invalid char in json text. <!DOCTYPE HTML PUBLIC "-//W3C//
On windows everything is fine. However, if I execute elastic::Search with a different body, it works. So here is my body.
'{
"_source":["DOC_ID", "DELIVERY_ID",
"CONTRIB_TS", "LANG", "SYS_NOT", "SURVEIL"],
"query": {
"bool": {
"must": [
{"match_phrase": { "CONTENT" : "XXX" }}
],
"filter": [{ "term" : { "DELIVERY_ID" : "100" } },{ "term" : { "SYS_NOT" : "0" } }]
}
},
"highlight": {
"pre_tags" : [""],
"post_tags" : [""],
"fields" : {
"CONTENT": {"fragment_size" : 200}
}
}
}'

Can't write to Firebase database using rules

I'm trying to create a rule that allows some users to write but not all.
I need that all user can read 'menu' items but only users listed at store data can write.
My data structure:
{
"category" : [ null, "Burger", "Drinks" ],
"menu" : [ null, {
"available" : true,
"category" : "1",
"description" : "item1 description",
"image" : "chicken_maharaja",
"name" : "New Chicken Maharaja",
"price" : 1300,
"store" : 1
}, {
"available" : true,
"category" : "1",
"description" : "item2 description",
"image" : "big_spicy_chicken_wrap",
"name" : "Big Spicy Chicken Wrap",
"price" : 120,
"store" : 1
}, {
"available" : true,
"category" : "2",
"description" : "item3 description",
"image" : "thumsup",
"name" : "Thumsup 100ml",
"price" : 40,
"store" : 1
}, {
"available" : true,
"category" : "2",
"description" : "item4 description",
"image" : "mccafe_ice_coffee",
"name" : "Ice Coffee",
"price" : 140,
"store" : 1
}, {
"available" : true,
"category" : "1",
"description" : "item5 description",
"image" : "mc_chicken",
"name" : "MC Chicken",
"price" : 190,
"store" : 1
}, {
"available" : true,
"category" : "2",
"description" : "item6 description",
"image" : "Smoothie",
"name" : "Smoothie",
"price" : 70,
"store" : 2
}, {
"available" : true,
"category" : "1",
"description" : "item8 description",
"image" : "salad_wrap",
"name" : "Salad Wrap",
"price" : 150,
"store" : 2
} ],
"stores" : [ null, {
"location" : "Campinas - Taquaral",
"name" : "Store 1",
"user" : {
"pyixsRTw9qdiuESt62YnmEYXQt13" : true
}
}, {
"location" : "São Paulo - Perdises",
"name" : "Store 2",
"user" : {
"LBNZ8Dwp2rdJtlSh0NC1ApdtbAl2" : true,
"TLomOgrd3gbjDdpDAqGiwl0lBhn2" : true
}
} ],
"userProfile" : {
"LBNZ8Dwp2rdJtlSh0NC1ApdtbAl2" : {
"birthDate" : "1974-02-10",
"email" : "asd#asd.com",
"firstName" : "João",
"lastName" : "Silva"
},
"pyixsRTw9qdiuESt62YnmEYXQt13" : {
"birthDate" : "1974-02-10",
"email" : "leandro.garcias#gmail.com",
"firstName" : "Leandro",
"lastName" : "Garcia"
}
}
}
My rule:
{
"rules": {
"menu": {
"$items": {
".read": "true",
".write": "root.child('stores').child('1').child(data.child('user').val()).hasChild(auth.uid)"
}
},
"stores": {
"$store": {
".read": "true",
".write": "root.child('stores').child('$store').child(data.child('user').val()).hasChild(auth.uid)"
}
}
}
}
The read is ok. :-) But I can't write.
Your newData doesn't have a child user so that check always fails. You probably mean:
"43268522": {
"menu": {
"$items": {
".read": "true",
".write": "root.child('stores').child('1').child('user').hasChild(auth.uid)"
}
}
You're probably looking for this rule:
".write": "
root.child('stores')
.child(newData.child('store').val())
.child('user')
.hasChild(auth.uid)"
So this uses the store property from the new data to look up if the current user is in the store they're trying to modify.
Unfortunately this rule won't work with your current data structure, since the value of store is a number, while the key of a store is a string: "1" !== 1.
The simplest solution is to store the store as a string, e.g.:
"store": "1"
You might want to consider that anyway, since you're now getting Firebase's array coercion, which is not helpful. For more on this see our blog post on Best Practices: Arrays in Firebase. I'd recommend storing stores using either push IDs, or simply prefixing them, e.g.
"stores": {
"store1": {
...
}
}

querying elasitcsearch parent child documents

We work with two types of documents on elastic search (ES): items and slots, where items are parents of slot documents.
We define the index with the following command:
curl -XPOST 'localhost:9200/items' -d #itemsdef.json
where itemsdef.json has the following definition
{
"mappings" : {
"item" : {
"properties" : {
"id" : {"type" : "long" },
"name" : {
"type" : "string",
"_analyzer" : "textIndexAnalyzer"
},
"location" : {"type" : "geo_point" },
}
}
},
"settings" : {
"analysis" : {
"analyzer" : {
"activityIndexAnalyzer" : {
"alias" : ["activityQueryAnalyzer"],
"type" : "custom",
"tokenizer" : "whitespace",
"filter" : ["trim", "lowercase", "asciifolding", "spanish_stop", "spanish_synonym"]
},
"textIndexAnalyzer" : {
"type" : "custom",
"tokenizer" : "whitespace",
"filter" : ["word_delimiter_impl", "trim", "lowercase", "asciifolding", "spanish_stop", "spanish_synonym"]
},
"textQueryAnalyzer" : {
"type" : "custom",
"tokenizer" : "whitespace",
"filter" : ["trim", "lowercase", "asciifolding", "spanish_stop"]
}
},
"filter" : {
"spanish_stop" : {
"type" : "stop",
"ignore_case" : true,
"enable_position_increments" : true,
"stopwords_path" : "analysis/spanish-stopwords.txt"
},
"spanish_synonym" : {
"type" : "synonym",
"synonyms_path" : "analysis/spanish-synonyms.txt"
},
"word_delimiter_impl" : {
"type" : "word_delimiter",
"generate_word_parts" : true,
"generate_number_parts" : true,
"catenate_words" : true,
"catenate_numbers" : true,
"split_on_case_change" : false
}
}
}
}
}
Then we add the child document definition using the following command:
curl -XPOST 'localhost:9200/items/slot/_mapping' -d #slotsdef.json
Where slotsdef.json has the following definition:
{
"slot" : {
"_parent" : {"type" : "item"},
"_routing" : {
"required" : true,
"path" : "parent_id"
},
"properties": {
"id" : { "type" : "long" },
"parent_id" : { "type" : "long" },
"activity" : {
"type" : "string",
"_analyzer" : "activityIndexAnalyzer"
},
"day" : { "type" : "integer" },
"start" : { "type" : "integer" },
"end" : { "type" : "integer" }
}
}
}
Finally we perform a bulk index with the following command:
curl -XPOST 'localhost:9200/items/_bulk' --data-binary #testbulk.json
Where testbulk.json holds the following data:
{"index":{"_type": "item", "_id":35}}
{"location":[40.4,-3.6],"id":35,"name":"A Name"}
{"index":{"_type":"slot","_id":126,"_parent":35}}
{"id":126,"start":1330,"day":1,"end":1730,"activity":"An Activity","parent_id":35}
I'm trying to make the following query: search for all items within a certain distance to a location that have children (slots) in the specified days and within certain start and end ranges.
An item with more slots fulfilling the condition should score higher.
I tried starting with existing samples but the docs are really scarce and its hard to move forward.
Clues?
I don't think there is a way to write an efficient query that would do something like this without moving location to slots. You can do something like this, but it can quite inefficient for some data:
{
"query": {
"top_children" : {
"type": "blog_tag",
"query" : {
"constant_score" : {
"query" : {
... your query for children goes here ...
}
}
},
"score" : "sum",
"factor" : 5,
"incremental_factor" : 2
}
},
"filter": {
"geo_distance" : {
"distance" : "200km",
"location" : {
"lat" : 40,
"lon" : -70
}
}
}
}
}
Basically, what this query is doing is this, it takes your range query or filter for children and whatever other conditions you need and wraps it into constant_score query to make sure that all children have score of 1.0. The top_children query collects all these children and accumulates their scores to the parents. And then filter filters out parents that are too far away.

Resources