elastic call and loops/apply in R - r

How do I run a loop/apply that will run an elastic search call with a different body each time?
I need to query (filtered query) an index for the word "local" for let's see, 2000 to 2010 and I want to store the results in R. Here I made an example with years from 2000 to 2001. I tried with paste but it wouldn't work because it messes up the JSON language in the body part. I can't share the es_base, unfortunately, but I would appreciate any help.
Thanks!
library(elastic)
connect(es_base = "http://...", es_port = "")
years <- list(
b <-'{
"query": {
"filtered": {
"query": {"term": { "signature_year": 2000}},
"filter": {
"term": { "text": "local"}
}
}
}
}',
c <- '{
"query": {
"filtered": {
"query": {"term": { "signature_year": 2001}},
"filter": {
"term": { "text": "local"}
}
}
}
}'
)
a<-for(i in years) {
Search( index = "index", b = i)$hits$total
}

Related

Repeat values in PAW, POST request

I wanted to ask if it is possible to repeat elements in a POST request in PAW and how to do so?
Basically, given the POST request BODY:
[
{ "zip":"DYNAMIC_VALUE" }
]
Can I then repeat this entry multiple times? Let's say I want to repeat this entry in the array 10 times, I would get for example:
[
{ "zip":"1234" },
{ "zip":"2543" },
{ "zip":"6543" },
{ "zip":"7645" },
{ "zip":"2654" },
{ "zip":"7568" },
{ "zip":"5364" },
{ "zip":"1313" },
{ "zip":"5432" },
{ "zip":"5634" }
]
And maybe I want to send an array with 1000 or more objects with a dynamic zip.
How do I do that?
Thank you :)

Struggling with filtering Drupal data in Gatsby GraphQL query

I am using Drupal 8 JSON:API to expose data to my Gatsby site. I have built a GraphQL query to expose a list of "officers" which contains a field with a relationship to a set of "service periods". The data returned by the API is correct, but I would like to filter for only one specific child record (service period) and can not figure out how to do that. My query is:
officerList: allGroupContentLodgeGroupNodeOfficer(filter: {relationships: {entity_id: {relationships: {field_position: {elemMatch: {relationships: {field_service_period: {drupal_internal__tid: {eq: 203}}}}}}}}}) {
edges {
node {
r: relationships {
entity_id {
r: relationships {
field_position {
r: relationships {
field_position {
name
}
field_service_period {
name
drupal_internal__tid
}
}
}
}
title
}
}
}
}
}
}
The resulting JSON set is:
"data": {
"officerList": {
"edges": [
{
"node": {
"r": {
"entity_id": {
"r": {
"field_position": [
{
"r": {
"field_position": {
"name": "Governor"
},
"field_service_period": {
"name": "2018 - 2019",
"drupal_internal__tid": 203
}
}
},
{
"r": {
"field_position": {
"name": "Junior Past Governor"
},
"field_service_period": {
"name": "2019 - 2020",
"drupal_internal__tid": 204
}
}
}
]
},
"title": "Tom Jones"
}
}
}
}
]
}
}
}
I understand the resulting set is correct because the child is within the root. However, I can not see how to filter the full query to include only certain child records. Is this even possible? I have seen some implementations of GraphQL that seem to allow filters to be placed on children, but I don't think this is possible in Gatsby.
I have searched everywhere for possible solutions and have been banging my head against the wall for a few days. Any insight is GREATLY appreciated!
TIA!

Querying Cosmos Nested JSON documents

I would like to turn this resultset
[
{
"Document": {
"JsonData": "{\"key\":\"value1\"}"
}
},
{
"Document": {
"JsonData": "{\"key\":\"value2\"}"
}
}
]
into this
[
{
"key": "value1"
},
{
"key": "value2"
}
]
I can get close by using a query like
select value c.Document.JsonData from c
however, I end up with
[
"{\"key\":\"value1\"}",
"{\"key\":\"value2\"}"
]
How can I cast each value to an individual JSON fragment using the SQL API?
As David Makogon said above, we need to transform such data within our app. We can do as below:
string data = "[{\"key\":\"value1\"},{\"key\":\"value2\"}]";
List<Object> t = JsonConvert.DeserializeObject<List<Object>>(data);
string jsonData = JsonConvert.SerializeObject(t);
Screenshot of result:

Google Analytics: Filter by custom dimension

I'm using the enhanced ecommerce tracking from Google Analytics to send data like this in JS to GA:
ga("ec:addImpression", {
brand: null,
dimension2: "shop123",
id: 1,
list: "Search",
name: "Product 123",
position: 1
});
ga("send", "pageview");
Then, I use the Reporting API to generate some charts. Here, I want to filter by my custom dimension dimension2. The request looks like this:
{
"reportRequests":[
{
"dateRanges":[
{
"startDate":"2016-10-17",
"endDate":"2016-11-16"
}
],
"viewId":"132093148",
"metrics":[
{
"expression":"ga:productListViews"
}
],
"dimensions":[
{
"name":"ga:date"
},
{
"name":"ga:dimension2"
}
],
"dimensionFilterClauses":[
{
"filters":[
{
"dimension_name":"ga:dimension2",
"operator":"EXACT",
"expressions":[
"shop123"
]
}
]
}
]
}
]
}
However, this returns no results:
{
"reports":[
{
"columnHeader":{
"dimensions":[
"ga:date",
"ga:dimension2"
],
"metricHeader":{
"metricHeaderEntries":[
{
"name":"ga:productListViews",
"type":"INTEGER"
}
]
}
},
"data":{
"totals":[
{
"values":[
"0"
]
}
]
}
}
]
}
But when I remove the dimensionFilterClauses I get all the results, of course not filtered by dimension2.
Did I anything wrong when filtering for that dimension?
Change your string dimension_name for dimensionName and try.
As you can see in the examples: https://developers.google.com/analytics/devguides/reporting/core/v4/samples
"dimensionFilter":
{
"dimensionName":"ga:browser",
"operator":"EXACT",
"expressions":["Safari"]
}

getting binary data when using POST request in httr package

I am using the POST function in httr library to get some data and the code is shown below.
library(httr)
url = "https://xxxx:xxx#api.xxx/_search" #omitted for privacy
a = POST(url,body = query,encode = "json")
The query is shown below in the appendix. a$content is giving me a whole bunch of a hexadecimal numbers on which I have to use another function before I can get some useful data.
Ultimately I wish to get a data frame by using b = fromJSON(a$content). So far in order to get any data I have to use:
chr<-function(n){rawToChar(as.raw(n))}
b = jsonlite::fromJSON(chr(a$content))
data = b$hits$hits$`_source`
This seems inefficient considering that I am parsing in the data through a local function to get the final data. So my questions are as follows:
Am I using the POST function correctly to get the query?
Is there a more efficient (faster) way of getting my data into a data frame?
Appendix:
query = '
{
"_source": [
"start","source.country_codes",
"dest.country_codes"
],
"size": 100,
"query": {
"bool": {
"must": [
{
"bool": {
"must_not": [
{
"range": {
"start": {
"lte": "2013-01-01T00:00:00"
}
}
},
{
"range": {
"start": {
"gt": "2016-05-19T00:00:00"
}
}
}
]
}
}
]
}
}
}'
POST function looks good.
js<-fromJSON(content(a,as="text"))

Resources