Kibana integrated with Firehose - kibana

I am pushing two different tools data in Elastic search using Kinesis Firehose. I need to display data in Kibana and I am able to do it separately But i want these two indexes to display in single index that means want to merge index and display.
Both the indexes names are different from each other. Example:
1st index- animal
2nd index- Human
To display in single index, Species.

You can create an alias in ElasticSearch for your indices, something like
POST to
your.elasticsearch.domain/_aliases
Body:
{
"actions" : [
{ "add" : { "indices" : ["Animal", "Human"], "alias" : "Species" } },
]
}
Then you should be able to use Species as a index pattern in Kibana.

Related

DynamoDB data structure / architecture to support set of particular queries

I currently have a lambda function pushing property data into a DynamoDB with streams enabled.
When a new item is added, the stream triggers another Lambda function which should query against a second DynamoDB table to see if there is a 'user query' in the table matching the new object in the stream.
The items in the first table which are pushed into the stream look like this...
{
Item: {
partitionKey: 'myTableId',
bedrooms: 3,
latitude: 52.4,
longitude: -2.6,
price: 200000,
toRent: false,
},
}
The second table contains active user queries. For example one user is looking for a house within a 30 mile radius of his location, between £150000 and £300000.
An example of this query object in the second table looks like this...
{
Item: {
partitionKey: 'myTableId',
bedrooms: 3,
minPrice: 150000,
maxPrice: 300000,
minLatitude: 52.3,
maxLatitude: 52.5
minLongitude: -2.5,
maxLongitude: -2.7,
toRent: false,
userId: 'userId',
},
}
When a new property enters the stream, I want to trigger a lambda which queries against the second table. I want to write something along the lines of...
get me all user queries where bedrooms == streamItem.bedrooms AND minPrice < streamItem.price AND maxPrice > streamItem.price AND minLatitude < streamItem.latitude AND maxLatitude > streamItem.latitude.
Ideally I want to achieve this via queries and filters, without scanning.
I'm happy to completely restructure the tables to suit the above requirements.
Been reading and reading and haven't found a suitable answer, so hoping an expert can point me in the right direction!
Thank you in advance
There's no silver bullet with DynamoDB here. Your only tools are the PK/SK lookup by value and range, filters to brute force things after that, and GSIs to give an alternate point of view. You're going to have to get creative. The details depend on your circumstances.
Like if you know you're getting all those specific values every time, you can construct a PK that is bed##rent# and an SK of price. Then for those three attributes you can do exact index-based resolution and filter for the geo attributes.
If you wanted, you could quantize the price range values (store pre-determined price ranges as singular values) and put that into the PK as well. Like divide prices into 50k chunks, each of which gets a name of the leading value. If someone wanted 150,000 to 250,000 then you'd lookup using two PKs, the "150" and "200" blocks.
You get PK/SK + GSI + filter. That's it. So it's up to you to invent a solution using them, aiming for efficiency.

Store a manipulated collection in Power Apps

(1) In my Canvas App I create a collection like this:
Collect(colShoppingBasket; {Category: varCategoryTitle ; Supplier: lblSupplier.Text ; ProductNumber: lblProductNumber.Text });;
It works - I get a collection. And whenever I push my "Add to shopping basket" button, an item are added to my collection.
(2) Now I want to sort the collection and then use the sorted output for other things.
This function sorts it by supplier. No problems here:
Sort(colShoppingBasket; Supplier)
(3) Then I want to display the SORTED version of the collection in various scenarios. And this is where the issue is. Because all I can do is manipulate a DISPLAY of the collection "colShoppingBasket" - (unsorted).
(4) What would be really nice would be the option to create and store a manipulated copy of the original collection. And the display that whereever I needed. Sort of:
Collect(colShoppingBasketSORTED; { Sort(colShoppingBasket; supplier) });; <--- p.s. I know this is not a working function
You could use the following:
ClearCollect(colShoppingBasketSorted, Sort(colShoppingBasket, Supplier))
Note that it is without the { }
This will Clear and Collect the whole colShoppingBasket sorted.
If you want to store the table in a single row in a collection, you can use
ClearCollect(colShoppingBasketSortedAlternative, {SingleRow: Sort(colShoppingBasket, Supplier)})
I wouldn't recommend this though because if you want to use the Sorted values you'd have to do something like this:
First(colShoppingBasketSortedAlternative).SingleRow -> this returns the first records of the colShoppingBasketSortedAlternative then gets the content of the SingleRow column, which in this case is a Collection
Note: You will need to replace the , with ; to work on your case

Export unique JSON values from two files

I am trying to extract unique values between two JSON files. I see many jq posts on how to filter unique values within the same file, but not compare two.
Both of my files are in the same format:
{
"time":"2021-10-01T04:00:38.161Z",
"Number":2,
"signature":"e03756fa67a30d52837d3743d4d87e9a810c5e2ddf11061a976c386a742fa"
}
{
"time":"2021-10-01T04:01:38.164Z",
"Number":2,
"signature":"3b4d746ac2da2543047d8cc981db2464d4993065993449b321fc15d7f0aa6"
}
I would like to create a 3rd file which contains only unique values. If I must choose a single value to declare as unique, then I would select 'signature.'
Choose a field that will be compared (e.g. .signature) and filter by that using unique_by in the comprehensive array obtained by using the option --slurp or -s:
jq -s 'unique_by(.signature)[]' file*.txt
I'm not sure if I totally understand what you are trying to explain to us here, but if you are trying to extract/export it from your file to a command or a retrieval command, then you would need to specify which files need to be included, along with where you want to post that text to.
With any files you can extract data. For example, if you were using Sqlite:
db.fetch(`data_specified_here`)
Note: This would fetch the data from the database—or for you db file— then what you would want to do is either log or print out the data.
Since you have things like "time" and "Number"'s, you'd want to specify that that (meaning "time":2021-10-01, and so on) you need to specify that it is the string, or input you with to take out from your file.
If this didn't help, please re-ask your question with a little more detail and I can help more. I just gave a general rundown on how to fetch something from the DB, or in your case "JSON".

Riak: searchable list of maps (with CRDTs)

I have a use-case which consist in storing several maps under an attribute. It's the equivalent JSON list:
[
{"tel": "+33123456789", "type": "work"},
{"tel": "+33600000000", "type": "mobile"},
{"tel": "+79001234567", "type": "mobile"}
]
Also, I'd like to have it searchable at the object level. For instance:
- search entries that have a mobile number
- search entries that have a mobile number and whose number starts with the string "+7"
Since Riak doesn't support sets of maps (but only sets of registers), I was wondering if there is a trick to achieve it. So far I've had 2 ideas
Map of maps
The idea is to generate (randomly?) keys for the objects, and store each object of the list in a parent map whose keys are the ones generated for this only purpose to have a key.
It seems to me that it doesn't allow to search the object (maps inside the parent map) because Riak Solr search requires the full path to the attribute. One cannot simply write the following Solr query: phone_numbers.*.tel:+7*. Also composed search (eg. search entries that have a mobile number and whose number starts with the string "+7") seem hard to achieve.
Sets with simulated multi-valued attributes
This solution consists in using a set and insert all the values of the object as a single string, with separators between them. Yes, it's a hack. An attribute value would look like: $tel:+79001234567$type:mobile$ with : as the attribute name-value separator and $ as the attribute separator.
Search could be feasible using the * wildcard (ugly, but still), except the problem with escaping the separators: what if the type attribute includes a :? Are they some separators that are accepted by Riak and would not be acceptable in a string (I'm thinking of control characters)?
In a nutshell
I'm looking for a solution whatever the hackiness of it, as long as it works. Any idea is welcome.

Set query to search all fields of a dojo datagrid

I have a Dojo DataGrid with several fields. I'm currently setting the query to search one field at a time, like so:
grid.setQuery( {name:"Bob"}, {ignoreCase:true} );
However I would like the query to search all the fields at once. For example say I have three fields titled "name", "friend", "family". Let's say I only want the rows that contain "Bob" in any of the three fields to show in the grid. How would I got about doing that without three separate queries?
Any help is appreciated.
Is your store an ItemFileReadStore or a QueryReadStore?
If ItemFileReadStore you may be able to utilize the AndOrReadStore
http://dojotoolkit.org/reference-guide/dojox/data/AndOrReadStore.html
Otherwise, my best suggestion for a limited fetch store would be to adjust your back-end code to support filtering options such that when the store makes a POST(or GET), you parse out an array of fields that you want to search against, and the result set is returned accordingly.
You'd see something like
start 0
count 25
columnsToQuery : ["name","friend","family"] //or perhaps a CSV string will do
columnOperator : "AND"
columnValue : "Bob"
You'd have to adjust the paradigm as per your business needs, but as long as the server can properly return the result set based on the filtering inputs this approach will work.
The call to generate such a request would be
grid.setQuery({
columnsToQuery : ["name","friend","family"],
columnOperator : "AND",
columnValue : "Bob"
});

Resources