Omitting Weaviate Objects Containing Specific Reference - weaviate

Suppose I have a database of movies with some genres tagged to it. My Weaviate schema looks like this:
"classes": [{
"class": "Movie",
"properties": [{
"name": "name",
"dataType": ["string"],
}, {
"name": "inGenres",
"dataType": ["Genre"],
}],
}, {
"class": "Genre",
"properties": [{
"name": "name",
"dataType": ["string"],
}],
}]
I would like to exclude movies tagged with a specific genre from the search results. Specifically, for a database containing the following Movie objects:
{"name":"foo", "inGenres":[{"name":"drama"}]}
{"name":"bar", "inGenres":[{"name":"horror"},{"name":"thriller"}]}
{"name":"baz", "inGenres":[{"name":"horror"},{"name":"sci-fi"}]}
If I exclude the horror genre, the search results should only return the movie foo. Is there any way to perform such a query with GraphQL or the Python client?

You can use the where filter to achieve this.
In your specific case:
{
Get {
Article(
where: {
path: ["inGenres", "Genre", "name"],
operator: NotEqual,
valueString: "horror"
}
) {
name
inGenres {
... on Genre {
name
}
}
}
}
}
In Python
import weaviate
client = weaviate.Client("http://localhost:8080")
where_filter = {
"path": ["inGenres", "Genre", "name"],
"operator": "NotEqual",
"valueString": "horror"
}
query_result = client.query.get("Movie", ["name"]).with_where(where_filter).do()
print(query_result)

Related

Data Factory: JSON data is interpreted as expression - ErrorCode=InvalidTemplate, ErrorMessage=Unable to parse expression

I want to copy items from
CosmosDB databaseA/productCollection
to
CosmosDB databaseB/productCollection
Therefore I decided to use Azure Data Factory.
I actived also "Export as-is to JSON files or Cosmos DB collection".
The read operation works as expected.
Unfortunately, the write operation stops because of an error related to the data:
ErrorCode=InvalidTemplate, ErrorMessage=Unable to parse expression 'Currency'
{
"ProductName": "Sample",
"Price": {
"#Currency": "GBP",
"$": "2624.83"
}
}
I'm not able to change to input data itself.
The output data has to equal the input data.
Is there possiblity, that #Currency will not be interpreted as an expression
In ARM, this part is failling:
Price.{#Currency}
I had the same problem and I was able to resolve accordingly.
I am using a Pipeline with a Source that is a Dataset referencing JSON data.
Clicking the button highlighted below.
I had to change the JSON from
{
"name": "SourceDataset",
"properties": {
"linkedServiceName": {
"referenceName": "StorageAccountLink",
"type": "LinkedServiceReference"
},
"annotations": [],
"type": "Json",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"container": "test-data"
}
},
"schema": {
"type": "object",
"properties": {
"#context": {
"type": "string"
},
"value": {
"type": "array",
"items": {
"type": "object",
"properties": {
"id": {
"type": "string"
}
}
}
}
}
}
}
}
To ( Escaping the # with ## )
{
"name": "SourceDataset",
"properties": {
"linkedServiceName": {
"referenceName": "StorageAccountLink",
"type": "LinkedServiceReference"
},
"annotations": [],
"type": "Json",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"container": "test-data"
}
},
"schema": {
"type": "object",
"properties": {
"##context": {
"type": "string"
},
"value": {
"type": "array",
"items": {
"type": "object",
"properties": {
"id": {
"type": "string"
}
}
}
}
}
}
}
}
I tried to reproduce your issue but it works for me. I used copy activity to transfer data from account A to account B.
Additional, if this operation is just need to be executed once, please consider using Azure Cosmos DB Migration Tool. It's free for usage. You could export the data from cosmos db A as json file then import it into cosmos db B very simply.Also, it could be executed in the cmd so that it could be made as a scheduled job on the windows system.

Search for available items for booking

I am working on a online booking system of items.
I am using mongo to store booking and item details
Item
{
id: "3",
"name": "",
"description": "",
"extra": [{}]
}
Booking
{
"id": "",
"itemId":""
"startDate": millis,
"endDate": millis,
"status": "",
"userId": ""
}
I have to implement search b/w dates. The search should return only available items for the specified period. How can I build a scalable search for this? I am planning to use elastic also for search. Any suggestion related to new technology also welcome.
I'd suggest making the booking the base object and putting the item info inside it. That is to say:
Set up mapping:
PUT bookings
{
"mappings": {
"properties": {
"id": {
"type": "keyword"
},
"item": {
"properties": {
"id": {
"type": "keyword"
},
"name": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword"
}
}
},
"description": {
"type": "text"
},
"extra": {
"type": "nested"
}
}
},
"startDate": {
"type": "date",
"format": "epoch_millis"
},
"endDate": {
"type": "date",
"format": "epoch_millis"
},
"status": {
"type": "keyword"
},
"userId": {
"type": "keyword"
}
}
}
}
Ingest the simplest booking
POST bookings/_doc
{
"item": {
"id": "987"
},
"startDate": 1587110540025,
"endDate": 1587220730025
}
Restricting the *Date fields and only returning the corresponding item:
GET bookings/_search
{
"_source": "item",
"query": {
"bool": {
"must": [
{
"range": {
"startDate": {
"gte": "17/04/2020",
"format": "dd/MM/yyyy"
}
}
},
{
"range": {
"endDate": {
"lte": "18/04/2020",
"format": "dd/MM/yyyy"
}
}
}
]
}
}
}
Note that although our date fields are defined as epoch_millis, we can still query using human-readable date strings, provided we specify the format. You can of course use milliseconds if you prefer.
While indexing the items to Elasticsearch you can check bookings. Think that, you are indexing items and you get the item from Mongo. Also, you can get the bookings for this item and you can add a field like bookingCount inside the item document of Elasticsearch. While searching you can use bookingCount field to search without booking items.
In generally, the indexing is async operations. You can use queue. So, this will reduce latency for the user operations. And, you can do what you want in there. You can get a summary with bookings and you can put inside the item.
{
id: "3",
"name": "",
"description": "",
"extra": [{}],
"bookingCount": "",
"bookingsByStatus": {
"status_1": 1233,
"status_2": 1233,
...
}
}
But this is a business decision. And after any update of items and booking, you need yo update the item from Elasticsearch index. Also, you can use other solution like mentione by #jzzfs.

Create product variations with python woocommerce api

I am attempting to create a product with variations in WooCommerce but I am getting this error:
{u'message': u'No route was found matching the URL and request method', u'code': u'rest_no_route', u'data': {u'status': 404}}
when I run the create_variation function from the API.
I ran a GET on the attributes for the product I created and it found no attributes even though the printed response when I created the product had the attributes listed.
Here is my code to create the variable product:
data = {
"name": row[3],
"type": "variable",
"description": row[4],
"images": [
{
"src": row[15],
"position": 0
}
],
"in_stock": True,
"sku": row[2],
'attributes': [
{
'name': 'Size',
'variation': True,
'visible': True,
'options': sizeList,
},
{
'name': 'Color',
'variation': True,
'visible': True,
'options': colorList,
}
],
}
print(wcapiNew.post("products", data).json())
Here is my code to create the variations:
result = wcapi.get("products/sku/"+row[2]).json()
product_id = result['product']['id']
variationData = {
"regular_price": row[17],
"image": {
"src": row[13]
},
"sku": row[19],
"attributes": [
{
"name": "Color",
"option": row[6]
},
{
"name": "Size",
"option": row[10]
}
]
}
print(wcapiNew.post("products/"+str(product_id)+"/variations", variationData).json())
I've been tearing my hair out trying to figure out what I'm doing wrong but I'm clueless right now.
Any help is appreciated. Thanks.
This is my variations data, and it work.
data_1 = {
"regular_price": "9.00",
"sku": "premium-quality-101-red",
"attributes": [
{
"id": 1,
"option": "Red"
}]
}
I figure out that you need to use id, and update one variation at a time.

Exclude Path in Azure Cosmos DB

What is the correct JSON to exclude certain keys from an input json to be not indexed by Azure CosmosDB. We are using the CosmosDB in mongodb mode. Was planning to change the index configuration on the Azure Portal after creating the collection.
Sample Input Json being
{
"name": "test",
"age": 1,
"location": "l1",
"height":5.7
}
If I were to include name and age in the index and remove location and height from the index, what does the includedPaths and excludedPaths look like.
Finally got it to work with the below spec:-
{
"indexingMode": "consistent",
"automatic": true,
"includedPaths": [{
"path": "/*",
"indexes": [{
"kind": "Range",
"dataType": "Number",
"precision": -1
},
{
"kind": "Hash",
"dataType": "String",
"precision": 3
}
]
}],
"excludedPaths": [{
"path": "/\"location\"/?"
},
{
"path": "/\"height\"/?"
}
]
}
It looks like underlying implementation has changed and at the time of writing the documentation does not cover changing indexPolicy in MongoDB flavoured CosmosBD. Because the documents are really stored in a wired way where all the keys start from root $v and all scalar fields are stored as documents, containing value and type information. So your document will be stored something like:
{
'_etag': '"2f00T0da-0000-0d00-0000-4cd987940000"',
'id': 'SDSDFASDFASFAFASDFASDF',
'_self': 'dbs/sMsxAA==/colls/dVsxAI8MBXI=/docs/sMsxAI8MBXIKAAAAAAAAAA==/',
'_rid': 'sMsxAI8MBXIKAAAAAAAAAA==',
'$t': 3,
'_attachments': 'attachments/',
'$v': {
'_id': {
'$t': 7,
'$v': "\\Ù\x87\x14\x01\x15['\x18m\x01ú"
},
'name': {
'$t': 2,
'$v': 'test'
},
'age': {
'$t': 1,
'$v': 1
},
...
},
'_ts': 1557759892
}
Therefore the indexingPolicy paths need to include the root $v and use /* (objects) instead of /? (scalars).
{
"indexingMode": "consistent",
"automatic": true,
"includedPaths": [
{
"path": "/*",
"indexes": [
{
"kind": "Range",
"dataType": "Number"
},
{
"kind": "Hash",
"dataType": "String"
}
]
}
],
"excludedPaths": [
{"path": "/\"$v\"/\"location\"/*"},
{"path": "/\"$v\"/\"height\"/*"}
]
}
PS:
Also mongo api can be used to drop all the default indexes and create specific indexes as required
https://learn.microsoft.com/en-us/azure/cosmos-db/mongodb-indexing

How to Query Google Cloud Datastore for array

I have wriiten the query to get the all the list of Event Data entities. The result is Coming like this from the google Data Store.
[{
"key": {
"id": 5678669024460800,
"kind": "Event",
"path": [
"Event",
5678669024460800
]
},
"data": {
"createdAt": "2017-03-27T06:28:58.000Z",
"users":["test1#xxx.com","test2#xxx.com","test3#xxx.com"]
}
},
{
"key": {
"id": 5678669024460800,
"kind": "Event",
"path": [
"Event",
5678669024460800
]
},
"data": {
"createdAt": "2017-03-27T06:28:58.000Z",
"users":["test1#xxx.com"]
}
},
{
"key": {
"id": 5678669024460800,
"kind": "Event",
"path": [
"Event",
5678669024460800
]
},
"data": {
"createdAt": "2017-03-27T06:28:58.000Z",
"users":["test2#xxx.com","test3#xxx.com"]
}
}]
but i need to Write a Query to filter by Email'id. means i need to fetch the entities which are match with the Email id. For Eg if i pass the emailid as "test1#xxx.com" i should get final Result like this. Can anybody help me on this.
[{
"key": {
"id": 5678669024460800,
"kind": "Event",
"path": [
"Event",
5678669024460800
]
},
"data": {
"createdAt": "2017-03-27T06:28:58.000Z",
"users":["test1#xxx.com","test2#xxx.com","test3#xxx.com"]
}
},
{
"key": {
"id": 5678669024460800,
"kind": "Event",
"path": [
"Event",
5678669024460800
]
},
"data": {
"createdAt": "2017-03-27T06:28:58.000Z",
"users":["test1#xxx.com"]
}
}]
The GQL query would be something like -
SELECT * FROM Event WHERE users='test1#xxx.com'
You need to make sure the users property is indexed in order for the search to work, otherwise you may not get any results back.

Resources