The documentations says that Cosmos supports Multipolygons but when I want to query using it I don't get the expected result.
If I change the multipolygon to a polygon the query works as expected.
This is the result of ST_ISVALIDDETAILED with the multipolygon
Invalid position. A position must be represented by an array of
numbers. There must be at least two elements in the array.
This is proof that the multipolygon is not working.
Has anyone been able to work with multipolygons?
Note:
I have used the multipolygon as example in the documentation.
I have created the spatial index for the property.
{
"path": "/Region/Area/?",
"types": [
"Point",
"LineString",
"Polygon",
"MultiPolygon"
]
}
After investigating more I found that the example on documentation is bad formed.
This is the example =>
{
"type":"MultiPolygon",
"coordinates":[ [
[52.0, 12.0],
[53.0, 12.0],
[53.0, 13.0],
[52.0, 13.0],
[52.0, 12.0]
],
[
[50.0, 0.0],
[51.0, 0.0],
[51.0, 5.0],
[50.0, 5.0],
[50.0, 0.0]
] ]
}
and is invalid.
This is the right geoJson =>
{
"type":"MultiPolygon",
"coordinates":[ [[
[52.0, 12.0],
[53.0, 12.0],
[53.0, 13.0],
[52.0, 13.0],
[52.0, 12.0]
]],
[[
[50.0, 0.0],
[51.0, 0.0],
[51.0, 5.0],
[50.0, 5.0],
[50.0, 0.0]
]]]
}
Related
the following is the JSON data. need to get only of id key
{apps:[ {
"id": "/application1/4b693882-ffba-4c93-a0f2-cccafcb4d7dd",
"cmd": null,
"args": null,
"user": null,
"env": {},
"constraints": [
[
"hostname",
"GROUP_BY",
"5"
]
},
{
"id": "/application2/4b693882-ffba-4c93-a0f2-cccafcb4d7dd",
"cmd": null,
"args": null,
"user": null,
"env": {},
"constraints": [
[
"hostname",
"GROUP_BY",
"5"
]
]},
output expected is
/application1/4b693882-ffba-4c93-a0f2-cccafcb4d7dd
/application2/4b693882-ffba-4c93-a0f2-cccafcb4d7dd
Thanks in advance
After fixing the errors in your JSON, we can use the following jq filter to get the desired output:
.apps[] | .id
JqPlay Demo
Result jq -r '.apps[] | .id':
/application1/4b693882-ffba-4c93-a0f2-cccafcb4d7dd
/application2/4b693882-ffba-4c93-a0f2-cccafcb4d7dd
You can use map() to create an array from the properties of the objects. Try this:
let data = {apps:[{"id":"/application1/4b693882-ffba-4c93-a0f2-cccafcb4d7dd","cmd":null,"args":null,"user":null,"env":{},"constraints":["hostname","GROUP_BY","5"]},{"id":"/application2/4b693882-ffba-4c93-a0f2-cccafcb4d7dd","cmd":null,"args":null,"user":null,"env":{},"constraints":["hostname","GROUP_BY","5"]}]}
let ids = data.apps.map(o => o.id);
console.log(ids);
Note that I corrected the invalid brace/bracket combinations in the data structure you posted in the question. I assume this is just a typo in that example, otherwise there would be parsing errors in the console.
I am trying to fetch report of line items and works fine from the UI but halt with error via the API. Following is the reportQuery:
{'reportQuery': {
'dimensions': [
'DATE',
'LINE_ITEM_NAME',
'LINE_ITEM_TYPE',
'CREATIVE_SIZE_DELIVERED'
],
'adUnitView': 'TOP_LEVEL',
'columns': [
'TOTAL_LINE_ITEM_LEVEL_IMPRESSIONS',
'TOTAL_LINE_ITEM_LEVEL_CLICKS',
'TOTAL_LINE_ITEM_LEVEL_ALL_REVENUE'
],
'dimensionAttributes': [
'LINE_ITEM_FREQUENCY_CAP',
'LINE_ITEM_START_DATE_TIME',
'LINE_ITEM_END_DATE_TIME',
'LINE_ITEM_COST_TYPE',
'LINE_ITEM_COST_PER_UNIT',
'LINE_ITEM_SPONSORSHIP_GOAL_PERCENTAGE',
'LINE_ITEM_LIFETIME_IMPRESSIONS'
],
'customFieldIds': [],
'contentMetadataKeyHierarchyCustomTargetingKeyIds': [],
'startDate': {
'year': 2018,
'month': 1,
'day': 1
},
'endDate': {
'year': 2018,
'month': 1,
'day': 2
},
'dateRangeType': 'CUSTOM_DATE',
'statement': None,
'includeZeroSalesRows': False,
'adxReportCurrency': None,
'timeZoneType': 'PUBLISHER'
}}
The above query throws following error when tried with API.
Error summary: {'faultMessage': "[ReportError.COLUMNS_NOT_SUPPORTED_FOR_REQUESTED_DIMENSIONS # columns; trigger:'TOTAL_LINE_ITEM_LEVEL_ALL_REVENUE']", 'requestId': 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx', 'responseTime': '98', 'serviceName': 'ReportService', 'methodName': 'runReportJob'}
[ReportError.COLUMNS_NOT_SUPPORTED_FOR_REQUESTED_DIMENSIONS # columns; trigger:'TOTAL_LINE_ITEM_LEVEL_ALL_REVENUE']
400 Syntax error: Expected ")" or "," but got identifier "TOTAL_LINE_ITEM_LEVEL_ALL_REVENUE" at [1:354]
Did I miss anything? Any idea in this issue?
Thanks !
This issue has been solved by adding a dimension "Native ad format name".
I am using riak2.2.3, and trying to search in a map bucket type, but nothing is ever returned.
I've configured a bucket type "dist_cache" on the memory backend:
# riak-admin bucket-type status dist_cache
dist_cache is active
active: true
allow_mult: true
backend: <<"memory_mult">>
basic_quorum: false
big_vclock: 50
chash_keyfun: {riak_core_util,chash_std_keyfun}
claimant: 'riak#127.0.0.1'
datatype: map
dvv_enabled: true
dw: quorum
last_write_wins: false
linkfun: {modfun,riak_kv_wm_link_walker,mapreduce_linkfun}
n_val: 3
notfound_ok: true
old_vclock: 86400
postcommit: []
pr: 0
precommit: []
pw: 0
r: quorum
rw: quorum
search_index: <<"expirable_token">>
small_vclock: 50
w: quorum
young_vclock: 20
I've then enable search in /etc/riak/:
search = on
Then I have configured an index, with the default schema and associated it with the bucket type (see above).
I can successfully store and retrieve values, using keys, in that bucket. I have stored 3 values in registers: binary data, integer (timestamp) and a string:
[
{{"attrs", :register}, <<131, 97, 111>>},
{{"iat_i", :register}, "1540923453"},
{{"test_s", :register}, "paul"}
]
(displayed after formatting from Elixir shell, using Elixir's Riak library.)
However, nothing is found when I try searching these values:
iex(74)> :riakc_pb_socket.search(pid, "expirable_token", "iat_i:[0 TO *]")
{:ok, {:search_results, [], 0.0, 0}}
iex(75)> :riakc_pb_socket.search(pid, "expirable_token", "iat_i:1540923453")
{:ok, {:search_results, [], 0.0, 0}}
iex(76)> :riakc_pb_socket.search(pid, "expirable_token", "test_s:paul")
{:ok, {:search_results, [], 0.0, 0}}
iex(77)> :riakc_pb_socket.search(pid, "expirable_token", "test_s:*")
{:ok, {:search_results, [], 0.0, 0}}
In addition, /var/log/riak/solr.log doesn't show any error message for these requests.
Am I missing something?
I needed to remove a few options from the java startup options, but now it seems java is up and running, and solr.log does show error message when trying malformed request.
EDIT:
After trying #vempo's solutions:
I have suffixed the field with _register, however it still does not work. Here is how the field is:
iex(12)> APISexAuthBearerCacheRiak.get("ddd", opts)
[
{{"attrs", :register}, <<131, 98, 0, 0, 1, 188>>},
{{"iat_i", :register}, "1542217847"},
{{"test_flag", :flag}, true},
{{"test_register", :register}, "pierre"}
]
but the search request still returns no result:
iex(15)> :riakc_pb_socket.search(pid, "expirable_token", "test_register:*")
{:ok, {:search_results, [], 0.0, 0}}
iex(16)> :riakc_pb_socket.search(pid, "expirable_token", "test_register:pierre")
{:ok, {:search_results, [], 0.0, 0}}
iex(17)> :riakc_pb_socket.search(pid, "expirable_token", "test_register:*")
{:ok, {:search_results, [], 0.0, 0}}
iex(18)> :riakc_pb_socket.search(pid, "expirable_token", "test_flag:true")
{:ok, {:search_results, [], 0.0, 0}}
iex(19)> :riakc_pb_socket.search(pid, "expirable_token", "test_flag:*")
Still know output in /var/log/riak/solr.log, and index seems correctly setup:
iex(14)> :riakc_pb_socket.list_search_indexes(pid)
{:ok,
[
[index: "expirable_token", schema: "_yz_default", n_val: 3],
[index: "famous", schema: "_yz_default", n_val: 3]
]}
For searching within maps the rules are different. According to Searching with Data Types, there are four schemas for maps, one for each embedded type:
*_flag
*_counter
*_register
*_set
So that in your case you should be searching attrs_register, iat_i_register, and test_s_register.
As a side note, the suffixes _s and _i are probably redundant. They are used by the default schema to decide the type of a regular field, but are useless with embedded datatypes).
UPDATE
And to sum up the rules:
a flag field named test will be indexed as test_flag (query test_flag:*)
a register field named test will be indexed as test_register (query test_register:*)
a counter field named test will be indexed as test_counter (query test_counter:*)
a set field named test will be indexed as test_set (query test_set:*)
This is nicely shown in the table in Searching with Data Types: Embedded Schemas.
See also the definition of dynamic fields for embedded datatypes in the default schema, section <!-- Riak datatypes embedded fields -->.
Below I have what I'd expect is a way to create a GeoJSON MultiPolygon object with one polygon in it which has two "holes".
When I use the service http://geojson.io/ to validate this object, it returns with an error each element in a position must be a number and it does not render, however if I remove the "holes" nest, removing one of them then it works.
I'm looking for a way to describe a MultiPolygon where the polygons can have multiple holes.
I'm not looking for a way in code to create a polygon with holes.
I'm looking for a way to use the GeoJSON spec to represent MultiPolygons with multiple holes.
{
"type": "MultiPolygon",
"coordinates": [
[
[
[
-73.98114904754641,
40.7470284264813
],
[
-73.98314135177611,
40.73416844413217
],
[
-74.00538969848634,
40.734314779027144
],
[
-74.00479214294432,
40.75027851544338
],
[
-73.98114904754641,
40.7470284264813
]
],
[
[
[
-73.99818643920906,
40.74550031602355
],
[
-74.00298643920905,
40.74550031602355
],
[
-74.00058643920897,
40.74810024102966
],
[
-73.99818643920906,
40.74550031602355
]
],
[
[
-73.98917421691903,
40.73646098717515
],
[
-73.99397421691901,
40.73646098717515
],
[
-73.99157421691893,
40.739061265535696
],
[
-73.98917421691903,
40.73646098717515
]
]
]
]
]
}
This is how it works:
{
"type": "MultiPolygon",
"coordinates": [
[
{polygon},
{hole},
{hole},
{hole}
]
]
}
Not like this:
{
"type": "MultiPolygon",
"coordinates": [
[
{polygon},
[
{hole},
{hole},
{hole}
]
]
]
}
Here's an example!
{
"type": "MultiPolygon",
"coordinates": [
[
[
[
-47.900390625,
-14.944784875088372
],
[
-51.591796875,
-19.91138351415555
],
[
-41.11083984375,
-21.309846141087192
],
[
-43.39599609375,
-15.390135715305204
],
[
-47.900390625,
-14.944784875088372
]
],
[
[
-46.6259765625,
-17.14079039331664
],
[
-47.548828125,
-16.804541076383455
],
[
-46.23046874999999,
-16.699340234594537
],
[
-45.3515625,
-19.31114335506464
],
[
-46.6259765625,
-17.14079039331664
]
],
[
[
-44.40673828125,
-18.375379094031825
],
[
-44.4287109375,
-20.097206227083888
],
[
-42.9345703125,
-18.979025953255267
],
[
-43.52783203125,
-17.602139123350838
],
[
-44.40673828125,
-18.375379094031825
]
]
]
]
}
For your example in fact it's not really a MultiPolygon (in the sense of geoJSON) but a simple Polygon (with a single outer ring and multiple inner rings for the holes).
Note the difference with Multipolygons in OSM (which represents them as a relation containing ways, and whose first and last node should be merged to the same "node" element (something that does not exist in geoJSON where they are unified only by the fact that the two nodes have the same coordinates, but will in reality be automatically closed by an additional segment for "Polygon" and "MultiPolygon" types of GeoJSON)
Note that when you import a geoJSON in OSM editors (such as JOSM) they will be imported with separate nodes for the first and last node, even if they have the same coordinates - you need to use the JOSM validator to detect superposed nodes and merge them after the import in JOSM but before submission to OSM.
But in scripts or general use of geoJSON, all rings (arrays of coordinate pairs) in a "type":"Polygon" or members of a "type":"Polygon" are not required to include the same coordinates for the last node as the first node, because it is implicit (but it is still recommended to add this duplicate node for compatibility). Such closure of rings is implicit for "Polygon" and "MultiPolygon" (as they represent surfaces), but not for "Polyline" and "MultiPolyline" (as they represent curves) where you still need to include twice the same coordinates for the first and last node to get closed curves.
To represent an OSM "multipolygon" with multiple "outer" rings, you have to include several "[ {outer},{inner*} ]" in the main array of coordinates for the geoJSON "MultiPolygon" type, i.e.
{"type":"MultiPolygon", "coordinates":[
[
[[x0,y0], [x1,y1], ... [x0,y0]], /*outer1*/
[[x0,y0], [x1,y1], ... [x0,y0]], /*inner1, optional*/
[[x0,y0], [x1,y1], ... [x0,y0]], /*inner2, optional*/
],[
[[x0,y0], [x1,y1], ... [x0,y0]], /*outer2*/
],...,[
[[x0,y0], [x1,y1], ... [x0,y0]], /*outer3*/
],[
[[x0,y0], [x1,y1], ... [x0,y0]], /*outer4*/
]
]}
So for your example, the solution is:
{"type":"Polygon", "coordinates":[
[[x0,y0], [x1,y1], [x2,y2], [x3,y3], [x0,y0]], /*outer1*/
[[x4,y4], [x5,y5], [x6,y6], [x4,y4]], /*inner1*/
[[x7,y7], [x8,y8], [x9,y9], [x7,y7]] /*inner2*/
]}
If you had several outer rings only (possibly overlapping to create an union of surfaces, but this is not recommended) it would need to be a MultiPolygon, and here you would get no "holes":
{"type":"MultiPolygon", "coordinates":[
[[[x0,y0], [x1,y1], [x2,y2], [x3,y3], [x0,y0]]], /*outer1*/
[[[x4,y4], [x5,y5], [x6,y6], [x4,y4]]], /*outer2*/
[[[x7,y7], [x8,y8], [x9,y9], [x7,y7]]] /*outer3*/
]}
Note there's one less level of [square braces] because we can use "Polygon" here instead of a Multipolygon that would contain only one member in your example.
As far as I know, you can use SUBSTR(JSON_EXTRACT(ST_ASGEOJSON(WKT) function if converting from wkt to geography. That way you can represent in map. What I found in bigquery is seems like multipolygon with holes switch position for holes coordinates when u use ST_ASGEOJSON().
And check out this link:
https://dev.socrata.com/docs/datatypes/multipolygon.html#,
Downloaded the packr from https://github.com/libgdx/packr &
Packr json is
{
"platform": "windows64",
"jdk": "C:/Program Files/Java/jdk1.8.0_72",
"executable": "myapp",
"classpath": [
"input/test-hello.jar"
],
"mainclass": "Main",
"vmargs": [
"Xmx1G"
],
"minimizejre": "soft",
"output": "out-windows64",
"verbose": true
}
test-hello.jar has Main.class which simply writes "Hello" on System.out.
There is no error reported while packaging the exe. However, when I run the exe, there is no output on the console. Is there anything in the json that I am missing? Does anyone have a simple working example?
You need to include the path to the main class, for example
"mainclass": "com.example.Main",