I'm draw polygon use geojson data (leaflet library).
code -
var myPlic = {
"type": "Polygon",
"coordinates": [
[47.98, 55.52],
[50.36, 56.55],
[51.76, 55.92],
[53.17, 56.31],
[54.31, 55.77],
[53.34, 54.97],
[53.52, 54.16],
[51.59, 54.57],
[50.71, 54.31],
[48.86, 54.87],
[47.81, 54.67],
[47.98, 55.52]
]
};
try{L.geoJson(myPlic, {
style: {
color: '#AAAAFF',
weight: 4
}
}).addTo(map);
}
catch(e){
console.log(e);
}
problem - console out -
Error: Invalid LatLng object: (NaN, NaN)
throw new Error('Invalid LatLng object: (' + lat + ', ' + lng + ')');
Please help. Thanks.
P.S. If i'm used 5 coordinates it's ok. And LineString from this coordinates also no pronblem, but Polygon don't work.
If anyone would look for the answer, there are missing [ ] in previous code
{
"geometry": {
"coordinates": [[
[47.98, 55.52],
[50.36, 56.55],
[51.76, 55.92],
[53.17, 56.31],
[54.31, 55.77],
[53.34, 54.97],
[53.52, 54.16],
[51.59, 54.57],
[50.71, 54.31],
[48.86, 54.87],
[47.81, 54.67],
[47.98, 55.52]
]],
"type": "Polygon"
}
}
You're not passing a valid GeoJSON feature/featurecollection object. A valid feature object would look like this:
{
type: "feature",
geometry: {
"type": "Polygon",
"coordinates": [
[47.98, 55.52],
[50.36, 56.55],
[51.76, 55.92],
[53.17, 56.31],
[54.31, 55.77],
[53.34, 54.97],
[53.52, 54.16],
[51.59, 54.57],
[50.71, 54.31],
[48.86, 54.87],
[47.81, 54.67],
[47.98, 55.52]
]
}
}
See GeoJSON specification # http://geojson.org/geojson-spec.html
Related
I've a route which is stored as a set of points.
{
"id": "9fc9b1e9-6062-4c65-820d-992569618883",
"shape": [
16.373056,
48.208333,
16.478611,
48.141111,
17.112778,
48.144722
]
}
I want to find nearest route to given point. For example: give me a route which is less than 25 km from point XY.
To be able to use built-in functions for geospatial querying in Azure Cosmos DB I need to make some changes to the document structure. My first attempt was to use LineString type.
{
"id": "9fc9b1e9-6062-4c65-820d-992569618883",
"shape": {
"type": "LineString",
"coordinates": [
[
16.373056,
48.208333
],
[
16.478611,
48.141111
],
[
17.112778,
48.144722
]
]
}
}
Than I query SELECT tf.id, ST_DISTANCE(tf.shape, {type: "Point", "coordinates": [16.6475, 48.319444]}) FROM tf WHERE ST_DISTANCE(tf.shape, {type: "Point", "coordinates": [16.6475, 48.319444]}) < 25000 with following result.
[
{
"id": "9fc9b1e9-6062-4c65-820d-992569618883",
"$1": 19683.798772898
}
]
Based on research it looks like plausible that ST_DISTANCE found a point on one route which is under 25 km.
When I have large document with many points (around 15000) the result is always []. It is an another dataset so the numbers are different.
SELECT tf.id, ST_DISTANCE(tf.shape, {type: "Point", "coordinates": [10.09, 52.831667]}) FROM tf WHERE ST_DISTANCE(tf.shape, {type: "Point", "coordinates": [10.09, 52.831667]}) < 10000 returns [].
What I tried next is to wrap every point as own data type and put them in array.
{
"id": "265de514-8995-4976-aeca-1f5d0ab0931d",
"shape": [
{
"type": "Point",
"coordinates": [
9.38626,
51.01587
]
},
{
"type": "Point",
"coordinates": [
9.38829,
51.01533
]
},
{
"type": "Point",
"coordinates": [
9.38853,
51.01554
]
}
...another set of 15000 points
]
}
When I execute the query like SELECT tf.id, locations.coordinates, ST_DISTANCE(locations, {type: "Point", "coordinates": [10.09, 52.831667]}) FROM tf JOIN locations IN tf.shape WHERE ST_DISTANCE(locations, {type: "Point", "coordinates": [10.09, 52.831667]}) < 10000 it returns all points on the route under 10 km.
[
{
"id": "265de514-8995-4976-aeca-1f5d0ab0931d",
"coordinates": [
9.97907,
52.77248
],
"$1": 9967.70776520528
},
{
"id": "265de514-8995-4976-aeca-1f5d0ab0931d",
"coordinates": [
9.97908,
52.77274
],
"$1": 9948.088917723748
}
...another set of points under 10 km
]
Do I use ST_DISTANCE correct and if yes why I don't get any results? Any service limitations? If no what is the correct way to implement this functionality? I see the possibility with the array of points but it seems somehow clunky.
Example json data:
{
"data": [
{
"place": "FM346",
"id": [
"7_day_A",
"7_day_B",
"7_day_C",
"7_day_D"
],
"values": [
0,
30,
23,
43
]
},
{
"place": "LH210",
"id": [
"1_day_A",
"1_day_B",
"1_day_C",
"1_day_D"
],
"values": [
4,
45,
100,
9
]
}
]
}
what i need to transform it into:
{
"data": [
{
"place": "FM346",
"7_day_A": {
"value": 0
},
"7_day_B": {
"value": 30
},
"7_day_C": {
"value": 23
},
"7_day_D": {
"value": 43
}
},
{
"place": "LH210",
"1_day_A": {
"value": 4
},
"1_day_B": {
"value": 45
},
"1_day_C": {
"value": 100
},
"1_day_D": {
"value": 9
}
}
]
}
i have tried this:
{
data:[.data |.[]|
{
place: (.place),
(.id[]):
{
value: (.values[])
}
}]
}
(in jqplay: https://jqplay.org/s/f4BBtN9gwmp)
and this:
{
data:[.data |.[]|
{
place: (.place),
test:
[{
(.id[]):
{
value: (.values[])
}
}]
}]
}
(in jqplay: https://jqplay.org/s/pKIvQe1CzgX)
but they arent grouped in the way i wanted and it gives each value to each id, not the corresponding one.
I have been trying for some time now, but im new to jq and have no idea how to transform it this way, thanks in advance for any answers.
You can use transpose here, which can play a key role in converting the arrays to key/value pairs
.data[] |= {place} +
([ .id, .values ] | transpose | map({(.[0]): { value: .[1] } }) | add)
The solution works by converting the array-of-arrays [.id, .values] by transposing them, i.e. converting
[["7_day_A","7_day_B","7_day_C","7_day_D"],[0,30,23,43]]
[["1_day_A","1_day_B","1_day_C","1_day_D"],[4,45,100,9]]
to
[["7_day_A",0],["7_day_B",30],["7_day_C",23],["7_day_D",43]]
[["1_day_A",4],["1_day_B",45],["1_day_C",100],["1_day_D",9]]
With the transformation done, we construct an object with key as the zeroth index element and value as an object comprising of the value of first index element, and combine the results together with add
Demo - jqplay
I have a JSON file that has geoJSON feature collections nested inside of it.
Is it possible to read in the JSON file using jsonlite::read_json(), extract the geoJSON bits, and then convert the resulting list to a sf object? The alternative is to write the list back to JSON (text) and read the geoJSON using a package like geojsonio.
This is what my JSON code looks like:
{
"all": [
{
"type": "Feature",
"geometry": {
"type": "GeometryCollection",
"geometries": [
{
"type": "Point",
"coordinates": [
-75.155727,
39.956318
]
},{
"type": "LineString",
"coordinates": [
[
-75.15567895337301,
39.95653558798881
],[
-75.15575995337292,
39.95616931624319
]
]
},{
"type": "Point",
"coordinates": [
-75.15566,
39.956432
]
}
]
},
"properties": {
# properties
}
},{
# more features of mixed type
}
]
}
perhaps
x <- '{
"all": [
{
"type": "Feature",
"geometry": {
"type": "GeometryCollection",
"geometries": [
{
"type": "Point",
"coordinates": [
-75.155727,
39.956318
]
},{
"type": "LineString",
"coordinates": [
[
-75.15567895337301,
39.95653558798881
],[
-75.15575995337292,
39.95616931624319
]
]
},{
"type": "Point",
"coordinates": [
-75.15566,
39.956432
]
}
]
},
"properties": null
}
]
}'
sf::st_read(jqr::jq(x, ".all[]"))
(string edited to be valid JSON)
I want to work on GeoJson data having below mentioned format;
{ "id": 1,
"geometry":
{ "type": "Point",
"coordinates": [
-3.706,
40.3],
"properties": {"appuserid": "5b46-7d3c-48a6-9c08-cc894",
"eventtype": "location",
"devicedate": "2016-06-08T07:25:21",
"date": "2016-06-08T07:25:06.507",
"location": {
"building": "2",
"floor": "0",
"elevation": ""
}}}
The problem is i want to use a "Where" clause to "appuserid" and select the selected records for processing. I dont know how to do it ? I have already saved data from a Mongodb in a dataframe.
Right now i am trying to do it as follow;
library(sqldf)
sqldf("SELECT * FROM d WHERE d$properties$appuserid = '0000-0000-0000-0000'")
But it gives an error.
Error: Only lists of raw vectors are currently supported
code is below;
library(jsonlite);
con <- mongo(collection = "geodata", db = "MongoDb", url = "mongodb://192.168.26.18:27017", verbose = FALSE, options = ssl_options());
d <- con$find();
library(jqr)
jq(d, '.features[] | select(d$properties$appuserid == "5b46-7d3c-48a6-9c08-cc894")')
Error : Error in jq.default(d, ".features[] | select(d$properties$appuserid == \"5b46-7d3c-48a6-9c08-cc894\")") :
jq method not implemented for data.frame.
jqr is one option, an R client for jq https://stedolan.github.io/jq/
x <- '{
"type": "FeatureCollection",
"features": [
{
"type": "Feature",
"properties": {
"population": 200
},
"geometry": {
"type": "Point",
"coordinates": [
10.724029,
59.926807
],
"properties": {
"appuserid": "5b46-7d3c-48a6-9c08-cc894"
}
}
},
{
"type": "Feature",
"properties": {
"population": 600
},
"geometry": {
"type": "Point",
"coordinates": [
10.715789,
59.904778
],
"properties": {
"appuserid": "c7e866a7-e32d-4dc2-adfd-c2ca065b25ce"
}
}
}
]
}'
library(jqr)
jq(x, '.features[] | select(.geometry.properties.appuserid == "5b46-7d3c-48a6-9c08-cc894")')
returns
{
"type": "Feature",
"properties": {
"population": 200
},
"geometry": {
"type": "Point",
"coordinates": [
10.724029,
59.926807
],
"properties": {
"appuserid": "5b46-7d3c-48a6-9c08-cc894"
}
}
}
I am trying to create a .json file from a string of coordinates to display. I can get to the point of creating the file but the JSON Is not correct. Code follows
json="10,10;10,5;5,5;5,10"
List<Coords> eList = new List<Coords>();
Coords d = new Coords();
d.type = "Polygon";
d.coordinates = Newtonsoft.Json.JsonConvert.DeserializeObject(json);
List<def> deflist = new List<def>();
def f = new def();
f.type = "GeometryCollection";
f.geometries = d;
THE RESULTS ARE
{
"type": "GeometryCollection",
"geometries": {
"type": "Polygon",
"coordinates": [
[
[
10,
10
],
[
10,
5
],
[
5,
5
],
[
5,
10
]
]
]
}
}
-- SHOULD LOOK LIKE THIS
{
"type": "GeometryCollection",
"geometries": {
"type": "Polygon",
"coordinates": [
[[10,10],[10,5],[5,5],[5,10]]
]
}
}
the coordinates are indented and formatted in a way I can't understand. Any suggestions would be greatly appreciated.
The File is being generated to be used with Telerik RadMap Control.