This is a beginner question, but I have a set of coordinate points formatted like
[ [ -75.526844, 39.655713 ], [ -75.526344, 39.656413 ], [ -75.522343, 39.660813 ], [ -75.518343, 39.663913 ], [ -75.514643, 39.668613 ], [ -75.511743, 39.674313 ], [ -75.509342, 39.685313 ], [ -75.509742, 39.686113 ], [ -75.509042, 39.694513 ], [ -75.507162, 39.696961 ], [ -75.504042, 39.698313 ], [ -75.496241, 39.701413 ], [ -75.491341, 39.711113 ], [ -75.488553, 39.714833 ], [ -75.485241, 39.715813 ], [ -75.483141, 39.715513 ], [ -75.481741, 39.714546 ], [ -75.478940, 39.713813 ], [ -75.477640, 39.715013 ], [ -75.476888, 39.718337 ], [ -75.477432, 39.720561 ], [ -75.477240, 39.724713 ], [ -75.475440, 39.728713 ], [ -75.475384, 39.731057 ], [ -75.474168, 39.735473 ], [ -75.469239, 39.743613 ], [ -75.466263, 39.750737 ], [ -75.466249, 39.750769 ], [ -75.463039, 39.758313 ], [ -75.463339, 39.761213 ]]
I want to make a dataframe that has one column for longitude and one for latitude for this data. How should I go about doing this?
Well, using the basics of R, you can make use of the following implementation. I read the coordinates as a string. You can use R's readChar()
text = "[
[ -75.526844, 39.655713 ], [ -75.526344, 39.656413 ],
[ -75.522343, 39.660813 ], [ -75.518343, 39.663913 ],
[ -75.514643, 39.668613 ], [ -75.511743, 39.674313 ],
[ -75.509342, 39.685313 ], [ -75.509742, 39.686113 ],
[ -75.509042, 39.694513 ], [ -75.507162, 39.696961 ],
[ -75.504042, 39.698313 ], [ -75.496241, 39.701413 ],
[ -75.491341, 39.711113 ], [ -75.488553, 39.714833 ],
[ -75.485241, 39.715813 ], [ -75.483141, 39.715513 ],
[ -75.481741, 39.714546 ], [ -75.478940, 39.713813 ],
[ -75.477640, 39.715013 ], [ -75.476888, 39.718337 ],
[ -75.477432, 39.720561 ], [ -75.477240, 39.724713 ],
[ -75.475440, 39.728713 ], [ -75.475384, 39.731057 ],
[ -75.474168, 39.735473 ], [ -75.469239, 39.743613 ],
[ -75.466263, 39.750737 ], [ -75.466249, 39.750769 ],
[ -75.463039, 39.758313 ], [ -75.463339, 39.761213 ]]"
library(stringr)
s = str_split(gsub('\n', ' ', text), ', ')[[1]]
s = gsub('\\[|\\]', '', s)
s = str_trim(s)
df = data.frame(matrix(s, nc = 2, byrow = T))
colnames(df) = c('longitude', 'latitude')
head(df)
Here's another possibility with tidyverse, where I read in the data as a string, then I extract only numeric, ., and -, then I make the values numeric and turn into a dataframe column. Next, I create an index, ind, that has the same value every other row (this will be the 2 columns). Next, I create a row number column, then pivot the data wide to get into two columns, then rename.
text <- "[
[ -75.526844, 39.655713 ], [ -75.526344, 39.656413 ],
[ -75.522343, 39.660813 ], [ -75.518343, 39.663913 ],
[ -75.514643, 39.668613 ], [ -75.511743, 39.674313 ],
[ -75.509342, 39.685313 ], [ -75.509742, 39.686113 ],
[ -75.509042, 39.694513 ], [ -75.507162, 39.696961 ],
[ -75.504042, 39.698313 ], [ -75.496241, 39.701413 ],
[ -75.491341, 39.711113 ], [ -75.488553, 39.714833 ],
[ -75.485241, 39.715813 ], [ -75.483141, 39.715513 ],
[ -75.481741, 39.714546 ], [ -75.478940, 39.713813 ],
[ -75.477640, 39.715013 ], [ -75.476888, 39.718337 ],
[ -75.477432, 39.720561 ], [ -75.477240, 39.724713 ],
[ -75.475440, 39.728713 ], [ -75.475384, 39.731057 ],
[ -75.474168, 39.735473 ], [ -75.469239, 39.743613 ],
[ -75.466263, 39.750737 ], [ -75.466249, 39.750769 ],
[ -75.463039, 39.758313 ], [ -75.463339, 39.761213 ]]"
library(tidyverse)
data.frame(Column = as.numeric(str_extract_all(text, "[0-9.-]+")[[1]])) %>%
group_by(ind = rep(1:2, length.out = n())) %>%
mutate(rn = row_number()) %>%
ungroup %>%
pivot_wider(names_from = ind, values_from = Column) %>%
select(-rn) %>%
rename("longitude" = 1, "latitude" = 2)
Output
longitude latitude
1 -75.52684 39.65571
2 -75.52634 39.65641
3 -75.52234 39.66081
4 -75.51834 39.66391
5 -75.51464 39.66861
6 -75.51174 39.67431
7 -75.50934 39.68531
8 -75.50974 39.68611
9 -75.50904 39.69451
10 -75.50716 39.69696
11 -75.50404 39.69831
12 -75.49624 39.70141
13 -75.49134 39.71111
14 -75.48855 39.71483
15 -75.48524 39.71581
16 -75.48314 39.71551
17 -75.48174 39.71455
18 -75.47894 39.71381
19 -75.47764 39.71501
20 -75.47689 39.71834
21 -75.47743 39.72056
22 -75.47724 39.72471
23 -75.47544 39.72871
24 -75.47538 39.73106
25 -75.47417 39.73547
26 -75.46924 39.74361
27 -75.46626 39.75074
28 -75.46625 39.75077
29 -75.46304 39.75831
30 -75.46334 39.76121
If you have access to the GEOJson, then it is a little easier to convert. For example, if the data is hosted on a URL, then you could do something like below. You can also convert the excel file with the data into a .txt, then use that to bring in the data (e.g., geojsonsf::geojson_sf("~/Downloads/Alaska.txt"))
library(geojsonsf)
library(sf)
sf <- geojsonsf::geojson_sf("https://raw.githubusercontent.com/glynnbird/usstatesgeojson/master/california.geojson")
# Or if you have a local file, then you could put that here instead, e.g., geojsonsf::geojson_sf("~/Downloads/Alaska.geojson")
as.data.frame( sf::st_coordinates( sf ) ) %>%
select(1:2) %>%
rename("longitude" = 1, "latitude" = 2) %>%
head()
longitude latitude
1 -120.2485 33.99933
2 -120.2474 34.00191
3 -120.2387 34.00759
4 -120.2300 34.01014
5 -120.2213 34.01037
6 -120.2085 34.00565
Related
I want to convert a feature file to json so that I can pass it to a javascript function in an RMD file.
However, the toJSON function seems to flatten it and remove many of the fields and structures as below. How can I convert it and keep it in tact, as it does if I write to a file using sf::st_write?
url <- 'https://opendata.arcgis.com/api/v3/datasets/bf9d32b1aa9941af84e6c2bf0c54b1bb_0/downloads/data?format=geojson&spatialRefId=4326'
ukWardShapes <- sf::st_read(url) %>%
head(2)
# Looks OK when written out
sf::st_write(ukWardShapes, "wardShapes.geojson")
# Converting to json with toJSON seems drop other top level fields (type, name, crs) and list the objects within features object,
# but without type, and puts all fields in properties at the top level of object.
json_data <- jsonlite::toJSON(ukWardShapes)
# I want to do this as I need to pass it to javascript within an RMD like this
htmltools::tags$script(paste0("var ukWardShapes = ", json_data, ";"))
# Output from st_write - with all the objects and fields listed properly
{
"type": "FeatureCollection",
"name": "wardShapes",
"crs": { "type": "name", "properties": { "name": "urn:ogc:def:crs:OGC:1.3:CRS84" } },
"features": [
{ "type": "Feature", "properties": { "OBJECTID": 1, "WD21CD": "E05000026", "WD21NM": "Abbey", "WD21NMW": " ", "BNG_E": 544433, "BNG_N": 184376, "LONG": 0.081276, "LAT": 51.53981, "SHAPE_Length": 0.071473941285613768, "SHAPE_Area": 0.00015225110241064838 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 0.093628520000038, 51.53767283600007 ], [ 0.08163128800004, 51.539165094000055 ], [ 0.085507102000065, 51.537043160000053 ], [ 0.075954208000041, 51.533595714000057 ], [ 0.07333983500007, 51.537621201000036 ], [ 0.068771363000053, 51.536206993000064 ], [ 0.068303699000069, 51.544253423000043 ], [ 0.068361695000021, 51.544390390000046 ], [ 0.08006389600007, 51.544772356000067 ], [ 0.093628520000038, 51.53767283600007 ] ] ] ] } },
{ "type": "Feature", "properties": { "OBJECTID": 2, "WD21CD": "E05000027", "WD21NM": "Alibon", "WD21NMW": " ", "BNG_E": 549247, "BNG_N": 185196, "LONG": 0.150987, "LAT": 51.545921, "SHAPE_Length": 0.074652046036690151, "SHAPE_Area": 0.00017418950412786572 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 0.161601914000073, 51.543327754000074 ], [ 0.147931795000034, 51.541598449000048 ], [ 0.140256898000075, 51.54111542000004 ], [ 0.13420572800004, 51.540716652000071 ], [ 0.131925236000029, 51.543763455000033 ], [ 0.14633003900002, 51.546332889000041 ], [ 0.142816723000067, 51.550973604000035 ], [ 0.156378253000071, 51.551020271000027 ], [ 0.161601914000073, 51.543327754000074 ] ] ] ] } }
]
}
# Output from toJson which seems to have a lot of structure removed. Note, I'm not
# concerned about it being pretty and separated into lines
[{
"OBJECTID":1, "WD21CD":"E05000026", "WD21NM":"Abbey", "WD21NMW":" ", "BNG_E":544433, "BNG_N":184376, "LONG":0.0813, "LAT":51.5398, "SHAPE_Length":0.0715, "SHAPE_Area":0.0002, "geometry":{
"type":"MultiPolygon", "coordinates":[[[[0.0936, 51.5377], [0.0816, 51.5392], [0.0855, 51.537], [0.076, 51.5336], [0.0733, 51.5376], [0.0688, 51.5362], [0.0683, 51.5443], [0.0684, 51.5444], [0.0801, 51.5448], [0.0936, 51.5377]]]]
}
}, {
"OBJECTID":2, "WD21CD":"E05000027", "WD21NM":"Alibon", "WD21NMW":" ", "BNG_E":549247, "BNG_N":185196, "LONG":0.151, "LAT":51.5459, "SHAPE_Length":0.0747, "SHAPE_Area":0.0002, "geometry":{
"type":"MultiPolygon", "coordinates":[[[[0.1616, 51.5433], [0.1479, 51.5416], [0.1403, 51.5411], [0.1342, 51.5407], [0.1319, 51.5438], [0.1463, 51.5463], [0.1428, 51.551], [0.1564, 51.551], [0.1616, 51.5433]]]]
}
}]
As per #SymbolixAU's comment above, the answer is to use
geojsonsf::sf_geojson() instead of jsonlite::toJSON() as geojson is a specific structure of JSON for spatial data and it needs a specific parser for it.
So my line of code should be:
json_data <- geojsonsf::sf_geojson(ukWardShapes)
Reporting API v4
I am a developer. I have my clients google adwords and analytics. I have been using adwords and analytics report API for almost a year now.
I am also using https://ga-dev-tools.appspot.com/query-explorer/. The query builder. For comparing if I have retrieve the right amount of data.
I don't know if its an error or not but its acting weird.
Try number 1 using https://ga-dev-tools.appspot.com/query-explorer/
I tried to add 2 metrics and 7 dimensions. This Account ID, contains 1 million data in only 1 month. I know this because I retrieved 1 million in a range of july 25, 2018 - august 16, 2018.
Then, here's the interesting part. I run the query again with the same parameters, it retrieves 5999 results. I did it again it returns 1 million. The results keep changing. I thought its the error in my code but its also happening in the query builder.
What do you guys think? is it a bug or not?
You can try this if you have more than a million data.
I know its not related to coding. But Google Analytics doesn't have forums just like Adwords.
Try number 2 using this link https://developers.google.com/analytics/devguides/reporting/core/v4/rest/v4/reports/batchGet
this is my request
{
"reportRequests": [
{
"dateRanges": [
{
"endDate": "2018-08-16",
"startDate": "2018-07-16"
}
],
"dimensions": [
{
"name": "ga:dimension2"
},
{
"name": "ga:dimension3"
},
{
"name": "ga:dimension1"
},
{
"name": "ga:adPlacementDomain"
}
],
"pageSize": 5,
"viewId": "********",
"samplingLevel": "LARGE",
"metrics": [
{
"expression": "ga:entrances"
},
{
"expression": "ga:newUsers"
}
],
"includeEmptyRows": true
}
]
}
The return of rowCount is sometimes 2111 and then 1000000.
This my response json with 1million result:
{
"reports": [
{
"columnHeader": {
"dimensions": [
"ga:dimension2",
"ga:dimension3",
"ga:dimension1",
"ga:adPlacementDomain"
],
"metricHeader": {
"metricHeaderEntries": [
{
"name": "ga:entrances",
"type": "INTEGER"
},
{
"name": "ga:newUsers",
"type": "INTEGER"
}
]
}
},
"data": {
"rows": [
{
"dimensions": [
"(other)",
"(other)",
"(other)",
"(other)"
],
"metrics": [
{
"values": [
"120834",
"68730"
]
}
]
},
{
"dimensions": [
"1000025873.1532426892",
"1532426891790.o9z84x",
"2018-07-24T11:08:15.449+01:00",
"unknown"
],
"metrics": [
{
"values": [
"0",
"0"
]
}
]
},
{
"dimensions": [
"1000025873.1532426892",
"1532426891790.o9z84x",
"2018-07-24T11:08:17.589+01:00",
"unknown"
],
"metrics": [
{
"values": [
"0",
"0"
]
}
]
},
{
"dimensions": [
"1000025873.1532426892",
"1532426891790.o9z84x",
"2018-07-24T11:08:31.809+01:00",
"unknown"
],
"metrics": [
{
"values": [
"0",
"0"
]
}
]
},
{
"dimensions": [
"1000025873.1532426892",
"1532427045552.p38pk78",
"2018-07-24T11:09:06.43+01:00",
"unknown"
],
"metrics": [
{
"values": [
"0",
"0"
]
}
]
}
],
"totals": [
{
"values": [
"158626",
"90225"
]
}
],
"rowCount": 1000000,
"minimums": [
{
"values": [
"0",
"0"
]
}
],
"maximums": [
{
"values": [
"120834",
"68730"
]
}
],
"isDataGolden": true
},
"nextPageToken": "5"
}
]
}
another response example when i have less 1million results:
{
"reports": [
{
"columnHeader": {
"dimensions": [
"ga:dimension2",
"ga:dimension3",
"ga:dimension1",
"ga:adPlacementDomain"
],
"metricHeader": {
"metricHeaderEntries": [
{
"name": "ga:entrances",
"type": "INTEGER"
},
{
"name": "ga:newUsers",
"type": "INTEGER"
}
]
}
},
"data": {
"rows": [
{
"dimensions": [
"1002211166.1531434756",
"1531762918308.fjnj7pa6",
"2018-07-16T18:41:58.307+01:00",
"mobileapp::2-com.forsbit.spider"
],
"metrics": [
{
"values": [
"1",
"0"
]
}
]
},
{
"dimensions": [
"1002211166.1531434756",
"1531771001486.jawfrpz8",
"2018-07-16T20:56:41.482+01:00",
"mobileapp::2-com.forsbit.spider"
],
"metrics": [
{
"values": [
"1",
"0"
]
}
]
},
{
"dimensions": [
"1002211166.1531434756",
"1531772475507.7n4w2qzb",
"2018-07-16T21:21:15.503+01:00",
"mobileapp::2-com.forsbit.spider"
],
"metrics": [
{
"values": [
"1",
"0"
]
}
]
},
{
"dimensions": [
"1002211166.1531434756",
"1531859165986.zl7we6a5",
"2018-07-17T21:26:05.977+01:00",
"mobileapp::2-com.forsbit.spider"
],
"metrics": [
{
"values": [
"1",
"0"
]
}
]
},
{
"dimensions": [
"1002211166.1531434756",
"1531859632678.dz7hccsa",
"2018-07-17T21:33:52.673+01:00",
"mobileapp::2-com.forsbit.spider"
],
"metrics": [
{
"values": [
"1",
"0"
]
}
]
},
{
"dimensions": [
"1002211166.1531434756",
"1531861026792.kw71ngx9",
"2018-07-17T21:42:31.667+01:00",
"mobileapp::2-com.forsbit.spider"
],
"metrics": [
{
"values": [
"1",
"0"
]
}
]
}
],
"totals": [
{
"values": [
"2111",
"233"
]
}
],
"rowCount": 2112,
"minimums": [
{
"values": [
"0",
"0"
]
}
],
"maximums": [
{
"values": [
"1",
"1"
]
}
],
"isDataGolden": true
},
"nextPageToken": "6"
}
]
}
I am assuming that you have kept all the queries intact. Double check just to make sure.
Second step would be to check for sampling. Check the field samplingSpaceSizes and samplesReadCounts in the response for sampling. If these fields were not defined that means no sampling was introduced.
I am having a problem with GeoJSON and Google Maps API.
How do I plot a MultiLineString with ...
1) Each line having It's own color and
2) Each line having it's own properties and
3) Each line should be clickable and show an info window with it's properties
Samples Javascript:
var dataGEOJSON=[];
function LoadMyGEOJSON(key)
{
dataGEOJSON[key] = new google.maps.Data();
dataGEOJSON[key].loadGeoJson('GetLatLngGEOJSON.aspx?key=' + key);
dataGEOJSON[key].setMap(map);
}
The reason for the array, is that way I can keep track of the keys they loaded, as the user can load or unload the keys from the map in their user interface.
Sample GeoJSON:
{
"type":"FeatureCollection",
"features":[
{
"type":"Feature",
"properties":{
"Key":"007",
"Line1":"<this is line 1 desc>",
"Line2":"<this is line 2 desc>",
"Line3":"<this is line 3 desc>",
"Line4":"<this is line 4 desc>",
"Line5":"<this is line 5 desc>",
"Line6":"<this is line 6 desc>",
"Line7":"<this is line 7 desc>"
},
"geometry":{
"type":"MultiLineString",
"coordinates":[
[
[
-79.7066775992172,
43.6462189758028
],
[
-79.7066939830514,
43.6461985074393
],
[
-79.7066378408013,
43.6461605607267
],
[
-79.7066097743239,
43.6461403201406
],
[
-79.7064548987452,
43.6460566901385
],
[
-79.7063956692058,
43.6460219372408
]
],
[
[
-79.7063956692058,
43.6460219372408
],
[
-79.7063852230813,
43.646033518772
],
[
-79.7063166536656,
43.6461172032157
],
[
-79.7064079964431,
43.6461815121163
],
[
-79.7060589374119,
43.646518038823
],
[
-79.7060054211382,
43.6465600820263
]
],
[
[
-79.7060054211382,
43.6465600820263
],
[
-79.7052588394648,
43.6471387374653
],
[
-79.7048261689477,
43.6474817773536
],
[
-79.7043239742464,
43.6474733374216
],
[
-79.7041128202014,
43.6476608859429
],
[
-79.703901284037,
43.6478509811517
],
[
-79.7030237720306,
43.6486568653637
],
[
-79.7029267563095,
43.6486965678914
]
],
[
[
-79.7029267563091,
43.6486965678909
],
[
-79.7028702942784,
43.6487267896104
],
[
-79.7028019515802,
43.6487884267869
]
],
[
[
-79.6949803205847,
43.6554816862022
],
[
-79.6946328513629,
43.6552226727517
],
[
-79.6945439505269,
43.6551559442016
]
],
[
[
-79.6945439505269,
43.6551559442016
],
[
-79.694066895687,
43.654797865403
],
[
-79.6934193769725,
43.6543136334174
],
[
-79.6924271403494,
43.6535711706703
],
[
-79.6920107752268,
43.6532605761111
],
[
-79.6919943721596,
43.6532604342567
]
],
[
[
-79.6919943721591,
43.6532604342562
],
[
-79.6914713751595,
43.6536247980162
],
[
-79.6911279733848,
43.6533992300817
]
],
[
[
-79.6959960003114,
43.6400049378117
],
[
-79.6960571265341,
43.6400850012767
],
[
-79.6961629127738,
43.640012603549
],
[
-79.6962380127401,
43.6399612066507
],
[
-79.6964991971409,
43.6401581219518
],
[
-79.6965504313169,
43.6403222661559
]
],
[
[
-79.6965504313164,
43.6403222661554
],
[
-79.6963411179014,
43.6405181683405
]
],
[
[
-79.6973635087052,
43.6393434514529
],
[
-79.6975152035274,
43.6394534198075
],
[
-79.6974394413309,
43.6393984974797
],
[
-79.6977214702725,
43.6396029481089
]
],
[
[
-79.7037279098659,
43.6441816734685
],
[
-79.7038116627627,
43.6442425378655
],
[
-79.7043663390943,
43.6446488071586
],
[
-79.7048680167224,
43.645024186195
],
[
-79.7053904212546,
43.6454260322038
],
[
-79.7059251921243,
43.6458354437457
],
[
-79.7065612964782,
43.6461582999466
],
[
-79.7065835802603,
43.6461492141531
],
[
-79.7066775992172,
43.6462189758028
]
],
[
[
-79.6973112420145,
43.6393143090171
],
[
-79.6972862146857,
43.6393319663604
],
[
-79.6971872210635,
43.6392588377729
],
[
-79.6968127129063,
43.6395272639245
],
[
-79.6966669835105,
43.6395263250713
],
[
-79.6960609039152,
43.6399594999986
]
],
[
[
-79.6960609039152,
43.6399594999986
],
[
-79.6960181044663,
43.639990047741
]
],
[
[
-79.7032573776668,
43.6438035217788
],
[
-79.7032773647046,
43.6438182749637
],
[
-79.7032968727787,
43.6438287137889
]
],
[
[
-79.6982280590368,
43.6399996458065
],
[
-79.6983123840689,
43.6400655275078
],
[
-79.6988209765837,
43.6404574862051
],
[
-79.6997681798983,
43.6411574341786
],
[
-79.699722618622,
43.6411857721075
],
[
-79.7003863588722,
43.641677233397
],
[
-79.7008842734269,
43.6420484764154
],
[
-79.7014070294285,
43.6424381587765
],
[
-79.7024395226368,
43.643210754341
],
[
-79.7029823771679,
43.643602976107
],
[
-79.7032573776668,
43.6438035217788
]
]
]
}
}
]
}
This is the first time I am using GeoJSON, so I will need assistance with the GeoJSON format needed for to add the properties and styles. Plus the javascript and Google Map API code.
If anyone can provide some code or a link to some resources which give detailed examples, that would be greatly appreciated.
One option would be to process the GeoJSON as it is loaded, creating google.maps.Polyline objects from each section of the line. Use function closure (like is used for markers in the answer to this question, set the position of the infowindow to the first point of the polyline segment.
proof of concept fiddle
code snippet:
var colors = ["#FF0000", "#800000", "#00FF00", "#008000", "#0000FF", " #8A2BE2", "#A52A2A", "#DEB887", "#5F9EA0", "#000080", "#FFFF00", "#808000", "#FF00FF", "#800080", "#00FFFF", "#7FFFD4", "#008080", "#000000"];
var infowindow = new google.maps.InfoWindow();
function initialize() {
// Create a simple map.
features = [];
map = new google.maps.Map(document.getElementById('map-canvas'), {
zoom: 14,
center: {
lat: 43.65,
lng: -79.7
}
});
// process the loaded GeoJSON data.
var bounds = new google.maps.LatLngBounds();
google.maps.event.addListener(map.data, 'addfeature', function(e) {
if (e.feature.getGeometry().getType() === 'MultiLineString') {
var polys = e.feature.getGeometry().getArray();
for (var i = 0; i < polys.length; i++) {
for (var j = 0; j < polys[i].getLength(); j++) {
var poly = new google.maps.Polyline({
map: map,
path: polys[i].getArray(),
strokeColor: colors[i % colors.length]
});
google.maps.event.addListener(poly, 'click', (function(poly, i, feature) {
return function() {
infowindow.setContent("polyline " + i+"<br>"+feature.getProperty("Line"+i));
infowindow.setPosition(polys[i].getAt(0));
infowindow.open(map);
}
})(poly, i, e.feature));
bounds.extend(polys[i].getAt(j));
}
}
map.fitBounds(bounds);
map.data.setMap(null);
} else if (e.feature.getGeometry().getType() === 'GeometryCollection') {
var polys = e.feature.getGeometry().getArray();
for (var i = 0; i < polys.length; i++) {
for (var j = 0; j < polys[i].getLength(); j++) {
bounds.extend(polys[i].getAt(j));
}
}
map.fitBounds(bounds);
}
});
map.data.addGeoJson(data);
}
google.maps.event.addDomListener(window, 'load', initialize);
var data = {
"type": "FeatureCollection",
"features": [{
"type": "Feature",
"properties": {
"Key": "007",
"Line1": "<this is line 1 desc>",
"Line2": "<this is line 2 desc>",
"Line3": "<this is line 3 desc>",
"Line4": "<this is line 4 desc>",
"Line5": "<this is line 5 desc>",
"Line6": "<this is line 6 desc>",
"Line7": "<this is line 7 desc>"
},
"geometry": {
"type": "MultiLineString",
"coordinates": [
[
[-79.7066775992172,
43.6462189758028
],
[-79.7066939830514,
43.6461985074393
],
[-79.7066378408013,
43.6461605607267
],
[-79.7066097743239,
43.6461403201406
],
[-79.7064548987452,
43.6460566901385
],
[-79.7063956692058,
43.6460219372408
]
],
[
[-79.7063956692058,
43.6460219372408
],
[-79.7063852230813,
43.646033518772
],
[-79.7063166536656,
43.6461172032157
],
[-79.7064079964431,
43.6461815121163
],
[-79.7060589374119,
43.646518038823
],
[-79.7060054211382,
43.6465600820263
]
],
[
[-79.7060054211382,
43.6465600820263
],
[-79.7052588394648,
43.6471387374653
],
[-79.7048261689477,
43.6474817773536
],
[-79.7043239742464,
43.6474733374216
],
[-79.7041128202014,
43.6476608859429
],
[-79.703901284037,
43.6478509811517
],
[-79.7030237720306,
43.6486568653637
],
[-79.7029267563095,
43.6486965678914
]
],
[
[-79.7029267563091,
43.6486965678909
],
[-79.7028702942784,
43.6487267896104
],
[-79.7028019515802,
43.6487884267869
]
],
[
[-79.6949803205847,
43.6554816862022
],
[-79.6946328513629,
43.6552226727517
],
[-79.6945439505269,
43.6551559442016
]
],
[
[-79.6945439505269,
43.6551559442016
],
[-79.694066895687,
43.654797865403
],
[-79.6934193769725,
43.6543136334174
],
[-79.6924271403494,
43.6535711706703
],
[-79.6920107752268,
43.6532605761111
],
[-79.6919943721596,
43.6532604342567
]
],
[
[-79.6919943721591,
43.6532604342562
],
[-79.6914713751595,
43.6536247980162
],
[-79.6911279733848,
43.6533992300817
]
],
[
[-79.6959960003114,
43.6400049378117
],
[-79.6960571265341,
43.6400850012767
],
[-79.6961629127738,
43.640012603549
],
[-79.6962380127401,
43.6399612066507
],
[-79.6964991971409,
43.6401581219518
],
[-79.6965504313169,
43.6403222661559
]
],
[
[-79.6965504313164,
43.6403222661554
],
[-79.6963411179014,
43.6405181683405
]
],
[
[-79.6973635087052,
43.6393434514529
],
[-79.6975152035274,
43.6394534198075
],
[-79.6974394413309,
43.6393984974797
],
[-79.6977214702725,
43.6396029481089
]
],
[
[-79.7037279098659,
43.6441816734685
],
[-79.7038116627627,
43.6442425378655
],
[-79.7043663390943,
43.6446488071586
],
[-79.7048680167224,
43.645024186195
],
[-79.7053904212546,
43.6454260322038
],
[-79.7059251921243,
43.6458354437457
],
[-79.7065612964782,
43.6461582999466
],
[-79.7065835802603,
43.6461492141531
],
[-79.7066775992172,
43.6462189758028
]
],
[
[-79.6973112420145,
43.6393143090171
],
[-79.6972862146857,
43.6393319663604
],
[-79.6971872210635,
43.6392588377729
],
[-79.6968127129063,
43.6395272639245
],
[-79.6966669835105,
43.6395263250713
],
[-79.6960609039152,
43.6399594999986
]
],
[
[-79.6960609039152,
43.6399594999986
],
[-79.6960181044663,
43.639990047741
]
],
[
[-79.7032573776668,
43.6438035217788
],
[-79.7032773647046,
43.6438182749637
],
[-79.7032968727787,
43.6438287137889
]
],
[
[-79.6982280590368,
43.6399996458065
],
[-79.6983123840689,
43.6400655275078
],
[-79.6988209765837,
43.6404574862051
],
[-79.6997681798983,
43.6411574341786
],
[-79.699722618622,
43.6411857721075
],
[-79.7003863588722,
43.641677233397
],
[-79.7008842734269,
43.6420484764154
],
[-79.7014070294285,
43.6424381587765
],
[-79.7024395226368,
43.643210754341
],
[-79.7029823771679,
43.643602976107
],
[-79.7032573776668,
43.6438035217788
]
]
]
}
}]
}
html,
body,
#map-canvas {
height: 100%;
margin: 0px;
padding: 0px;
width: 100%;
}
<script src="https://maps.googleapis.com/maps/api/js"></script>
<div id="map-canvas"></div>
I have two data frames, one that contains lat / long points and another that contains geojson data to draw multiple polygons.
The first dataframe (countyDF) is imported from a CSV and the 2nd dataframe (basinData) is imported from geoJSON using readLines() (should I be using getJSON() instead for the json data?
i.e. (Over simplified lat and long, let me know if a more realistic example would help)
countyDF
pointNum Lat Long
1 100 251
2 150 175
3 50 -330
4 -150 100
and geoJSON formatted like this(basinData):
{
"type": "FeatureCollection",
"crs": { "type": "name", "properties": { "name": "urn:ogc:def:crs:OGC:1.3:CRS84" } },
"features": [
{ "type": "Feature", "properties": { "Basin_ID": "9-19", "Basin_Subb": "9-19", "Basin_Name": "TIA JUANA", "Subbasin_N": null }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ -117.0679768595155, 32.574291366336219 ], [ -117.06716593683379, 32.573400317874729 ], [ -117.06341997467541, 32.569091397019029 ], [ -117.06194315333148, 32.566356289257612 ], [ -117.0590009947062, 32.565022113434004 ], [ -117.05426277324793, 32.561991889800737 ], [ -117.04796901017907, 32.559156827796407 ], [ -117.03949416454982, 32.555722526353854 ], [ -117.03658670964059, 32.55260623337746 ], [ -117.03547505795123, 32.551186682532702 ], [ -117.03106791328587, 32.547633284977714 ], [ -117.02453341519558, 32.542766715943806 ], [ -117.03096733611001, 32.542203418941845 ], [ -117.0324558562139, 32.542073586201752 ], [ -117.03444148063983, 32.541897693349327 ], [ -117.03508530679723, 32.541840746762325 ], [ -117.03893675722108, 32.541503138385245 ], [ -117.04563848478635, 32.540916090254136 ], [ -117.04690666072744, 32.540805072734159 ], [ -117.04975321345368, 32.540555662782488 ], [ -117.05487638996065, 32.540105955834157 ], [ -117.05619151994331, 32.539990441256123 ], [ -117.05848531771566, 32.539788923486533 ], [ -117.0631749250457, 32.54215137136817 ], [ -117.06680187749691, 32.543877383980934 ], [ -117.06927539222501, 32.545562859775892 ], [ -117.07152258989198, 32.546046045900979 ], [ -117.07489884859852, 32.54530496670057 ], [ -117.07773834685484, 32.54485919550919 ], [ -117.0794889459513, 32.544200153274701 ], [ -117.08117134135811, 32.543542769155188 ], [ -117.08203266401881, 32.542495002174064 ], [ -117.08180018990477, 32.541005184972832 ], [ -117.07878527133475, 32.53800290854668 ], [ -117.08430738067622, 32.537515913009408 ], [ -117.08445861115598, 32.541710736294391 ], [ -117.08701141119104, 32.543969332906386 ], [ -117.08926545491882, 32.544796327561301 ], [ -117.09253292944108, 32.545435592907872 ], [ -117.09470680391962, 32.545631754871643 ], [ -117.09647622272129, 32.545948915009404 ], [ -117.09856416256312, 32.545227266049267 ], [ -117.10023514667088, 32.543994836689315 ], [ -117.10068933038696, 32.542953304281674 ], [ -117.09977208759979, 32.541129535758976 ], [ -117.09846043769052, 32.539943208794561 ], [ -117.09742346564566, 32.538925042182839 ], [ -117.0947242456092, 32.536597809766022 ], [ -117.09829488087858, 32.536282964307077 ], [ -117.09840179566876, 32.536275072973709 ], [ -117.10107619525394, 32.538466493837575 ], [ -117.10364734517533, 32.538196700466642 ], [ -117.10522233351519, 32.538976382033773 ], [ -117.10697118628674, 32.538259847181273 ], [ -117.10895422266631, 32.535344375464277 ], [ -117.11407862537365, 32.537631294453931 ], [ -117.11625581912718, 32.537999378719483 ], [ -117.11827996690573, 32.537508348187259 ], [ -117.11981739848021, 32.536392491701257 ], [ -117.11950653557783, 32.534417856408112 ], [ -117.12372820778141, 32.534103225208959 ], [ -117.12463878866241, 32.534032917908206 ], [ -117.12483027265007, 32.538135448497222 ], [ -117.12502048372495, 32.539350678974635 ], [ -117.12507499529305, 32.540448930323947 ], [ -117.12528896501055, 32.54199741245435 ], [ -117.12534718931569, 32.542421432291746 ], [ -117.12599067680821, 32.544551370001955 ], [ -117.12618141788914, 32.545696646353761 ], [ -117.12645239828784, 32.546429360119511 ], [ -117.12685554422104, 32.548102449008979 ], [ -117.12737138919654, 32.549703348943893 ], [ -117.12763885987745, 32.550533107582829 ], [ -117.12774693079939, 32.550626279391835 ], [ -117.12774760184091, 32.550738637890753 ], [ -117.12880070942489, 32.553488055429675 ], [ -117.13066553380129, 32.556444612620524 ], [ -117.13079138058607, 32.556859005175724 ], [ -117.13090941562115, 32.557248309210252 ], [ -117.13144943066766, 32.558277136765511 ], [ -117.1318548691015, 32.55947068448608 ], [ -117.13255675068075, 32.56215226422006 ], [ -117.13271798929109, 32.563505590313213 ], [ -117.13279809835319, 32.563664669666316 ], [ -117.13291972036119, 32.564513801799826 ], [ -117.13304092465259, 32.565358954253092 ], [ -117.13317600342353, 32.566483563557291 ], [ -117.13322866985202, 32.568729809322086 ], [ -117.13322781196094, 32.569368502964551 ], [ -117.13308936989753, 32.57249315647227 ], [ -117.13300847391125, 32.57431809617708 ], [ -117.13308839881873, 32.575694177896445 ], [ -117.13327826763549, 32.576450195841588 ], [ -117.13316931636456, 32.57809929670556 ], [ -117.12945373702573, 32.577489321400627 ], [ -117.12803166530341, 32.577626667375966 ], [ -117.12613400930002, 32.577713895283864 ], [ -117.1215685168029, 32.576694287394176 ], [ -117.11623399545566, 32.574537576560076 ], [ -117.11386955416937, 32.575034406011433 ], [ -117.11190978595226, 32.575409504886181 ], [ -117.10647756024149, 32.575149662771501 ], [ -117.10010384262075, 32.575306669257621 ], [ -117.09510331040457, 32.576303242644421 ], [ -117.09115860317613, 32.575789876157522 ], [ -117.08924842286103, 32.575244771913034 ], [ -117.08413987452015, 32.574174428676862 ], [ -117.0780854909149, 32.573290881813598 ], [ -117.07518693202192, 32.574254743186088 ], [ -117.07241851206417, 32.574928753914364 ], [ -117.0679768595155, 32.574291366336219 ] ] ] ] } },
{ "type": "Feature", "properties": { "Basin_ID": "9-18", "Basin_Subb": "9-18", "Basin_Name": "OTAY VALLEY", "Subbasin_N": null }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ -117.09094993890868, 32.619563585315568 ], [ -117.09024988989258, 32.616130450534968 ], [ -117.08912052342582, 32.606956836308925 ], [ -117.08936013555311, 32.601840287378998 ], [ -117.08875364427725, 32.598575478052986 ], [ -117.08680278644947, 32.595962898606246 ], [ -117.08441502669503, 32.595310423365085 ], [ -117.08156822333883, 32.595469231360795 ], [ -117.07566674287644, 32.595559902113109 ], [ -117.06972154230749, 32.596914588847568 ], [ -117.06432122834885, 32.598375496484095 ], [ -117.05964154380946, 32.598503627441907 ], [ -117.05616095566411, 32.597465378923445 ], [ -117.05457753753359, 32.596282762027229 ], [ -117.05073717232032, 32.594157791168556 ], [ -117.04589484061222, 32.592852213251177 ], [ -117.04312260910274, 32.593353146236865 ], [ -117.03899112955028, 32.593759832413419 ], [ -117.03371516847811, 32.59464309052418 ], [ -117.02862864592969, 32.594776592758308 ], [ -117.02272717085718, 32.594863862255089 ], [ -117.0174198657681, 32.594081171207122 ], [ -117.01427902225983, 32.593035965218846 ], [ -117.01229182668048, 32.591973496787411 ], [ -117.00991124266857, 32.59166411636609 ], [ -117.00874224342975, 32.590819514563513 ], [ -117.00439778679819, 32.59071141973822 ], [ -116.99862263551589, 32.590278958905216 ], [ -116.99501811309899, 32.589814373506385 ], [ -116.99227778615513, 32.588360796763084 ], [ -116.9878300187743, 32.586300314735269 ], [ -116.98294017247588, 32.586026512778012 ], [ -116.97812890271364, 32.586325564277786 ], [ -116.97467386624824, 32.586605260066648 ], [ -116.97337558327038, 32.586106965024619 ], [ -116.97141803299374, 32.586652564095417 ], [ -116.96871423244761, 32.587208317388864 ], [ -116.96622438005944, 32.588335212493227 ], [ -116.96042390526929, 32.590256368604443 ], [ -116.95557273144384, 32.592164131117251 ], [ -116.95221499420181, 32.594107400176057 ], [ -116.94872755550411, 32.596397127940342 ], [ -116.9459745479417, 32.59804505057086 ], [ -116.94404991193704, 32.60048471476123 ], [ -116.94239254000078, 32.602691044509697 ], [ -116.94129817998231, 32.602189536366552 ], [ -116.93999093720781, 32.601173820492207 ], [ -116.9387687523143, 32.601133484218465 ], [ -116.93659488322565, 32.600992021811464 ], [ -116.93460813321069, 32.599928207546924 ], [ -116.93113424649738, 32.599173132290474 ], [ -116.9290142367999, 32.598225880469577 ], [ -116.9297423527787, 32.597181836096283 ], [ -116.93156090942639, 32.596409563977744 ], [ -116.93419840172811, 32.59591322458288 ], [ -116.93526867881214, 32.59503617591411 ], [ -116.93605726039273, 32.593589412189353 ], [ -116.93802757033967, 32.593733854613859 ], [ -116.94108823292062, 32.59415039913921 ], [ -116.94387352503604, 32.59434074502488 ], [ -116.94622507549801, 32.593043461784966 ], [ -116.94952443572129, 32.591618310534955 ], [ -116.95503847459827, 32.588839924862008 ], [ -116.96069206495801, 32.586289312384324 ], [ -116.96744074207328, 32.584296717135018 ], [ -116.97352519139275, 32.58311772322412 ], [ -116.981310419243, 32.58220119915768 ], [ -116.98674868342913, 32.582811874361674 ], [ -116.98884496772166, 32.58243634623561 ], [ -116.99120624064994, 32.581712707774301 ], [ -116.99562328744466, 32.582107923888408 ], [ -117.00194664640887, 32.582877158297073 ], [ -117.0089538927989, 32.583923150362544 ], [ -117.01141157202515, 32.584748556651512 ], [ -117.01434837489472, 32.585796911898029 ], [ -117.01815538476318, 32.586200321663206 ], [ -117.02285874709682, 32.587394380373176 ], [ -117.02575839115465, 32.586431974986255 ], [ -117.02839270252196, 32.585818237773253 ], [ -117.03196930799476, 32.584788393166896 ], [ -117.0353541393241, 32.584393233527791 ], [ -117.03987787273692, 32.583234126182219 ], [ -117.04456223839728, 32.583393332527173 ], [ -117.05183697612944, 32.584203049689997 ], [ -117.05593610914512, 32.585692117192636 ], [ -117.06281865949224, 32.587253753081214 ], [ -117.06796448881572, 32.58671587048164 ], [ -117.07223334682625, 32.58642076677404 ], [ -117.07703883952287, 32.585830015255262 ], [ -117.08162775469272, 32.584553520964334 ], [ -117.08146648913291, 32.583234519028593 ], [ -117.08023020967258, 32.582449325419546 ], [ -117.07885455120608, 32.581493901980991 ], [ -117.07727361062912, 32.580426941222797 ], [ -117.07425083808856, 32.578462550566684 ], [ -117.06938355028353, 32.575836820265927 ], [ -117.0679768595155, 32.574291366336219 ], [ -117.07241851206417, 32.574928753914364 ], [ -117.07518693202192, 32.574254743186088 ], [ -117.0780854909149, 32.573290881813598 ], [ -117.08413987452015, 32.574174428676862 ], [ -117.08924842286103, 32.575244771913034 ], [ -117.09115860317613, 32.575789876157522 ], [ -117.09510331040457, 32.576303242644421 ], [ -117.10010384262075, 32.575306669257621 ], [ -117.10647756024149, 32.575149662771501 ], [ -117.11190978595226, 32.575409504886181 ], [ -117.11386955416937, 32.575034406011433 ], [ -117.11623399545566, 32.574537576560076 ], [ -117.1215685168029, 32.576694287394176 ], [ -117.12613400930002, 32.577713895283864 ], [ -117.12803166530341, 32.577626667375966 ], [ -117.12945373702573, 32.577489321400627 ], [ -117.13316931636456, 32.57809929670556 ], [ -117.1331563833194, 32.580461353063633 ], [ -117.13313895151131, 32.583667037314477 ], [ -117.13305735034756, 32.583895362177643 ], [ -117.13302977296665, 32.585640269688504 ], [ -117.13295680012118, 32.588445797309987 ], [ -117.13289081976185, 32.590978124650832 ], [ -117.13297029012178, 32.593220472898004 ], [ -117.13327298374499, 32.596429216273556 ], [ -117.13369223287799, 32.600867834211961 ], [ -117.12357324390332, 32.599849353180694 ], [ -117.11589749104179, 32.600462415367865 ], [ -117.1158318107198, 32.599777199361775 ], [ -117.11569656396389, 32.599570329888948 ], [ -117.11357898780251, 32.600647416312967 ], [ -117.11177228173005, 32.601566355813603 ], [ -117.11086691646757, 32.6021189358108 ], [ -117.109148177021, 32.60316887609109 ], [ -117.10657781402381, 32.605005474819528 ], [ -117.10440205308318, 32.606810459053861 ], [ -117.10381630211057, 32.607296700104691 ], [ -117.10324926304537, 32.607915653098537 ], [ -117.1011662702258, 32.612340439524345 ], [ -117.10075972866082, 32.612638859150657 ], [ -117.09977514995659, 32.61283800452685 ], [ -117.09894626905405, 32.613005656528017 ], [ -117.09840512482522, 32.613004595665437 ], [ -117.09740410874794, 32.613049694419168 ], [ -117.09705139513309, 32.613600174933183 ], [ -117.09705183800251, 32.613714871347049 ], [ -117.09843269412127, 32.614767877138462 ], [ -117.09843275610504, 32.615089758769592 ], [ -117.09764747762975, 32.616740386051163 ], [ -117.09756685742629, 32.617197966049851 ], [ -117.09758669043111, 32.617689185373024 ], [ -117.09759495313509, 32.617885380739892 ], [ -117.09783823577867, 32.618045356304386 ], [ -117.09794643605798, 32.618320326264303 ], [ -117.09786566853079, 32.619785920093079 ], [ -117.09094993890868, 32.619563585315568 ] ] ] ] } }
]
}
I want to test point numbers 1,2,3,4 against all "Basin_ID"'s and if they are within that basin, add that as a column to countyDF.
For example if point 1 were in basin 9-18 and none of the other points fell within polygons contained in basinData the returned data frame would look like the following...
returnedDF:
pointNum Lat Long Basin
1 100 251 9-18
2 150 175 n/a
3 50 -330 n/a
4 -150 100 n/a
Might anybody suggest a specific library / method / solution of accomplishing this? I imagine if there's a library that tests if a point is in a polygod I can loop over the 2nd dataframe for each point?
Here is one way you could do it, using rgdal to read the geojson (see this answer for more details on this) and sp and/or rgeos to test if a point lies within a polygon or not.
Note, I adjusted your coordinates, since none of them was located within a polygon.
First, read the data:
countyDF <- read.table(textConnection("
pointNum Lat Long
1 32.6 -117.1
2 90 175
4 -90 100"), header = TRUE)
basinDF <- rgdal::readOGR("basin.json", "OGRGeoJSON")
Make sure points and polygons have the same projection:
sp::coordinates(countyDF) <- ~Long+Lat
sp::proj4string(countyDF) <- sp::proj4string(basinDF)
Here we use sp::over to extract the attributes of basinDF at each point. If points are not located within a polygon of basinDF NA is returned.
sp::over(countyDF, basinDF)
# Basin_ID Basin_Subb Basin_Name Subbasin_N
# 1 9-18 9-18 OTAY VALLEY <NA>
# 2 <NA> <NA> <NA> <NA>
# 3 <NA> <NA> <NA> <NA>
Alternatively, you could also use rgeos, which tells you that point 1 is located in poygon 1.
rgeos::gWithin(countyDF, basinDF, byid = TRUE)
# 1 2 3
# 0 FALSE FALSE FALSE
# 1 TRUE FALSE FALSE
I want to import data from text file.which contain arround lakhs of records
I am using bulk insert do it like this
BULK
INSERT vw_bulk_insert_test
FROM '\\server\c$\csvtext.txt'--\\server\SQLEXPRESS\csvtest.txt'
WITH
(FIRSTROW=2,
check_CONSTRAINTS,
FIELDTERMINATOR = '~',
ROWTERMINATOR = '\n'
)
GO
But before insert I want to validate values of each column without using cursor.Like if second row will have values of all fields except unit_number(Column) then it should create a error log specifying unit_number value is missing.
Personally, I would bulk-insert into a temp table, and then do validations/conversions from the temp table into the table where things will ultimately reside using either TSQL or TSQL in the form of stored procedures created for this purpose.
You have this syntax
BULK INSERT
[ database_name . [ schema_name ] . | schema_name . ] [ table_name | view_name ]
FROM 'data_file'
[ WITH
(
[ [ , ] BATCHSIZE = batch_size ]
[ [ , ] CHECK_CONSTRAINTS ]
[ [ , ] CODEPAGE = { 'ACP' | 'OEM' | 'RAW' | 'code_page' } ]
[ [ , ] DATAFILETYPE =
{ 'char' | 'native'| 'widechar' | 'widenative' } ]
[ [ , ] FIELDTERMINATOR = 'field_terminator' ]
[ [ , ] FIRSTROW = first_row ]
[ [ , ] FIRE_TRIGGERS ]
[ [ , ] FORMATFILE = 'format_file_path' ]
[ [ , ] KEEPIDENTITY ]
[ [ , ] KEEPNULLS ]
[ [ , ] KILOBYTES_PER_BATCH = kilobytes_per_batch ]
[ [ , ] LASTROW = last_row ]
[ [ , ] MAXERRORS = max_errors ]
[ [ , ] ORDER ( { column [ ASC | DESC ] } [ ,...n ] ) ]
[ [ , ] ROWS_PER_BATCH = rows_per_batch ]
[ [ , ] ROWTERMINATOR = 'row_terminator' ]
[ [ , ] TABLOCK ]
[ [ , ] ERRORFILE = 'file_name' ]
)]