I frequently find myself doing some analysis in R and then wanting to make a quick map. The standard plot() function does a reasonable job of quick, but I quickly find that I need to go to ggplot2 when I want to make something that looks nice or has more complex symbology requirements. Ggplot2 is great, but is sometimes cumbersome to convert a SpatialPolygonsDataFrame into the format required by Ggplot2. Ggplot2 can also be a tad slow when dealing with large maps that require specific projections.
It seems like I should be able to use Mapnik to plot spatial objects directly from R, but after exhausting my Google-fu, I cannot find any evidence of bindings. Rather than assume that such a thing doesn't exist, I thought I'd check here to see if anyone knows of an R - Mapnik binding.
The Mapnik FAQ explicitly mentions Python bindings -- as does the wiki -- with no mention of R, so I think you are correct that no (Mapnik-sponsored, at least) R bindings currently exist for Mapnik.
You might get a more satisfying (or at least more detailed) answer by asking on the Mapnik users list. They will know for certain if any projects exist to make R bindings for Mapnik, and if not, your interest may incite someone to investigate the possibility of generating bindings for R.
I would write the SpatialWotsitDataFrames to Shapefiles and then launch a Python Mapnik script. You could even use R to generate the Python script (package 'brew' is handy for making files from templates and inserting values form R).
Related
I'm wondering if there are any packages for R which help to visualize workflows/code in a way Alteryx does. I find the visualization of the workflows within Alteryx quite helpful, but manually dragging an dropping the tools onto the canvas and set the parameters just takes so much longer than just writing the code in R. Also some functionally within Alteryx is not yet sufficient and has to be implemented via the R/Python-Tool anyway.
During my search I found this post which goes into the same direction, but the suggested packages don't really match what I am looking for.
Best regards
I am currently storing the output (a Julia Dataframe) of my Julia simulation in a Parquet file using Parquet.jl. I would also like to save some of the simulation parameters (eg. a list of (byte-)strings) to that same output file.
Preferably, these parameters are different for each column as each column is the result of different starting conditions of my code. However, I could also work with a global parameter list and then untangle it afterwards by indexing.
I have found a solution for Python using pyarrow
https://mungingdata.com/pyarrow/arbitrary-metadata-parquet-table/.
Do you know a way how to do it in Julia?
It's not quite done yet, and it's not registered, but my rewrite of the Julia parquet package, Parquet2.jl does support both custom file metadata and individual column metadata (the keyword arguments metadata and column_metadata in Parquet2.writefile.
I haven't gotten to documentation for writing yet, but if you are feeling adventurous you can give it a shot. I do expect to finish up this package and register it within the next couple of weeks. I don't have unit tests in for writing yet, so of course, if you try it and have problems, please open an issue.
It's probably also worth mentioning that the main use case I recommend for parquet is if you must have parquet for compatibility reasons. Most of the time, Julia users are probably better off with Arrow.jl as the format has a number of advantages over parquet for most use cases, please see my FAQ answer on this. Of course, the reason I undertook writing the package is because parquet is arguably the only ubiquitous binary format in "big data world" so a robust writer is desperately needed.
I am in the process of automating a number of graphs that are produced where I work through R that are currently in Excel.
Note that for now, I am not able to convince that doing the graphs directly in R is the best solution, so the solution cannot be "use ggplot2", although I will push for it.
So in the meantime, my path is to download, update and tidy data in R, then export it to an existing Excel file where the graph is already constructed.
The way I have been trying to do that is through openxlsx, which seems to be the most frequent recommendation (for instance here).
However, I am encountering an issue taht I cannot solve with this way (I asked a question there that did not inspire a lot of answers !).
Therefore, I am going to try other ways, but I seem to mainly be directed to the aforementioned solution. What are the existing alternatives ?
Does one need to have SAS IML installed to use the SAS/R interface? or should/could one use the sas x command to run R and feed data to it?
If you want to actually use the SAS/R interface, then yes, you must license and have SAS/IML installed as it is specifically a feature of SAS/IML (which makes sense, as SAS/IML is SAS's matrix programming language, and R is a matrix programming language).
However, you're welcome to use R the way you describe (by submitting R programs via xcmd); you will, however, need to use a CSV file or similar to exchange data between the two programs. There are several ways to do it, so look at the different options available to see what's easiest for you.
If you're choosing between the different ways to do this, here is a list of the advantages of using IML which serves as a nice comparison between the two (perhaps a biased one (Rick is the lead developer of SAS/IML), but it is sufficiently detailed in what you won't have available to you running it as a separate program that it should be helpful in making the decision).
As question, I have satisfied with what R and ggplot2 can do for static graph, but what about interactive graphs? How combine R and Protovis to make the graphs?
There is somethings called rwebvis but seems it is no longer active.
Any suggestion? Thanks.
Well, first you need a web server. Ooh, R has one of those now. Then you need some way of generating output on the web from R code - ooh, R has one of those too:
http://jeffreybreen.wordpress.com/2011/04/25/4-lines-of-r-to-get-you-started-using-the-rook-web-server-interface/
So you can then write R server pages that return JSON-encoded data that you can feed to Protovis - or if you want to get right up to date, to D3, which is Protovis++ and made of win.
Iplots is a fairly useful package that allows interactive graphing ( by this I mean selection linking between graphs, color linking, etc). It has some limitations and is not really made for producing plots as much as exploring data trends.
Acinonyx also was recently updated which is supposed to be an updated version of iplots, but from what I can tell it still has some work to do.
Not familiar with protovis or rwebvis.
There is a package from google called googlevis that enables some interactivity. This produces plots that are embeddable online. If you like protovis, the same author has another library called D3.
For running R on a webserver, I have been experimenting with RApache, which enables you to link your R installation to an apache server.
If the interactivity does not to be online, RStudio have a package called manipulate which may also be of interest.