Access PostGIS from R on Windows XP - r

Is there any easy way how to access PostGis data in R on Windows XP?
I thought I can try rgdal, but there is no easy way how to add PG driver (see file.show(system.file("README.windows", package="rgdal")). Therefore this simple piece of code does not work:
library(rgdal)
mylayer <-readOGR(dsn="PG:host=localhost user=MyUser dbname=MyDb password=Secret port=5432", layer = "MyLayer", verbose = TRUE)
Then I have found qGIS plugin called manageR. Unfortunately, there is a dependency to rpy2 plugin which is not available anymore.
I have no problems using shapefiles, but my intention was to integrate postgis data with current data warehouse and then use R for analytics.
Is there any known simple way how to use PostGIS and R or I have to change OS?

This question has been previously dealt with on the r-sig-geo mailing lists.
Searching for 'postgis rgdal windows' on rseek.org finds previous discussions.
In short - not positive, because the windows rgdal is compiled using a minimal set of drivers - because compiling for lots of others - all of which have different versions - would just transpose the problem into one of the wrong version or missing postgis (in this case).
Going via FWtools may be an option.

Related

Link Project and R Version

I have two different versions of R installed, one which is up to date and which I use for all my regular R coding (needs to be up to date so that I can use various updated and new packages) and one which I use to access OLAP cubes (needs to be the R Client from Microsoft, because this is the only one which supports the olapR package, and which currently uses R version 3.4.3).
Since, in theory, I only have to access the OLAP cube once a month, I "outsourced" this task to a different RStudio project, in which I download and save the required data for all other projects. Hence, all other projects never require the olapR package to be installed and can and will be run in the up to date R version.
Now, ideally I would like to link my R version to my projects, so that I do not have to change my global R version and restart RStudio every time I access the OLAP cube or work on this data retrieval project (and then switch it back). However, I could not find any options in RStudio to achieve this result.
There are a few threads out there describing the same problem, but with no satisfactory answer in my opinion:
https://support.rstudio.com/hc/en-us/community/posts/200657296-Link-Project-and-R-Version
Rstudio project using different version of R
I also tried looking for a different package than olapR but with similar functionality, but could not find anything except X4R, which seems outdated and does not work for me (https://github.com/overcoil/X4R). Sadly, I am also unable to directly access the databases which the OLAP cube uses for its results, so I cannot go "around" it.
I am happy for any help or suggestions you can offer, whether it is a general workaround to link a project to a specific R version or the (less helpful for the community) solution of accessing the OLAP cube in a different way.
Thanks in advance!
Using the answer from MrGumble I created a .bat file that will execute my .R file using the desired R installation. Even though it is not the answer I thought I would get, I think it is an even better solution to the problem.
For all facing a similar issue, here is the .bat file (never created one before, so also had to google how to do it and I guess some might be in the same position):
#echo off
title Getting data for further processing in R
echo Retrieving OLAP data
echo.
"C:\Program Files\Microsoft\R Client\R_SERVER\bin\Rscript.exe" "C:\Users\me\Documents\Projects\!Data\script.R"
echo.
echo Saved data
echo.
pause
Thanks again to MrGumble for his help.
Skip RStudio.
RStudio is really just an editor (albeit powerful and useful) editor, which starts an R console for you (and the surrounding PATH variables, library locations, etc.).
If your monthly task only requires you to run the R-script (or a bit of interactive work), you can simply execute your preferred version of R from the command line and have it run your R script. E.g.
C:\Users\me>"C:\Program Files (x64)\Microsoft R\bin\Rscript" myscript.R
You might have to define some PATH variables so that the older R doesn't look for packages in the newer R's libraries, but that depends entirely on your current setup.

R: Is there a way to get the sessionInfo/packages of other session R?

Imagine that I open two session R.
In the first (R1) I loaded the package dplyr.
Now, my questions is, is there way to get the sessionInfo/packages loaded in R1
through R2??
UPDATE:
I am writing a R help system in Atom editor. Atom editor currently not support the function help of R. So i am creating one. And to find the help of the function you need to search into packages where this function is, the best way is know what packages are loaded in your current session R. And that is my difficult. One way to solution this is to forgett the loaded packages and search in all installed packages, but it is to slowly if you have a lot of packages installed.
So in my script R i have a line that has this code:
pkg <- .packages() # all packages loaded in this currently session
But when I run this script R1 in other script R2, it not get the packages loaded in the currently script R2, but the script R1.
Use the Services API to interact with Hydrogen
The following details interacting with other packages in atom: http://flight-manual.atom.io/behind-atom/sections/interacting-with-other-packages-via-services/
Hydrogen is an interface to a jupyter kernel. It's is maintaining the session with the kernel, and it has a plugin API currently which you could use to get the connection information to the backing kernel. https://nteract.gitbooks.io/hydrogen/docs/PluginAPI.html. Using that you could send your call to packages().
There is also r-exec, but I believe that's Mac only. In that case, you could get the

Google Prediction using R

Has anyone successfully used the Google Predication API from within R? My goal is to perform the following tasks:
Upload and manage the data in Google Storage
Use this data to train a model from Google Prediction
I have followed the install instructions located here and here (when using Windows). I have not been able to successfully connect using both Mac OSX and Windows.
I suspect that the core issue is authentication. There is scattered documentation and I feel like I have tried everything (even the overview of the R package designed for this purpose).
I am not the greatest programmer, but I can typically follow along with code and piece together what I need from worked examples. At this point, though, I simply do not know what else to try.
Many thanks in advance.
Marc Cohen seems to be right, I think something is broken. However I managed to pass authentication, here is how:
Download the googlepredictionapi_0.12.tar.gz and extract it inside a temporary folder. Then open googlepredictionapi/R/prediction_api_init.R inside an Editor and remove the lines
myEmail <- ""
myPassword <- ""
myAPIkey <- ""
Afterwards repackage the source files and load them in R:
tar czf googlepredictionapi.mod.tar.gz googlepredictionapi
R
remove.packages("googlepredictionapi")
Now you should be able to follow the steps in [1] http://code.google.com/p/r-google-prediction-api-v12/. However instead of calling
install.packages("googlepredictionapi_0.12.tar.gz", repos=NULL, type="source")
you need to call
install.packages("googlepredictionapi.mod.tar.gz", repos=NULL, type="source")
Following the steps, at some point a file $HOME/.auth-token should be generated.
(You can even explicitly trigger this by calling explicitely: PredictionApiUtilGetAuth(verbose=TRUE), myEmail and myPassword must be set beforehands.)
For some reason the global variables that are manually set in [1] have been shadowed by the removed lines above. The same is actually true for the verbose option which you can pass to most API function calls as extra option ..., verbose=TRUE).
Hope this helps.
Last I heard (April of this year), R support for the Google Prediction API was not yet upgraded to use OAuth so when the Prediction API switched from the older client login scheme to OAauth 2.0, it effectively broke R functionality.

How do I revert to an earlier version of a package?

I'm trying to write some SPARQL queries in R using the rrdf package. However, I get this error every time I try to load the library.
Error: package 'rrdflibs' 1.1.2 was found, but == 1.1.0 is required by 'rrdf'
Not sure why they didn't write it as >= 1.1.0. Is what they did a good programming practice?
Go to http://cran.r-project.org/src/contrib/Archive/rrdflibs/ to retrieve an older version. This is a source archive, so you will have to be able to build from source (typically easy on Linux, pretty easy on MacOS, and hard on Windows; you can use the http://win-builder.r-project.org/ service to build a Windows binary if necessary).
Actually, based on a quick look at the package, I think you should be able to install in this case (even on Windows without Rtools) via
download.file("http://cran.r-project.org/src/contrib/Archive/rrdflibs/rrdflibs_1.1.0.tar.gz",
dest="rrfdlibs_1.1.0.tar.gz")
install.packages("rrfdlibs_1.1.0.tar.gz",repos=NULL,type="source")
because the package doesn't actually contain anything that needs to be compiled.
Don't know about programming practice, you'd have to ask the authors if they had some particular reason to do it that way. (See maintainer("rrdf").) Maybe they knew the versions would not be backward/forward compatible?

How do you use multiple versions of the same R package?

In order to be able to compare two versions of a package, I need to able to choose which version of the package that I load. R's package system is set to by default to overwrite existing packages, so that you always have the latest version. How do I override this behaviour?
My thoughts so far are:
I could get the package sources, edit the descriptions to give different names and build, in effect, two different packages. I'd rather be able to work directly with the binaries though, as it is much less hassle.
I don't necessarily need to have both versions of the packages loaded at the same time (just installed somewhere at the same time). I could perhaps mess about with Sys.getenv('R_HOME') to change the place where R installs the packages, and then .libpaths() to change the place where R looks for them. This seems hacky though, so does anyone have any better ideas?
You could selectively alter the library path. For complete transparency, keep both out of your usual path and then do
library(foo, lib.loc="~/dev/foo/v1") ## loads v1
and
library(foo, lib.loc="~/dev/foo/v2") ## loads v2
The same works for install.packages(), of course. All these commands have a number of arguments, so the hooks you aim for may already be present. So don't look at changing R_HOME, rather look at help(install.packages) (assuming you install from source).
But AFAIK you cannot load the same package twice under the same name.
Many years have passed since the accepted answer which is of course still valid. It might however be worthwhile to mention a few new options that arised in the meanwhile:
Managing multiple versions of packages
For managing multiple versions of packages on a project (directory) level, the packrat tool can be useful: https://rstudio.github.io/packrat/. In short
Packrat enhances your project directory by storing your package dependencies inside it, rather than relying on your personal R library that is shared across all of your other R sessions.
This basically means that each of your projects can have its own "private library", isolated from the user and system libraries. If you are using RStudio, packrat is very neatly integrated and easy to use.
Installing custom package versions
In terms of installing a custom version of a package, there are many ways, perhaps the most convenient may be using the devtools package, example:
devtools::install_version("ggplot2", version = "0.9.1")
Alternatively, as suggested by Richie, there is now a more lightweight package called remotes that is a result of the decomposition of devtools into smaller packages, with very similar usage:
remotes::install_version("ggplot2", version = "0.9.1")
More info on the topic can be found:
https://support.rstudio.com/hc/en-us/articles/219949047-Installing-older-versions-of-packages
I worked with R for a longtime now and it's only today that I thought about this. The idea came from the fact that I started dabbling with Python and the first step I had to make was to manage what they (pythonistas) call "Virtual environments". They even have dedicated tools for this seemingly important task. I informed myself more about this aspect and why they take it so seriously. I finally realized that this is a neat and important way to manage different projects with conflicting dependencies. I wanted to know why R doesn't have this feature and found that actually the concept of "environments" exists in R but not introduced to newbies like in Python. So you need to check the documentation about this and it will solve your issue.
Sorry for rambling but I thought it would help.

Resources