I have a user defined function in SQLite (an aggregator that calculates the product) and it works fine outside R. But I'm on a Mac some of the time, which requires the MacPorts version of SQLite3 if you'd like to add your own functions/extensions.
Can I pick which SQLite3 that RSQLite loads? I don't see anything in the SQLite documentation.
Also, MacPorts appears to change my sqlite3 link to the MacPorts installed SQLite3:
mbp:~ richard$ which sqlite3
/opt/local/bin/sqlite3
But if I want to load the extension in SQLite3, I have to explicitly can the MacPorts version, like this:
mbp:~ richard$ /opt/local/bin/sqlite3 temp.sqlite
Is writing my own SQLite functions and combining them with R a lost cause? Thanks!
Have you installed and loaded the RSQLite.extfuns package? It has a single function which loads the available functions:
db <- dbConnect(SQLite(), dbname = ":memory:")
init_extensions(db)
By default these are the Healy extensions.
Related
I am trying to create a SQLite database, working within Jupyter Notebook.
However, when I run the command to create the database, I'm presented with the error 'No module named 'sqlite''. (See image for full command and response).
Checking the 'Environments' page of Anaconda Navigator, I can see that 'sqlite' is actually listed as an installed package.
What am I missing out?
I use sqlalchemy which takes away a lot of pain. You can install it like this in a Jupyter notebook cell (note the !)
!pip install sqlalchemy
Then your code would be as follows.
from sqlalchemy import create_engine
engine = create_engine('sqlite:///first_db.sqlite')
connection = engine.connect()
I am working on a machine without admin rights. I use sql developer to connect to an internal database. I would like to connect via R also.
Is there any way I can do this, without admin rights? Some solutions require me to set up a systemDNS - which I can not do.
Other requires me to install jvm.dll
My environment: Windows7, sqldeveloper, connection method is via TNS file.
Connecting to SQL Developer via R is far more difficult than other databases I've encountered. It's important that you have jdbc6.jar installed on your machine, and that you know the file path to where it was installed. Installing the jar file does not require admin rights. You can install the jar file from Oracle's website.
I use the RJDBC package to connect like so:
library(RJDBC)
jdbcDriver <- JDBC("oracle.jdbc.OracleDriver", classPath = "file path to where ojdbc6.jar is installed on your computer")
jdbcConnection <- dbConnect(jdbcDriver, "jdbc:oracle:thin:#YOUR_SERVER","YOUR_USERNAME","YOUR_PASSWORD")
You can then test the connection with a number of commands; I typically use:
dbListTables(jdbcConnection)
Another favorite of mine is to use dbplyr for dplyr-like functions when working with databases:
library(dbplyr)
tbl(jdbcConnection, "SAMPLE_TABLE_NAME")
The resulting output will be the data from the queried table in tibble form.
You can set the environment variables in your R session.
Sys.setenv(OCI_LIB64="/Path/to/instantclient",OCI_INC="/Path/to/instantclient/sdk/include")
You can put this in the file .Rprofile in your home directory, and RStudio will run it each time you begin a new session. Once you have this in .Rprofile you should be able to install ROracle.
I want to use sqlite with the json extension so I've installed it with homebrew. When I run which sqlite though, the one that is being used is the anaconda install. If I try and use pythons sqlite library I have the same issue. It's linked to the Anaconda version and the JSON functions aren't available. How do I replace this with the brew version? Brew provided some values when I installed sqlite but I don't know if I need them or how they are used.
LDFLAGS: -L/usr/local/opt/sqlite/lib
CPPFLAGS: -I/usr/local/opt/sqlite/include
PKG_CONFIG_PATH: /usr/local/opt/sqlite/lib/pkgconfig
Sqlite installed by Homebrew is keg-only, which is not linked to /usr/local/... .
This is because system already have older version of sqlite3.
If you really want to invoke Homebrew's sqlite binary, specify full path as below.
$ /usr/local/opt/sqlite/bin/sqlite3
(All Homebrew package is symlinked under /usr/local/opt)
I'm not so familiar with python, but AFAIK sqlite is statically linked to python executable.
In other words, maybe you have to build python from source to use with Homebrew's sqlite.
The answer by equal-l2 is correct. Also, the comment under it by Keith John Hutchison.
But, since they are from quite a few years ago and there is not an officially accepted answer still, here you go as this still catches you off-guard in 2022.
To fix, add this to your ~/.zshrc file and you should be good:
export PATH=/usr/local/opt/sqlite/bin:$PATH
Remember to have $PATH at the end like the above and not at the beginning like so:
export PATH=$PATH:/usr/local/opt/sqlite/bin
as the shell traverses your $PATH for command completion left to right and stops at the first instance found and obviously you want your desired path to be considered first.
Also, you might need to run source ~/.zshrc and rehash if you want it to just start working in the same terminal session.
I'm trying to use the SOUNDEX function with SQLite. Can I install this with homebrew, or do I need to compile from source?
I've tried
brew install --fresh sqlite --with-functions
which seems to install extension functions, but I still get Error: no such function: SOUNDEX messages on my queries.
I also tried to modify the sqlite formula, adding the following compile option
ENV.append 'CPPFLAGS', "-DSQLITE_SOUNDEX"
based on http://www.sqlite.org/lang_corefunc.html, but this still fails.
Ideally I'd like to avoid compiling SQLite manually from source, even if that means I need to write a custom homebrew formula.
That's the right flag, but you might not be running the right sqlite.
Your approach is correct. Adding that ENV.append 'CPPFLAGS', "-DSQLITE_SOUNDEX" will compile it with soundex. I just tested it with SQLite 3.7.16.2 and homebrew on my OS X 10.8.3 system. Or, for more control, like this.
option 'with-soundex', 'Enable the SOUNDEX function'
def install
[ ... ]
ENV.append 'CPPFLAGS', "-DSQLITE_SOUNDEX" if build.include? "with-soundex"
Are you sure you're calling the right sqlite3 program once it's installed? SQLite is a "keg-only" formula; that is, unlike most homebrew formulas, it does not get linked in to /usr/local/bin, to avoid conflicting with the sqlite supplied with OS X. You need to call the homebrew one with the full path, like /usr/local/Cellar/sqlite/3.7.16.2/bin/sqlite3.
$ /usr/local/Cellar/sqlite/3.7.16.2/bin/sqlite3
SQLite version 3.7.16.2 2013-04-12 11:52:43
Enter ".help" for instructions
Enter SQL statements terminated with a ";"
sqlite> select soundex('Hello, world!');
H464
It would be easy to add a --with-soundex option to the main homebrew sqlite formula, so you don't have to maintain a separate formula. If you think enough people would use it, head on over to the Homebrew issue tracker on GitHub and put in a request for it.
Installing the RODBC package on Ubuntu is a bit of a kludge. First I learned to install the following:
$ sudo apt-get install r-cran-rodbc
That wasn't good enough as the package was still looking for header files. I solved this issue by:
$ sudo apt-get install unixodbc-dev
Good, RODBC installed properly on the Ubuntu machine. But when I try to run the following script:
## import excel file from Dropbox
require("RODBC")
channel <- odbcConnectExcel("~/Dropbox/DATA/SAMPLE/petro.xls")
petro <- sqlFetch (channel, "weekly")
odbcClose(channel)
str(petro)
head(petro)
I get an error thrown that function odbcConnectExcel not found. I checked the case of each letter, making sure it was not a simple typo. Nope. Then I ran this same script on a Windows R installation (file path different, of course) and the script works.
Any idea of why Ubuntu R installation cannot find the odbcConnectExcel function and how I can get this to work?
That functionality is available where Excel is available. In other words: not on Ubuntu.
For reference, from the R Data Import / Export manual (with my highlighting):
4.3.2 Package RODBC
Package RODBC on CRAN provides an
interface to database sources
supporting an ODBC interface. This is
very widely available, and allows the
same R code to access different
database systems. RODBC runs on
Unix/Linux, Windows and Mac OS X, and
almost all database systems provide
support for ODBC. We have tested
Microsoft SQL Server, Access, MySQL,
PostgreSQL, Oracle and IBM DB2 on
Windows and MySQL, Oracle, PostgreSQL
and SQLite on Linux.
ODBC is a client-server system, and we
have happily connected to a DBMS
running on a Unix server from a
Windows client, and vice versa.
On Windows ODBC support is normally
installed, and current versions are
available from
http://www.microsoft.com/data/odbc/ as
part of MDAC. On Unix/Linux you will
need an ODBC Driver Manager such as
unixODBC (http://www.unixODBC.org) or
iOBDC (http://www.iODBC.org: this is
pre-installed in Mac OS X) and an
installed driver for your database
system.
Windows provides drivers not just for
DBMSs but also for Excel (.xls)
spreadsheets, DBase (.dbf) files and
even text files. (The named
applications do not need to be
installed. Which file formats are
supported depends on the the versions
of the drivers.) There are versions
for Excel 2007 and Access 2007 (go to
http://download.microsoft.com, and
search for Office ODBC, which will
lead to AccessDatabaseEngine.exe), the
`2007 Office System Driver'.
I've found RODBC to be a real pain in the Ubuntu. Maybe it's because I don't know the right incantations, but I switched to RJDBC and have had much better luck with it. As discussed here.
As Dirk says, that wont solve your Excel problem. For writing Excel I've had very good luck with the WriteXLS package. In Ubuntu I found it quite easy to set up. I had Perl and many of the packages already installed and had to simply install Text::CSV_XS which I installed with the GUI package manager. The reason I like WriteXLS is the ability to write data frames to different sheets in the Excel file. And now that I look at your question I see that you want to READ Excel files not WRITE them. Hell. WriteXLS doesn't do that. Stick with gdata, like Dirk said in his comments:
gdata on CRAN and you are going to want the read.xls() function:
read.xls("//path//to/excelfile.xls", sheet = 1, verbose=FALSE, pattern, ...,
method=c("csv","tsv","tab"), perl="perl")
you may need to run installXLSXsupport which installs the needed Perl modules.
read.xls expect sheet numbers, not names. The method parameter is simply the intermediate file format. If your data has tabs then don't use tab as the intermediate format. And likewise for commas and csv.