I created a database with SQL for a schoolproject. Currently I'm stuck at importing this data into rstudio. I put my DB file in this directory: /Users/milanpatty/Documents/Business/Semester_2/R/Proftaak". When I tried to make a connection with this code:
db <- dbConnect(SQLite(), dbname = 'Festivate.db')
I got this error:
Warning message:
Couldn't set synchronous mode: file is not a database
Use `synchronous` = NULL to turn off this warning.
BTW I use DB browser as an interface for SQLite. Could anybody help me with this problem?
edit: I installed the library RSQLite in R
Related
I successfully configured connection to Azure DataBricks cluster and can query tables with
conn <- odbcConnect("AzureDatabricks")
sqlQuery(conn, "SELECT * FROM my_table")
but I need to access parquet files.
In Databricks I can do it with this code:
%sql
Select * FROM parquet.`/path/to/folder`
If I try this by R as
sqlQuery(conn, "Select * FROM parquet.`/path/to/folder`")
I receive error:
[Simba][SQLEngine] Table or view not found: SPARK.parquet./path/to/folder"
[RODBC] ERROR: Could not SQLExecDirect 'Select * FROM parquet.`/path/to/folder`
Is there way to access parquet files via RODBC?
You are experiencing this issue due to an error in your sql query itself. When you run Select * FROM parquet./path/to/folder, command you will not see table or view not found due to syntax error.
Example: Sample example for understanding the issue (when you run SELECT * FROM parquer.'somepath'), you will see the syntax error.
Note: After creating a Dataframe from parquet file, you have to register it as a temp table to run sql queries on it.
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
val df = sqlContext.read.parquet("src/main/resources/peopleTwo.parquet")
df.printSchema
// after registering as a table you will be able to run sql queries
df.registerTempTable("people")
sqlContext.sql("select * from people").collect.foreach(println)
Reference: Spark SQL guide - Parquet files
You need to add UseNativeQuery=1; parameter to the odbc connection string.
Examlpe: Driver={Simba Spark ODBC Driver};Host=[serverHost];Port=443;HTTPPath=[httpPath];ThriftTransport=2;SSL=1;AuthMech=3;UID=token;PWD=[pwd];UseNativeQuery=1;
https://docs.databricks.com/integrations/jdbc-odbc-bi.html#ansi-sql-92-query-support-in-odbc
I'm having trouble connecting to a DB2 database via ODBC. I'm on a Windows system, and have configured a Data Source Name within the ODBC Administrator. When I test the connection there I get Connection tested successfully.. I can also successfully test the connection within IBM's DB2 Configuration Assistant, using both CLI and ODBC.
I'm not able to connect within R. I've tried both the RODBC & odbc packages, the result is the same. My intent is to execute a simple query to verify the connection. When I run the following R script I get an error. Here's my pseudocode.
library('RODBC')
myQuery <- 'SELECT COLUMN1, COLUMN2 FROM DATABASE.TABLE FETCH FIRST 10 ROWS ONLY;'
cnxn <- odbcConnect('myDSN')
data <- sqlQuery(channel=cnxn, query=myQuery)
odbcCloseAll()
Here's the error that I get.
Error in sqlQuery(channel = cnxn, query = myQuery) :
first argument is not an open RODBC channel
In addition: Warning messages:
1: In RODBC::odbcDriverConnect("DSN=myDSN") :
[RODBC] ERROR: state 58031, code -1031, message [IBM][CLI Driver] SQL1031N The database directory cannot be found on the indicated file system. SQLSTATE=58031
2: In RODBC::odbcDriverConnect("DSN=myDSN") : ODBC connection failed
I've learned through experimentation that my script never gets to the point of sending the query. This error is generated at the odbcConnect command.
I don't have access to the server itself, only the database. Is there anything that I can do or try to resolve this on my own, without having to go through support?
EDIT:
I've now cataloged my database, and test connection is successful in 3 places, ODBC Data Source Administrator, Db2 Command Line & Db2 Configuration Assistant. I know that there's no issue with permissions, as I can execute queries via IBM Query Management Facility. I believe this is an issue with either my driver or my system's PATH statements, but I'm not sure how to trace that down.
Taking a non-RODBC approach, the below method works for connecting R and DB2. Assuming you know all the information below, you'll need to download an IBM DB2 jar file and locate it, in this case, in a folder on my machine called "IBM".
Note: there are two types of available jar files, db2jcc.jar and db2jcc4.jar. The below example is using db2jcc.jar.
library(rJava)
library(RJDBC)
library(DBI)
#Enter the values for you database connection
dsn_driver = "com.ibm.db2.jcc.DB2Driver"
dsn_database = "" # e.g. "BLUDB"
dsn_hostname = "" # e.g.: "awh-yp-small03.services.dal.bluemix.net"
dsn_port = "" # e.g. "50000"
dsn_protocol = "TCPIP" # i.e. "TCPIP"
dsn_uid = "" # e.g. "dash104434"
dsn_pwd = "" # e.g. "7dBZ39xN6$o0JiX!m"
jcc = JDBC("com.ibm.db2.jcc.DB2Driver", "C:/Program Files/IBM/SQLLIB/java/db2jcc.jar");
jdbc_path = paste("jdbc:db2://", dsn_hostname, ":", dsn_port, "/", dsn_database, sep="");
conn = dbConnect(jcc, jdbc_path, user=dsn_uid, password=dsn_pwd)
query = "SELECT *
FROM Table
FETCH FIRST 10 ROWS ONLY";
rs = dbSendQuery(conn, query);
df = fetch(rs, -1);
df
According to the DB2 Manual here
SQL1031N The database directory cannot be found on the indicated file system.
Explanation
The system database directory or local database directory could not be found. A database has not been created or it was not cataloged correctly.
The command cannot be processed.
User response
Verify that the database is created with the correct path specification. The Catalog Database command has a path parameter which specifies the directory where the database resides.
sqlcode: -1031
sqlstate: 58031
So I'm trying to access my DB file without success. Here is my script:
library(DBI)
library(sqldf)
drv <- dbDriver("SQLite")
con <- dbConnect(drv, dbname = "database.sqlite")
and here is the error:
drv <- dbDriver("SQLite")
con <- dbConnect(drv, dbname = "database.sqlite")
Error in rsqlite_connect(dbname, loadable.extensions, flags, vfs) :
Could not connect to database:
unable to open database file
I've checked, of course, and made sure that I've installed the packages correctly and that my working directory is set.
I have solved my problem, and its a bit embarrassing:
I've saved my file on my desktop. since my OS in installed in my native language (Hebrew) the file path had one Hebrew word in it, and while that doesn't pose a problem with reading tables into R, it does pose a problem to the SQL connection.
solving it was easy - I've saved the file in a new folder on my hard drive (c:\database), set as working dir, and everything worked fine.
I can duplicate this error in two ways:
The file exists but you don't have permission to open it
This could be because of operating system permissions. Check your permissions.
The file doesn't exist and you don't have permission to create it.
If SQLite is asked to open a database file that doesn't exist, it tries to create it. If this fails, you get that error message. This will fail if the path to the DB file (in this case the current working directory) does not let you create files. Check your permissions.
Note that if the file does exist but is corrupted, I get a different error:
> con <- dbConnect(drv, dbname = "database.sqlite")
Error in rsqlite_send_query(conn#ptr, statement) :
file is encrypted or is not a database
>
so that's probably not your problem.
I also had this issue. I double checked the path and spotted the mistake there. Once I provided the correct path, the connection worked.
Thanks for this post. It draw my attention to double check if the path was correct.
I ran into a similar problem and the solution was to use forward slashes when issuing dbConnect. I had a working directory defined as "C:\ ..." which worked fine in the rest of my code but not when I tried to open the file with dbConnect.
This is my first time connecting to Vertica. I have already connected to a MySQL database sucessfully by using RODBC library.
I have the database setup in vertica and I installed the windows 64-bit ODBC driver from https://my.vertica.com/download-community-edition/
When I tried to connect to vertica using R, I get the below error:
channel = odbcDriverConnect(connection = "Server=myserver.edu;Database=mydb;User=mydb;Password=password")
Warning messages:
1: In odbcDriverConnect(connection = "Server=myserver.edu;Database=mydb;User=mydb;Password=password") :
[RODBC] ERROR: state IM002, code 0, message [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified
2: In odbcDriverConnect(connection = "Server=myserver.edu;Database=mydb;User=mydb;Password=password") :
ODBC connection failed
Can someone tell me how to fix this? Or is there any other ways to connect to vertica using R?
It may not be the fastest, but I prefer to use the Vertica JDBC driver from R. Getting the ODBC drivers working is a little messy across different operating systems. If you already have a Java Runtime Environment (JRE) installed for other applications then this is fairly straightforward.
Download the Vertica JDBC drivers for your Vertica server version from the MyVertica portal. Place the driver (a .jar file) in a reasonable location for your operating system.
Install RJDBC into your workspace:
install.packages("RJDBC",dep=TRUE)
In your R script, load the RJDBC module and create an instance of the Vertica driver, adjusting the classPath argument to point to the location and filename of the driver you downloaded:
library(RJDBC)
vDriver <- JDBC(driverClass="com.vertica.jdbc.Driver", classPath="full\path\to\driver\vertica_jdbc_VERSION.jar")
Make a new connection using the driver object, substituting your connection details for the host, username and password:
vertica <- dbConnect(vDriver, "jdbc:vertica://host:5433/db", "username", "password")
Then run your SQL queries:
myframe = dbGetQuery(vertica, "select Address,City,State,ZipCode from MyTable")
You have to use double slash in the classPath arguement in JDBC function.
for example,
vDriver <- JDBC(driverClass="com.vertica.jdbc.Driver",
classPath="C:\\Program Files\\Vertica Systems\\JDBC\\vertica-jdk5-6.1.2-0.jar")
worked for me, while just copying and pasting the route failed.
The following code connects to my PostgreSQL database successfully (or appears to, at any rate), but attempt to issue queries were met with "relation does not exist" errors, so I tried dbListTables, which doesn't return any tables at all. The database name passed to dbConnect is correct, and the tables do exist. I think the code I'm using is exactly the same as what I was using recently, which worked successfully. Any ideas?
> library(RPostgreSQL)
Loading required package: DBI
> drv <- dbDriver("PostgreSQL")
> con <- dbConnect(drv, dbname="mydb", user="user", password=password)
> dbListTables(con)
character(0)
I'm new to both R and DBI, so I'm sure I could be missing something extremely simple...any help would be appreciated.
Solved-- I was right; it was something incredibly simple (and very, very stupid) on my part. I was running the script from the wrong server. The server I was running it from has an empty copy of the database I was attempting to connect to, so everything succeeded, and the empty result from dbListTables was correct. Once I switched servers (or simply specified the host on the other server), everything worked.
1.Connet to MySQL
a)if Mysql is installed in your system, if not install it.
b)download the RMySQL IN R
library(RMySQL)
drv = dbDriver("MySQL 5.0.1")
make sure MySQL version is correct.
con = dbConnect(drv,host="localhost",dbname="test",user="root",pass="root")
use local host or use the server i.e ip address
use the required database name, user name and password
album = dbGetQuery(con,statement="select * from table")
run required query
close(con)
2.Another way to connect database
a)first install any database like MySQL,Oracle,SQL Server
b)install the ODBC connector for database
library(Rodbc)
channel <- odbcConnect("test", uid="ripley", pwd="secret")
test is the connection name of odbc conector which user has to set manualy
user can find this in Administrator tool
res <- sqlFetch(ch, "table name")
A table can be retrieved as a data frame
res<-sqlQuery(channel, paste("select query"))
part of the with condition one table can be retrieved as a data frame
sqlSave(channel, dataframe)
to save a dataframe to the database(dont use "res<-" something like this)
like user can use
sqlCopy()
sqlDrop()
sqlTables()
close(channel)
always close the connection