I have a SQL Server database to which I can connect with my user and this password: "testspecialcharacters_¤" through MS Studio without problems. We are using windows authentication to connect to the server.
However I cannot connect to it through my R Shiny app. I cannot give the exact server, database and user names but this is very close to the code I am using:
library(RODBCext)
ch <- odbcDriverConnect("DSN=test;database=test_db;UID=test_user;PWD=testspecialcharacters_¤")
data <- sqlQuery(ch,"select * from test_db.general.test_tbl")
I receive the following error:
[RODBC] ERROR: state 42000, code 18452, message [unixODBC][FreeTDS][SQL Server]Login failed. The login is from an untrusted domain and cannot be used with Windows authentication.
[RODBC] ERROR: state 08001, code 0, message [unixODBC][FreeTDS][SQL Server]Unable to connect to data source
ODBC connection failed
I have to mention that the same code works for passwords without that special character. From experience we have found out that "-" and "!" seem to work. However "#" and "¤" do not work. We have not tried all the potential special characters, only these few.
Based on different other posts online I have tried the following, with no success:
ch <- odbcDriverConnect("DSN=test;database=test_db;UID=test_user;PWD=testspecialcharacters_¤", DBMSencoding = "UTF-8")
ch <- odbcDriverConnect("DSN=test;database=test_db;UID=test_user;PWD=testspecialcharacters_¤", DBMSencoding = "UTF-8-BOM")
ch <- odbcDriverConnect("DSN=test;database=test_db;UID=test_user;PWD=testspecialcharacters_¤", DBMSencoding = "latin1")
ch <- odbcDriverConnect("DSN=test;database=test_db;UID=test_user;PWD={testspecialcharacters_¤}")
ch <- odbcDriverConnect("DSN=test;database=test_db;UID=test_user;PWD='testspecialcharacters_¤'")
I am using RStudio Pro with UTF-8 default encoding.
Does anybody know how to escape the special characters used in the password string? Or if a different way of connecting to the database is needed.
The best way is to use basic escaping like this. But you have to test it on your machine, because setting correct encoding between your machine and database is really hard.
x <- "¤"
y <- "\u00A4"
identical(x, y) # true
And link for table
https://www.utf8-chartable.de/
Related
I'm trying to automate data download from db using RJDBC using a for loop. The database im using automatically closes the connection after every 10mins, so what i want to do is somehow catch the error, remake the connection, and continue the loop. In order to do this i need to capture the error somehow, the problem is, it is not an r error so none of the commands trycatch and similar works. I just get a text on the console telling me:
Error in .jcheck() : No running JVM detected. Maybe .jinit() would help.
How do i handle this in terms of:
if (output == ERROR) {remake connection and run dbQuery} else {run dbQuery}
thanks for any help
You could use the pool package to abstract away the logic of connection management.
This does exactly what you expect regarding connection management with DBI.
It should work with RJDBC which is an implentation of DBI, but I didn't test it with this driver.
libray(pool)
library(RJDBC)
conn <- dbPool(
drv = RJDBC::JDBC(...),
dbname = "mydb",
host = "hostadress",
username = "test",
password = "test"
)
on.exit(poolClose(conn))
dbGetQuery(conn, "select... ")
I receive the following error...
Error in b_wincred_i_get(target) :
Windows credential store error in 'get': Element not found.
...when running the following script in R v3.5.1 (R Studio v1.1.456) and keyring v1.1.0. I'm attempting this on a new setup so not sure if this could be as simple as a firewall issue or the like. The script errors out when attempting to get the password (key_get(json_data$service, json_data$user)). I've tried manually plugging in the service and username into the key_get method (instead of using the variables from the config file) but get the same error. The config file is a json file that holds all of the connection details except the password, which obviously gets retrieved from the Windows Credentials Vault. Any help in figuring out a fix for this is greatly appreciated.
library(RJDBC)
library(keyring)
library(jsonlite)
postgres.connection <- function(json_data){
print("Creating Postgres driver...")
pDriver <- JDBC(driverClass=json_data$driver, classPath="C:/Users/Drivers/postgresql-42.2.4.jar")
print("Connecting to Postgres...")
server <- paste("jdbc:postgresql://", json_data$host, ":", json_data$port, "/", json_data$dbname, sep="")
pConn <- dbConnect(pDriver, server, json_data$user, key_get(json_data$service, json_data$user))
return(pConn)
}
json_data <- fromJSON("C:/Users/Configs/Config.json", simplifyVector = TRUE, simplifyDataFrame = TRUE)
json_data_connection <- json_data$postgres$local_read
pc <- postgres.connection(json_data_connection)
You mentioned it's a new setup. Is the password already in credentials? To check, go to 'Control Panel'>'User Accounts'>'Credential Manager'>'Windows Credentials', is it under 'Generic Credentials'?
I can get the same error by running:
key_get('not-valid-service', 'some-user')
I have issues with wrong encoding when connecting to an Oracle database from RStudio. I have tried both the RODBC and ODBC packages.
When I query the database either with sqlQuery() or through dplyr Norwegian letters (Æ, Ø and Å) is not displayed correctly.
I have tried setting the default text encoding i RStudio to UTF-8, and saving the R-scripts with UTF-8 encoding, but the problem persists.
When querying the database with SELECT * FROM NLS_DATABASE_PARAMETERS I find that the database has NLS_CHARACTERSET WE8MSWIN1252, so I have tried to specify this when connecting.
con <- dbConnect(odbc::odbc(), "RStudio", uid = "user",
pwd = "password", encoding = "WE8MSWIN1252")
But this only gives me this error message:
Error in new_result(connection#ptr, statement) :
Can't convert from WE8MSWIN1252 to UTF-8
Any ideas on what I'm doing wrong?
The encoding argument in dbConnect should be set to the encoding used by the system in which you run R, rather than the encoding used in the database (in accordance with the NLS_LANG system environment variable). Therefore, if you run R/Rstudio on a Windows machine, 'encoding = "cp1252"' or 'encoding="latin1"' should do it. However, if you're on Linux or Mac, 'encoding="UTF8" should hopefully work'. If this does not work, make sure that NLS_LANG is set correctly.
(If you use Rstudio server, you have to set NLS_LANG in your .Renviron file, as Rstudio server does not export OS wide environment variables)
I'm having trouble connecting to a DB2 database via ODBC. I'm on a Windows system, and have configured a Data Source Name within the ODBC Administrator. When I test the connection there I get Connection tested successfully.. I can also successfully test the connection within IBM's DB2 Configuration Assistant, using both CLI and ODBC.
I'm not able to connect within R. I've tried both the RODBC & odbc packages, the result is the same. My intent is to execute a simple query to verify the connection. When I run the following R script I get an error. Here's my pseudocode.
library('RODBC')
myQuery <- 'SELECT COLUMN1, COLUMN2 FROM DATABASE.TABLE FETCH FIRST 10 ROWS ONLY;'
cnxn <- odbcConnect('myDSN')
data <- sqlQuery(channel=cnxn, query=myQuery)
odbcCloseAll()
Here's the error that I get.
Error in sqlQuery(channel = cnxn, query = myQuery) :
first argument is not an open RODBC channel
In addition: Warning messages:
1: In RODBC::odbcDriverConnect("DSN=myDSN") :
[RODBC] ERROR: state 58031, code -1031, message [IBM][CLI Driver] SQL1031N The database directory cannot be found on the indicated file system. SQLSTATE=58031
2: In RODBC::odbcDriverConnect("DSN=myDSN") : ODBC connection failed
I've learned through experimentation that my script never gets to the point of sending the query. This error is generated at the odbcConnect command.
I don't have access to the server itself, only the database. Is there anything that I can do or try to resolve this on my own, without having to go through support?
EDIT:
I've now cataloged my database, and test connection is successful in 3 places, ODBC Data Source Administrator, Db2 Command Line & Db2 Configuration Assistant. I know that there's no issue with permissions, as I can execute queries via IBM Query Management Facility. I believe this is an issue with either my driver or my system's PATH statements, but I'm not sure how to trace that down.
Taking a non-RODBC approach, the below method works for connecting R and DB2. Assuming you know all the information below, you'll need to download an IBM DB2 jar file and locate it, in this case, in a folder on my machine called "IBM".
Note: there are two types of available jar files, db2jcc.jar and db2jcc4.jar. The below example is using db2jcc.jar.
library(rJava)
library(RJDBC)
library(DBI)
#Enter the values for you database connection
dsn_driver = "com.ibm.db2.jcc.DB2Driver"
dsn_database = "" # e.g. "BLUDB"
dsn_hostname = "" # e.g.: "awh-yp-small03.services.dal.bluemix.net"
dsn_port = "" # e.g. "50000"
dsn_protocol = "TCPIP" # i.e. "TCPIP"
dsn_uid = "" # e.g. "dash104434"
dsn_pwd = "" # e.g. "7dBZ39xN6$o0JiX!m"
jcc = JDBC("com.ibm.db2.jcc.DB2Driver", "C:/Program Files/IBM/SQLLIB/java/db2jcc.jar");
jdbc_path = paste("jdbc:db2://", dsn_hostname, ":", dsn_port, "/", dsn_database, sep="");
conn = dbConnect(jcc, jdbc_path, user=dsn_uid, password=dsn_pwd)
query = "SELECT *
FROM Table
FETCH FIRST 10 ROWS ONLY";
rs = dbSendQuery(conn, query);
df = fetch(rs, -1);
df
According to the DB2 Manual here
SQL1031N The database directory cannot be found on the indicated file system.
Explanation
The system database directory or local database directory could not be found. A database has not been created or it was not cataloged correctly.
The command cannot be processed.
User response
Verify that the database is created with the correct path specification. The Catalog Database command has a path parameter which specifies the directory where the database resides.
sqlcode: -1031
sqlstate: 58031
I am using dplyr package to connect to PostgreSQL, and my code is as below:
# Connect to local PostgreSQL via dplyr
library('dplyr')
localdb <- src_postgres(dbname = 'postgres',
host = 'localhost',
port = 5432,
user = 'postgres',
password = '236236')
#Write table from R to PostgreSQL
iris<-as.data.frame(iris)
dbWriteTable(localdb$con,'iris',iris, row.names=FALSE)
The connection is successful, but after about 5 minutes, a message popped up, saying "Auto-disconnecting postgres connection (4308, 1)". I am not sure how this issue comes from, and I need to deal with large data which takes more than 5 minutes to write to PostgreSQL, so I want to know how to solve this auto-disconnecting issue.
I've had similar problems with src_sqlite(). The error that I got was Auto-disconnecting SQLiteConnection. Apparently, the usage of the src_* functions is now discouraged (from the documentation of tbl()).
However, modern best practice is to use tbl() directly on an DBIConnection.
Before I was using the code below. The code in itself didn't return any errors. However, after using the same db again, I would get the error Auto-disconnecting SQLiteConnection.
path <- 'C:/sql.db'
sql_data <- src_sqlite(path)
# work on the sql_data variable
tbl(sql_data)
DBI::dbDisconnect(thesql$con)
As already said, the usage of src_sqlite is discouraged. Using the preferred code below solved my issue.
sql_data <- DBI::dbConnect(RSQLite::SQLite(), dbname = path)
# work on the sql_data variable
tbl(sql_data)
DBI::dbDisconnect(thesql)
Pay attention that the disconnect statement slightly changed and that the SQLite() arguments are in fact dbConnect() arguments.