I created a connection:
library(RODBC)
pswd <- readline("Input Password: ")
channel<-odbcConnect (dsn="dsn",uid="uid",pwd=pswd,believeNRows=FALSE)
And I am able to get a list of tables
tables <- sqlTables(channel, schema="SYSADM")
But when I try to query one of the tables
query <- "select * from SYSADM.TABLE1"
dataframe <- sqlQuery(channel,query)
I get:
"[RODBC] ERROR: Could not SQLExecDirect 'SELECT * FROM \"TABLE1\"'"
I do have access to this table and am able to query it using Toad.
What could be the issue?
Kindly make a change as follows and then try below queries:
tables <- sqlTables(channel, schema='SYSADM')
Queries:
dataframe <- sqlQuery(channel,"select * from SYSADM.TABLE1")
Or
query <- paste("select * from SYSADM.TABLE1")
dataframe <- sqlQuery(channel,"select * from SYSADM.TABLE1")
Hope it helps!
Related
I have read-only access to a Postgres database. I can not write to the database.
Q. Is there a way to construct and run a SQL query where I join a data frame (or other R object) to a table in a read-only Postgres database?
This is for accessing data from WRDS, https://wrds-www.wharton.upenn.edu/
Here's an attempt at pseudocode
#establish a connection to a database
con <- dbConnect( Postgres(),
host = 'host.org',
port = 1234,
dbname = 'db_name',
sslmode = 'require',
user = 'username', password = 'password')
#create an R dataframe (or other object)
df <- data.frame( customer_id = c('a123', 'a-345', 'b0') )
#write a sql query we will run
sql_query <- "
SELECT t.customer_id, t.* FROM df t
LEFT JOIN table_name df
on t.customer_id = df.customer_id
"
my_query_results <- dbSendQuery(con, sql_query)
temp <- dbFetch(res, n = 1)
dbClearResult(res)
my_query_results
Note and edit: The example query I provided is intentionally super simple for example purposes.
In my actual queries, there might be 3 or more columns I want to join on, and millions of rows I want to join on.
Use the copy_inline function from the dbplyr package, which was added following an issue filed on this topic. See also the question here.
An example of its use is found here.
If your join is on a single condition, it can be rewritten using an in clause:
In SQL:
SELECT customer_id
FROM table_name
WHERE customer_id in ('a123', 'a-345', 'b0')
Programmatically from R:
sql_query = sprintf(
"SELECT customer_id
FROM table_name
WHERE customer_id in (%s)",
paste(sQuote(df$customer_id, q = FALSE), collapse = ", ")
)
I'm running a postgreSQL query based on an automated list of ID's stored in an R list. I'm trying to determine how to include that R list in my query so I don't have to hard-code the ID's each time I run my query.
For example, I have a script that produces the list
id <- c("001","002","003")
and my query looks something like this:
SELECT *
FROM my_query
WHERE my_query.id_col IN ('001', '002', '003')
which I run using Rpostgres:
library(Rpostgres)
snappConnection <- DBI::dbConnect(RPostgres::Postgres(),
host = "host",
dbname = "dbname",
user = "user",
password = "pword",
port = 0000)
core.data <- dbGetQuery(conn = snappConnection,statement = SELECT * FROM my_query WHERE my_query.id_col IN ('001', '002', '003'))
Is there a way to reference my "id" list from R in my query so that when "id" updates to new values, the query also updates to those new values?
glue_sql from glue package should work:
query <- glue::glue_sql("
SELECT *
FROM my_query
WHERE my_query.id_col IN ({id*})
", .con = snappConnection)
core.data <- dbGetQuery(conn = snappConnection, statement = query)
#dave-edison's answer solved my problem. Concurrent to trying his, I got this to work.
I saved the query below as "my_query.sql"
SELECT *
FROM my_query
WHERE my_query.id_col IN ('string_to_replace')
then created a string and used gsub on the string.
library(tidyverse)
temp.script <- read_file("my_query.sql")
core.data.script <- gsub('string_to_replace',paste0(id,collapse = "', '"),temp.script)
From there I just ran my RPostgres script like above.
I am trying to retrieve data from a database table based on a given condition:
I want to select all from a table, and during the while loop I put a condition like to return only what I want the way it is done in PHP
mytable <- dbSendQuery(con, "select date from member")
while(!dbHasCompleted(mytable)){
if(name = 'myname'){
new_date <- dbFetch(mytable, n=-1)
print(mytable)
}
}
How do I deal with the if statements to operate well?
I later found out the solution like this
mytable <- dbSendQuery(con, "select * from member")
while(!dbHasCompleted(mytable)){
new_date <- dbFetch(mytable, name = 'john', n=-1)
print(new_date )
}
I'm trying to connect to a database in Redshift with my mac.
I managed to connect to Redshift with both dplyr and RPostgreSQL, but even though i can see all the available tables regardless of schema, i'm unable to access any of them as they all are under diiferent schemas.
I've tried all sorts of syntax to specify the schema but i'm not getting anywhere.
Here's my RPostgreSQL code:
library(RPostgreSQL)
drv <- dbDriver("PostgreSQL")
postgre.conn <-dbConnect(drv,
host="localhost", port="XXXX", dbname="redshiftdb",
user="XXXX", password="XXXX")
dbListTables(postgre.conn)
This lists all the tables regardless of schema.
I can see all the tables under a specific schema so this works:
dbGetQuery(postgre.conn,
"SELECT table_name FROM information_schema.tables
WHERE table_schema='my_schema'")
but i'm then unable to access data from my_schema.my_table with any of these commands:
dbSendQuery(postgre.conn,"SELECT * FROM my_table LIMIT 10")
dbSendQuery(postgre.conn,"SELECT * FROM my_schema.my_table LIMIT 10")
dbSendQuery(postgre.conn,"SELECT * FROM my_table WHERE table_schema='my_schema' LIMIT 10")
dbSendQuery(postgre.conn,"SELECT * FROM c("my_schema", "my_table") LIMIT 10")
Similarly here's my dplyr code:
library(dplyr)
dplyr.conn <- src_postgres(host="localhost", port="XXXX",
dbname = "redshiftdb", user = "XXXX", password = "XXXX")
head(src_tbls(dplyr.conn)) # lists all the tables, regardless of schema
But then, none of these work:
tbl(dplyr.conn, sql("SELECT * FROM my_table LIMIT 10"))
tbl(dplyr.conn, sql("SELECT * FROM my_schema.my_table LIMIT 10"))
and i also tried specifying the search path in both cases as such:
dplyr.conn <- src_postgres(host="localhost", port="XXXX",
dbname = "redshiftdb", user = "XXXX", password = "XXXX",
options="-c search_path=my_schema")
postgre.conn <-dbConnect(drv,
host="localhost",
port="XXXX",
dbname="redshiftdb",
user="XXXX",
password="XXXX",
options="-c search_path=my_schema")
but these still didn't work:
tbl(dplyr.conn, sql("SELECT * FROM my_table LIMIT 10"))
dbSendQuery(postgre.conn,"SELECT * FROM my_table LIMIT 10")
any ideas...?
Use the in_schema() command. The code would be something like:
t <- tbl(dplyr.conn, in_schema("sheman_name", "table_name")
library(glue)
schema <- "your_schema"
tbl <- "your_table"
var <- "your_var"
conn <- "your_connection_to_database"
select_query <- glue_sql('
SELECT {`var`}
FROM {`schema`}.{`tbl`} ', .con = conn)
DBI::dbGetQuery(conn, select_query)
I am having a variable x which contains 20000 IDs. I want to write a sql query like,
select * from tablename where ID in x;
I am trying to implement this in R where I can get the values only for IDs in x variable. The following is my try,
dbSendQuery(mydb, "select * from tablename where ID in ('$x') ")
I am not getting any error while trying this. But it is returning 0 values.
Next tried using
sprintf("select * from tablename where ID in %s",x)
But this is creating 20000 individual queries which could prove costly in DB.
Can anybody suggest me a way to write a command, which would loop through IDs in x and save to a Dataframe in R in a single query?
You need to have the codes in the actual string. Here is how I would do it with gsub
x <- LETTERS[1:3]
sql <- "select * from tablename where ID in X_ID_CODES "
x_codes <- paste0("('", paste(x, collapse="','"), "')")
sql <- gsub("X_ID_CODES", x_codes, sql)
# see new output
cat(sql)
select * from tablename where ID in ('A','B','C')
# then submit the query
#dbSendQuery(mydb, sql)
How about pasting it:
dbSendQuery(mydb, paste("select * from tablename where ID in (", paste(x, collapse = ","), ")"))