I'm trying to connect to a database in Redshift with my mac.
I managed to connect to Redshift with both dplyr and RPostgreSQL, but even though i can see all the available tables regardless of schema, i'm unable to access any of them as they all are under diiferent schemas.
I've tried all sorts of syntax to specify the schema but i'm not getting anywhere.
Here's my RPostgreSQL code:
library(RPostgreSQL)
drv <- dbDriver("PostgreSQL")
postgre.conn <-dbConnect(drv,
host="localhost", port="XXXX", dbname="redshiftdb",
user="XXXX", password="XXXX")
dbListTables(postgre.conn)
This lists all the tables regardless of schema.
I can see all the tables under a specific schema so this works:
dbGetQuery(postgre.conn,
"SELECT table_name FROM information_schema.tables
WHERE table_schema='my_schema'")
but i'm then unable to access data from my_schema.my_table with any of these commands:
dbSendQuery(postgre.conn,"SELECT * FROM my_table LIMIT 10")
dbSendQuery(postgre.conn,"SELECT * FROM my_schema.my_table LIMIT 10")
dbSendQuery(postgre.conn,"SELECT * FROM my_table WHERE table_schema='my_schema' LIMIT 10")
dbSendQuery(postgre.conn,"SELECT * FROM c("my_schema", "my_table") LIMIT 10")
Similarly here's my dplyr code:
library(dplyr)
dplyr.conn <- src_postgres(host="localhost", port="XXXX",
dbname = "redshiftdb", user = "XXXX", password = "XXXX")
head(src_tbls(dplyr.conn)) # lists all the tables, regardless of schema
But then, none of these work:
tbl(dplyr.conn, sql("SELECT * FROM my_table LIMIT 10"))
tbl(dplyr.conn, sql("SELECT * FROM my_schema.my_table LIMIT 10"))
and i also tried specifying the search path in both cases as such:
dplyr.conn <- src_postgres(host="localhost", port="XXXX",
dbname = "redshiftdb", user = "XXXX", password = "XXXX",
options="-c search_path=my_schema")
postgre.conn <-dbConnect(drv,
host="localhost",
port="XXXX",
dbname="redshiftdb",
user="XXXX",
password="XXXX",
options="-c search_path=my_schema")
but these still didn't work:
tbl(dplyr.conn, sql("SELECT * FROM my_table LIMIT 10"))
dbSendQuery(postgre.conn,"SELECT * FROM my_table LIMIT 10")
any ideas...?
Use the in_schema() command. The code would be something like:
t <- tbl(dplyr.conn, in_schema("sheman_name", "table_name")
library(glue)
schema <- "your_schema"
tbl <- "your_table"
var <- "your_var"
conn <- "your_connection_to_database"
select_query <- glue_sql('
SELECT {`var`}
FROM {`schema`}.{`tbl`} ', .con = conn)
DBI::dbGetQuery(conn, select_query)
Related
I have read-only access to a Postgres database. I can not write to the database.
Q. Is there a way to construct and run a SQL query where I join a data frame (or other R object) to a table in a read-only Postgres database?
This is for accessing data from WRDS, https://wrds-www.wharton.upenn.edu/
Here's an attempt at pseudocode
#establish a connection to a database
con <- dbConnect( Postgres(),
host = 'host.org',
port = 1234,
dbname = 'db_name',
sslmode = 'require',
user = 'username', password = 'password')
#create an R dataframe (or other object)
df <- data.frame( customer_id = c('a123', 'a-345', 'b0') )
#write a sql query we will run
sql_query <- "
SELECT t.customer_id, t.* FROM df t
LEFT JOIN table_name df
on t.customer_id = df.customer_id
"
my_query_results <- dbSendQuery(con, sql_query)
temp <- dbFetch(res, n = 1)
dbClearResult(res)
my_query_results
Note and edit: The example query I provided is intentionally super simple for example purposes.
In my actual queries, there might be 3 or more columns I want to join on, and millions of rows I want to join on.
Use the copy_inline function from the dbplyr package, which was added following an issue filed on this topic. See also the question here.
An example of its use is found here.
If your join is on a single condition, it can be rewritten using an in clause:
In SQL:
SELECT customer_id
FROM table_name
WHERE customer_id in ('a123', 'a-345', 'b0')
Programmatically from R:
sql_query = sprintf(
"SELECT customer_id
FROM table_name
WHERE customer_id in (%s)",
paste(sQuote(df$customer_id, q = FALSE), collapse = ", ")
)
I'm running a postgreSQL query based on an automated list of ID's stored in an R list. I'm trying to determine how to include that R list in my query so I don't have to hard-code the ID's each time I run my query.
For example, I have a script that produces the list
id <- c("001","002","003")
and my query looks something like this:
SELECT *
FROM my_query
WHERE my_query.id_col IN ('001', '002', '003')
which I run using Rpostgres:
library(Rpostgres)
snappConnection <- DBI::dbConnect(RPostgres::Postgres(),
host = "host",
dbname = "dbname",
user = "user",
password = "pword",
port = 0000)
core.data <- dbGetQuery(conn = snappConnection,statement = SELECT * FROM my_query WHERE my_query.id_col IN ('001', '002', '003'))
Is there a way to reference my "id" list from R in my query so that when "id" updates to new values, the query also updates to those new values?
glue_sql from glue package should work:
query <- glue::glue_sql("
SELECT *
FROM my_query
WHERE my_query.id_col IN ({id*})
", .con = snappConnection)
core.data <- dbGetQuery(conn = snappConnection, statement = query)
#dave-edison's answer solved my problem. Concurrent to trying his, I got this to work.
I saved the query below as "my_query.sql"
SELECT *
FROM my_query
WHERE my_query.id_col IN ('string_to_replace')
then created a string and used gsub on the string.
library(tidyverse)
temp.script <- read_file("my_query.sql")
core.data.script <- gsub('string_to_replace',paste0(id,collapse = "', '"),temp.script)
From there I just ran my RPostgres script like above.
I have connected Teradata to my R session with RODBC.
Typically I use data <- sqlQuery(conn, "SELECT statement") however when I put the following WITH statement in place of the SELECT statement, there is an error.
data <- sqlQuery(conn,
"WITH zzz as (SELECT statement1),
yyy as (SELECT statement2)
SELECT statement3"
try correcting mismatched " and ) as below...
data <- sqlQuery(conn,
"WITH zzz as (SELECT statement1),
yyy as (SELECT statement2)
SELECT statement3")
Question: How do I pass a variable in the RPostgreSQL query?
Example: In the example below I try to pass the date '2018-01-03' to the query
library(RPostgreSQL)
dt <- '2018-01-03'
connect <- dbConnect(PostgreSQL(),
dbname="test",
host="localhost",
port=5432,
user="user",
password="...")
result <- dbGetQuery(connect,
"SELECT * FROM sales_tbl WHERE date = #{dt}")
You can use paste0 to generate your query and pass it to dbGetQuery:
library(RPostgreSQL)
dt <- '2018-01-03'
connect <- dbConnect(PostgreSQL(),
dbname="test",
host="localhost",
port=5432,
user="user",
password="...")
query <- paste0("SELECT * FROM sales_tbl WHERE date='", dt, "'")
result <- dbGetQuery(connect, query)
The safest way is to parameterize the query as mentioned here
Example:
library(RPostgreSQL)
dt <- '2018-01-03'
connect <- dbConnect(drv = PostgreSQL(),
dbname ="test",
host = "localhost",
port = 5432,
user = "user",
password = "...")
query <- "SELECT * FROM sales_tbl WHERE date= ?"
sanitized_query <- dbSendQuery(connect, query)
dbBind(sanitized_query, list(dt))
result <- dbFetch(sanitized_query)
Here by passing ? you are sanitizing your query to avoid SQL injection attacks.
Another thing I like to do is to create .Renviron file to store my credintials. For example, for the connection above, the .Renviron file will look like this.
dbname = test
dbuser = me
dbpass = mypass
dbport = 5432
dbhost = localhost
save the file, restart RStudio (to load the .Renviron file at startup). Then access the credentials using the Sys.getenv(variable)
#example:
connect <- dbConnect(drv = PostgreSQL(),
dbname = Sys.getenv("dbname"),
host = Sys.getenv("dbhost"),
port = Sys.getenv("dbport"),
user = Sys.getenv("dbuser"),
password = Sys.getenv("dbpass"))
I created a connection:
library(RODBC)
pswd <- readline("Input Password: ")
channel<-odbcConnect (dsn="dsn",uid="uid",pwd=pswd,believeNRows=FALSE)
And I am able to get a list of tables
tables <- sqlTables(channel, schema="SYSADM")
But when I try to query one of the tables
query <- "select * from SYSADM.TABLE1"
dataframe <- sqlQuery(channel,query)
I get:
"[RODBC] ERROR: Could not SQLExecDirect 'SELECT * FROM \"TABLE1\"'"
I do have access to this table and am able to query it using Toad.
What could be the issue?
Kindly make a change as follows and then try below queries:
tables <- sqlTables(channel, schema='SYSADM')
Queries:
dataframe <- sqlQuery(channel,"select * from SYSADM.TABLE1")
Or
query <- paste("select * from SYSADM.TABLE1")
dataframe <- sqlQuery(channel,"select * from SYSADM.TABLE1")
Hope it helps!