I am not able to write data to a table part of the different owner (Oracle data base) using dbwriteTable() even after I have INSERT Grants.
I have tried using the below option but, getting an error. When I tried checking the dbExistsTable(), it is returning true for the same table. Execution Steps followed:
Connecting to DB:
library(RJDBC)
drv <- JDBC("oracle.jdbc.OracleDriver",
classPath = "C:/oracle_64/product/11.2.0/client_2/jdbc/lib/ojdbc6.jar", " ")
con <- dbConnect(drv, "jdbc:oracle:thin:#//hostname:1521/oracle_sid",
"MASTER_OWNER", "PASSWORD" )
Write Dataframe into Oracle Table:
dbWriteTable(conn, "R_STG_INSERT", DF_NAME, row.names = FALSE,
overwrite = FALSE, append = TRUE, schema = "TEST_OWNER")
Note: The same dbWriteTable() is working, if it is with same Database Owner. I am able to use it as expected.
Expected: Load the data frame into Oracle table
Actual: Error Message
Error in .local(conn, statement, ...) : execute JDBC update query
failed in dbSendUpdate (ORA-00942: table or view does not exist )
Calls: dbWriteTable ... dbWriteTable -> .local -> dbSendUpdate ->
dbSendUpdate -> .local
Execution halted
Related
On Airflow 2.2.4 (Postgres and Celery).
I have a connection created for Microsoft SQL Server. When I try to test the connection, UI asked me to enter API Auth user/password (Basic Auth) and showed up a green flash "Connection successfully tested"
But,
When I use the same connection ID in an Operator definition to run some SQL queries, I am getting an error as below.
pymssql._mssql.MSSQLDatabaseException: (20009, b'DB-Lib error message 20009, severity 9:\nUnable to connect: Adaptive Server is unavailable or does not exist (<IP>)\nNet-Lib error during Connection timed out (110)\n
DB-Lib error message 20009, severity 9:\nUnable to connect: Adaptive Server is unavailable or does not exist (<IP>)\n
Net-Lib error during Connection timed out (110)\n')
Here is what my code from my custom operator looks like,
def get_update_num(self):
conn = None
try:
query = f"""
UPDATE SOME_TABLE
SET COL_A = COL_A+1
OUTPUT INSERTED.COL_A
WHERE COL_ID='12345'
self.log.info(f"SQL = {query}")
conn_id = self.conn_id # This is equal to MSSQL_CONNECTION
hook = MsSqlHook(mssql_conn_id=conn_id)
conn = hook.get_conn()
hook.set_autocommit(conn, True)
cursor = conn.cursor()
cursor.execute(query)
row = cursor.fetchone()
self.log.info(f"row = {row}")
return row[0]
except Exception as e:
message = "Error: Could not run SQL"
raise AirflowException(message)
finally:
if not conn:
conn.close()
Any help would be much appreceated.
I am trying to use R to access postgresql db on Heroku, and I found that I can use src_dbi from dplyr package.
I have dplyr properly installed, but when I try to call src_dbi I get an error message:
Error in src_dbi(db_con) : could not find function "src_dbi"
This happens right when I run:
db <- src_dbi(db_con)
after poviding the credentials:
config <- run("heroku", c("config:get", "postgres://xxxxxxxxxxxxxx
", "-a", "prjectAlpha"))
pg <- httr::parse_url(config$stdout)
dbConnect(RPostgres::Postgres(),
dbname = "xxxxxxxxxx",
host = "xxxxxxxxx.amazonaws.com",
port = 5432,
user = "xxxxxx",
password = "xxxxxxxxxxxxxxxxx",
sslmode = "require"
) -> db_con
The idea is to be able to download a table and re-upload it after making a few changes with R.
My solution to access Heroku Postgresql from R:
library(dbplyr) #in case you have an error, run: system("defaults write org.R-project.R force.LANG en_US.UTF-8") from Rails console, then restart R.
library(processx)
library(RPostgres)
library(httr)
library(tidyverse)
library(dplyr)
config <- run("heroku", c("config:get", "postgres://xxxxxxxxxxxxxx
", "-a", "prjectAlpha"))
pg <- httr::parse_url(config$stdout)
dbConnect(RPostgres::Postgres(),
dbname = "xxxxxxxxxx",
host = "xxxxxxxxx.amazonaws.com",
port = 5432,
user = "xxxxxx",
password = "xxxxxxxxxxxxxxxxx",
sslmode = "require"
) -> db_con
Once the connection is set up:
db <- src_dbi(db_con)
Once the connection is established, check for the available tables
db
Now time to retrive data
A lot of examples only show treatment directly from the collection, but one might just be interested to read the tables. I have a table called "weather_records"
weather_records_local_df <- tbl(db_con, "weather_records")
df <- collect(weather_records_local_df)
Then do what you want with the data. Hope it helps.
I'm trying to connect R to a postgresql database that requires SSL.
Here's the connection string that works for me when using PSQL:
postgresql://USERNAME:PASSWORD#HOSTNAME:PORT/DBNAME?ssl=true, replacing all the uppercase strings with appropriate values.
In R, I don't know how to handle the SSL parameter. When I try
conn = dbConnect(RPostgreSQL::PostgreSQL(), host="HOST", dbname="DBNAME", user="USERNAME", password="PASSWORD", port=PORT)
I get
Error in postgresqlNewConnection(drv, ...) :
RS-DBI driver: (could not connect USERNAME#HOST on dbname "DBNAME"
)
When I try conn = dbConnect(RPostgreSQL::PostgreSQL(), host="HOST", dbname="DBNAME", user="USERNAME", password="PASSWORD", port=PORT, ssl=TRUE)
I get
Error in postgresqlNewConnection(drv, ...) : unused argument (ssl = TRUE)
Connect to Postgres via SSL using R suggests adding extra info to the dbname parameter. I've tried dbname="DBNAME ssl=TRUE" which results in RS-DBI driver: (could not connect (null)#HOST on dbname "(null)" I get the same result with sslmode=allow and sslmode=require (as suggested by the above post).
The documentation for the PostgreSQL driver says, under "User Authentication", "The passed string can be empty to use all default parameters, or it can contain one or more parameter settings separated by comma. Each parameter setting is in the form parameter = "value". Spaces around the equal sign are optional." But I haven't been able to get it to accept any parameters other than the three shown in the function prototype.
I'm out of ideas; help appreciated.
You can try this :
Create a configuration file like "configuration.yml" and add your setup :
db:
host : "your_localhost"
dbname : "your_database_name?ssl=true"
user : "your_user_name"
port : 5432
password : "your_password"
Install this packages :
install.packages(yaml, dependencies = TRUE)
install.package(RPostgreSQL, dependencies = TRUE)
install.packages(DBI, dependencies = TRUE)
Run :
driver <- DBI:::dbDriver("PostgreSQL")
con <- do.call( RPostgreSQL:::dbConnect,
c(drv = driver, yaml:::yaml.load_file("configuration.yml")$db)
)
/!\ Note /!\ : don't forget to add this statement
*?ssl=true*
on the configuration.yml file on the dbname field.
Hope this help !
I'm having trouble connecting my R client to redshift through the RPostgreSQL package, despite it working very easily through psql. I've tried downloading and sourcing the redshift-ssl-ca-cert.pem file, but this still doesn't seem to work. Any ideas what could be causing this? Here's my R code:
library("RPostgreSQL")
drv <- dbDriver("PostgreSQL")
host = 'host.com'
dbname = 'dbname'
port = 1234
password = 'password'
username = 'user'
redshift_cert = paste0(FILE_PATH, 'redshift-ssl-ca-cert.pem')
pg_dsn = paste0(
'dbname=', dbname, ' ',
'sslrootcert=', redshift_cert, ' ',
'sslmode=verify-full'
)
con <- dbConnect(drv, dbname=pg_dsn, host=host, port=port, password=password, user=username)
and I always get this error message
Error in postgresqlNewConnection(drv, ...) :
RS-DBI driver: (could not connect user#host.com on dbname "dbname"
)
Meanwhile this command using psql works perfectly
psql 'host=host.com dbname=dbname sslmode=require port=1234 user=user password=password'
I've also tried other variations of sslmode including require and allow for the R code, but nothing works. Would appreciate any ideas thanks!
I have a connection with my db in redshift using dplyr (function scr_postgres), but because it is under a schema I can't select the table.
trying to access without the schema I got an error:
requests <- tbl(my_db, "sessions")
Error in postgresqlExecStatement(conn, statement, ...) : RS-DBI driver: (could not Retrieve the result : ERROR: relation "sessions" does not exist
trying to access with the schema in one string I got an error:
requests <- tbl(my_db, "analytics.sessions")
Erro: Table analytics.sessions not found in database
trying to access with schema combining strings I got an error:
requests <- tbl(my_db, c("analytics", "sessions"))
Erro: length(from) not equal to 1
But in RPostgreSQL it works:
dbExistsTable(my_db$con, c("analytics", "sessions"))
[1] TRUE
As suggested by #mike_db here I can set the schema in the search_path.
For that I need to set the [options] parameter at the opening of a connection with src_postgres:
my_db <- src_postgres(host="host", port="5439",
dbname = "dbname", user = "user", password = "XXXX",
options="-c search_path=analytics")
Or, you can use something like it at the selecting of a table:
requests <- tbl(my_db, sql("SELECT * from analytics.sessions WHERE 0=1"))