Pl Sql pls-00103-encountered-the-symbol / or create - plsql

i'm trying to use this package https://technology.amis.nl/wp-content/uploads/2021/03/as_xlsx20.txt
to create xls file from a query.
If i paste the code in a package in sql developer, i get the error:
Pl Sql pls-00103-encountered-the-symbol / at line 490, before the CREATE OR REPLACE package body AS_XLSX
Where is the real problem here?
Thank you in advance

Related

R DBI issue with accessing list fields of a remote table

I am trying to get the fields of a table in an Arctic database. For that I've been successful at creating a jdbcConnection object which is of class "JDBCConnection", but once I write the following code to get the fields of the AR_LOT table (dbListFields comes from DBI package),
dbListFields(jdbcConnection, name = Id(schema = "ARCTIC", table = "AR_LOT"))
I get the following error message.
Error in dbSendQuery(conn, paste("SELECT * FROM ", dbQuoteIdentifier(conn, :
Unable to retrieve JDBC result set
JDBC ERROR: ORA-00933: SQL command not properly ended
Statement: SELECT * FROM "ARCTIC"."AR_LOT" LIMIT 0
I also tried with RJDBC's function dbGetFields, but I'm also running into an error.
Error in .jcall(md, "Ljava/sql/ResultSet;", "getColumns", .jnull("java/lang/String"), :
method getColumns with signature (Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;)Ljava/sql/ResultSet; not found
The weird thing is that dbReadTable from DBI package works just fine.
Can anyone please help me understand these error messages a little more clearly? Thanks in advance

Invalid value at 'start_index' (TYPE_UINT64), "1e+05" [invalid] issue while downloading data to R from BigQuery

I successfully connected Google BigQuery with the R environment using the bigrquery package.
I have defined a sql statement which extracts a report. While using the bq_table_download function, I get the following error.
Invalid value at 'start_index' (TYPE_UINT64), "1e+05" [invalid]
Code:
sql <- "SELECT * FROM ABC"
df <- bq_project_query(billing, sql)
data <- (bq_table_download(df))
There is very little help on this issue. Thank you in advance.
The issue is caused as BigQuery allows only 100k records to be downloaded. Adding the
options(scipen = 20) script to the start of your code will solve the issue.
This was just fixed in PR#400 by #gjuggler
updating the bigrquery package to the latest version will fix your issue.
remotes::install_github("r-dbi/bigrquery")

How do I create a log file in R that includes the R code?

How do I create a log file in R that has the R code and execution times for each line of R code?
Here is a sample of the R code:
library(RODBC)
myconn <-odbcConnect("TD", uid="TD7949", pwd="")
Employee <- sqlQuery(myconn, "select * from TD7949.employee")
odbcClose(myconn)
I would like to see in the log file the R codes above as well as any messages related to each of the codes and the execution time for the SQL query.

RImpala: Query Failed When Larger Data

check1<-rimpala.query("select * from sum2")
Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl, :
java.sql.SQLException: Method not supported
dim(sum2) is 49501 rows and 18 columns.
check1<-rimpala.query("select *from sum3")
dim(sum3) is 102 rows and 6 columns.
It worked with smaller sample size.
sorry that I cant reproduce example to this. Is anyone encounter the same problem with larger data size? Any idea to solve this? Thanks.
As noted elsewhere on StackOverflow, RImpala does not implement executeUpdate and so cannot run any query that modifies state. I suspect you hit your error not by running a larger SELECT query but rather because you tried to insert, update, or delete some data.
If you'd like to use Impala from R, I'd recommend using dplyrimpaladb.
RImpala (v0.1.6) build is updated with the support to execute DDL queries using executeUpdate.
The latest build contains the following fixes / additions:
Support for DDL query execution.
fetchSize parameter in query function to state the number of records that can be retrieved in one round trip read from Impala.
Fix for query failing when NULL values are being returned.
Compatiblity with CDH 5.x.x
You can run DDL queries using the query function as illustrated below:
rimpala.query(Q="drop table sample_table",isDDL="true")
You can also specify the fetchSize in the query function to aid reading large data efficiently.
rimpala.query(Q="select * from sample_table",fetchSize="10000")
Please find the latest build in Cran : http://cran.r-project.org/web/packages/RImpala/index.html
Source Code : https://github.com/Mu-Sigma/RImpala
I have the same problem with the RImpala package and recommend to use the RJDBC package:
library(RJDBC)
drv <- JDBC(driverClass = "org.apache.hive.jdbc.HiveDriver",
classPath = list.files("path_to_jars",pattern="jar$",full.names=T),
identifier.quote="`")
conn <- dbConnect(drv, "jdbc:hive2://localhost:21050/;auth=noSasl")
check1 <- dbGetQuery(conn, "select *from sum3")
I used these jar files an evenything works as expected:
https://downloads.cloudera.com/impala-jdbc/impala-jdbc-0.5-2.zip
For more information and a speed comparison look at this blog post:
http://datascience.la/r-and-impala-its-better-to-kiss-than-using-java/

sqlFetch Table not found error

After I use
cn<-odbcConnect(...)
to connect to MS SQL Server. I can successfully get data using:
tmp <- sqlQuery(cn, "select * from MyTable")
But if I use
tmp <- sqlFetch(cn,"MyTable")
R would complain about "Error in odbcTableExists(channel, sqtable) : table not found on channel". Did I miss anything here?
Assuming you work on Windows OS. When you define your "dsn" in Control panel > Administrative tools > System and Security > Data Sources (ODBC), you have to select a database as well. If you do that your code should work as expected.
So, the problem is not in your R code, but in your "dsn" string that in my opinion does not contain the reference to a database which is needed.

Resources