Trouble accessing recently created ODBC table - r

I have started using DBI and RODBC packages as a way to talk to the ODBC interface and write a dataframe as a ODBC table to be accessible during queries.
My problem is, while I can write a table by using either dbWriteTableor sqlSave, I can't access it.
When I explore the available tables on my ODBC connection, my test table appears in my personal schema but when I try to access it via SELECT or even desc the "table or view does not exist" error appears.
The problem is only accessing the database because I can properly update or remove the table using either ODBC R package or even using SQL Developer.
PS: If I create the table using the import function in SQL Developer I can properly access the table but my goal is to properly access it after writing it using an R function

Related

How to connect R to SQL Azure?

I have two databases in Azure and each has 5 tables. I can perform data wrangling inside Azure using Kusto but I would rather prefer using RStudio. I wish to connect R with Azure such that I can run a script in R and return results without importing the actual datasets.
Please help, where do I start? I have zero knowledge of such connections.
Thank you in advance.
Assuming you have already installed R and RStudio. Please follow below steps:
Open ODBC Data Source through Start Window and Add a User Data Source under 'User DSN' as below. Follow through the next buttons until finish and test the connections.
Go to RStudio and create new connection, you should see the Data Source added in above step. You should see the connection and table listed under Azure Sql Database that you connected with.
Run the query like below in Console:
dbGetQuery(con, "SELECT * from dbo.xxxx")
You should be able to see the result set accordingly. You can play with queries in the way you want.

Connecting to Oracle database from R using PL/SQL settings

I currently use PL/SQL to connect to a database, and I want to be able to do the same thing using R Studio.
I tried installing ROracle, but I got the following error:
ERROR: cannot find Oracle Client.
Please set OCI_LIB64 to specify its location.
I don't know if I have the client installed or not, but I don't have admin privileges anyway and I'm not comfortable editing registries.
I then tried RODBC based on several other posts, but I either don't have or I don't know where to find the right information to enter. I know the database name, the username and password.
If I can use PL/SQL, does that mean that I have the Oracle client installed? If it does, would I be able to find its location in PL/SQL and then tell ROracle where to find it?
If not, is all the information I would need to connect with RODBC (or another package) available inside PL/SQL?

Connecting to BigQuery using ODBC in Qlikview

I have the latest BigQuery ODBC driver installed and setup according to the instruction here
I was able to follow the tutorial and access the data in MS Excel.
However in Qlikview I was unable to see any tables when using the same ODBC connection.
The ODBC driver is functional. What I didn't notice was that I didn't have any dataset created under the BigQuery test project, hence no table was available.
It is still possible to utilize QlikView to access the public data set by adding the query strings in the scripts after the ODBC CONNECT line.
QlikView Edit Script screen

Error when inserting large CSV into empty tables via Teradata SQL assistant

I have a 6 gig csv and I am trying to load it into Teradata.
So I fire up Teradata SQL assistant, created an empty table and then I turn on Import data mode and try to insert the records into the empty table using
insert into some_lib.some_table
(?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?);
But I always get a failure message around the 600k rows marks that goes
Error reading import file at record 614770: Exception of type
'System.OutOfMemoryException' was thrown.
I think it's because Teradata SQL assistant is trying to load everything into memory on my 4G laptop before trying to send the data to the Teradata server. Is my theory correct? How do I tell Teradata to upload the data in chunks and not try to store everything in local memory?
I believe you are pushing the capabilities of SQL Assistant as a means to load data.
Have you considered installed the Teradata Load Utilities such as FastLoad or MultiLoad on your system?
Another option if you don't want to write scripts for the load utilities would be to install Teradata Studio Express which should provide a mechanism to use JDBC FastLoad to load your data. This would be in the Smart Loader mechanism of Studio Express. You may find this to be more extensible than SQL Assistant using .Net or ODBC.

Trying to attach a database to a currently open database but i'm getting an error saying ATTACH is not allowed from SQL

I'm trying to attach a database(db2.sqlite) to a currently open database(db1.sqlite) and copy the contents of one of the tables in db2 into one of the tables in db1. The logical way to do this I thought was to use the attach command and then select all from db2 and insert into db1:-
attach 'C:\db2.sqlite' as newData;
insert into main.table1 select * from newData.table1
Both database's have identical table names (table1) and the exact same schema. To make sure my syntax was correct I tried this out in the Firefox SQLite Manager and everything worked perfectly.
Unfortunately when I tried the same method in my air application I got the following error:-
"ATTACH is not allowed from SQL.', operation:'execute', detailID:'2053'"
Can anyone please tell me why this isn't working?
Many Thanks
Adam
From the Adobe LiveDocs:
The following SQL elements and SQLite
features are supported in some SQLite
implementations, but are not supported
in Adobe AIR. Most of this
functionality is available through
methods of the SQLConnection class:
* ATTACH: This functionality is available through the
SQLConnection.attach() method.

Resources