I want to use the sqlite ODBC driver from http://www.ch-werner.de/sqliteodbc/ and start with an ATTACH statement, as I need joint data from two databases. However the driver returns no dataw when provided the following SQL:
attach 'my.db' as mydb; select 1
It however correctly complains with only one SQL statement allowed when the first statement is indeed a SELECT:
select 2;attach 'my.db' as mydb; select 1
Checking at the source, a checkddl() function analyzes if the provided requests contains DDL (Data Definition Language) statement. Before digging in the complete code, question is:
did someone manage to issue a select after an attach with this driver ?
Related
I would like to load text from a field in a SQLite table and run it as a SQLite query. All done in a SQLite query. No external string operations, nor command line operations are possible. Pure SQLite only.
Let's say that I would create a table command_table with the rows:
COMMAND_NAME: COMMAND:
command1 SELECT * FROM table1
command2 SELECT * FROM table1 WHERE table1.row1 = '1'
The desired SQLite command would be able to load the COMMAND and interpret it.
The commands would be as complex as it gets, so using some generic comparisons like WHERE table1.row1 = command_table.command1" is not an option.
SQLite is designed as an embedded database, i.e., to be used together with a 'real' programming language. Therefore, it does not have any mechanism to execute dynamic SQL statements from within SQL itself.
With the release of dplyr 0.7.0, it is now supposedly easy to connect to Oracle using the odbc package. However, I am running into a problem accessing tables not inside the default schema (for me it is my username). For example, suppose there is the table TEST_TABLE in schema TEST_SCHEMA. Then, example SQL syntax to get data would be: select * from TEST_SCHEMA.TEST_TABLE'.
To do the same in `dplyr, I am trying the following:
# make database connection using odbc: [here's a guide][1]
oracle_con <- DBI::dbConnect(odbc::odbc(), "DB")
# attempt to get table data
tbl(oracle_con, 'TEST_SCHEMA.TEST_TABLE')
Now, this leads to an error message:
Error: <SQL> 'SELECT *
FROM ("TEST_SCHEMA.TEST_TABLE") "zzz12"
WHERE (0 = 1)'
nanodbc/nanodbc.cpp:1587: 42S02: [Oracle][ODBC][Ora]ORA-00942: table or view does not exist
I think the problem here is the double quotation marks, as:
DBI::dbGetQuery(oracle_con, "select * from (TEST_SCHEMA.TEST_TABLE) where rownum < 100;")
works fine.
I struggled with this for a while until I found the solution at the bottom of the introduction to dbplyr. The correct syntax to specify the schema and table combo is:
tbl(oracle_con, in_schema('TEST_SCHEMA', 'TEST_TABLE'))
As an aside, I think the issue with quotation marks is lodged here: https://github.com/tidyverse/dplyr/issues/3080
There are also the following alternate work-arounds that may be suitable depending on what you wish to do. Since the connection used DBI, one can alter the schema via:
DBI::dbSendQuery(oracle_con, "alter session set current_schema = TEST_SCHEMA")
after which tbl(oracle_con, 'TEST_TABLE') will work.
Or, if you have create view privileges, you can create a "shortcut" in your default schema to any table you are interested in:
DBI::dbSendQuery(oracle_con, "CREATE VIEW TEST_TABLE AS SELECT *
FROM TEST_SCHEMA.TEST_TABLE")
Note that the latter may be more suitable for applications where you wish to copy local data to the database for a join, but do not have write access to the table's original schema.
I have to copy some objects from one schema to another on the same database, between others java sources too. The dbms_metadata.get_ddl(object_type, object_name, schema_name) returns schema name in the ddl. Because I want to execute this ddl on the new schema the old schema name in the ddl doesn't help me in my job. To avoid this problem I use a following function a step before:
execute dbms_metadata.set_transform_param(dbms_metadata.session_transform,'EMIT_SCHEMA', false);
In case of a table it works (it means, it omits the schema name in ddl):
select dbms_metadata.get_ddl('TABLE', object_name, schema_name) from dual;
but in case of java source:
select dbms_metadata.get_ddl('JAVA_SOURCE', object_name, schema_name) from dual;
it doesn't!
I've tested these functions on VM with database 12.2 from Oracle too. The same behavior.
Is it a bug? Any workaround?
Regards,
Jacek
The only thing I don't have an automated tool for when working with Oracle is a program that can create INSERT INTO scripts.
I don't desperately need it so I'm not going to spend money on it. I'm just wondering if there is anything out there that can be used to generate INSERT INTO scripts given an existing database without spending lots of money.
I've searched through Oracle with no luck in finding such a feature.
It exists in PL/SQL Developer, but errors for BLOB fields.
Oracle's free SQL Developer will do this:
http://www.oracle.com/technetwork/developer-tools/sql-developer/overview/index.html
You just find your table, right-click on it and choose Export Data->Insert
This will give you a file with your insert statements. You can also export the data in SQL Loader format as well.
You can do that in PL/SQL Developer v10.
1. Click on Table that you want to generate script for.
2. Click Export data.
3. Check if table is selected that you want to export data for.
4. Click on SQL inserts tab.
5. Add where clause if you don't need the whole table.
6. Select file where you will find your SQL script.
7. Click export.
Use a SQL function (I'm the author):
https://github.com/teopost/oracle-scripts/blob/master/fn_gen_inserts.sql
Usage:
select fn_gen_inserts('select * from tablename', 'p_new_owner_name', 'p_new_table_name')
from dual;
where:
p_sql – dynamic query which will be used to export metadata rows
p_new_owner_name – owner name which will be used for generated INSERT
p_new_table_name – table name which will be used for generated INSERT
p_sql in this sample is 'select * from tablename'
You can find original source code here:
http://dbaora.com/oracle-generate-rows-as-insert-statements-from-table-view-using-plsql/
Ashish Kumar's script generates individually usable insert statements instead of a SQL block, but supports fewer datatypes.
I have been searching for a solution for this and found it today. Here is how you can do it.
Open Oracle SQL Developer Query Builder
Run the query
Right click on result set and export
http://i.stack.imgur.com/lJp9P.png
You might execute something like this in the database:
select "insert into targettable(field1, field2, ...) values(" || field1 || ", " || field2 || ... || ");"
from targettable;
Something more sophisticated is here.
If you have an empty table the Export method won't work. As a workaround. I used the Table View of Oracle SQL Developer. and clicked on Columns. Sorted by Nullable so NO was on top. And then selected these non nullable values using shift + select for the range.
This allowed me to do one base insert. So that Export could prepare a proper all columns insert.
If you have to load a lot of data into tables on a regular basis, check out SQL Loader or external tables. Should be much faster than individual Inserts.
You can also use MyGeneration (free tool) to write your own sql generated scripts. There is a "insert into" script for SQL Server included in MyGeneration, which can be easily changed to run under Oracle.
I am running a SQLite database in memory and I am attempting to drop a table with the following command.
DROP TABLE 'testing' ;
But when I execute the SQL statement, I get this error
SQL logic error or missing database
Before I run the "Drop Table" query I check to make sure that the table exists in the database with this query. So I am pretty sure that the table exists and I have a connection to the database.
SELECT count(*) FROM sqlite_master WHERE type='table' and name='testing';
This database is loaded in to memory from a file database and after I attempt to drop this table the database is saved from memory to the file system. I can then use a third party SQLite utility to view the SQLite file and check to see if the "testing" exists, it does. Using the same 3rd party SQLite utility I am able to run the "Drop TABLE" SQL statement with out error.
I am able to create/update tables without any problems.
My questions:
Is there a difference between a memory database and a file database in SQLite when dropping a table?
Is there a way to disable the ability to drop a table in SQLite that I may have accentually turned on somehow?
Edit: It appears to have something to do with a locked table. Still investigating.
You should not have quotes in your DROP TABLE command. Use this instead:
DROP TABLE testing
I had the same problem when using Sqlite with the xerial jbdc driver in the version 3.7.2. and JRE7
I first listed all the tables with the select command as follows:
SELECT name FROM sqlite_master WHERE type='table'
And then tried to delete a table like this:
DROP TABLE IF EXISTS TableName
I was working on a database stored on the file system and so it seems not to effect the outcome.
I used the IF EXISTS command to avoid listing all the table from the master table first, but I needed the complete table list anyway.
For me the solution was simply to change the order of the SELECT and DROP.