Sqlite3 Vs web2py DAL - sqlite

I have been working with sqlite DB for some time but want to integrate my codes to web2py esp. DAL. How do I rewrite such a code to web2py DAL code?
name = input ('Please Type your Question: ').lower().split()
name2 = name[:]
import sqlite3
for item in name2:#break
conn = sqlite3.connect("foods.db")
cursor = conn.cursor()
cursor.execute("INSERT INTO INPUT33 (NAME) VALUES (?);", (name2,))
cursor.execute("select MAX(rowid) from [input33];")
conn.commit()
for rowid in cursor:break
for elem in rowid:
m = elem
print(m)
cursor.execute("DELETE FROM INPUT33 (NAME) WHERE NAME = name")

I do not quite understand the question so I would like to apologize in advance for any misunderstanding.
Web2py is a Web MVC framework and you should follow that pattern while designing your application. Having that in mind, using a console-related function like input makes no sense. Also, you shouldn't use the same component to extract user interaction related data and deal with database connection and data access/manipulation.
If your intent is to simply convert your code snippet that used sqlite3 module into using pyDAL you just need to install it pip install pydal and change your code to something like
#Do your imports
from pydal import DAL, Field
# connect to your database
db = DAL('sqlite://foods.db')
# define your table model and table fields
db.define_table('input33', Field('NAME'))
# perform an insert into input database
db.input33.insert(name=name2)
# every insert/delete/update needs your to commit to your changes
db.commit()
A full documentation can be found here

Related

Save result of Teradata SQL statement

Does anyone know how I can save the output of the “HELP VOLATILE TABLE” statement as a table that I can use in a query later on.
My goal is to save a list of all Volatile Tables that are currently present.
I tried to use the “HELP VOLATILE TABLE” in a CTE, but it doesn’t do the trick. It refuses to run. Any help is useful.
Update: It seems HELP/SHOW statements can only return data to the client. They can’t be used in a query.
It seems it is possible to write an external stored procedure in f.e. Java that FETCHES this data and INSERTS it into a Global Temporary Table.
My question is whether someone knows how to write said external stored procedure in JAVA, and knows how to import it and use it?
For those using SAS, this is easy to do.
PROC SQL NOPRINT;
CONNECT TO TERADATA ( &connect_string. );
SELECT *
FROM CONNECTION TO TERADATA
(HELP VOLATILE TABLE);
DISCONNECT FROM TERADATA;
QUIT;
For those using Python, the same can be done easily too.
import os
import teradatasql
import pandas as pd
with teradatasql.connect('{"host":"your_host_name"}', user="", password = "") as connect:
df = pd.read_sql("HELP VOLATILE TABLE", connect)

SQLite Importer will overwrite my database when I load my application?

I have an Ionic App using SQLite. I don't have any problems with implementation.
The issue is that I need to import an SQL file using SQLitePorter to populate the database with configuration info.
But also, on the same database I have user info, so my question is:
Everytime I start the app, it will import the sql file, fill the database and probably overwrite my user data too? Since it is all on the same base?
I assume that you can always init your table using string queries inside your code. The problem is not that you are importing a .sql file. Right?
According to https://www.sqlitetutorial.net/sqlite-create-table/ it is obvious that you always create a table with [IF NOT EXISTS] switch. Writing a query like :
CREATE TABLE [IF NOT EXISTS] [schema_name].table_name (
column_1 data_type PRIMARY KEY);
you let sqlite to decide if it's going to create a table with the risk to overwrite an existing table. It is supposed that you can trust that sqlite is smart enough, not to overwrite any information especially if you use 'BEGIN TRANSACTION' - 'COMMIT' procedure.
I give my answer assuming that you have imported data and user data in distinct tables, so you can manipulate what you populate and what you don't. Is that right?
What I usually do, is to have a sql file like this:
DROP TABLE configutation_a;
DROP TABLE configutation_b;
CREATE TABLE configutation_a;
INSERT INTO configutation_a (...);
CREATE TABLE configutation_b;
INSERT INTO configutation_b (...);
CREATE TABLE IF NOT EXIST user_data (...);
This means that every time the app starts, I am updating with the configuration data I have at that time (that's is why we use http.get to get any configuration file from a remote repo in the future) and create user data only if user_data table is not there (hopefully initial start).
Conclusion: It's always a good practice, in my opinion, to trust a database product 100% and abstractly let it do any transaction that might give you some risk if you implemented your self in your code; since it gives a tool for that.For example, the keyword [if not exists], is always safer than implementing a table checker your self.
I hope that helps.
PS: In case you refer in create database procedure, SQLite, connects to a database file and it doesn't exist, it creates it. For someone comfortable in sqlite command line, when you type
sqlite3 /home/user/db/configuration.db will connect you with this db and if the file is not there, it will create it.

Need to get data from a table using database link where database name is dynamic

I am working on a system where I need to create a view.I have two databases
1.CDR_DB
2.EMS_DB
I want to create the view on the EMS_DB using table from CDR_DB. This I am trying to do via dblink.
The dblink is created at the runtime, i.e. DB Name is decided at the time user installs the database, based on the dbname dblink is decided.
My issue is I am trying to create a query like below to create a view from a table which name is decided at run time. Please see below query :
select count(*)
from (SELECT CONCAT('cdr_log#', alias) db_name
FROM ems_dbs a,
cdr_manager b
WHERE a.db_type = 'CDR'
and a.ems_db_id = b.cdr_db_id
and b.op_state = 4 ) db_name;
In this query cdr_log#"db_name" is the runtime table name(db_name get's created at runtime).
When I'm trying to run above query, I'm not getting the desired result. The result of the above query is '1'.
When running only the sub-query from the above query :
SELECT CONCAT('cdr_log#', alias) db_name
FROM ems_dbs a,
cdr_manager b
WHERE a.db_type = 'CDR'
and a.ems_db_id = b.cdr_db_id
and b.op_state = 4;
i'm getting the desired result, i.e. cdr_log#cdrdb01
but when i'm trying to run the full query, getting result as '1'.
Also, when i'm trying to run as
select count(*) from cdr_log#cdrdb01;
I'm getting the result as '24' which is correct.
Expected Result is that I should get the same output similar to the query :
select count(*) from cdr_log#cdrdb01;
---24
But the desired result is coming as '1' using the full query mentioned initially.
Please let me know a way to solve the above problem. I found a way to do it via a procedure, but i'm not sure how can I invoke this procedure.
Can this be done as part of sub query as I have used above?
You're not going to be able to create a view that will dynamically reference an object over a database link unless you do something like create a pipelined table function that builds the SQL dynamically.
If the database link is created and named dynamically at installation time, it would probably make the most sense to create any objects that depend on the database link (such as the view) at installation time too. Dynamic SQL tends to be much harder to write, maintain, and debug than static SQL so it would make sense to minimize the amount of dynamic SQL you need. If you can dynamically create the view at installation time, that's likely the easiest option. Even better than directly referencing the remote object in the view, particularly if there are multiple objects that need to reference the remote object, would probably be to have the view reference a synonym and create the synonym at install time. Something like
create synonym cdr_log_remote
for cdr#<<dblink name>>
create or replace view view_name
as
select *
from cdr_log_remote;
If you don't want to create the synonym/ view at installation time, you'd need to use dynamic SQL to reference the remote object. You can't use dynamic SQL as the SELECT statement in a view so you'd need to do something like have a view reference a pipelined table function that invokes dynamic SQL to call the remote object. That's a fair amount of work but it would look something like this
-- Define an object that has the same set of columns as the remote object
create type typ_cdr_log as object (
col1 number,
col2 varchar2(100)
);
create type tbl_cdr_log as table of typ_cdr_log;
create or replace function getAllCDRLog
return tbl_cdr_log
pipelined
is
l_rows typ_cdr_log;
l_sql varchar(1000);
l_dblink_name varchar(100);
begin
SELECT alias db_name
INTO l_dblink_name
FROM ems_dbs a,
cdr_manager b
WHERE a.db_type = 'CDR'
and a.ems_db_id = b.cdr_db_id
and b.op_state = 4;
l_sql := 'SELECT col1, col2 FROM cdr_log#' || l_dblink_name;
execute immediate l_sql
bulk collect into l_rows;
for i in 1 .. l_rows.count
loop
pipe row( l_rows(i) );
end loop;
return;
end;
create or replace view view_name
as
select *
from table( getAllCDRLog );
Note that this will not be a particularly efficient way to structure things if there are a large number of rows in the remote table since it reads all the rows into memory before starting to return them back to the caller. There are plenty of ways to make the pipelined table function more efficient but they'll tend to make the code more complicated.

Export Multpile Tables Data as Insert Statements in to single file Oracle DB [duplicate]

The only thing I don't have an automated tool for when working with Oracle is a program that can create INSERT INTO scripts.
I don't desperately need it so I'm not going to spend money on it. I'm just wondering if there is anything out there that can be used to generate INSERT INTO scripts given an existing database without spending lots of money.
I've searched through Oracle with no luck in finding such a feature.
It exists in PL/SQL Developer, but errors for BLOB fields.
Oracle's free SQL Developer will do this:
http://www.oracle.com/technetwork/developer-tools/sql-developer/overview/index.html
You just find your table, right-click on it and choose Export Data->Insert
This will give you a file with your insert statements. You can also export the data in SQL Loader format as well.
You can do that in PL/SQL Developer v10.
1. Click on Table that you want to generate script for.
2. Click Export data.
3. Check if table is selected that you want to export data for.
4. Click on SQL inserts tab.
5. Add where clause if you don't need the whole table.
6. Select file where you will find your SQL script.
7. Click export.
Use a SQL function (I'm the author):
https://github.com/teopost/oracle-scripts/blob/master/fn_gen_inserts.sql
Usage:
select fn_gen_inserts('select * from tablename', 'p_new_owner_name', 'p_new_table_name')
from dual;
where:
p_sql – dynamic query which will be used to export metadata rows
p_new_owner_name – owner name which will be used for generated INSERT
p_new_table_name – table name which will be used for generated INSERT
p_sql in this sample is 'select * from tablename'
You can find original source code here:
http://dbaora.com/oracle-generate-rows-as-insert-statements-from-table-view-using-plsql/
Ashish Kumar's script generates individually usable insert statements instead of a SQL block, but supports fewer datatypes.
I have been searching for a solution for this and found it today. Here is how you can do it.
Open Oracle SQL Developer Query Builder
Run the query
Right click on result set and export
http://i.stack.imgur.com/lJp9P.png
You might execute something like this in the database:
select "insert into targettable(field1, field2, ...) values(" || field1 || ", " || field2 || ... || ");"
from targettable;
Something more sophisticated is here.
If you have an empty table the Export method won't work. As a workaround. I used the Table View of Oracle SQL Developer. and clicked on Columns. Sorted by Nullable so NO was on top. And then selected these non nullable values using shift + select for the range.
This allowed me to do one base insert. So that Export could prepare a proper all columns insert.
If you have to load a lot of data into tables on a regular basis, check out SQL Loader or external tables. Should be much faster than individual Inserts.
You can also use MyGeneration (free tool) to write your own sql generated scripts. There is a "insert into" script for SQL Server included in MyGeneration, which can be easily changed to run under Oracle.

Create Sqlite database patches while updating

Context:
A python 3.6 script is updating a Sqlite database several times a day using sqlite3 module.
The database is ~500Mo, each update adds up ~250Ko.
Issue:
I deliver every updated versions of the database and would like to reduce the size of the data transferred. In other words, I would like to transfer only the updated content (through a kind of patch).
The sqldiff.exe utility program could be used for that, nevertheless, it requires to create a local copy of the database every time I update it.
Question:
Is there a way, using Python (through the DB-API 2.0 interface or using other ways in Python), to generate this kind of patch while updating the database?
First thoughts:
Wouldn't it be possible to write a patch (e.g. a list of actions to be done to update the database) based on the cursor before/while performing the commit?
import sqlite3
# Open database
conn = sqlite3.connect('mydb.db')
cur = conn.cursor()
# Insert/Update data
new_data = 3.14
cur.execute('INSERT INTO mytable VALUES (?)', (new_data,))
# KEEP TRACK & Save (commit) the changes
conn.dump_planned_actions() # ?????
conn.commit()
conn.close()
The following snippet shows the workaround I found.
It relies on the Sqlite3 method set_trace_callback to log all the SQL statements sent and executescript to apply these statements.
import sqlite3
class DBTraceCallbackHandler(object):
"""Class handling callbacks in order to log sql statements history."""
def __init__(self):
self.sql_statements = []
def instance_handler(self, event):
self.sql_statements.append(str(event))
def database_modification(cursor):
# user-defined
pass
def create_patch(db_path):
# Openning connection
conn = sqlite3.connect(db_path)
c = conn.cursor()
# Start tracing sql
callback_handler = DBTraceCallbackHandler()
conn.set_trace_callback(callback_handler.instance_handler)
# Modification of database
database_modification(c)
# End of modification of database
conn.commit()
c.close()
# Generating the patch - selecting sql statements that modify the db
idx_rm = []
for idx, sql_statement in enumerate(callback_handler.sql_statements):
if not any([sql_statement.startswith(kw) for kw in ['UPDATE', 'INSERT', 'CREATE']]):
idx_rm.append(idx)
for idx in sorted(idx_rm, reverse=True):
del callback_handler.sql_statements[idx]
return ';\n'.join(callback_handler.sql_statements) + ';\n'
def apply_patch(db_path, sql_script):
# Openning connection
conn = sqlite3.connect(db_path)
c = conn.cursor()
# Modification of database - apply sql script
c.executescript(sql_script)
# End of modification of database
conn.commit()
c.close()

Resources