according to this trigger example
I can use trigger to write to other files for audting trails. How do I exactly do that?
I have tried this to no avail
CREATE TRIGGER log AFTER INSERT ON my_table
BEGIN
ATTACH DATABASE /location/otherfile AS logDb
Insert ..
I tried the above code the sqlite console and got the syntax error near ATTACH. How else can I do this?
Thanks!
Inside a trigger body, only INSERT/UPDATE/DELETE/SELECT statements are allowed, and you cannot access other database schemas.
To access other files, you have to create a user-defined function, which requires support from any application that modifies the database.
Related
I have an Ionic App using SQLite. I don't have any problems with implementation.
The issue is that I need to import an SQL file using SQLitePorter to populate the database with configuration info.
But also, on the same database I have user info, so my question is:
Everytime I start the app, it will import the sql file, fill the database and probably overwrite my user data too? Since it is all on the same base?
I assume that you can always init your table using string queries inside your code. The problem is not that you are importing a .sql file. Right?
According to https://www.sqlitetutorial.net/sqlite-create-table/ it is obvious that you always create a table with [IF NOT EXISTS] switch. Writing a query like :
CREATE TABLE [IF NOT EXISTS] [schema_name].table_name (
column_1 data_type PRIMARY KEY);
you let sqlite to decide if it's going to create a table with the risk to overwrite an existing table. It is supposed that you can trust that sqlite is smart enough, not to overwrite any information especially if you use 'BEGIN TRANSACTION' - 'COMMIT' procedure.
I give my answer assuming that you have imported data and user data in distinct tables, so you can manipulate what you populate and what you don't. Is that right?
What I usually do, is to have a sql file like this:
DROP TABLE configutation_a;
DROP TABLE configutation_b;
CREATE TABLE configutation_a;
INSERT INTO configutation_a (...);
CREATE TABLE configutation_b;
INSERT INTO configutation_b (...);
CREATE TABLE IF NOT EXIST user_data (...);
This means that every time the app starts, I am updating with the configuration data I have at that time (that's is why we use http.get to get any configuration file from a remote repo in the future) and create user data only if user_data table is not there (hopefully initial start).
Conclusion: It's always a good practice, in my opinion, to trust a database product 100% and abstractly let it do any transaction that might give you some risk if you implemented your self in your code; since it gives a tool for that.For example, the keyword [if not exists], is always safer than implementing a table checker your self.
I hope that helps.
PS: In case you refer in create database procedure, SQLite, connects to a database file and it doesn't exist, it creates it. For someone comfortable in sqlite command line, when you type
sqlite3 /home/user/db/configuration.db will connect you with this db and if the file is not there, it will create it.
The only thing I don't have an automated tool for when working with Oracle is a program that can create INSERT INTO scripts.
I don't desperately need it so I'm not going to spend money on it. I'm just wondering if there is anything out there that can be used to generate INSERT INTO scripts given an existing database without spending lots of money.
I've searched through Oracle with no luck in finding such a feature.
It exists in PL/SQL Developer, but errors for BLOB fields.
Oracle's free SQL Developer will do this:
http://www.oracle.com/technetwork/developer-tools/sql-developer/overview/index.html
You just find your table, right-click on it and choose Export Data->Insert
This will give you a file with your insert statements. You can also export the data in SQL Loader format as well.
You can do that in PL/SQL Developer v10.
1. Click on Table that you want to generate script for.
2. Click Export data.
3. Check if table is selected that you want to export data for.
4. Click on SQL inserts tab.
5. Add where clause if you don't need the whole table.
6. Select file where you will find your SQL script.
7. Click export.
Use a SQL function (I'm the author):
https://github.com/teopost/oracle-scripts/blob/master/fn_gen_inserts.sql
Usage:
select fn_gen_inserts('select * from tablename', 'p_new_owner_name', 'p_new_table_name')
from dual;
where:
p_sql – dynamic query which will be used to export metadata rows
p_new_owner_name – owner name which will be used for generated INSERT
p_new_table_name – table name which will be used for generated INSERT
p_sql in this sample is 'select * from tablename'
You can find original source code here:
http://dbaora.com/oracle-generate-rows-as-insert-statements-from-table-view-using-plsql/
Ashish Kumar's script generates individually usable insert statements instead of a SQL block, but supports fewer datatypes.
I have been searching for a solution for this and found it today. Here is how you can do it.
Open Oracle SQL Developer Query Builder
Run the query
Right click on result set and export
http://i.stack.imgur.com/lJp9P.png
You might execute something like this in the database:
select "insert into targettable(field1, field2, ...) values(" || field1 || ", " || field2 || ... || ");"
from targettable;
Something more sophisticated is here.
If you have an empty table the Export method won't work. As a workaround. I used the Table View of Oracle SQL Developer. and clicked on Columns. Sorted by Nullable so NO was on top. And then selected these non nullable values using shift + select for the range.
This allowed me to do one base insert. So that Export could prepare a proper all columns insert.
If you have to load a lot of data into tables on a regular basis, check out SQL Loader or external tables. Should be much faster than individual Inserts.
You can also use MyGeneration (free tool) to write your own sql generated scripts. There is a "insert into" script for SQL Server included in MyGeneration, which can be easily changed to run under Oracle.
Is there a way to get system/environment variables with SQLite?
I know that by using sqlite3's command line I can do something like:
sqlite> .shell echo $USER
But I want to know if I can implement this in a SQL trigger.
We have multiple users using a shared sqlite database and I want to create an automatic log that records who made that changes to specific tables.
SQLite does not know anything about environment variables. (Many systems on which it runs do not even have them.)
The easiest way to get a user name into a trigger would be to create a user-defined function, but then the trigger would fail if used outside a program that has registered this function.
SQLite allows to define custom functions that can be called from SQL statements. I use this to get notified of trigger activity in my application:
CREATE TRIGGER AfterInsert AFTER INSERT ON T
BEGIN
-- main trigger code involves some extra db modifications
-- ...
-- invoke application callback
SELECT ChangeNotify('T', 'INSERT', NEW.id);
END;
However, user-defined functions are added only to current database connection. There may be other clients who haven't defined ChangeNotify and I don't want them to get a "no such function" error.
Is is possible to call a function only if it's defined? Any alternative solution is also appreciated.
SQLite is designed as an embedded database, so it is assumed that your application controls what is done with the database.
SQLite has no SQL function to check for user-defined functions.
If you want to detect changes made only from your own program, use sqlite3_update_hook.
Prior to calling your user defined function, you can check if the function exists by selecting from pragma_function_list;
select exists(select 1 from pragma_function_list where name='ChangeNotify');
1
It would be possible by combining a query against pragma_function_list and a WHEN statement on the trigger --
CREATE TRIGGER AfterInsert AFTER INSERT ON T
WHEN EXISTS (SELECT 1 FROM pragma_function_list WHERE name = 'ChangeNotify')
BEGIN
SELECT ChangeNotify('T', 'INSERT', NEW.id);
END;
except that query preparation attempts to resolve functions prior to execution. So, afaik, this isn't possible to do in a trigger.
I need to do the same thing and asked here: https://sqlite.org/forum/forumpost/a997b1a01d
Hopefully they come back with a solution.
Update
SQLite forum suggestion is to use create temp trigger when your extension loads -- https://sqlite.org/forum/forumpost/96160a6536e33f71
This is actually a great solution as temp triggers are:
not visible to other connections
are cleaned up when the connection creating them ends
I want a debug function to do this, but I'm unaware of whether one already exists. Going through and using 'drop table' for each of my tables will be a pain.
Help appreciated.
Since the database is just one file, you can indeed just erase it. If you want something more automatic, you can use the following to do it all programmatically:
Recover your schema:
SELECT group_concat(sql,';') FROM sqlite_master;
Disconnect from the database
Delete the database file
Create your schema again with what was returned from the above query
If you used any particular options for your original database (page_size, etc), they will have to be declared manually as well.
to "drop database" for sqlite, simply delete the database file (and recreate if needed)