Save result of Teradata SQL statement - teradata

Does anyone know how I can save the output of the “HELP VOLATILE TABLE” statement as a table that I can use in a query later on.
My goal is to save a list of all Volatile Tables that are currently present.
I tried to use the “HELP VOLATILE TABLE” in a CTE, but it doesn’t do the trick. It refuses to run. Any help is useful.
Update: It seems HELP/SHOW statements can only return data to the client. They can’t be used in a query.
It seems it is possible to write an external stored procedure in f.e. Java that FETCHES this data and INSERTS it into a Global Temporary Table.
My question is whether someone knows how to write said external stored procedure in JAVA, and knows how to import it and use it?

For those using SAS, this is easy to do.
PROC SQL NOPRINT;
CONNECT TO TERADATA ( &connect_string. );
SELECT *
FROM CONNECTION TO TERADATA
(HELP VOLATILE TABLE);
DISCONNECT FROM TERADATA;
QUIT;
For those using Python, the same can be done easily too.
import os
import teradatasql
import pandas as pd
with teradatasql.connect('{"host":"your_host_name"}', user="", password = "") as connect:
df = pd.read_sql("HELP VOLATILE TABLE", connect)

Related

SQLite Importer will overwrite my database when I load my application?

I have an Ionic App using SQLite. I don't have any problems with implementation.
The issue is that I need to import an SQL file using SQLitePorter to populate the database with configuration info.
But also, on the same database I have user info, so my question is:
Everytime I start the app, it will import the sql file, fill the database and probably overwrite my user data too? Since it is all on the same base?
I assume that you can always init your table using string queries inside your code. The problem is not that you are importing a .sql file. Right?
According to https://www.sqlitetutorial.net/sqlite-create-table/ it is obvious that you always create a table with [IF NOT EXISTS] switch. Writing a query like :
CREATE TABLE [IF NOT EXISTS] [schema_name].table_name (
column_1 data_type PRIMARY KEY);
you let sqlite to decide if it's going to create a table with the risk to overwrite an existing table. It is supposed that you can trust that sqlite is smart enough, not to overwrite any information especially if you use 'BEGIN TRANSACTION' - 'COMMIT' procedure.
I give my answer assuming that you have imported data and user data in distinct tables, so you can manipulate what you populate and what you don't. Is that right?
What I usually do, is to have a sql file like this:
DROP TABLE configutation_a;
DROP TABLE configutation_b;
CREATE TABLE configutation_a;
INSERT INTO configutation_a (...);
CREATE TABLE configutation_b;
INSERT INTO configutation_b (...);
CREATE TABLE IF NOT EXIST user_data (...);
This means that every time the app starts, I am updating with the configuration data I have at that time (that's is why we use http.get to get any configuration file from a remote repo in the future) and create user data only if user_data table is not there (hopefully initial start).
Conclusion: It's always a good practice, in my opinion, to trust a database product 100% and abstractly let it do any transaction that might give you some risk if you implemented your self in your code; since it gives a tool for that.For example, the keyword [if not exists], is always safer than implementing a table checker your self.
I hope that helps.
PS: In case you refer in create database procedure, SQLite, connects to a database file and it doesn't exist, it creates it. For someone comfortable in sqlite command line, when you type
sqlite3 /home/user/db/configuration.db will connect you with this db and if the file is not there, it will create it.

Is it possible to query SAS tables from SSMS?

I want to know if it's possible to query SAS EG data from the SSMS interface in a new query and store the results (if any) in a temp table.
I've seen a lot of material for connecting SAS EG to SSMS. But not a lot for connecting SSMS to SAS EG. I primarily use SSMS and want to pass SAS code or whatever from SSMS and retrieve data.
BEGIN
Select * into #GetSasData from (
SomeFunctionorProceedureinSSMSthatPullsSASData("Proc SQL Create table A as Select ... ; end;)
) [A]
Select * from #GetSasData
END

Export Multpile Tables Data as Insert Statements in to single file Oracle DB [duplicate]

The only thing I don't have an automated tool for when working with Oracle is a program that can create INSERT INTO scripts.
I don't desperately need it so I'm not going to spend money on it. I'm just wondering if there is anything out there that can be used to generate INSERT INTO scripts given an existing database without spending lots of money.
I've searched through Oracle with no luck in finding such a feature.
It exists in PL/SQL Developer, but errors for BLOB fields.
Oracle's free SQL Developer will do this:
http://www.oracle.com/technetwork/developer-tools/sql-developer/overview/index.html
You just find your table, right-click on it and choose Export Data->Insert
This will give you a file with your insert statements. You can also export the data in SQL Loader format as well.
You can do that in PL/SQL Developer v10.
1. Click on Table that you want to generate script for.
2. Click Export data.
3. Check if table is selected that you want to export data for.
4. Click on SQL inserts tab.
5. Add where clause if you don't need the whole table.
6. Select file where you will find your SQL script.
7. Click export.
Use a SQL function (I'm the author):
https://github.com/teopost/oracle-scripts/blob/master/fn_gen_inserts.sql
Usage:
select fn_gen_inserts('select * from tablename', 'p_new_owner_name', 'p_new_table_name')
from dual;
where:
p_sql – dynamic query which will be used to export metadata rows
p_new_owner_name – owner name which will be used for generated INSERT
p_new_table_name – table name which will be used for generated INSERT
p_sql in this sample is 'select * from tablename'
You can find original source code here:
http://dbaora.com/oracle-generate-rows-as-insert-statements-from-table-view-using-plsql/
Ashish Kumar's script generates individually usable insert statements instead of a SQL block, but supports fewer datatypes.
I have been searching for a solution for this and found it today. Here is how you can do it.
Open Oracle SQL Developer Query Builder
Run the query
Right click on result set and export
http://i.stack.imgur.com/lJp9P.png
You might execute something like this in the database:
select "insert into targettable(field1, field2, ...) values(" || field1 || ", " || field2 || ... || ");"
from targettable;
Something more sophisticated is here.
If you have an empty table the Export method won't work. As a workaround. I used the Table View of Oracle SQL Developer. and clicked on Columns. Sorted by Nullable so NO was on top. And then selected these non nullable values using shift + select for the range.
This allowed me to do one base insert. So that Export could prepare a proper all columns insert.
If you have to load a lot of data into tables on a regular basis, check out SQL Loader or external tables. Should be much faster than individual Inserts.
You can also use MyGeneration (free tool) to write your own sql generated scripts. There is a "insert into" script for SQL Server included in MyGeneration, which can be easily changed to run under Oracle.

Sqlite3 Vs web2py DAL

I have been working with sqlite DB for some time but want to integrate my codes to web2py esp. DAL. How do I rewrite such a code to web2py DAL code?
name = input ('Please Type your Question: ').lower().split()
name2 = name[:]
import sqlite3
for item in name2:#break
conn = sqlite3.connect("foods.db")
cursor = conn.cursor()
cursor.execute("INSERT INTO INPUT33 (NAME) VALUES (?);", (name2,))
cursor.execute("select MAX(rowid) from [input33];")
conn.commit()
for rowid in cursor:break
for elem in rowid:
m = elem
print(m)
cursor.execute("DELETE FROM INPUT33 (NAME) WHERE NAME = name")
I do not quite understand the question so I would like to apologize in advance for any misunderstanding.
Web2py is a Web MVC framework and you should follow that pattern while designing your application. Having that in mind, using a console-related function like input makes no sense. Also, you shouldn't use the same component to extract user interaction related data and deal with database connection and data access/manipulation.
If your intent is to simply convert your code snippet that used sqlite3 module into using pyDAL you just need to install it pip install pydal and change your code to something like
#Do your imports
from pydal import DAL, Field
# connect to your database
db = DAL('sqlite://foods.db')
# define your table model and table fields
db.define_table('input33', Field('NAME'))
# perform an insert into input database
db.input33.insert(name=name2)
# every insert/delete/update needs your to commit to your changes
db.commit()
A full documentation can be found here

How do I speed up the import of data from a CSV file into a SQLite table (in Windows)?

When I was searching for a tool to create and update SQlite databases for use in an Android application I was recommended to use SQLite Database Browser. This has a windows GUI and is reasonably powerful, offering in particular a menu option to import data to a new table from a CSV file.
This has proved perfectly capable for initial creation of the database and I have been using the CSV Import option to update the database whenever I have new data to be added.
When there were only a few records to import this worked well, however as the volume of data has grown the process has become painfully slow. A data file of 11,000 records (800 kilobytes) takes about 10 minutes to import on my averagely slow laptop. Using SQLite Database Browser the whole process of deleting the old table, running the import command, then correcting the data types of the new table created by the import command takes the best part of 15 minutes.
How can the import be speeded up?
You could use the built-in csv import (using the sqlite3 command line utility):
create table test (id integer, value text);
.separator ","
.import no_yes.csv test
Importing 10,000 records took less than 1 second on my Laptop.
By googling I have found several people asking this question, however I have not found the answer set out in once place in simple terms that I could understand. So, I hope the following will help.
The command line utility sqlite3.exe offers a very simple solution. The reason why the "import CSV" option in SQLite Database Browser is so slow is that it executes and commits to the database a separate SQL 'insert' statement foreach line in the CSV file. However, sqlite3.exe includes an "import" command which will process the whole in one go. What's more this is done virtually instantaneously: my 11,000 records are imported in well under a second.
There is a slight drawback in that the import command does not deal with commas in the same way as other programs such as Excel. For example,
if cell A1 in Excel contains Joe Bloggs
and cell B1 contains 123 Main Street, Anytown
the row is exported into a CSV file as:
Joe Bloggs,"123 Main Street, Anytown"
However, if you tried to import this using sqlite3 into a 2-column table, sqlite3 would report an error because it would treat each of the commas as a field separator and so would try to import Joe Bloggs, "123 Main Street and Anytown" as 3 separate fields.
Because it is unusual for text fields (especially in Excel) to include tabs this problem can usually be avoided by using a file where the fields are delimited by tabs rather than by commas.
Since sqlite3.exe can execute any SQL statement and a number of additional commands (like 'import') it is very flexible. However, a routine job like my need to import a delimited data file into a database table can be automated by:
listing the SQL statements and sqlite3.exe commands in a small text file, and feeding this file into sqlite3.exe as a command line parameter
writing a short Windows (MS-DOS) batch file to run sqlite3.exe with the specified list of commands.
These are the steps I followed:
Download and unzip sqlite3.exe
Convert the raw data from comma separated values to tab separated values.
Create a script file listing commands to be executed by sqlite3.exe as follows:
drop table tblTableName;
create table tblTableName(_id INTEGER PRIMARY KEY, fldField1 TEXT, fldField2 NUMERIC, .... );
.mode tabs
.import SubfolderName/DataToBeImported.tsv tblTableName
(Note: SQL statements are followed by a semi-colon; sqlite3.exe commands are preceded by a full stop (period))
Create a .bat file as follows:
cd "c:\users\UserName\FolderWhereSqlite3DatabaseFileAndScriptFileAreStored"
sqlite3 DatabaseName < textimportscript.txt
Having set this up, all I need to do whenever I have new data to add is run the batch file and the data is imported in an instant.
If you are generating INSERT statements, enclose them in a single transaction as stated in the official SQLite FAQ:
BEGIN; -- or BEGIN TRANSACTION;
INSERT ...;
INSERT ...;
END; -- can be COMMIT TRANSACTION; also
Have you tried wrapping all of your updates into a transaction? I had a similar problem and doing that sped it up no end.
Assuming Android Device:
db.beginTransaction();
// YOUR CODE
db.setTransactionSuccessful();
db.endTransaction();
Try that :)
sqlite> PRAGMA journal_mode=WAL;
sqlite> PRAGMA synchronous = 0;
sqlite> PRAGMA journal_mode=MEMORY;
memory
sqlite> BEGIN IMMEDIATE;
.import --csv blah.csv <tablename>
sqlite> COMMIT;
This turns off sync() on write, and puts the WAL file in memory, so it's not "safe", but as long as you are doing this "offline" as it were, and were OK re-creating the DB if power went out, disk gets full, etc, then this will def. speed up the import.

Resources