Teradata SQL export to csv - teradata

Is there a way to run a Teradata SQL query and then export the data to an external file?
For example, if I ran:
SELECT TOP 10
*
FROM
mydb.mytable
Could I write this in such a way that it will export to a CSV? Do I need to store my data in a temp table first and then do CREATE EXTERNAL TABLE? Any ideas would be appreciated - I've been returning data to R and then exporting, but there's no need for the intermediate step in some jobs.

There's no CREATE EXTERNAL TABLE in Teradata.
The only tool capable of exporting CSV directly is TPT (Teradata Parallel Transporter).
Otherwise you have to do the concat in your query using SELECT TRIM(col1) || ',' || TRIM(col2)... and then export based on your client's capabilities.

Related

specifying Azure blob virtual folder instead of file for ingesting into Kusto

Referring to the .ingest into table <tablename> feature , as per the documentation we need to specify direct file name (blob). But it is more common that we may have a bunch of text files in a given blob path , all of which need to be imported. Is there a way we can specify path? I have tried specifying but Kusto won't like folder path.
Kusto does not iterate over folders or containers.
Zip all your files up into a file. Place on blob. This [ingest into] command worked for me:
.ingest into table Blah (
h#'https://YOURACCOUNT.blob.core.windows.net/somefolder/FileFullofCsvs.zip;YOURKEY'
)
with (
format = "csv",
ignoreFirstRecord = true,
zipPattern="*.csv"
)
You can probably achieve this by creating external table referencing your blob storage folder.
Generate SAS token.
Generate SAS token for your blob storage folder. (ensure to select read and list permissions and any other appropriately)
Create external table
Here is the Kusto query
.create external table myExternalTable(ProductID:string, Name:string ,Description:string, ExpiryDate:datetime)
kind=blob
dataformat=csv
(
h#'https://{storageaccount}.blob.core.windows.net/{file system}/{folder name}?{SAS token url generated from step1}
)
Create Table in Azure Data Explorer DB
Set or Append data to Azure Data Explorer database table.
.set-or-append myProductTable (extend_schema=true) <|external_table("myExternalTable")
Query the table
This will list all the data rows in the table
myProductTable

Read a csv file and insert the values in mysql database using R

I am able to read a csv file using read function, i now want to insert the values into a table in mysql database,i have to make it dynamic so that if the content of the csv changes it can insert stil.
Your post is very subjective. I advise you to go in parts, read the dplyr documentation.
I use dplyr for persistence in mysql database. This a powerfull packpage.
https://shiny.rstudio.com/articles/pool-dplyr.html

Export Multpile Tables Data as Insert Statements in to single file Oracle DB [duplicate]

The only thing I don't have an automated tool for when working with Oracle is a program that can create INSERT INTO scripts.
I don't desperately need it so I'm not going to spend money on it. I'm just wondering if there is anything out there that can be used to generate INSERT INTO scripts given an existing database without spending lots of money.
I've searched through Oracle with no luck in finding such a feature.
It exists in PL/SQL Developer, but errors for BLOB fields.
Oracle's free SQL Developer will do this:
http://www.oracle.com/technetwork/developer-tools/sql-developer/overview/index.html
You just find your table, right-click on it and choose Export Data->Insert
This will give you a file with your insert statements. You can also export the data in SQL Loader format as well.
You can do that in PL/SQL Developer v10.
1. Click on Table that you want to generate script for.
2. Click Export data.
3. Check if table is selected that you want to export data for.
4. Click on SQL inserts tab.
5. Add where clause if you don't need the whole table.
6. Select file where you will find your SQL script.
7. Click export.
Use a SQL function (I'm the author):
https://github.com/teopost/oracle-scripts/blob/master/fn_gen_inserts.sql
Usage:
select fn_gen_inserts('select * from tablename', 'p_new_owner_name', 'p_new_table_name')
from dual;
where:
p_sql – dynamic query which will be used to export metadata rows
p_new_owner_name – owner name which will be used for generated INSERT
p_new_table_name – table name which will be used for generated INSERT
p_sql in this sample is 'select * from tablename'
You can find original source code here:
http://dbaora.com/oracle-generate-rows-as-insert-statements-from-table-view-using-plsql/
Ashish Kumar's script generates individually usable insert statements instead of a SQL block, but supports fewer datatypes.
I have been searching for a solution for this and found it today. Here is how you can do it.
Open Oracle SQL Developer Query Builder
Run the query
Right click on result set and export
http://i.stack.imgur.com/lJp9P.png
You might execute something like this in the database:
select "insert into targettable(field1, field2, ...) values(" || field1 || ", " || field2 || ... || ");"
from targettable;
Something more sophisticated is here.
If you have an empty table the Export method won't work. As a workaround. I used the Table View of Oracle SQL Developer. and clicked on Columns. Sorted by Nullable so NO was on top. And then selected these non nullable values using shift + select for the range.
This allowed me to do one base insert. So that Export could prepare a proper all columns insert.
If you have to load a lot of data into tables on a regular basis, check out SQL Loader or external tables. Should be much faster than individual Inserts.
You can also use MyGeneration (free tool) to write your own sql generated scripts. There is a "insert into" script for SQL Server included in MyGeneration, which can be easily changed to run under Oracle.

SQLite on CSV file

I have a small statistics program, which you can point to a CSV file. It tries to determine certain properties (like i.E. which columns might be a date). Lately I have been reading a lot about SQLite and would like to port my application to make us of it, as this would make it easier to create new statitics as only a new select would have to be written.
Now what I would like to know is, I know that SQLite can operate in memory, but of course I don't want to always load the whole file into memory as this can become rather big. So I would like to point SQLite to the CSV file and provide the column information, so that I can do queries on it. It would also be cool if I could create an index in memory (or a temprorary directory) so that the statistics will run faster. This would not need to modify the CSV, only do selects.
Can this be done out of the box? If not, can I write my own filemanager and connect it to SQLite, to achieve this? Writing my own filemanager would only be an option if the effort is not to big, as I don't want to write a full blown database code.
SQLite supports reading from a file:
$ cat data.csv
Cheese,7,12.3
Bacon,8,19.4
Eggs,3,20.3
# With no filename SQLite creates the database in memory.
$ sqlite3
sqlite> create table data (name text, units integer, price double);
sqlite> .separator ','
sqlite> .import data.csv data
sqlite> select * from data;
Cheese,7,12.3
Bacon,8,19.4
Eggs,3,20.3
You can add constrains and indexes on this table to help you with your analysis.

Bulk load data into sqlite?

Does anybody have any tips on utilities that can be used to bulk load data that is stored in delimited text files into an SQLite database?
Ideally something that can be called as a stand-alone program from a script etc.
A group I work with has an Oracle Database that's going to dump a bunch of data out to file and then load that data into an SQLite database for use on a mobile device and are looking for the easiest way to implement that sort of scenario.
Check out the sqite .import command - it does exacty this.
You can set the separator with the .separator command
sqlite3 myDatabase
create table myTable (a, b, c);
.separator ','
.import myFile myTable
Why do you want a text file?
Just use Java which does have easily available libraries for Oracle and SQLite access. Connect to both databases and just select from one db and insert into another with no additional complexity of CSV, which is not a very well defined format and will give you problems with character encoding, quotes, comas/tabs or semicolons, newlines etc. in your data.

Resources