Cannot export table from PL/SQL developer having a column datatype as CLOB - plsql

I need to export table having a column datatype as clob.
I need to have inserts scripts for that table. But I get following error when I try to export the table from PL/SQL developer
Table XYZ contains one or more CLOB columns. Cannot export in SQL format, use PL/SQL Developer format instead
Is there a way I can export tables with CLOB column datatype using plain SQL scripts?

I know that topic is old but maybe someone in future need this:
When You export a blob field use this
dbms_lob.substr( blob_field , 3999, 1) blob_field
where blob_field is your cblob

As of the latestest version of SQLDeveloper you can try out xml export, this at least worked for me for CLOBs.

Related

Export Multpile Tables Data as Insert Statements in to single file Oracle DB [duplicate]

The only thing I don't have an automated tool for when working with Oracle is a program that can create INSERT INTO scripts.
I don't desperately need it so I'm not going to spend money on it. I'm just wondering if there is anything out there that can be used to generate INSERT INTO scripts given an existing database without spending lots of money.
I've searched through Oracle with no luck in finding such a feature.
It exists in PL/SQL Developer, but errors for BLOB fields.
Oracle's free SQL Developer will do this:
http://www.oracle.com/technetwork/developer-tools/sql-developer/overview/index.html
You just find your table, right-click on it and choose Export Data->Insert
This will give you a file with your insert statements. You can also export the data in SQL Loader format as well.
You can do that in PL/SQL Developer v10.
1. Click on Table that you want to generate script for.
2. Click Export data.
3. Check if table is selected that you want to export data for.
4. Click on SQL inserts tab.
5. Add where clause if you don't need the whole table.
6. Select file where you will find your SQL script.
7. Click export.
Use a SQL function (I'm the author):
https://github.com/teopost/oracle-scripts/blob/master/fn_gen_inserts.sql
Usage:
select fn_gen_inserts('select * from tablename', 'p_new_owner_name', 'p_new_table_name')
from dual;
where:
p_sql – dynamic query which will be used to export metadata rows
p_new_owner_name – owner name which will be used for generated INSERT
p_new_table_name – table name which will be used for generated INSERT
p_sql in this sample is 'select * from tablename'
You can find original source code here:
http://dbaora.com/oracle-generate-rows-as-insert-statements-from-table-view-using-plsql/
Ashish Kumar's script generates individually usable insert statements instead of a SQL block, but supports fewer datatypes.
I have been searching for a solution for this and found it today. Here is how you can do it.
Open Oracle SQL Developer Query Builder
Run the query
Right click on result set and export
http://i.stack.imgur.com/lJp9P.png
You might execute something like this in the database:
select "insert into targettable(field1, field2, ...) values(" || field1 || ", " || field2 || ... || ");"
from targettable;
Something more sophisticated is here.
If you have an empty table the Export method won't work. As a workaround. I used the Table View of Oracle SQL Developer. and clicked on Columns. Sorted by Nullable so NO was on top. And then selected these non nullable values using shift + select for the range.
This allowed me to do one base insert. So that Export could prepare a proper all columns insert.
If you have to load a lot of data into tables on a regular basis, check out SQL Loader or external tables. Should be much faster than individual Inserts.
You can also use MyGeneration (free tool) to write your own sql generated scripts. There is a "insert into" script for SQL Server included in MyGeneration, which can be easily changed to run under Oracle.

Stored Procedure not returning Unicode characters even when the tables and the variables are set as nvarchar

I have been working on a ASP.NET website which include various functionalities like add some data, modifying data downloading the data to excel.
Here when we download the data a stored proc is running which generate some queries which then are used to insert some data in a temp table, from which the data is downloaded in excel. Previously this system was working without any unicode encoding and was working fine. Now we are enhancing the system's capability to be able to handle special characters. We have changed all the table datatypes to nvarchar and also appended N' to the queries that are shooting via the ASP.NET code on the server. But when the data type of the variables in the stored proc is changed to nvarchar it returns an error stating 'Invalid syntax near ')''. So i kept the data type of the variables to varchar where as the data types of the columns of the tables are nvarchar. The data gets downloaded but i get ? where there were special characters in the table.
Please help as i am new in this domain and have been stuck on this issue from a long time.
It's working like this:
if object_id('sample', 'U') is not null
drop table sample;
create table sample (
text nvarchar(50) null
)
insert into sample(text) values (N'fööbäär#?<*');
insert into sample(text) values (N'~©‚它©¨Ω†∑');
select * from sample;
drop table sample;
Output:
text
-----------
fööbäär#?<*
~©‚它©¨Ω†∑
When you post some of your code, it might be easier to help.

Need to insert huge data into sqlite table from insert statement

I have got a statment like this
INSERT INTO 'tablename' ('column1', 'column2') VALUES
('data1', 'data2'),
('data1', 'data2'),
('data1', 'data2'),
('data1', 'data2');
But it contents a huge amount of data (about 100 000) (it was exported like this from other database).
Does anybody know a way to insert this data into table? Any way will help. This is just one operation, not for program, etc..
All sqlite managers I use just hanged up.
I read here that sqlite does not support such statements, so may be I need a way to convert statement in something usable.. may be trought other DB type..
Try to use .import command.
.import FILE TABLE Import data from FILE into TABLE
There is a HUGE speed difference between using individual sql statements and .import in sqlite.
Use pragma encoding Check the encoding of text in SQlite

Teradata SQL export to csv

Is there a way to run a Teradata SQL query and then export the data to an external file?
For example, if I ran:
SELECT TOP 10
*
FROM
mydb.mytable
Could I write this in such a way that it will export to a CSV? Do I need to store my data in a temp table first and then do CREATE EXTERNAL TABLE? Any ideas would be appreciated - I've been returning data to R and then exporting, but there's no need for the intermediate step in some jobs.
There's no CREATE EXTERNAL TABLE in Teradata.
The only tool capable of exporting CSV directly is TPT (Teradata Parallel Transporter).
Otherwise you have to do the concat in your query using SELECT TRIM(col1) || ',' || TRIM(col2)... and then export based on your client's capabilities.

Syntax for using collate nocase in a SQLite replace function

I have an existing database where they created theiw own unicode collation sequence. I'm trying to use the following code and get a "no such collation sequence" exception. Can anybdy hlep with the the syntax to use "collate nocase" with this code?
update Songs set
SongPath = replace (SongPath, 'Owner.Funkytown', 'Jim');
Dump database (via shell), edit output SQL (find and change column definitions, set COLLATION NOCASE). Recreate database.

Resources