Unable to import oracle dump in oracle 11g - oracle11g

I am trying to import oracle dump in Oracle 11g XE by using the below command
imp system/manager#localhost file=/home/madhu/test_data/oracle/schema_only.sql full=y
Getting like below
IMP-00037: Character set marker unknown
IMP-00000: Import terminated unsuccessfully
Any one please help me

You received IMP-00037 error because of export file corrupted. I'd suspect either your dump file is corrupted or the dump file was not created by exp utility.
If the issue was occured because of corrupted dump file, then there is no choice other than obtaining uncorrupted dump file. Use impdp utility to import if you have used expdp utility to prepare dumpfile.
Following link will be helpful to try other option:
https://community.oracle.com/thread/870104?start=0&tstart=0
https://community.oracle.com/message/734478
If you are not sure which command(exp/expdp) was used, you could check log file which was created during dump export. It contains exact command which was executed to prepare the dump file.

Related

How to load Northwind into SQLite3 -- Error 'File is not a database'

I am trying to load the Northwind.Sqlite3.create.sql hosted on https://github.com/jpwhite3/northwind-SQLite3 into SQLite3 on Ubuntu.
I have tried using: sqlite3 Northwind.Sqlite3.create.sql to import the database.
However when I try to view the data using SELECT * FROM CUSTOMERS; I get an error saying Error: file is not a database
Any suggestions as to how to open the database file and use it?
That's just a text file full of DDL statements, not a sqlite3 database. You'd have to import it into a database with something like
sqlite3 mydatabase.db < Northwind.Sqlite3.create.sql

SQL lite import csv error CREATE TABLE data;(...) failed: near ";": syntax error

Brand new to SQL lite, running on a mac. I'm trying to import a csv file from the SQL lite tutorial:
http://www.sqlitetutorial.net/sqlite-import-csv/
The 'cities' data I'm trying to import for the tutorial is here:
http://www.sqlitetutorial.net/wp-content/uploads/2016/05/city.csv
I try and run the following code from Terminal to import the data into a database named 'data' and get the following error:
sqlite3
.mode csv
.import cities.csv data;
CREATE TABLE data;(...) failed: near ";": syntax error
A possible explanation may be the way I'm downloading the data - I copied the data from the webpage into TextWrangler and saved it as a .txt file. I then manually changed the extension to .csv. This doesn't seem very eloquent but that was the advice I found online for creating the .csv file: https://discussions.apple.com/thread/7857007
If this is the issue then how can I resolve it? If not then where am I going wrong?
Another potentially useful point - when I executed the code yesterday there was no problem, it created a database with the data. However, running the same code today produces the error.
sqlite3 dot commands such as .import are not SQL and don't need semicolon at end. Replace
.import cities.csv data;
with
.import cities.csv data

Importing database to 000webhost and receiving the following error: #1064

Error: - You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near '#', 1) LIMIT 1' at line 2
Thanks in advance.
I had the same problem.When you are importing whole DB then its quite difficult to find the error.
Check Your 'Option' Column. Problem Could Be In Dumping The Data There.
But I solved it in my way, I didn't import whole DB, I uploaded in pieces.Now what you have to do is
-Import Each Column Manually.
-Then Dump It Manually
There You Go.

fast export unexplained failure

I have roughly 14 million records that I am attempting to export from a Teradata table to file using a fast export connection object.
There is no size limit for fast export files on our Linux system, and there is 1.2 TB of available space in the target directory.
The session fails, and gives the following errors:
READER_2_1_1 FEXP_87011 Process [16022] exited with status [12]
SDKS_38200 Partition-level [SOURCE_TABLE_NAME]: Plug-in #305400 failed in deinit()
I googled the error message, and found this post:
Here
I followed the recommendations in the port to delete the .out file in the temp directory, delete the files that were partially filled in the target directory, and drop the error table and delete the log file. This did not fix the issue and the session still fails with the same error messages.
Try to use TPT Export plug-in instead. Also you can try to execute this FastExport using bteq scripts directly on your unix environment.

Creating a Dump File for Oracle table

I intend to export one table from My database as a dmp File. This is what I am doing:
expdp SYSTEM/manager#UATDB FILE=F:\LLT.dmp log=F:\llt.log tables=TBAADM.LLT
The error I am getting is:
LRM-00101: unknown parameter name 'FILE'
What is my mistake. Please Help.
Ok Guys I fouund the Answer you need to first Create a Directory where the the dmp File is going to be. Then Import Like this:
expdp USERNAME#server ip/SERVICE_NAME
DIRECTORY=DIR_NAME DUMPFILE=FILE.dmp TABLES=SCHEMA.TABLE_NAME
And you can also use
Below one is shell command, so execute this in the shell. Once Data export is completed, the search dpdump folder in oracle installed directory where you can find your exported file with log.
expdp userid/pwd schemas=dbschema dumpfile=file.dmp logfile=file.log

Resources