error during sqoop data migration - oracle11g

I have one doubt. I am trying to pull data from oracle and want to push it into HDFS using sqoop 1.4.6.The table which I want to migrate contains column named "COMMENT"(which is a reserved keyword in oracle), but when I tried to push table into HDFS using sqoop, the error occurred was:
15/09/30 14:52:49 ERROR db.DBRecordReader: Top level exception:
java.sql.SQLSyntaxErrorException: ORA-00936: missing expression
I have tried by putting \ and " for this column as:
"\"\"COMMENT\"\"" when I listed column names during query
so how to get this error fixed..

Please try to use --query option for sqoop import.
eg: sqoop import --query "select COMMENT from Table_Name ....."

Related

How can I quit sqlite from a command batch file?

I am trying to create a sealed command for my build pipeline which inserts data and quits.
So far I have created my data files
things-to-import-001.sql and 002 etc, which contains all the INSERT statements I'd like to run, with a file per table.
I have created a command file to run them
-- import-all.sql
.read ./things-to-import-001.sql
.read ./things-to-import-002.sql
.quit
However when I run my command
sqlite3 -init ./import-all.sql ./database.sqlite
..the data is inserted, but the program remains running and shows the sqlite> prompt, despite the .quit command. I have also tried using .exit 0.
From the sqlite3 --help
-init FILENAME read/process named file
Docs: https://www.sqlite.org/cli.html#reading_sql_from_a_file
How can I tell sqlite to exit once my inserts have finished?
I have managed to find a dirty workaround for this issue.
I have updated my import file to include a bad command, and executed using -bail to quit on first error.
-- import-all.sql
.read ./things-to-import-001.sql
.read ./things-to-import-002.sql
.fakeErrorToQuitWithBail
Then you can execute with
sqlite3 -init import-all.sql -bail
and it should quit with
Error: unknown command or invalid arguments: "fakeErrorToQuitWithBail". Enter ".help" for help
Try using ".exit" at the place of ".quit". For some reason SQLite dont doccumented this commands.
https://www.tutorialspoint.com/sqlite/sqlite_commands.htm

Push/Export large datframe from R to Vertica database

I have a dataframe of 10M rows which needs to be uploaded back from R to Vertica Database.
The DBwrite() function from DBI is running into memory issues and I have tried increasing memory to 16g by
options(java.parameters = c("-XX:+UseConcMarkSweepGC", "-Xmx16g"))
Still the process is running into memory issue. I am planning to use bulk copy option of vertica to copy the csv file to create the table.
I have created an empty table on vertica
When I am executing the query
dbSendQuery(vertica, "COPY hpcom_usr.VM_test FROM LOCAL \'/opt/mount1/musoumit/MarketBasketAnalysis/Code/test.csv\' enclosed by \'\"\' DELIMITER \',\' direct REJECTED DATA \'./code/temp/rejected.txt\' EXCEPTIONS \'./code/temp/exceptions.txt\'")
I am running into this error.
Error in .verify.JDBC.result(r, "Unable to retrieve JDBC result set", :
Unable to retrieve JDBC result set
JDBC ERROR: [Vertica]JDBC A ResultSet was expected but not generated from query "COPY hpcom_usr.VM_test FROM LOCAL '/opt/mount1/musoumit/MarketBasketAnalysis/Code/test.csv' enclosed by '"' DELIMITER ',' direct REJECTED DATA './code/temp/rejected.txt' EXCEPTIONS './code/temp/exceptions.txt'". Query not executed.
Please help with what i'm doing wrong here.
Vertica also provides STDIN option aswell. Link
Please help me how can I execute this.
My Environment.
CENT OS 7
R 3.6.3 (No R Studio here I have to execute this from CLI)
Tidyverse 1.0.x
Vertica driver 9.x
System 128GB Memory and 28Core system.
Your problem is that you fire dbSendQuery() , which lives with a following dbFetch() and a final dbClearResult() - but only for query SQL statements - those that actually return a result set.
Vertica's COPY <table> FROM [LOCAL] 'file.ext' ... command is treated like a DML command. And for those - as this docu says ...
https://www.rdocumentation.org/packages/DBI/versions/0.5-1/topics/dbSendQuery
.. you need to use dbSendStatement() for data manipulation statements.
Have a go at it that way - good luck ...
dbSendUpdate(vertica, "COPY hpcom_usr.VM_test FROM LOCAL \'/opt/mount1/musoumit/MarketBasketAnalysis/Code/test.csv\' enclosed by \'\"\' DELIMITER \',\' direct REJECTED DATA \'./code/temp/rejected.txt\' EXCEPTIONS \'./code/temp/exceptions.txt\'")
instead of dbSendQuery did the trick for me.

SQL lite import csv error CREATE TABLE data;(...) failed: near ";": syntax error

Brand new to SQL lite, running on a mac. I'm trying to import a csv file from the SQL lite tutorial:
http://www.sqlitetutorial.net/sqlite-import-csv/
The 'cities' data I'm trying to import for the tutorial is here:
http://www.sqlitetutorial.net/wp-content/uploads/2016/05/city.csv
I try and run the following code from Terminal to import the data into a database named 'data' and get the following error:
sqlite3
.mode csv
.import cities.csv data;
CREATE TABLE data;(...) failed: near ";": syntax error
A possible explanation may be the way I'm downloading the data - I copied the data from the webpage into TextWrangler and saved it as a .txt file. I then manually changed the extension to .csv. This doesn't seem very eloquent but that was the advice I found online for creating the .csv file: https://discussions.apple.com/thread/7857007
If this is the issue then how can I resolve it? If not then where am I going wrong?
Another potentially useful point - when I executed the code yesterday there was no problem, it created a database with the data. However, running the same code today produces the error.
sqlite3 dot commands such as .import are not SQL and don't need semicolon at end. Replace
.import cities.csv data;
with
.import cities.csv data

Why am I getting: database is locked, in an SQLite3 script?

I'm getting an error when running an SQLite script.
--drop use table before replacing it
DROP TABLE IF EXISTS db.use;
--Create the use table in the saved database
CREATE TABLE db.use AS SELECT * FROM use2; -- this is the line that generates the error: Error: near line 145: database is locked
Are these two statements run asynchronously or something? I don't understand what's causing the error, but I'm wondering if it has to do with that.
Might there be a way to run the script in a lock-step manner, i.e. non-asynchronously?
This is how I was running the command: sqlite3 --init script_name.sql dbname.db, and elsewhere in the script I had an ATTACH statement reading the same database dbname.db. Essentially reading the same file twice.
The way I solved this was by executing the script in the sqlite3 shell:
sqlite3> .read script_name.sql
Have you tried to add a commit statement after the drop statement?
I think that would make sure the create table statement run after the drop statement is totally done.

Importing database to 000webhost and receiving the following error: #1064

Error: - You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near '#', 1) LIMIT 1' at line 2
Thanks in advance.
I had the same problem.When you are importing whole DB then its quite difficult to find the error.
Check Your 'Option' Column. Problem Could Be In Dumping The Data There.
But I solved it in my way, I didn't import whole DB, I uploaded in pieces.Now what you have to do is
-Import Each Column Manually.
-Then Dump It Manually
There You Go.

Resources