We have a CSV files being loaded automatically in Unix machine.
Requirement: We need to load the csv file from that remote server to my oracle DB. We do have an ODI as our ETL tool. Can someone advice on how to proceed further. What is the way to load the CSV from Unix server to Oracle DB.Please help us with some document if this case is possible.
Thanks ,
Gowtham Raja S
Oracle is providing some tutorials (Oracle By Example), one of them explains how to load a flat file to an Oracle table with ODI 12c : https://apexapps.oracle.com/pls/apex/f?p=44785:112:::::P112_CONTENT_ID:7947
You will just need to change the field delimiter to a comma instead of a tab.
The other tutorials can be found on the product page : http://www.oracle.com/technetwork/middleware/data-integrator/learnmore/index.html
if you know how to create a data store for file, in ODI we have a LKM called LKM file to Oracle (SQLLDR) OR LKM file to Oracle (external table), both can be used to load data quickly, or if you feel like this is bit difficult, however since you have the Sqlldr to load data manually from file to DB, what ever the command you are using to start sqlldr place it in ODI procedure with technology as OSCommand whihc loads data automatically.
let me know if any other suggestios are required.
Related
I have a remote MonetDB server running and I want to bulk upload a csv file as it is much faster.
Based on the params in MonetDB.R, there is a csvdump=TRUE option but I don't think it works when you are trying to do this against a remote server. The server has to be local.
https://rdrr.io/github/MonetDB/monetdb-r/man/dbWriteTable.html
First, am I correct that I can't do this and if not, is there a workaround? I have a dataframe with +5M rows so it takes a long time with insert statements rather than using COPY INTO.
When I try using csvdump=TRUE against the remote server, it can't find the csv file because it is local to computer that called the dbWriteTable command.
I think you are right. As a workaround either use explicit COPY INTO ON CLIENT SQL statements or first use some file transfer tool to copy the file to the remote server before calling dbWriteTable.
It reads from MonetDB's documentation on COPY INTO:
FROM files ON SERVER
With ON SERVER, which is the default, the file name must be an
absolute path on the system on which the database server (mserver5) is
running. ...
Interestingly enough pymonetdb, the Python driver for MonetDB, uses ON CLIENT for bulk loads. From the pymonetdb's doc:
File Uploads and Downloads
Classes related to file transfer requests as used by COPY INTO ON
CLIENT.
You might want to file an issue for the MonetDB R-driver project to have similar behavior as pymonetdb.
I have look at other that have been trying to get data from an OpenEdge Progress database.
I have the same problem, but there is a backup routine on the windows file server that dump the data every night. I have the *.pbk and a 1K *.st file. How can I get the data out of the dump file in a form I can use?
Or is't not possible?
Thanks.
A *.pbk file is probably a backup (ProBacKup). You can restore it on another system with compatible characteristics (same byte order, same release of Progress OpenEdge). Sometimes that is helpful if the other system has better connectivity or licensing.
To extract the data from a database, either the original or a restored backup, you have some possibilities:
1) A pre-written extract program. Possibly provided by whoever created the application. Such a program might create simple text files.
2) A development license that permits you to write your own extract program. The output of the "showcfg" command will reveal whether or not you have a development license.
3) Regardless of license type you can use "proutil dbName -C dump tableName" to export the data but this will result in binary output that you probably will not be able to read or convert. (It is usually used in conjunction with "proutil load").
4) Depending again on the license that you have you might be able to dump data with the data administration tool. If you have a runtime only license you may need to specify the -rx startup parameter.
5) If your database has been configured to allow SQL access via ODBC or JDBC you could connect with a SQL tool and extract data that way.
I am writing a PLSQL procedure that takes input as an excel file through front end and using that excel input the procedure inserts , updates or deletes the records present in an existing table . Can anyone show me the approach for this?
If that "Excel" file has to be really in native XLS(X) format, a simple option - if you want to stay within Oracle boundaries - is an Apex application which offers a data loading wizard. Takes 4 pages to create it (don't worry, Apex Wizard creates almost everything for you). Once the loading is over, a (stored) procedure can do the rest of processing (you'd call it by pushing a button).
Alternatively, if you save contents of that file as a CSV file, you can load it with SQL*Loader, utility ran at the operating system command prompt. You'd have to create a control file (no wizard to do that, I'm afraid). This approach probably isn't convenient for end users (who's going to type anything at the command prompt?) so you'd have to create some kind of an application to do that.
Or, CSV again, but this time used as an external table. This approach requires the file to be located in a directory accessible by the database server (most frequently, the directory is located on that computer, and you most frequently don't want to allow access to anyone to it). Its advantage is that you can access the CSV file directly from (PL/)SQL, fetch data from it, perform various adjustments etc.
If you're capable of writing programs that aren't part of the Oracle niche (I'm not), go for it (but I can't suggest anything; someone else might).
I have a 6 gig csv and I am trying to load it into Teradata.
So I fire up Teradata SQL assistant, created an empty table and then I turn on Import data mode and try to insert the records into the empty table using
insert into some_lib.some_table
(?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?);
But I always get a failure message around the 600k rows marks that goes
Error reading import file at record 614770: Exception of type
'System.OutOfMemoryException' was thrown.
I think it's because Teradata SQL assistant is trying to load everything into memory on my 4G laptop before trying to send the data to the Teradata server. Is my theory correct? How do I tell Teradata to upload the data in chunks and not try to store everything in local memory?
I believe you are pushing the capabilities of SQL Assistant as a means to load data.
Have you considered installed the Teradata Load Utilities such as FastLoad or MultiLoad on your system?
Another option if you don't want to write scripts for the load utilities would be to install Teradata Studio Express which should provide a mechanism to use JDBC FastLoad to load your data. This would be in the Smart Loader mechanism of Studio Express. You may find this to be more extensible than SQL Assistant using .Net or ODBC.
I have an excel file which gets updated every 10 seconds through an automated process. I need excel data to be updated in MY-SQL database which is located on a remote server.
How do I do that?
I have thought of following option:
1) Every 11 seconds, an Excel macro will run and will "Save as" excel as CSV file. (not sure whether this can be done by macro...just thinking)
2) This CSV file we will FTP to remote server using Windows Service.
3) On remote server, we will parse the csv file and Update MYSQL database.
Is this approach fine? Or do you have a better approach which requires less time to update the database?
Thanks!
I found following links to be more useful:
http://www.heritage-tech.net/908/inserting-data-into-mysql-from-excel-using-vba/
http://vbaexcel.eu/vba-macro-code/update-mysql-database-php
I hope this helps someone having similar problem as mine.
You can connect to the Excel spreadsheet using ODBC connection, read the data, and post it to the MySQL database, maybe through some sort of web service access, or via a saved CSV file?