Filehelpers Excel to Oracle db - asp.net

I want to import excel data to oracle DB. I got enough help for Excel part, can you guys help me in Oracle side?
Is it possible to import to oracledb with filehelpers? Please provide some sample code for reference.
Thanks
Dee

If you save the spreadsheet data as .csv files it is relatively straightforward to import it into Oracle using SQLLoader or external tables. Because we can use SQL with external tables they are generally eaiser to work with, and so are preferable to SQLLoader in almost all cases. SQL*Loader is the betterchoice when dealing with huuuuge amounts of data and when ultra-fast loading is paramount.
Find out more about external tables here. You'll find the equivalent reference for SQL*Loader here.

A simple and relatively fool-proof way to do that for one-off tasks is to create a new column in the excel sheet, containing a formula like that:
="insert into foobar values('"&A1&"','"&A2&"');"
Copy that to all rows, then copy the whole column into an editor and run it in SQL*Plus or sql developer.

I am using Sqlloader to upload data from CSV to Oracle DB.

Related

Importing very large sqlite table to BigQuery

I have a relatively large SQLite table (5 million rows, 2GB) which I'm trying to move to Google BigQuery. The easy solution, which I've used for other tables in the db, was to use something like SQLite Manager (the Firefox extension) to export to CSV, but this fails with what I'd imagine is an out of memory error when trying to export the table in question. I'm trying to think of the best way to approach this, and have come up with the following:
Write something that will manually write a single, gigantic CSV. This seems like a bad idea for many reasons, but the big ones are that one of the fields is text data which will inevitably screw things up with any of the delimiters supported by BQ's import tools, and I'm not sure that BQ could even support a single CSV that big
Write a script to manually export everything to a series of CSVs, like ~100k rows each or something--the main problem being that this will then require importing 50 files
Write everything to a series of JSONs and try to figure out a way to deal with it from there, same as above
Try to import it to MySQL and then do a mysqldump which apparently can be read by BQ
Use Avro, which seems like the same as #2 except it's going to be in binary so it'll be harder to debug when it inevitably fails
I also have some of this data on a local ElasticSearch node, but I couldn't find any way of migrating that to BQ either. Does anyone have any suggestions? Most of what I've found online has been trying to get things out of BQ, not put things in.
(2) is not a problem. BQ can import up to 10k files per import job.
Also, BQ can also import very large CSV/JSON/AVRO files, as long as the input can be sharded (text based formats are not compressed, CSV files without quoted new lines).
See https://cloud.google.com/bigquery/quota-policy#import for more.

Core Data SQLite File

I have an app that saves your data and retrieves your data under a ID number of your choice! The only thing is I have people asking for an excel document of all there saved data! Does anyone know how I would go ahead with this! I am using Swift and Xcode 6.1! I am also using a SQLite file that core data has made for me.
Thanks,
AppSwiftGB
Creating native Excel is likely a PITA. You should export to CSV, though the format seems to be another PITA for one is liking comma, the other semicolon and the third tabs :-(

Parse Excel to BizTalk and save it in database

How to use BizTalk To disassemble Excel File .. Then Save these data in Database?
Can anyone provide me detailed steps of how to achieve this or any existing link for the same.
Wow - this is pretty open ended!
The steps you would generally take are:
1) Generate a Flat File schema that represents your excel file structure. As it's excel I'm assuming that your file is actually a CSV?
2) Create a custom pipeline that implements a flat file disassembler to convert CSV to Xml.
3) Using the WCF-LOB adapter, generate schemas for the Table you want to insert to. You might want to front this with a stored proc. I'm assuming an SQL or ORACLE database as you don't say what DB you are using!
4) Map your input Xml file to your Table/SP Schema.
5) Send your insert request to your DB (advise using Composite operations or a User defined table parameter to to avoid looping through your Xml and sending line-by-line!)
This is pretty high-level but frankly you are asking quite a few questions in one go!
HTH
In case it isn't not a CSV flat file as assumed by teepeeboy, and it is actually a XLS file you want to parse you will need something like FarPoint Spread for BizTalk. We've successfully used this to parse incoming XLS files attached to e-mails. The other option would be to write your own Pipeline component to do it but would be a lot of work. Other than that the steps that teepeeboy outline are what I would do as well.

Export data to Excel Part By Part

I have a huge dataset value in JRDatasource object and am not able to export it to Excel as it will give me memory out of space error. So am planning to split the JRDatasource object and export the data part by part. Any idea or suggestion on how to implement this? Or any other way suggested also fine for me. Thanks in advance.
I dont know much about JRDataSource, but I'll offer another solution.
Take a look at Apache POI library which enables you to create excel files on-the-fly.
So you can read from the data source element by element and persist them on a excel file.

Sample code to create Pivot table in open XML SDK

I am trying to create a pivot table in excel using the open XML SDK from my .net web application. I have got an excel file, for which I have to generate a pivot table.
Please provide any sample code.
I have some similar task.
Take a look at this Power Tools for OpenXML.
It really helped me, although now I need to figure out how to initialize the pivot table with some structure.
PS. you might need to expose this from a webservice.

Resources