How can I import data from excel? - openedge

I am new to progress and I would like to know if there is a way to import data from excel. So far I have done only for CSV. Please share your Exp here!!! I would appreciate that.

By far the easiest way for you is to save the excel sheet as CSV and then simply import that. If that's not possible you could take a look at automation of excel to perhaps trigger a Save AS CSV and then import that.
Here's a quite good example in the KnowledgeBase. Please note that there might be changes due to different versions of Excel (also possibly of Progress).
There might also be DotNet APIs that might help you. Look for those in Microsofts documentation in that case.

Related

Can I import data to Cloud Firestore without having to have exported first?

I want to populate my database with a lot of data and I have found there exists an import command but you should have exported first. I don't know if I can do it just with several json files of my collections and use the import command in order to make the population faster or the most efficient way to do it is just with batch writes?
I have reviewed the documentation and I have not found something similar but I wanted to be completely sure that I am not ruling out a very good option, Thank you for your time.
In nutshell I wonder if I can adapt my data in a specific format to be able to use the import command and load my data quickly without having to use batched writes. On the documentation I've found I can use import command if I previously exported but that is impossible for me because I still don't have any data in my database.

Importing very large sqlite table to BigQuery

I have a relatively large SQLite table (5 million rows, 2GB) which I'm trying to move to Google BigQuery. The easy solution, which I've used for other tables in the db, was to use something like SQLite Manager (the Firefox extension) to export to CSV, but this fails with what I'd imagine is an out of memory error when trying to export the table in question. I'm trying to think of the best way to approach this, and have come up with the following:
Write something that will manually write a single, gigantic CSV. This seems like a bad idea for many reasons, but the big ones are that one of the fields is text data which will inevitably screw things up with any of the delimiters supported by BQ's import tools, and I'm not sure that BQ could even support a single CSV that big
Write a script to manually export everything to a series of CSVs, like ~100k rows each or something--the main problem being that this will then require importing 50 files
Write everything to a series of JSONs and try to figure out a way to deal with it from there, same as above
Try to import it to MySQL and then do a mysqldump which apparently can be read by BQ
Use Avro, which seems like the same as #2 except it's going to be in binary so it'll be harder to debug when it inevitably fails
I also have some of this data on a local ElasticSearch node, but I couldn't find any way of migrating that to BQ either. Does anyone have any suggestions? Most of what I've found online has been trying to get things out of BQ, not put things in.
(2) is not a problem. BQ can import up to 10k files per import job.
Also, BQ can also import very large CSV/JSON/AVRO files, as long as the input can be sharded (text based formats are not compressed, CSV files without quoted new lines).
See https://cloud.google.com/bigquery/quota-policy#import for more.

Export data to Excel Part By Part

I have a huge dataset value in JRDatasource object and am not able to export it to Excel as it will give me memory out of space error. So am planning to split the JRDatasource object and export the data part by part. Any idea or suggestion on how to implement this? Or any other way suggested also fine for me. Thanks in advance.
I dont know much about JRDataSource, but I'll offer another solution.
Take a look at Apache POI library which enables you to create excel files on-the-fly.
So you can read from the data source element by element and persist them on a excel file.

Neo4j Configuration with Gephi

I want to use Neo4j to store a number of graphs I created in python. I was using Gephi for visualization, and I thought the export to Neo4j plugin would be a very simple way to get the data across. The problem is that the server is seemingly not recognizing the neostore...db files that Gephi generated.
I'm guessing I configured things incorrectly, but is there a way to fix that?
Alternatively, I'm also open to importing the files directly. I have two files: one with node titles and attributes and another with an edge list of title to title.
I'm guessing that I would need to convert the titles to ids, right? What would be the fastest way to do that?
Thank you in advance!
If you have the file as tab separated csv files, feel free to import them directly. There are some options, check out this page: http://www.neo4j.org/develop/import
Especially the CSV batch importer can help you: http://maxdemarzi.com/2012/02/28/batch-importer-part-1/
Or if it is just a little bit of data, use the spreadsheet approach: http://blog.neo4j.org/2013/03/importing-data-into-neo4j-spreadsheet.html
Please report back if you were successful.
I used Gephi to generate a neo4j store file directory in the past - it worked like a charm...
I assume you did delete the default graph.db directory and renamed your gephi-generated directory to graph.db? That worked for me...

ASP.NET - Export formatted data in excel file

I want to export a datable to excel file. The data should be formatted and in readable format.
Can anyone suggest me ways to do this ?
Also please suggest ready made free library which we can use in ASP.NET for formatted exporting.
Thanks
try http://epplus.codeplex.com/
Nowadays I work on exporting excel and this is the best I tried so far.
Easy to use.

Resources