Peforming DML on a table by taking Excel file as an input - plsql

I am writing a PLSQL procedure that takes input as an excel file through front end and using that excel input the procedure inserts , updates or deletes the records present in an existing table . Can anyone show me the approach for this?

If that "Excel" file has to be really in native XLS(X) format, a simple option - if you want to stay within Oracle boundaries - is an Apex application which offers a data loading wizard. Takes 4 pages to create it (don't worry, Apex Wizard creates almost everything for you). Once the loading is over, a (stored) procedure can do the rest of processing (you'd call it by pushing a button).
Alternatively, if you save contents of that file as a CSV file, you can load it with SQL*Loader, utility ran at the operating system command prompt. You'd have to create a control file (no wizard to do that, I'm afraid). This approach probably isn't convenient for end users (who's going to type anything at the command prompt?) so you'd have to create some kind of an application to do that.
Or, CSV again, but this time used as an external table. This approach requires the file to be located in a directory accessible by the database server (most frequently, the directory is located on that computer, and you most frequently don't want to allow access to anyone to it). Its advantage is that you can access the CSV file directly from (PL/)SQL, fetch data from it, perform various adjustments etc.
If you're capable of writing programs that aren't part of the Oracle niche (I'm not), go for it (but I can't suggest anything; someone else might).

Related

R shiny concurrent file access

I am using the R shiny package to build a web interface for my executable program. The web interface provides user input and shows output.
On the server background, the R script formats user inputs and saves them to a local input file. Then R calls the system command to run the executable program.
My concern is that if multiple users run the web app at the same time, it is possible that the input file generated by the first user will be overwritten by the second user's input before it is read by the executable program.
One way to solve the conflict is to ask R to create a temporary folder and generate/run the input file under that folder for each user. But I'd like to know whether there is a better or automatic way to resolve this potential conflict with shiny. For example, if use shiny fileInputs, the uploaded files are automatically stored in a temporary folder.
Update
Thanks for the advice.#Symbolix and #Mike Wise
I read the persistent data storage article before but I don't think it is exactly what I wanted. Maybe my understanding is not correct. I end up with creating a temporary folder and run my executable from there.

Loading Excel into application from client side Excel file

I'm building an application that takes as an input data stored in an Excel sheet. I want the user to be able to select the file they want to load data from, have the application connect to and read from the Excel file stored on the client's machine, and close the connection. Can I do this without uploading the Excel file to the server? I'm able to do everything except for selecting the file using a filedialog box and passing the path & file name to the procedure that connects to the Excel file and processes. I've tried using the file input control but I'm unable to pass the path & file name to the connection string. Any suggestions as to what other routes I might take?
EDIT:
The application essentially takes user input, either via single inputs into textboxes on the page or a bulk upload via an Excel spreadsheet, processes the inputs and spits out a report in Excel format. The only thing displayed on the page are the loaded inputs (via the 2 methods just described) in a listbox that the user can either add or remove additional items.
The short answer is "Only with an ActiveX control in IE unless you write your own plugin for another browser." My opinion is, "you shouldn't."
There is a good discussion already on Stack Overflow: How to read an excel file contents on client side?
The long answer, given the rest of the information you have provided, is that I would recommend the following:
1) Upload the spreadsheet to the server.
2) Extract the data on the server.
3) Return the data to the client in whatever form suits your situation.
4) Clean the original file on the server.
Some further recommendations for you, since you provided so much detail in your comments:
Rather than using a GUID to generate your server-side filename, use a timestamp in the format YYYY-MM-DD-HH-MM-SS-Ticks followed by the original filename.
Instead of deleting the data files immediately, each time you add a file, remove any files older than N days.
This way if you have any issues processing your files in the future, you'll be able to retrieve the uploaded file to your development environment and debug it there. Assuming the data isn't personal information or sensitive in some other way, of course.
Cheers!

Should we store data in database?

i'm asp.net beginner and currently working in "upload download file" project with asp.net and vb.net as code behind language (like skydrive's web).
what i'm want ask is about upload file in server, must we store path file, size, accessed or created date into database? as we know we can use directory listing in system.io.
Thanks for your help.
You definetly want to store the path of the file. You want a way to find the file ;) Maybe later you will have multiple servers, replication or other fancy things.
For the rest, it depends a bit on the type of website. If it's going to get high traffic then store it in the database, this will limit the number of IO call (very slow). Also, it'll be a lot easier to handle sorting and queries. (sort by date, pull only the read onyl files, ...).
Database will also help if you want to show history or statistique.
You can save file in some directory and can save path of that file in database. You can also store size and created date of that file in DB. But storing a file in DB is a bit difficult. Rather than save file in Directory and save path of that file in DB
you could store the file information in a database to built some extra features like "avoid storing duplicate files", because you are having a faster search in the database! if you search the filesystem always a recursive function call get started

How to open infopath templates in code and change data connections URL

I need to iterate through infopath templates (xsn files) and change the URL of data connections, and then save the changes to the templates.
The data connections I want to change, points to lists in a sharepoint environment.
So, how could I accomplish this task?
I was thinking doing this with a console application.
Infopath definitely doesn't make it easy to deploy to different servers. I have used a powershell script but you can use any console app or scripting language.
Steps to follow:
1. Extract the files from the XSN (either use extrac32 util from MS or rename to zip and use any zip library)
2. Change the data connection (string replace) in manifest.xsf, template.xml, and sampledata.xml
3. Repackage the files into the XSN (either use cabarc util from MS or zip and rename)
It is a pain to have to do all that but the entire script is less than a page long and runs pretty fast. One caveat I ran into was I needed a delay between steps 1 and 2 - the files weren't actually finished extracting and my script was trying to change them.

need help in choosing the right tool

I have a client who has set-up a testing environment in some AI language. It basically runs some predefined test cases and stores the results in as log files (comma separated txt files). My job is to identify and suggest a reporting system and I have these options in mind. either
1. Importing the logs into MSSQL and use the reporting(SSRS) it uses
2. or us import the logs to MySQL and use PHP to develop custom reporting.
I am thinking that going with option2 is better. The reason for this is, the logs are inconsistent and contain unexpected wild characters that normally DB's don't accept. So, I can write some scripts in php before loading them to the database.
Can anyone please suggest if this is your problem what will you suggest to do?
It depends how fancy you need to be. If the data is in CSV files, you could even go so simple as to load it into Excel (or their favorite spreadsheet tool), and use spreadsheet macros to analyze it.

Resources