building excel file gives memory out of exception - asp.net

I need to export huge amount of data from ado.net datatable(which i get by db query) to excel.
I tried the following way :
1. Create excel object with workbook/worksheet # server side...and use memory stream to write whole document to client side.
But this gave me "out of Memory exception". bcoz my memory stream was was so huge.
So I replaced this with a new way - as follows :
Writing each row from datatable as a coma seperated string to client side.
So, as and when we get each row ...we can write to client side ..no memory is used.
But by this way we can write to csv file...not to excel...
Does anybody know how to handle this situation.
Can I use silverlight to get data row by row from server, pass it to client side...build excel at client side.?

try spreadsheetgear
OR
smartxls

I'd keep the csv approach but write to a file and not the memory strea. After you've created the file, I'd use TransmitFile to get it to the browser. You can see more about using TransmitFile here.

Related

database disk image is malformed

I got an error when trying to query an sqlite file in R saying Error in rsqlite_fetch(res#ptr, n = n) : database disk image is malformed likely indicating that the sqlite3 file is somehow malformed. Running pragma integrity_check returns ok.
The file size is 76GB. There is one table main containing 172 columns with an index.
This was meant to create an easier to access method for a series of unstructured files and so is intended only to be read only. I tried reconstituting it several times, but it always seems malformed.
Not sure what else to look at or to do. Any help would be appreciated!

Sending large amount of data for Excel report to client

MS VS 2008, ASp.Net 3.5.
On the client side :
client selects start and end dates, report format as Excel, clicks "run report" button
On that click redirected to reportToExcel.aspx, in reportToExcel.aspx.vb in Page_Load event stored procedure is executed to retrieve report data :
oSQLDataReader = oSqlCommand.ExecuteReader()
Then:
Response.ContentType = "application/ms-excel"
Response.AddHeader("Content-Disposition", "attachment; filename=" + MyBase.UserSession.ReportName + ".xls")
Then Response.Write is used to write retrieved report data into Response object in XML format, like
Response.Write("<td>" & FormatColumnValue(oSQLDataReader.GetValue(I), arrColHeader(I + 1).ColumnFormat) & "</td>"), etc. Last callis Response.End().
I know Response.End should not be used, I plan to substitute is with
context.Response.Flush()
context.ApplicationInstance.CompleteRequest()
but I doubt it will improve response time.
Problem: on the client side take 6 mins to receive 32.5 MB of data. This is too long.
How to reduce this time ?
As I understood so far: chuncking is not possible for Excel report and anyway client wants to receive Excel report as one and whole.
In order to use Response.TransferFile : Excel file has to be created first, then zipped to reduce amount of data to download, then downloaded. For this to work Excel should be installed on the server, which is not acceptable in our case.
Deliver data as csv to client is not acceptable: client will have to import it to Excel, which they would not like to do.
Stored procedure executed from SQL management studio shows inconsistent run times: from 12 secs to 4 mins.
So, are there any other ways to reduce report 'delivery' time to the client ?
Thank you for all replies
You do need to get the stored procedure run time down. But that's a whole question and answer in itself.
Your method of writing out the HTML is slower than it needs to be. Essentially, you are doing repeated string concatenations, which are slow. Consider using a StringBuilder to construct the entire document before writing it to the Response stream.
Another option (and perhaps a better one) would be to try something like the free Excel Xml Writer library: http://www.carlosag.net/tools/excelxmlwriter/. I haven't used it, but I've heard good things about it. This would (I believe) let you write your Excel file on the server without needing Excel itself installed.

How to read invariant csv files using c#

I am working on Windows Application development using c#. I want to read a csv file from a directory and imported into sql server database table. I am successfully read and import the csv file data into database table if the file content is uniform. But I am unable to insert the file data with invariant form ex.Actually my csv file delimiter is tab('\t') and after getting individual fields I have a field that contains data like dcc
Name
----
xxx
xxx yyy
xx yy zz
and i rerieved data like xxx,yyy and xx,yy,zz so the insertion becomes problem.
How could i insert the data uniformly into a database table.
It's pretty easy.
Just read file line-by-line. Example on MSDN here:
How to: Read Text from a File
For each line use String.Split Method with your tab as delimiter. Method documentation and sample are here:
String.Split Method (Char[], StringSplitOptions)
Then working insert your data.
If a CSV (or TSV) value contains a delimiter inside of it, then it should be surrounded by quotes. See the spec for more details: https://www.rfc-editor.org/rfc/rfc4180#page-3
So your input file is incorrectly formatted. If you can convince the input provider to fix this issue, that will be the best way to fix the problem. If not, other solutions may include:
visually inspecting and editing the file to fix errors, or
writing your parser program to have enough knowledge of your data expectations that it can correctly "guess" where the real delimiters are.
If I'm understanding you correctly, the problem is that your code is splitting on spaces instead of on tabs. Given you have read in the lines from the file, all you need to do is:
string[] fileLines;//from the file
foreach(string line in fileLines)
{
string[] lineParts=line.Split(new char[]{'\t'});
}
and then do whatever you want with each lineParts. The \t is the tab character.
If you're also asking about writing the lines to a database file...you can just read in tab-delimited files with the Import Export Wizard (assuming you're using Sql Server Mgmt Studio, but I'm sure there are comparable ways to import using other db management software).

excel export Memory exception

I need to export huge amount of data from ado.net datatable(which i get by db query) to excel.
I tried the following way : 1. Create excel object with workbook/worksheet # server side...and use memory stream to write whole document to client side.
But this gave me "out of Memory exception". bcoz my memory stream was was so huge.
So I replaced this with a new way - as follows :
Writing each row from datatable as a coma seperated string to client side. So, as and when we get each row ...we can write to client side ..no memory is used.
But by this way we can write to csv file...not to excel...
Does anybody know how to handle this situation.
Can I use silverlight to get data row by row from server, pass it to client side...build excel at client side.?
You should create it on the server, then copy it to the client in chunks.
For an example, see this answer.
If this is for XL 2007, then the workbook is basically in an OPEN XML file format.
If you can format the data in your datatable to conform to the OPEN XML, you can save the file and then just download the entire file.
Read up on OPEN XML at http://msdn.microsoft.com/en-us/library/aa338205.aspx

Downloading >10,000 rows from database table in asp.net

How should I go about providing download functionality on an asp.net page to download a series of rows from a database table represented as a linq2sql class that only has primitive types for members (ideally into a format that can be easily read by Excel)?
E.g.
public class Customer
{
public int CustomerID;
public string FirstName;
public string LastName;
}
What I have tried so far.
Initially I created a DataTable, added all the Customer data to this table and bound it to a DataGrid, then had a download button that called DataGrid1.RenderControl to an HtmlTextWriter that was then written to the response (with content type "application/vnd.ms-excel") and that worked fine for a small number of customers.
However, now the number of rows in this table is >10,000 and is expected to reach upwards of 100,000, so it is becoming prohibitive to display all this data on the page before the user can click the download button.
So the question is, how can I provide the ability to download all this data without having to display it all on a DataGrid first?
After the user requests the download, you could write the data to a file (.CSV, Excel, XML, etc.) on the server, then send a redirect to the file URL.
I have used the following method on Matt Berseth blog for large record sets.
Export GridView to Excel
If you have issues with the request timing out try increasing the http request time in the web.config
Besides the reasonable suggestion to save the data on server first to a file in one of the answers here, I would like to also point out that there is no reason to use a DataGrid (it’s one of you questions as well). DataGrid is overkill for almost anything. You can just iterate over the records, and save them directly using HtmlTextWriter, TextWriter (or just Response.Write or similar) to the server file or to a client output stream. It seems to me like an obvious answer, so I must be missing something.
Given the number of records, you may run into a number of problems. If you write directly to the client output stream, and buffer all data on server first, it may be a strain on the server. But maybe not; it depends on the amount of memory on the serer, the actual data size and how often people will be downloading the data. This method has the advantage of not blocking a database connection for too long. Alternatively, you can write directly to the client output stream as you iterate. This may block the database connection for too long as it depends on the download speed of the client. But again; it your application is of a small or medium size (in audience) then anything is fine.
You should definitely check out the FileHelpers library. It's a freeware, excellent utility set of classes to handle just this situation - import and export of data, from text files; either delimited (like CSV), or fixed width.
It offer a gazillion of options and ways of doing things, and it's FREE, and it works really well in various projects that I'm using it in. You can export a DataSet, an array, a list of objects - whatever it is you have.
It even has import/export for Excel files, too - so you really get a bunch of choices.
Just start using FileHelpers - it'll save you so much boring typing and stuff, you won't believe it :-)
Marc
Just a word of warning, Excel has a limitation on the number of rows of data - ~65k. CSV will be fine, but if your customers are importing the file into Excel they will encounter that limitation.
Why not allow them to page through the data, perhaps sorting it before paging, and then give them a button to just get everything as a cvs file.
This seems like something that DLinq would do well, both the paging, and writing it out, as it can just fetch one row at a time, so you don't read in all 100k rows before processing them.
So, for cvs, you just need to use a different LINQ query to get all of the rows, then start to save them, separating each cell by a separator, generally a comma or tab. That could be something picked by the user, perhaps.
OK, I think you are talking too many rows to do a DataReader and then loop thru to create the cvs file. The only workable way will be to run:
SQLCMD -S MyInstance -E -d MyDB -i MySelect.sql -o MyOutput.csv -s
For how to run this from ASP.Net code see here. Then once that is done, your ASP.Net page will continue with:
string fileName = "MyOutput.csv";
string filePath = Server.MapPath("~/"+fileName);
Response.Clear();
Response.AppendHeader("content-disposition",
"attachment; filename=" + fileName);
Response.ContentType = "application/octet-stream";
Response.WriteFile(filePath);
Response.Flush();
Response.End();
This will give the user the popup to save the file. If you think more than one of these will happen at a time you will have to adjust this.
So after a bit of research, the solution I ended up trying first was to use a slightly modified version of the code sample from http://www.asp.net/learn/videos/video-449.aspx and format each row value in my DataTable for CSV using the following code to try to avoid potentially problematic text:
private static string FormatForCsv(object value)
{
var stringValue = value == null ? string.Empty : value.ToString();
if (stringValue.Contains("\"")) { stringValue = stringValue.Replace("\"", "\"\""); }
return "\"" + stringValue + "\"";
}
For anyone who is curious about the above, I'm basically surrounding each value in quotes and also escaping any existing quotes by making them double quotes. I.e.
My Dog => "My Dog"
My "Happy" Dog => "My ""Happy"" Dog"
This appears to be doing the trick for now for small numbers of records. I will try it soon with the >10,000 records and see how it goes.
Edit: This solution has worked well in production for thousands of records.

Resources