Generating Excel Documents with ASP.NET Website - asp.net

I have an ASP.NET application that helps the user create a Gridview with certain data in it. Once this table is generated I want the user to push a button and be able to save the table as an Excel document.There are two different methods I know of:
Using HtmlTextWriter with ContentType "application/vnd.ms-excel" to send the file as an HttpResponse. I use GridView1.RenderControl(htmlTextWriter) to render the gridview. This almost works, but the excel file always shows a warning when the file opens because the content doesn't match the extension. I have tried various content types to no avail. This makes sense I guess, because I'm using an HtmlWriter. It also doesn't seem a good practice.
The second thing I've tried is generating the Excel file using Office Automation. But for the file to be generated, I need to save it to disk and then read it again. From what I have read, this is the only way, because the Excel object only becomes a real Excel file once you save it. I found that the .saveas method from the Excel class would throw an exception because of write permissions, even if I tried to save in the App_Data folder. So I did some research and found that apparently Office Automation is discouraged for web services: https://support.microsoft.com/en-us/kb/257757
Microsoft does not currently recommend, and does not support,
Automation of Microsoft Office applications from any unattended,
non-interactive client application or component (including ASP,
ASP.NET, DCOM, and NT Services), because Office may exhibit unstable
behavior and/or deadlock when Office is run in this environment.
There surely must be a save way to have a website generate an Excel file and offer it to the user!? I can't imagine that this problem is unsolved or so rare that nobody cares about it, but yet I can't find any good solution to this.

the easiest (and best) way to create an excel file is by using epplus
Epplus sample for webapplication
using (ExcelPackage pck = new ExcelPackage())
{
ExcelWorksheet ws = pck.Workbook.Worksheets.Add("Demo");
//Load the datatable into the sheet, starting from cell A1. Print the column names on row 1
ws.Cells["A1"].LoadFromDataTable(tbl, true);
//Format the header for column 1-3
using (ExcelRange rng = ws.Cells["A1:C1"])
{
rng.Style.Font.Bold = true;
rng.Style.Fill.PatternType = ExcelFillStyle.Solid; //Set Pattern for the background to Solid
rng.Style.Fill.BackgroundColor.SetColor(Color.FromArgb(79, 129, 189)); //Set color to dark blue
rng.Style.Font.Color.SetColor(Color.White);
}
//Example how to Format Column 1 as numeric
using (ExcelRange col = ws.Cells[2, 1, 2 + tbl.Rows.Count, 1])
{
col.Style.Numberformat.Format = "#,##0.00";
col.Style.HorizontalAlignment = ExcelHorizontalAlignment.Right;
}
//Write it back to the client
Response.ContentType = "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet";
Response.AddHeader("content-disposition", "attachment; filename=ExcelDemo.xlsx");
Response.BinaryWrite(pck.GetAsByteArray());
}

Related

Can I test the validity of an image file before uploading it in ASP.NET?

I have an ASP.NET web application that allows the user to upload a file from his PC to a SQL Server database (which is later used to generate an image for an tag). Is there an "easy" way to test the image within .NET to validate that it does not contain anything malicious before saving it?
Right now, I use this:
MemoryStream F = new MemoryStream();
Bitmap TestBitmap = new Bitmap(Filename);
TestBitmap.Save(F, System.Drawing.Imaging.ImageFormat.Png);
int PhotoSize = (int)F.Length;
Photo = new byte[PhotoSize];
F.Seek(0, SeekOrigin.Begin);
int BytesRead = F.Read(Photo, 0, PhotoSize);
F.Close();
Creating TestBitmap fails if it is not an image (e.g. if Filename is the name of a text file), but apparently this doesn't stop a file that is an image with malicious code appended to it from loading as an image, so saving it as a MemoryStream and then writing the stream to a byte array (which is later saved in the database) supposedly fixes this.
To avoid people pass programs and other information's using the ability to upload photos to your site you can do two main steps.
Read and save again the image with your code to remove anything elst.
Limit the size of each image to a logical number.
To avoid some one upload bad code and run it on your server you keep an isolate folder with out permission to run anything. More information's about that on:
I've been hacked. Evil aspx file uploaded called AspxSpy. They're still trying. Help me trap them‼
And a general topic on the same subject: Preparing an ASP.Net website for penetration testing

how can i create a true, binary excel file using pure asp

is it possible to create a binary excel file in asp classic?
the common method is using this
Response.ContentType = "application/octet-stream"
Response.AddHeader "Content-Disposition", "attachment; filename=test.txt"
which is not a "true" excel file
In .net you could use the Microsoft.Office.Interop.Excel namespace - very easy to use, if you have access to .net i'd certainly go for that.
You say "in pure asp" - if this means without com objects etc - i dont think it's possible.
If however you wish to persevere with classic asp with the option of using a com object you can create full binary spreadsheets using the Interop com object - to do this you will need Excel installed on the webserver.
For example, to open a blank excel file on disk (i.e. a template), change a value and save the finished file:
'#### Ready Excel
Set ExcelApp = CreateObject("Excel.Application")
ExcelApp.Application.Visible = True
'#### Open a blank Excel File
Set ExcelBook = ExcelApp.workbooks.Open("d:\wwwroot2\blankTemplate.xls")
'#### Select a sheet and change the value of a cell
ExcelBook.Worksheets("SheetName").Cells(1, 1).Value = "Lorem Ipsum"
'#### Switch between sheets
ExcelBook.Worksheets("AnotherSheetName").Select
'#### Save the finished file with a different name (to leave the template ready for easy re-use)
ExcelBook.SaveAs "d:\wwwroot2\FinishedFile.xls"
'#### Tidy up
ExcelBook.Close
ExcelApp.Application.Quit
Set ExcelApp = Nothing
Set ExcelBook = Nothing
Good thing about this method is the interop com object mimics VBA methods etc very closely, so if your ever unsure how to do something, record a macro in excel, view the raw vba, the code it produces, specifically the methods and parameters would be very similar in most cases to the code you would use for the com object.

asp.net - exporting table

I have a big problem with exporting my table to Excel file format.
Firstly I created code which runs on server and allows me to export data to Excel. Due to the fact that my table is created dynamically from the database there is nothing WITHIN the table at that stage, so no data were exported.
My second approach was targeting the final compiled table on the client side using either javascript or a very nice jQuery plugin called "DataTables" (www.datatables.net). Both of the attempts failed. Javascript seems to be to complex for me, plus it has difficulties running in Firefox, plugin on the other hand requires a very specific table structure which I am afraid I cannot provide.
So, a new idea of mine is: grab the page just after compiling and building it on the server, but before sending it to the browser. Target THE table and source its data using function on server. Finally export data to Excel, and send the page to the browser. Now. Is it possible? And if yes, then how?
I am beginner in programming world so any constructive suggestions and criticism would be highly appreciated. I would not mind any hard code examples ;)
You can try doing something like this:
protected void btnExport_Click(object sender, EventArgs e)
{
Response.Clear();
Response.Buffer = true;
Response.ContentType = "application/vnd.ms-excel";
Response.Charset = "";
System.IO.StringWriter oStringWriter = new System.IO.StringWriter();
System.Web.UI.HtmlTextWriter oHtmlTextWriter = new System.Web.UI.HtmlTextWriter(oStringWriter);
//if you're exporting a table put the table in a placeholder and render
//the placeholder to the text writer here
grdJobs.RenderControl(oHtmlTextWriter);
Response.Write(oStringWriter.ToString());
Response.End();
}
What you need to do is export your query results in a .CSV file. CSV files can be opened in Excel no problem at all. http://wiki.asp.net/page.aspx/401/export-to-csv-file/ This shows you how to export into a .CSV format.
You're going to get a lot of suggestions instead of answers on this here. My recommendation would be to try the jQuery plugin: table2csv in order to create a more universal file format. But there are ways to target an actual Excel format, like this project.
If you want to export to actual XLS or XLSX instead of just CSV or something that just "opens" in Excel, there are third party tools that can help you with this. One example here:
http://www.officewriter.com

Merging/filling pdf form file with xml data

Let's say I have a pdf form file available at website which is filled by the users and submitted to the server. On the server side (Asp.Net) I would like to merge the data that I receive in xml format with the empty pdf form that was filled and save it.
As I have found there are several possible ways of doing it:
Using pdf form created by adobe acrobat and filling it with itextsharp.
Using pdf form created by adobe acrobat and filling it with FDF Toolkit .net (which seems to be using itextsharp internally)
Usd pdfkt to fill the form.
Use pdf form file created with adobe livecycle and merge the data by using Form Data Integration Service
As I have no experience with this kind of task can you advise which option would be better/easier and give some additional tips?
Thank you in advance.
I would suggest using the 4th approach if possible because it would be cleaner. You would be using solutions specifically tailored for what you are asking to do, but if you don't have the available resources for such a solution I would suggest using the 1st option.
The 1st option is what I have recently dove into. I have found it relatively painless to implement.
Option 1 is possible if the following applies:
You have control of development of PDF forms.
You have control of formating xml data
You have can live with having uncompressed (fastweb=false) PDF files
Example of implementation:
Using Adobe Acrobat to generate a PDF form. Tip: Use Adobe Native Fonts when generating the forms. For each control you add that is not a native font it will import the font used and bloat the file when it is not compressed, and to my knowledge ITextSharp currently does not produce compressed PDFs.
Using ITextSharp Library to combine XML data with the PDF form to generate a populated document. Tip: to manually populate a PDF form from xml you must map xml values to control names in the PDF form and match them by page as shown in the example below.
using (MemoryStream stream = GeneratePDF(m_FormsPath, oXmlData))
{
byte[] bytes = stream.ToArray();
Response.ContentType = "application/pdf";
Response.BinaryWrite(bytes);
Response.End();
}
// <summary>
// This method combines pdf forms with xml data
// </summary>
// <param name="m_FormName">pdf form file path</param>
// <param name="oData">xml dataset</param>
// <returns>memory stream containing the pdf data</returns>
private MemoryStream GeneratePDF(string m_FormName, XmlDocument oData)
{
PdfReader pdfTemplate;
PdfStamper stamper;
PdfReader tempPDF;
Document doc;
MemoryStream msTemp;
PdfWriter pCopy;
MemoryStream msOutput = new MemoryStream();
pdfTemplate = new PdfReader(m_FormName);
doc = new Document();
pCopy = new PdfCopy(doc, msOutput);
pCopy.AddViewerPreference(PdfName.PICKTRAYBYPDFSIZE, new PdfBoolean(true));
pCopy.AddViewerPreference(PdfName.PRINTSCALING, PdfName.NONE);
doc.Open();
for (int i = 1; i < pdfTemplate.NumberOfPages + 1; i++)
{
msTemp = new MemoryStream();
pdfTemplate = new PdfReader(m_FormName);
stamper = new PdfStamper(pdfTemplate, msTemp);
// map xml values to pdf form controls (element name = control name)
foreach (XmlElement oElem in oData.SelectNodes("/form/page" + i + "/*"))
{
stamper.AcroFields.SetField(oElem.Name, oElem.InnerText);
}
stamper.FormFlattening = true;
stamper.Close();
tempPDF = new PdfReader(msTemp.ToArray());
((PdfCopy)pCopy).AddPage(pCopy.GetImportedPage(tempPDF, i));
pCopy.FreeReader(tempPDF);
}
doc.Close();
return msOutput;
}
Save the File or post the file to the response of your ASP.Net page
Since you tagged this 'LiveCycle', I take it you have an installation of Adobe LiveCycle running somewhere (optionally, can install it somewhere).
In that case, I'd go for number 4 (with the modification of using the Adobe LiveCycle Forms ES module). The other three will undoubtedly yield compatibility issues in the long run. With the LiveCycle server (running the Forms module), you'll be able to handle any PDF, whether it's old, new, static, dynamic, compressed, Acrobat-based or LiveCycle-based.
You should be able to set things up, have the form send its data to the LiveCycle server, and use that data to populate the form. The fill can then be stored in the server's database, or routed into the PDF form (or any other form) and streamed back to the client.
Create the form using LiveCycle Designer.
The quick-and-dirty-option would be the following: Set the form to http-post (as for example an xfdf, see Acrobat for more info) to your ASP-server and publish it on the server (make sure your users don't download the form before opening it, otherwise this won't work. The form has to be opened in the web browser). Then simply capture the submissions as you would capture a http-post from a web page. Optionally, save the fill to a database. Then send the captured xfdf stream fill back to the client (could also be invoked at a later stage via a http-link). The xfdf stream will contain the URL of the form used to fill it out. The client web browser will ask the Acrobat/Adobe reader plug to handle the xfdf stream, and the plug will locate, download and populate the form pointed to by the xfdf.
The user should now be able to save the form AND it's fill - no Reader Extension needed!
You can also use iTextSharp to fill xml data into a Reader Extension enabled form. There are two things you need to set correctly:
Set PdfReader.unethicalreading = true to prevent BadPasswordException.
Set append mode in PdfStamper's constructor, otherwise the Adobe Reader Extensions signature becomes broken and Adobe Reader will display following message: "This document contained certain rights to enable special features in Adobe Reader. The document has been changed since it was created and these rights are no longer valid. Please contact the author for the original version of this document."
So all you need to do is this:
PdfReader.unethicalreading = true;
using (var pdfReader = new PdfReader("form.pdf"))
{
using (var outputStream = new FileStream("filled.pdf", FileMode.Create, FileAccess.Write))
{
using (var stamper = new iTextSharp.text.pdf.PdfStamper(pdfReader, outputStream, '\0', true))
{
stamper.AcroFields.Xfa.FillXfaForm("data.xml");
}
}
}
See How to fill XFA form using iText?

How can I export my ASP.NET page to Excel?

How can I export the data in my webapp to an Excel sheet from ASP.NET (VB.NET,SQL 2005)?
change the contenttype of your ASP.Net page
Response.ContentType = "application/ms-excel"
One of my most popular blogs is how to generate an Excel document from .NET code, using the Excel XML markup (this is not OpenXML, it's standard Excel XML) - http://www.aaron-powell.com/linq-to-xml-to-excel
I also link off to an easier way to do it with VB 9.
Although this is .NET 3.5 code it could easily be done in .NET 2.0 using XmlDocument and creating the nodes that way.
Then it's just a matter to set the right response headers and streaming back in the response.
SpreadsheetGear for .NET will do it. You can find a bunch of live ASP.NET samples with C# & VB.NET source on this page.
Disclaimer: I own SpreadsheetGear LLC
If you can display your data in a GridView control, it inherently supports "right-click-->Export to Excel" without having to write any code whatsoever.
SQL Server Reporting services would be the best way to export data from an application into Excel.
If you dont have access to / dont wan't to use reporting services depending on the data you want to extract / format possibly using a CSV structure instead of Excel may be easiest.
Use the Microsoft.Office.Interop.Excel dlls to create excel files with your data and then provide links to download the files using Hunter Daley's download method...
As a general solution, you may want to consider writing handler (ashx) for exporting -- and pass either the query parameters to recreate the query to generate the data or an identifier to get the data from the cache (if cached). Depending on whether CSV is sufficient for your Excel export you could just format the data and send it back, setting the ContentType as #Hunter suggests or use the primary interop assemblies (which would require Excel on the server) to construct a real Excel spreadsheet and serialize it to the response stream.
I prefer to use a OLEDB connection string.
Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\Excel.xls;Extended Properties="Excel 8.0;HDR=Yes;IMEX=1";
Not sure about exporting a page but if you just want to export a dataset or datatable
HttpContext.Current.Response.Clear()
HttpContext.Current.Response.AddHeader("content-disposition", String.Format("attachment; filename={0}", fileName))
HttpContext.Current.Response.ContentType = "application/ms-excel"
Dim sw As StringWriter = New StringWriter
Dim htw As HtmlTextWriter = New HtmlTextWriter(sw)
Dim table As Table = New Table
table.RenderControl(htw)
' render the htmlwriter into the response
HttpContext.Current.Response.Write(sw.ToString)
HttpContext.Current.Response.End()
I use almost the same exact code as CodeKiwi. I would use that if you have a DataTable and want to stream it to the client browser.
If you want a file, you could also do a simple loop through each row/column, create a CSV file and I guess provide a link to the client - you can use a file extension of CSV or XLS. Or if you stream the resulting file to the client it will prompt them if they want to open or save it to disk.
The interops are (well were last time I tried them) great for small datasets, but didn't scale well - horrifically slow for larger datasets.

Resources