VB.net - RsData / Cleaning up code - asp.net

I have a question regarding VB.net and the use of rsData connections to an SQL database.
Basically we have a few inline pages that will display course information of courses that my institution runs. The code will connect to an SQL DB and pull through live data directly in the following format.
html += "<tr><td>" & rsData("M_Start") & "</td><td>" & rsData("WEEKS") & "</td><td>" & rsData("DAYSTIME") & "</td></tr>"
Now I was wondering if people would suggest pulling directly from an open DB connection or map the RsData results to strings? All data connections open and close after they have done their required portions and we have around 5 different procedures that occur within the page.
I'm worried that the code isn't as clean as it could be and would really like to tidy up this inherited nightmare. Also can people shed any best practice with inline code and the multiple data connections?
Thanks!

It's difficult to give you a full solution on how to clean-up the code without seeing it all. Better ways to display your data might be using a GridView or Repeater.
However, if you are going to build up the HTML in a String variable I'd suggest doing the portion you've posted like this:
Dim html As New Text.StringBuilder
html.Append(String.Format("<tr><td>{0}</td><td>{1}</td><td>{2}</td></tr>",
rsData("M_Start"),
rsData("WEEKS"),
rsData("DAYSTIME")))
It makes it more readable and a StringBuilder performs better than incrementing a String variable multiple times.
I'm not sure how you are dealing with repetition, but you could then do something like this (assuming rsData belongs to a DataTable):
Const htmlRowFormat As String = "<tr><td>{0}</td><td>{1}</td><td>{2}</td></tr>"
Dim html As New Text.StringBuilder
For Each dr As DataRow In rsDataTable.Rows
html.Append(String.Format(htmlRowFormat,
rsDataTable("M_Start"),
rsDataTable("WEEKS"),
rsDataTable("DAYSTIME")))
Next
To get your html: html.ToString

Related

Iterating through IHtmlElementCollection

I have a VB webapplication that needs to read information from an excisting webpage on the internet. Therefore I use the mshtml library. I read the html into an ihtmldocument3 interface. After that I iterate through an ihtmlelementcollection and everything worked fine in Visual Studio 2010 Debugger. At least, the first time. When I debug the code for the second time, after iterating a few elements, the next elements return nothing and I get an exception. (When I break into the code the ihtmlelementcollection shows 0 items.) When I rename all the variables, it runs properly, but again, only the first time.
Here's the code I use to debug. I have outlined the actual code because that responds into an exception (null reference). Do I need to manually release a collection or something or am I doing something stupid?
'global variable
Private tables as IHTMLElementCollection
...........................................
Dim tableChildren As IHTMLElementCollection = tables(3).children
Dim trElements As IHTMLElementCollection = tableChildren.item(0).getElementsByTagName("tr")
Dim intCount As Integer 'just for debugging purposes
For Each element As IHTMLElement In trElements
intCount += 1 'for debugging purposes
Debug.Print(intCount.ToString & vbNewLine & element.innerHTML)
'strLine1 = element.children(0).innerText
'strLine2 = element.children(1).innerText
'and so on...
Next
I assume that by this point you've already resolved the problem one way or another, but I thought I'd suggest the HtmlAgilityPack. It has the advantage of being written to support just this type of scenario, and (as far as I know) is a native .NET library rather than being COM-based. It might be a better fit for your situation.

Need a quick and simple classic asp page to query records from a sql server database

I know this seems elementary, but I have been looking for 2 days and all i find is snippets that dont work. I am simply trying to have a web page dynamically display the contents of a table with 4 columns.
Need by tomorrow!
Help!
Thank you!
Here's the simplest way to do it. This is assuming your server is SQL Server. If not, head to http://connectionstrings.com and look up the specifics for your server. That site is awesome and I find myself on it all the time.
set rs = server.CreateObject("ADODB.Recordset")
rs.open "select col1 from table1", "provider=sqloledb.1;uid=user;pwd=password;database=database;Server=server;"
do while rs.EOF = false
response.write rs("col1")
rs.MoveNext
loop
What's going on here is we're using Microsoft's ADO database library. I'm creating a Recordset object and calling its open method. Provided to the open method are the sql statement I want to execute and the specifics on how to connect to that database. The specifics on how to connect to the database is commonly referred to as a "Connection String." The site mentioned above is an invaluable resource in figuring out exactly what this should look like. 99% of the time, any problems I've run into have been an invalid connection string. Once opened, I loop through the returned records in the while loop and write out the data to the page.
DON'T FORGET THE CALL TO rs.MoveNext!!! I've done this a handful of times over the years and you'll wind up with an infinite loop.

Looking for a SQL injection demonstration

I'm a web applications developer, using Classic ASP as server side script.
I always protect my apps from SQL injection by using a simple function to double single apostrophe for string parameters.
Function ForSQL(strString)
ForSQL = Replace(strString, "'", "''")
End Function
For numeric parameters, I use the CInt, CLng and CDbl functions.
I often write concatenated query; I don't always use stored procedure and I don't always validate user inputs.
I'd like to ask you if someone can show me a working attack against this line of code:
strSQL = "SELECT Id FROM tUsers WHERE Username='" & _
ForSQL(Left(Request.Form("Username"),20)) & "' AND Password='" & _
ForSQL(Left(Request.Form("Username"),20)) & "'"
It could be a banality but I've never found a kind of attack that works.
I've always found "sqli helper 2.7" (you can download it) to find most/all SQL injections. I'm not sure if this will help at all, but it will at least help test for all of the SQL comments and everything. I remember on one of my sites it found a main SQL injection to dumb all of my database data. It's not exactly what you're looking for, but it might be able to find a way through.
There is no functioning SQL injection for input sanitized this way. The downside is retrieving data from the database is you have to replace on double apostrophes.
sDataRetrievedFromDatabase = Replace(sDataRetrievedFromDatabase, "''", "'")

asp.net form output: streamwriter to write to file versus database connection

I have to make an ASP.net form and only need to gather 3 fields: name, bday and email.
Do you think it's best to write the info to a csv or xml file, or do you think it's worth it to write to a SQL DB or something and then export from there to a file?
I'm of the opinion that just writing to a flat file is best because it's just going to need to be exported into a csv/xml file anyway so it can be appended to an excel file.
I'd use something like streamwriter or filestream in my C# submitbutton function:
StreamWriter sw = new StreamWriter(filename, true);
sw.WriteLine(string.Concat
( textBox1.Text
, textBox2.Text
, textBox3.Text
, textBox4.Text
, textBox5.Text
, textBox6.Text
, textBox7.Text ));
sw.Close();
Am I overlooking shortcomings of using csv and StreamWriter? Like do any weird things happen when the file gets to a certain size?
Also, how is Streamwriter compared to Filestream, or should I be looking at a different method entirely?
Even for small applications you will never regret using a database. Especially if they ever change / grow. Text files are a lot harder to make a change if you want to store other data.
File access for web applications be challenging at best. If you do use a Database I would ultimately look in to XML/XSD and using Data Sets.

Downloading >10,000 rows from database table in asp.net

How should I go about providing download functionality on an asp.net page to download a series of rows from a database table represented as a linq2sql class that only has primitive types for members (ideally into a format that can be easily read by Excel)?
E.g.
public class Customer
{
public int CustomerID;
public string FirstName;
public string LastName;
}
What I have tried so far.
Initially I created a DataTable, added all the Customer data to this table and bound it to a DataGrid, then had a download button that called DataGrid1.RenderControl to an HtmlTextWriter that was then written to the response (with content type "application/vnd.ms-excel") and that worked fine for a small number of customers.
However, now the number of rows in this table is >10,000 and is expected to reach upwards of 100,000, so it is becoming prohibitive to display all this data on the page before the user can click the download button.
So the question is, how can I provide the ability to download all this data without having to display it all on a DataGrid first?
After the user requests the download, you could write the data to a file (.CSV, Excel, XML, etc.) on the server, then send a redirect to the file URL.
I have used the following method on Matt Berseth blog for large record sets.
Export GridView to Excel
If you have issues with the request timing out try increasing the http request time in the web.config
Besides the reasonable suggestion to save the data on server first to a file in one of the answers here, I would like to also point out that there is no reason to use a DataGrid (it’s one of you questions as well). DataGrid is overkill for almost anything. You can just iterate over the records, and save them directly using HtmlTextWriter, TextWriter (or just Response.Write or similar) to the server file or to a client output stream. It seems to me like an obvious answer, so I must be missing something.
Given the number of records, you may run into a number of problems. If you write directly to the client output stream, and buffer all data on server first, it may be a strain on the server. But maybe not; it depends on the amount of memory on the serer, the actual data size and how often people will be downloading the data. This method has the advantage of not blocking a database connection for too long. Alternatively, you can write directly to the client output stream as you iterate. This may block the database connection for too long as it depends on the download speed of the client. But again; it your application is of a small or medium size (in audience) then anything is fine.
You should definitely check out the FileHelpers library. It's a freeware, excellent utility set of classes to handle just this situation - import and export of data, from text files; either delimited (like CSV), or fixed width.
It offer a gazillion of options and ways of doing things, and it's FREE, and it works really well in various projects that I'm using it in. You can export a DataSet, an array, a list of objects - whatever it is you have.
It even has import/export for Excel files, too - so you really get a bunch of choices.
Just start using FileHelpers - it'll save you so much boring typing and stuff, you won't believe it :-)
Marc
Just a word of warning, Excel has a limitation on the number of rows of data - ~65k. CSV will be fine, but if your customers are importing the file into Excel they will encounter that limitation.
Why not allow them to page through the data, perhaps sorting it before paging, and then give them a button to just get everything as a cvs file.
This seems like something that DLinq would do well, both the paging, and writing it out, as it can just fetch one row at a time, so you don't read in all 100k rows before processing them.
So, for cvs, you just need to use a different LINQ query to get all of the rows, then start to save them, separating each cell by a separator, generally a comma or tab. That could be something picked by the user, perhaps.
OK, I think you are talking too many rows to do a DataReader and then loop thru to create the cvs file. The only workable way will be to run:
SQLCMD -S MyInstance -E -d MyDB -i MySelect.sql -o MyOutput.csv -s
For how to run this from ASP.Net code see here. Then once that is done, your ASP.Net page will continue with:
string fileName = "MyOutput.csv";
string filePath = Server.MapPath("~/"+fileName);
Response.Clear();
Response.AppendHeader("content-disposition",
"attachment; filename=" + fileName);
Response.ContentType = "application/octet-stream";
Response.WriteFile(filePath);
Response.Flush();
Response.End();
This will give the user the popup to save the file. If you think more than one of these will happen at a time you will have to adjust this.
So after a bit of research, the solution I ended up trying first was to use a slightly modified version of the code sample from http://www.asp.net/learn/videos/video-449.aspx and format each row value in my DataTable for CSV using the following code to try to avoid potentially problematic text:
private static string FormatForCsv(object value)
{
var stringValue = value == null ? string.Empty : value.ToString();
if (stringValue.Contains("\"")) { stringValue = stringValue.Replace("\"", "\"\""); }
return "\"" + stringValue + "\"";
}
For anyone who is curious about the above, I'm basically surrounding each value in quotes and also escaping any existing quotes by making them double quotes. I.e.
My Dog => "My Dog"
My "Happy" Dog => "My ""Happy"" Dog"
This appears to be doing the trick for now for small numbers of records. I will try it soon with the >10,000 records and see how it goes.
Edit: This solution has worked well in production for thousands of records.

Resources