Reasons to remove ID's from Controls in on pages - asp.net

It is possible to remove ID's from pages being renedered in .NET. This can be done by simply setting the ID tag of an element with the attribute runat="server" to null. Obviously this shouldn't be done for controls that have to be evaluated / used on postback scenarios. I am very curious on how widely removing ID's will be used by people knowing this can be done. I know that by removing ID values you are able to save some bandwith, but what should be a reason to start using this method?

If you're really concerned with performance I'd perhaps worry less about the verboseness of the .NET control IDs (which is a real pet dislike of mine) and worry more about the overall postback model.
The whole send the state to the client so that it can post it pack to the server is woefully inefficient in both latency and bandwidth terms.
If it's a new project it's probably worth use MVC instead, or if it's existing try turning page compression on in IIS.

Related

ASP.NET OutputCacheAttribute too good to be true?

So I've been looking into effective ways to take the load off of my database in my ASP.NET application, and I've run into the System.Web.Mvc.OutputcCacheAttribute. I've used caching based on System.Web.HttpRuntime.Cache before, it seems to be pretty much functionally equivalent.
I've done a lot of research on it, and everything I've seen portrays it as some sort of silver bullet for caching requests as long as you configure it effectively. I find this hard to believe. I understand that all it really takes (to a degree) for some effective caching is storing the output data based on certain conditions, but it still just seems too easy to just tack on an attribute and have your application magically perform better.
Has anyone had any experience with the benefits/drawbacks of using Output Caching in ASP.NET? If so, what are the pain points of using this approach to caching?
Caching can do wonders, by trading latency for memory. The devil's in the "configuring it effectively."
The important thing is to nail down for yourself what is acceptable behavior in the application, e.g., is it ok if the "top 3 posts" on the front page is up to 1 minute old? Is it ok if the "current users online" list is up to 30 seconds old? Is it ok if the main page takes 0.75 seconds to load, or does it need to be faster? Your answers to these questions will determine what should or should not be cached. Profile your application so you understand where the real performance bottlenecks are, and why they exist, so you know where to focus your optimization/caching efforts
There are many forms of caching available in a .Net application. OutputCache is just one form:
Application-Level Caching (shared by everything in the application - Application[Key])
Object Caching (automatically managed with cache invalidation callbacks - Cache[Key])
Output Caching (caching the generated output of aspx pages/parts - OutputCache property)
Per-Request Caching (caching calculated data during a single request - Context[key])
Session Caching (caching data specific to a user's session - Session[key])
They all have their pros and cons, and a well-designed application will probably make use of most or all of these forms of caching. If you want some points to consider with OutputCache, here are a few:
Try to cache parts of a page rather than a full page, because they are more likely to be re-usable. Building your pages out of components like a UserControl can help here.
Be careful with using a set of parameters that vary greatly, such as a QueryString parameter that is different per item id, because you will end up generating a lot of cached copies that are used infrequently, consuming lots of memory with very little benefit.
Note that OutputCache is merely saving the generated output of the ASPX markup. So it will not work as well as other caching types in a dynamic page that changes form based on user input.
From my experience, there is one very obvious and very often forgotten thing about this attribute.
It is the fact that method, which output is cached, won't be even executed after being cached. So, if the code behind the action has some side effects, they won't take place (e.g. logging to the database DB the fact, that use visited the page).
I have seen at least few very nasty bugs, because of that.
Short advice: use it sparsely and be 101% sure that every dev in the team knows very well how it works.

How increase the performance of asp.net application?

Hi
I want to increase the performance of asp.net application when multiple user access my application about 5000 users.
Can we do this
Your ASP.NET application performance depends on various things. You can improve your site's performance by doing various stuff. Your questions is very subjective and of course the answer would be some best practices about improving ASP.NET applications' performance.
I have gathered some tips from the net. Unfortunately, I cannot remember where. Search on any item and you will find many resources that can help you implement it:
Use Cache:
Page output caching.
Page fragment caching.
Data caching.
Avoid frequent trips to database.
Use DB-level paging. Don't retrieve unnecessary data that's not going to be shown in the current page.
Be careful with Session variables. Usually, you should avoid session variables because each ASP page runs in a different thread and session calls will be serialized one by one. So, this will slow down the application. Instead of session variables you can use the QueryString collection or hidden variables in the form which holds the values.
Select the Release mode before making the final Build for your application.
Set debug=false under compilation: <compilation default Language="c#" debug="false">
Avoid Inline JavaScript and CSS
Use Finally Method to kill resources. (But not in the case of using)
Avoid Exceptions: Use If condition (if it is check proper condition)
Check “Page.IsPostBack”. To avoid repetition code execution.
Use single css file instead of multiple css file.
Use Client-Side Validation. (but not all the time you have to validate even on the server side).
Turn off Tracing unless until required.
Turn off Session State, if not required.
Disable ViewState when not required.
Try to use StringBuilder instead of string.
It is nice to use Stringbuilder instead of String when string are Amended. Strings occupy different memory location in every time of amended where stringbuilder use single memory location.
Never use object value directly; first get object value in local variable and then use. It takes more time then variable reading.
Avoid using code like x = x +1; it is always better to use x+=1.
Data Access Techniques: DataReaders provide a fast and efficient method of data retrieval. DataReader is much faster than DataSets as far as performance is concerned. But that depends on you deciding to balance between features/performance.
Use Repeater control instead of DataGrid , DataList, Because It is efficient, customizable, and programmable.
Reduce cookie size.
Compress CSS, JavaScript and Images.
Use server side compression software such as Port80s
Make your page files as light as possible. That is try to avoid unnecessary markups, e.g. use div elements instead of tables.
Write static messages in div and make it visible when necessary. This is faster than letting server set Text property of your label or div.
Retrieve necessary data from database at once, if possible. Don't add up to database trip as far as possible. For this, combine the datafields from different tables and select them.
Remove blank spaces from your html it will increase your kb. You can use regular expression to remove white spaces. I will post the code for removing white spaces next posts.
For asp.net 2.0 and higher version use master pages. It will increase your performance.
Use ADO.NET asynchronous calls for ado.net methods. asp.net 2.0 or higher version is supporting your performance. If you are using same procedure or command multiple time then use ADO.NET Prepare command it will increase your performance.
Do IIS performance tuning as per your requirement.
Disable view state for your controls if possible. If you are using asp.net 2.0 or higher version then use asp.net control state instead of view state. Store view state in session or database by overriding the default methods for storing view state
Use Ajax for your application wisely. Lots of Ajax calls for a page will also decrease your performance.
Call web service from java script instead of server side. Use asynchronous calls to call a web method from web service.

Best Practices for Passing Data Between Pages

The Problem
In the stack that we re-use between projects, we are putting a little bit too much data in the session for passing data between pages. This was good in theory because it prevents tampering, replay attacks, and so on, but it creates as many problems as it solves.
Session loss itself is an issue, although it's mostly handled by implementing Session State Server (or by using SQL Server). More importantly, it's tricky to make the back button work correctly, and it's also extra work to create a situation where a user can, say, open the same screen in three tabs to work on different records.
And that's just the tip of the iceberg.
There are workarounds for most of these issues, but as I grind away, all this friction gives me the feeling that passing data between pages using session is the wrong direction.
What I really want to do here is come up with a best practice that my shop can use all the time for passing data between pages, and then, for new apps, replace key parts of our stack that currently rely on Session.
It would also be nice if the final solution did not result in mountains of boilerplate plumbing code.
Proposed Solutions
Session
As mentioned above, leaning heavily on Session seems like a good idea, but it breaks the back button and causes some other problems.
There may be ways to get around all the problems, but it seems like a lot of extra work.
One thing that's very nice about using session is the fact that tampering is just not an issue. Compared to passing everything via the unencrypted QueryString, you end up writing much less guard code.
Cross-Page Posting
In truth I've barely considered this option. I have a problem with how tightly coupled it makes the pages -- if I start doing PreviousPage.FindControl("SomeTextBox"), that seems like a maintenance problem if I ever want to get to this page from another page that maybe does not have a control called SomeTextBox.
It seems limited in other ways as well. Maybe I want to get to the page via a link, for instance.
QueryString
I'm currently leaning towards this strategy, like in the olden days. But I probably want my QueryString to be encrypted to make it harder to tamper with, and I would like to handle the problem of replay attacks as well.
On 4 guys from Rolla, there's an article about this.
However, it should be possible to create an HttpModule that takes care of all this and removes all the encryption sausage-making from the page. Sure enough, Mads Kristensen has an article where he released one. However, the comments make it sound like it has problems with extremely common scenarios.
Other Options
Of course this is not an exaustive look at the options, but rather the main options I'm considering. This link contains a more complete list. The ones I didn't mention such as Cookies and the Cache not appropriate for the purpose of passing data between pages.
In Closing...
So, how are you handling the problem of passing data between pages? What hidden gotchas did you have to work around, and are there any pre-existing tools around this that solve them all flawlessly? Do you feel like you've got a solution that you're completely happy with?
Thanks in advance!
Update: Just in case I'm not being clear enough, by 'passing data between pages' I'm talking about, for instance, passing a CustomerID key from a CustomerSearch.aspx page to Customers.aspx, where the Customer will be opened and editing can occur.
First, the problems with which you are dealing relate to handling state in a state-less environment. The struggles you are having are not new and it is probably one of the things that makes web development harder than windows development or the development of an executable.
With respect to web development, you have five choices, as far as I'm aware, for handling user-specific state which can all be used in combination with each other. You will find that no one solution works for everything. Instead, you need to determine when to use each solution:
Query string - Query strings are good for passing pointers to data (e.g. primary key values) or state values. Query strings by themselves should not be assumed to be secure even if encrypted because of replay. In addition, some browsers have a limit on the length of the url. However, query strings have some advantages such as that they can be bookmarked and emailed to people and are inherently stateless if not used with anything else.
Cookies - Cookies are good for storing very tiny amounts of information for a particular user. The problem is that cookies also have a size limitation after which it will simply truncate the data so you have to be careful with putting custom data in a cookie. In addition, users can kill cookies or stop their use (although that would prevent use of standard Session as well). Similar to query strings, cookies are better, IMO, for pointers to data than for the data itself unless the data is tiny.
Form data - Form data can take quite a bit of information however at the cost of post times and in some cases reload times. ASP.NET's ViewState uses hidden form variables to maintain information. Passing data between pages using something like ViewState has the advantage of working nicer with the back button but can easily create ginormous pages which slow down the experience for the user. In general, ASP.NET model does not work on cross page posting (although it is possible) but instead works on posts back to the same page and from there navigating to the next page.
Session - Session is good for information that relates to a process with which the user is progressing or for general settings. You can store quite a bit of information into session at the cost of server memory or load times from the databases. Conceptually, Session works by loading the entire wad of data for the user all at once either from memory or from a state server. That means that if you have a very large set of data you probably do not want to put it into session. Session can create some back button problems which must be weighed against what the user is actually trying to accomplish. In general you will find that the back button can be the bane of the web developer.
Database - The last solution (which again can be used in combination with others) is that you store the information in the database in its appropriate schema with a column that indicates the state of the item. For example, if you were handling the creation of an order, you could store the order in the Order table with a "state" column that determines whether it was a real order or not. You would store the order identifier in the query string or session. The web site would continue to write data into the table to update the various parts and child items until eventually the user is able to declare that they are done and the order's state is marked as being a real order. This can complicate reports and queries in that they all need to differentiate "real" items from ones that are in process.
One of the items mentioned in your later link was Application Cache. I wouldn't consider this to be user-specific since it is application wide. (It can obviously be shoe-horned into being user-specific but I wouldn't recommend that either). I've never played with storing data in the HttpContext outside of passing it to a handler or module but I'd be skeptical that it was any different than the above mentioned solutions.
In general, there is no one solution to rule them all. The best approach is to assume on each page that the user could have navigated to that page from anywhere (as opposed to assuming they got there by using a link on another page). If you do that, back button issues become easier to handle (although still a pain). In my development, I use the first four extensively and on occasion resort to the last solution when the need calls for it.
Alright, so I want to preface my answer with this; Thomas clearly has the most accurate and comprehensive answer so far for people starting fresh. This answer isn't in the same vein at all. My answer is coming from a "business developer's" standpoint. As we all know too well; sometimes it's just not feasible to spend money re-writing something that already exists and "works"... at least not all in one shot. Sometimes it's best to implement a solution which will let you migrate to a better alternative over time.
The only thing I'd say Thomas is missing is; client-side javascript state. Where I work we've found customers are coming to expect "Web 2.0"-type applications more and more. We've also found these sorts of applications typically result in much higher user satisfaction. With a little practice, and the help of some really great javascript libraries like jQuery (we've even started using GWT and found it to be AWESOME) communicating with JSON-based REST services implemented in WCF can be trivial. This approach also provides a very nice way to start moving towards a SOA-based architecture, and clean separation of UI and business logic.
But I digress.
It sounds to me as though you already have an application, and you've already stretched the limits of ASP.NET's built-in session state management. So... here's my suggestion (assuming you've already tried ASP.NET's out-of-process session management, which scales signifigantly better than the in-process/on-box session management, and it sounds like you have because you mentioned it); NCache.
NCache provides you with a "drop-in" replacement for ASP.NET's session management options. It's super easy to implement, and could "band-aid" your application more than well enough to get you through - without any significant investment in refactoring your existing codebase immediately.
You can use the extra time and money to start reducing your technical debt by focusing new development on things with immediate business-value - using a new approach (such as any of the alternatives offered in the other answers, or mine).
Just my thoughts.
Several months later, I thought I would update this question with the technique I ended up going with, since it has worked out so well.
After playing with more involved session state handling (which resulted in a lot of broken back buttons and so on) I ended up rolling my own code to handle encrypted QueryStrings. It's been a huge win -- all of my problem scenarios (back button, multiple tabs open at the same time, lost session state, etc) are solved and the complexity is minimal since the usage is very familiar.
This is still not a magic bullet for everything but I think it's good for about 90% of the scenarios you run into.
Details
I built a class called CorePage that inherits from Page. It has methods called SecureRequest and SecureRedirect.
So you might call:
SecureRedirect(String.Format("Orders.aspx?ClientID={0}&OrderID={1}, ClientID, OrderID)
CorePage parses out the QueryString and encrypts it into a QueryString variable called CoreSecure. So the actual request looks like this:
Orders.aspx?CoreSecure=1IHXaPzUCYrdmWPkkkuThEes%2fIs4l6grKaznFGAeDDI%3d
If available, the currently logged in UserID is added to the encryption key, so replay attacks are not as much of a problem.
From there, you can call:
X = SecureRequest("ClientID")
Conclusion
Everything works seamlessly, using familiar syntax.
Over the last several months I've also adapted this code to work with edge cases, such as hyperlinks that trigger a download - sometimes you need to generate a hyperlink on the client that has a secure QueryString. That works really well.
Let me know if you would like to see this code and I will put it up somewhere.
One last thought: it's weird to accept my own answer over some of the very thoughtful posts other people put on here, but this really does seem to be the ultimate answer to my problem. Thanks to everyone who helped get me there.
After going through all the above scenarios and answers and this link Data pasing methods My final advice would be :
COOKIES for:
ENCRYPT[userId's]
ENCRYPT[productId]
ENCRYPT[xyzIds..]
ENCRYPT[etc..]
DATABASE for:
datasets BY COOKIE ID
datatables BY COOKIE ID
all other large chunks BY COOKIE ID
My advise also depends on the below statistics and this link details Data pasing methods :
I would never do this. I have never had any issues storing all session data in the database, loading it based on the users cookie. It's a session as far as anything is concerned, but I maintain control over it. Don't give up control of your session data to your web server...
With a little work, you can support sub sessions, and allow multi-tasking in different tabs/windows.
As a starting point, I find using the critical data elements, such as a Customer ID, best put into the query string for processing. You can easily track/filter bad data coming off of these elements, and it also allows for some integration with e-mail or other related sites/applications.
In a previous application, the only way to view an employee or a request record involving them was to log into the application, do a search for the employee or do a search for recent records to find the record in question. This became problematic and a big time sink when somebody from a related department needed to do a simple view on records for auditing purposes.
In the rewrite, I made both the employee Id, and request Ids available through a basic URL of "ViewEmployee.aspx?Id=XXX" and "ViewRequest.aspx?Id=XXX". The application was setup to A) filter out bad Ids and B) authenticate and authorize the user before allowing them to these pages. What this allowed the primarily application users to do was to send simple e-mails to the auditors with a URL in the e-mail. When they were in a big hurry, they were in their bulk processing time, they were able to simply click down a list of URLs and do the appropriate processing.
Other session related data, such as modification dates and maintaining the "state" of the user's interaction with the application gets a little more complex, but hopefully this provides a starting poing for you.

Design Decision - Javascript array or http handler

I'm building a Web Page that allows the user to pick a color and size. Once they have these selected I need to perform a lookup to see if inventory exists or not and update some UI elements based on this.
I was thinking that putting all the single product data into multidimensional JavaScript array (there is only 10-50 records for any page instance) and writing some client side routines around that, would be the way to go for two reasons. One because it keeps the UI fast and two it minimizes callbacks to the server. What i'm worried about with this solution is code smell.
As an alternative i'm thinking about using a more AJAX purist approach of using HTTP handlers and JSON, or perhaps a hybrid with a bit of both. My question is what are your thoughts as to the best solution to this problem using the ASP.Net 2.0 stack?
[Edit]
I also should mention that this page will be running in a SharePoint environment.
Assuming the data is static, I would vote option #1. Storing and retrieving data elements in a JavaScript array is relatively foolproof and entirely within your control. Calling back to the server introduces a lot of possible failure points. Besides, I think keeping the data in-memory within the page will require less code overall and be more readable to anyone with a more than rudimentary understanding of JavaScript.
i'm against Ajax for such tasks, and vote (and implemented) the first option.
As far as I understand, you won't create Code smells if the JS part is being written by your server-side.
From a user point-of-view, Ajax is an experience-killer for wireless browsing, since any little glitch or mis-service will fail or simply lengthen the interaction by factors of 20(!).
I've implemented even more records than yours in my site, and the users love it. Since some of my users use internet-caffee, or dubious hotel wifi, it wouldn't work otherwise.
Besides, Ajax makes your server-vs-client interaction code much more complex, IMO, which is the trickiest part in web programming.
I would go with your second option by far. As long as the AJAX call isn't performing a long running process for this case, it should be pretty fast.
The application I work on does lots with AJAX and HttpHandler, and our calls execute fast. Just ensure you are minimizing the size of your JSON returned in the response.
Go with your second option. If there are that few items involved, the AJAX call should perform fairly well. You'll keep your code off the client side, hopefully prevent any browser based issues that the client side scripting might have caused, and have a cleaner application.
EDIT
Also consider that client side script can be modified by the user. If there's no other validation occuring to the user's selection, this could allow them to configure a product that is out of stock.

ajaxified auto suggest

I have a search module with Auto Suggest feature to build in ASP.Net
The search criteria is Training Name and there is a table in database that stores trainings. The size would be as large as 30,000 trainings in the table so I have to be very careful in selecting the approach keeping in mind the performance.
There could be about 3000 users logging in the system simultaneously. When the user starts typing a training name the system should autosuggest.
The approaches that came in my mind were as under
Cache object - There would be a database hit after the user types 3 (e.g. saf) characters and the system would search the activity table for all trainings starting with saf and would cache them. The other requests would go thro this cache.
But the problem with this approach would be if there are 3000 concurrent users using the system and if they all search for different combinations of 3 different letters the cache would just blow.
Client side caching - Did not think much on this. The only drawback I see here is we might have to purge the temporary internet folder periodically.
Using Session - I thought to rule this out completely as I thought it would hit performance.
Can you please suggest the best or any other different approach I can take here. I am looking for all information/ideas that you have on this.
Thank you so much
Deepa.
My favourite jQuery plug-in to do that (if you're in intent to use jQuery) is the Flexbox.
It has a really impressive list of features.
You could use the jQuery Auto Complete plugin, which has caching features built in.
$(document).ready(function()
{
$(".landingpage").autocomplete('/AutoSuggestHandler.ashx',
{
minChars: 1,
matchSubset: 1,
autoFill: false,
delay: 10,
scroll: false
}).result(OnResultSelected);
}
Furthermore, you could specify outpu caching on the generic handler, to accommodate the need of caching across users.
I think your first approach will work.
Make sure there is an index on the field - you probably won't need to index the whole field. This should give the database a decent boost. You may need to look at full text indexing depending on how your search works, or even use an external library like lucene for the index is performance is an issue.
Cache the object, or even the resulting xml/json from the queries to improve performance.
You should also set the http headers so that browsers cache the xml/json as well.
Your posting really contains two questions:
How can I get autocomplete on my webpage?
I am concerned about performance due to a large number of queries hitting my database at the same time.
My answers...
1: We've found the ASP.NET AJAX AutoComplete Extender works well on all modern browsers, provides a slick user experience and is pretty easy to implement.
In your web application you need to create a web service that has a method with a specific signature (covered in the documentation linked to above).
2: Have you proven that you actually have a performance bottleneck with this part of your project? I'd recommend setting up a test harness and hitting your database with a large number of autocomplete queries to see how much it can take. Be wary of premature optimization.

Resources