Hi
I want to increase the performance of asp.net application when multiple user access my application about 5000 users.
Can we do this
Your ASP.NET application performance depends on various things. You can improve your site's performance by doing various stuff. Your questions is very subjective and of course the answer would be some best practices about improving ASP.NET applications' performance.
I have gathered some tips from the net. Unfortunately, I cannot remember where. Search on any item and you will find many resources that can help you implement it:
Use Cache:
Page output caching.
Page fragment caching.
Data caching.
Avoid frequent trips to database.
Use DB-level paging. Don't retrieve unnecessary data that's not going to be shown in the current page.
Be careful with Session variables. Usually, you should avoid session variables because each ASP page runs in a different thread and session calls will be serialized one by one. So, this will slow down the application. Instead of session variables you can use the QueryString collection or hidden variables in the form which holds the values.
Select the Release mode before making the final Build for your application.
Set debug=false under compilation: <compilation default Language="c#" debug="false">
Avoid Inline JavaScript and CSS
Use Finally Method to kill resources. (But not in the case of using)
Avoid Exceptions: Use If condition (if it is check proper condition)
Check “Page.IsPostBack”. To avoid repetition code execution.
Use single css file instead of multiple css file.
Use Client-Side Validation. (but not all the time you have to validate even on the server side).
Turn off Tracing unless until required.
Turn off Session State, if not required.
Disable ViewState when not required.
Try to use StringBuilder instead of string.
It is nice to use Stringbuilder instead of String when string are Amended. Strings occupy different memory location in every time of amended where stringbuilder use single memory location.
Never use object value directly; first get object value in local variable and then use. It takes more time then variable reading.
Avoid using code like x = x +1; it is always better to use x+=1.
Data Access Techniques: DataReaders provide a fast and efficient method of data retrieval. DataReader is much faster than DataSets as far as performance is concerned. But that depends on you deciding to balance between features/performance.
Use Repeater control instead of DataGrid , DataList, Because It is efficient, customizable, and programmable.
Reduce cookie size.
Compress CSS, JavaScript and Images.
Use server side compression software such as Port80s
Make your page files as light as possible. That is try to avoid unnecessary markups, e.g. use div elements instead of tables.
Write static messages in div and make it visible when necessary. This is faster than letting server set Text property of your label or div.
Retrieve necessary data from database at once, if possible. Don't add up to database trip as far as possible. For this, combine the datafields from different tables and select them.
Remove blank spaces from your html it will increase your kb. You can use regular expression to remove white spaces. I will post the code for removing white spaces next posts.
For asp.net 2.0 and higher version use master pages. It will increase your performance.
Use ADO.NET asynchronous calls for ado.net methods. asp.net 2.0 or higher version is supporting your performance. If you are using same procedure or command multiple time then use ADO.NET Prepare command it will increase your performance.
Do IIS performance tuning as per your requirement.
Disable view state for your controls if possible. If you are using asp.net 2.0 or higher version then use asp.net control state instead of view state. Store view state in session or database by overriding the default methods for storing view state
Use Ajax for your application wisely. Lots of Ajax calls for a page will also decrease your performance.
Call web service from java script instead of server side. Use asynchronous calls to call a web method from web service.
Related
So I've been looking into effective ways to take the load off of my database in my ASP.NET application, and I've run into the System.Web.Mvc.OutputcCacheAttribute. I've used caching based on System.Web.HttpRuntime.Cache before, it seems to be pretty much functionally equivalent.
I've done a lot of research on it, and everything I've seen portrays it as some sort of silver bullet for caching requests as long as you configure it effectively. I find this hard to believe. I understand that all it really takes (to a degree) for some effective caching is storing the output data based on certain conditions, but it still just seems too easy to just tack on an attribute and have your application magically perform better.
Has anyone had any experience with the benefits/drawbacks of using Output Caching in ASP.NET? If so, what are the pain points of using this approach to caching?
Caching can do wonders, by trading latency for memory. The devil's in the "configuring it effectively."
The important thing is to nail down for yourself what is acceptable behavior in the application, e.g., is it ok if the "top 3 posts" on the front page is up to 1 minute old? Is it ok if the "current users online" list is up to 30 seconds old? Is it ok if the main page takes 0.75 seconds to load, or does it need to be faster? Your answers to these questions will determine what should or should not be cached. Profile your application so you understand where the real performance bottlenecks are, and why they exist, so you know where to focus your optimization/caching efforts
There are many forms of caching available in a .Net application. OutputCache is just one form:
Application-Level Caching (shared by everything in the application - Application[Key])
Object Caching (automatically managed with cache invalidation callbacks - Cache[Key])
Output Caching (caching the generated output of aspx pages/parts - OutputCache property)
Per-Request Caching (caching calculated data during a single request - Context[key])
Session Caching (caching data specific to a user's session - Session[key])
They all have their pros and cons, and a well-designed application will probably make use of most or all of these forms of caching. If you want some points to consider with OutputCache, here are a few:
Try to cache parts of a page rather than a full page, because they are more likely to be re-usable. Building your pages out of components like a UserControl can help here.
Be careful with using a set of parameters that vary greatly, such as a QueryString parameter that is different per item id, because you will end up generating a lot of cached copies that are used infrequently, consuming lots of memory with very little benefit.
Note that OutputCache is merely saving the generated output of the ASPX markup. So it will not work as well as other caching types in a dynamic page that changes form based on user input.
From my experience, there is one very obvious and very often forgotten thing about this attribute.
It is the fact that method, which output is cached, won't be even executed after being cached. So, if the code behind the action has some side effects, they won't take place (e.g. logging to the database DB the fact, that use visited the page).
I have seen at least few very nasty bugs, because of that.
Short advice: use it sparsely and be 101% sure that every dev in the team knows very well how it works.
It is possible to remove ID's from pages being renedered in .NET. This can be done by simply setting the ID tag of an element with the attribute runat="server" to null. Obviously this shouldn't be done for controls that have to be evaluated / used on postback scenarios. I am very curious on how widely removing ID's will be used by people knowing this can be done. I know that by removing ID values you are able to save some bandwith, but what should be a reason to start using this method?
If you're really concerned with performance I'd perhaps worry less about the verboseness of the .NET control IDs (which is a real pet dislike of mine) and worry more about the overall postback model.
The whole send the state to the client so that it can post it pack to the server is woefully inefficient in both latency and bandwidth terms.
If it's a new project it's probably worth use MVC instead, or if it's existing try turning page compression on in IIS.
I am trying to find a specific key within the current Cache.
Problem is, my key in the cache are composite, and I would like to run like a Linq Where expression.
Is this possible? if so - how? does it reduce performance on the server?
Thanks
The whole idea behind a key is that it enables direct lookup of the item. If you have to scan all the items in the cache to find what you're looking for that's not going to perform very well at all. If you're using AppFabric Caching you can "tag" similar items with the same tag and then pull back all the items from the cache with that "tag" with a single call, but there is no such concept in the built in standard ASP.NET caching classes.
This is quite a lengthy post, so bear with me. I'm not sure whether it is primarily about ASP.NET Session State behaviour, NInject, application design, or refactoring. Read on and then you can decide... :-)
Background
First, a bit of background. We are working on trying to refactor a large webshop into a more maintainable , structured design. The webshop is currently running on .NET 3.5, but the design is more of a hangover from the classic ASP days. Obviously we cannot tackle everything in one go, so many of the features / technologies / approaches have to be taken as a given. With that in mind...
The app maintains everything to do with the current session (user profile, cart, session choices, etc.) in a context object which is simply a large XML document that gets serialized to and deserialized from the Session as a string. The XML format is also important because the rendering is done via XSLT.
This has led to a number of problems :
It's a kind of God object with far
too many concerns.
It's loosely typed and relies too much on XML manipulation / XPath.
There is no standard way / pattern for retrieving the session xml document or for writing it back. We have a horrible mixture of methods that take the document in as a parameter, modify it and return it, methods that retrieve it themselves, modify it and save it back to session, etc, etc. This has lead to a lot of hard to trace bugs, over-use of serializing /deserializing from the Session, etc.
Our Solution
What we have done is try to introduce a strongly -typed wrapper around the xml document, which breaks it up into different concerns and to manage the lifecycle transparently to the rest of the app.
What we are aiming for is the following workflow:
Beginning of the request, we populate
the session document from the xml
string stored in the session.
The rest of the app interacts with it
only through the strongly typed
wrapper. The whole app uses the same
instance and does not have to worry
about when to retrieve or save the
state back to session.
At the end of the request, the underlying xml document is serialized back to the Session.
Since we are using NInject(v1) as the IOC of choice, we decided to use this to manage the lifecycle of our context object. The context object was wrapped with the OnePerRequest attribute and the dispose method was hooked up to a method that would save the xml document back to Session as a string.
It doesn't work...
We soon encountered a problem that the NInject OnePerRequest module didn't appear to have access to SessionState. The first thing we tried was a hack that we would keep the Session object in a variable to make sure we could still write to it. This appeared to work on a development machine but it became obvious it didn't when moving to out of process state.
It still doesn't work...
We tried inheriting from the OnePerRequest behaviour / module, and adding the IRequiresSessionState marker interface (OnePerRequestRequiresSessionState). However, this was not enough as the method which NInject uses to release references and clean up gets hooked up to the EndRequest method. Session is available in EndRequest but it has already been serialized to the out of process state server so changing something now is not reflected when the session string is retrieved at the beginning of the next request.
We then decided to change the even t to hook up to. We ditched EndRequest and hooked up our OnePerRequestRequiresSessionState "release all" method to the PostRequestHandlerExecute event, which is BEFORE the session data gets serialized out of process.
It works... then it doesn't...
This seemed to work. On a single server and on a web farm. Then we noticed weird behaviour. There seemed to be two different versions of the context and you would randomly switch between them. Add something to the cart, it's not there. Go to browse to another product and the previous product would show up in the cart.
After some tracing, we discovered the culprit: Response.Redirect. Sprinkled throughout the site in literally hundreds of places is Response.Redirect(url);. With this version of the redirect, the execution of the page is stopped immediately. This means that PostRequestHandlerExecute is not fired and the current version of the Context object is not thrown away by NInject... and everything falls apart. New versions are not created properly, etc. EndRequest is fired which is why the normal NInject OnePerRequest module works fine with it, just not our bastardized version that tries to use session state.
Of course, there is an override to Response.Redirect where you can pass a boolean value in to tell it whether to terminate the existing page or continue to execute - Response.Redirect(url,false). Continuing obviously fires our event and everything works but... it continues to execute the rest of the page! This means executing everything that comes after the call to Redirect and we have absolutely no idea what that means (since the existing site expects it to stop).
What next?
So, any suggestions on what to do? So far we've discussed :
Abstracting our redirect behaviour
and going through a central method
that controls the redirect (perhaps
hacking out a way to call the
PostRequestHandlerExecute even t or
maybe a custom Redirect event that
our NInject module can also
subscribe to and clean up).
Seeing if there is a way we can
force the Session object to save in
EndRequest if it hasn't been saved
previously in
PostRequestHandlerExecute, and do
the ninject clean up in EndRequest
Remove our dependency on Session
completely and use another storage
mechanism: DB, document DB,
distributed HashTable, etc. Any
advice? Suggestions we haven't
thought of? Things you've tried
that have / haven't worked?
I think you're on the right track. Here's some thoughts I had:
in addition to the strongly typed wrapper you have, I'd suggest a facade for accessing the context object that returns your wrapper, something like an IContextProvider. that way you can introduce it piece-meal, and then when it's fully integrated, you can refactor the provider without breaking the things that use it. I can't tell, but you might have already done this. it'll also be easier to change your persistence mechanism if you choose to. if you can do this, I would suggest once you get all the dependencies isolated from the context object, change it to not persist as XML. the SessionState will store a binary object much faster, and you can always serialize to XML if you need to do transforms.
I don't think that Ninject is the correct mechanism for what you're trying to do. it's difficult to signal end of the request in Ninject, since garbage collection can't be depended on. have you considered using an IHttpModule instead? you can use the AcquireRequestState and ReleaseRequestState or EndRequest to handle getting/setting the context in Session. only allow the app to get to the context object through the facade.
if you're on a webfarm, you're probably using a database for your Session storage anyway, so putting your context into a DB won't be much different.
Firstly, while it's good to demonstrate you've put in the work, (and I and others may not have replied if it wasn't clear how much you're interested in a resolution)... that's a massive wall of text! Here's a +1 on your way to investing in a bonus for a complete response that talks about the Ninject ASP.NET extensions and how they apply to each individual element of your issue. Having said that, hopefully someone will come along with a real resolution for you.
Even though it's [very] 2.0 specific, Nate's Cache and Collect Post is required reading. While it seems you're pretty au fait with the tradeoffs involved and have debugged deep in, the article is well worth a few reads.
I'd also consider moving to V2 of Ninject - a lot of this stuff has been revised significantly. It's not magically going to work, but represents a mature rewrite based on a lot of learning from V1. Have you read the (V1 or) V2 unit tests for Ninject? They'll show you the low level tools at your disposal in order to realise your goals.
Bottom line for me is that you need to work out a strategy for your state management independent of DI, and then by all means use the container/DI system as a part of the implementation.
I'm building a Web Page that allows the user to pick a color and size. Once they have these selected I need to perform a lookup to see if inventory exists or not and update some UI elements based on this.
I was thinking that putting all the single product data into multidimensional JavaScript array (there is only 10-50 records for any page instance) and writing some client side routines around that, would be the way to go for two reasons. One because it keeps the UI fast and two it minimizes callbacks to the server. What i'm worried about with this solution is code smell.
As an alternative i'm thinking about using a more AJAX purist approach of using HTTP handlers and JSON, or perhaps a hybrid with a bit of both. My question is what are your thoughts as to the best solution to this problem using the ASP.Net 2.0 stack?
[Edit]
I also should mention that this page will be running in a SharePoint environment.
Assuming the data is static, I would vote option #1. Storing and retrieving data elements in a JavaScript array is relatively foolproof and entirely within your control. Calling back to the server introduces a lot of possible failure points. Besides, I think keeping the data in-memory within the page will require less code overall and be more readable to anyone with a more than rudimentary understanding of JavaScript.
i'm against Ajax for such tasks, and vote (and implemented) the first option.
As far as I understand, you won't create Code smells if the JS part is being written by your server-side.
From a user point-of-view, Ajax is an experience-killer for wireless browsing, since any little glitch or mis-service will fail or simply lengthen the interaction by factors of 20(!).
I've implemented even more records than yours in my site, and the users love it. Since some of my users use internet-caffee, or dubious hotel wifi, it wouldn't work otherwise.
Besides, Ajax makes your server-vs-client interaction code much more complex, IMO, which is the trickiest part in web programming.
I would go with your second option by far. As long as the AJAX call isn't performing a long running process for this case, it should be pretty fast.
The application I work on does lots with AJAX and HttpHandler, and our calls execute fast. Just ensure you are minimizing the size of your JSON returned in the response.
Go with your second option. If there are that few items involved, the AJAX call should perform fairly well. You'll keep your code off the client side, hopefully prevent any browser based issues that the client side scripting might have caused, and have a cleaner application.
EDIT
Also consider that client side script can be modified by the user. If there's no other validation occuring to the user's selection, this could allow them to configure a product that is out of stock.