Convert HttpContext current cache to LINQ - asp.net

I am trying to find a specific key within the current Cache.
Problem is, my key in the cache are composite, and I would like to run like a Linq Where expression.
Is this possible? if so - how? does it reduce performance on the server?
Thanks

The whole idea behind a key is that it enables direct lookup of the item. If you have to scan all the items in the cache to find what you're looking for that's not going to perform very well at all. If you're using AppFabric Caching you can "tag" similar items with the same tag and then pull back all the items from the cache with that "tag" with a single call, but there is no such concept in the built in standard ASP.NET caching classes.

Related

Querying the Cache using Linq in asp.net

Search is the most used feature on our website and the search query is the most CPU intensive, complex and frequent query that executes on our db, causing heavy CPU usages on the db server. To reduce the load on the db we have been looking at various caching strategies. For now, we intend to use the ASP.NET Cache.
The idea is to have an in-memory db of the most frequently/recently created/accessed objects in the cache and then query the in-memory db using linq to come up with search results. My initial thought was to Cache a List of the Users and then query or modify this List using linq. But given the complexities of multiple threads accessing or trying to modify List I was looking at other options.
Which is when I thought that instead of Caching a List, cache the individual User objects with its Id as the key and try and query the Cache. At http://msdn.microsoft.com/en-us/library/system.web.caching.cache.aspx I see that the Cache has an extension method AsQueryable but I am not sure what does this mean. Cache is a key value pair so with AsQueryable will I able to query the keys and get a set of User Objects or will I able to query the User objects and get my desired result?
Before you start this you really need to have some measurability in place around it -- there is no way to figure out if your changes help or hurt without having some good, solid data to make that judgement on. Performance, especially performance at scale isn't something you can think or guess through. You have to know your way through it.
As for your solution, I think you might well make the problem worse or at least create another problem here. Your database server is theoretically designed to handle arbitrary user queries across vast information sets efficiently. Linq is awesome but it is not really meant to be an ad-hoc search engine -- it doesn't have the sorts of indexing capabilities one really expects from search engines. Just because it can expose things as an IQueryable doesn't mean you should treat it that way. And even if you've got a way to efficently search the cache, you've got another problem to get past -- how do you identify what is most frequently used? And how do you manage the ASP.NET cache to not start ejecting things when it gets low on memory?
You would probably be better served here by:
Starting with some good old fashioned database tuning -- why are your queries so slow and expensive? Are you missing an index somewhere?
Looking at caching the results page output, especially if your search URLs are GET-able as that is pretty easy to manage. This is a great short term solution if the site is melting.
Look at building the search bits properly. Using LIKE %whatever% is not a proper search. Full text indexes in your database is a good start. Something like lucene.net is probably better.
No, cannot use AsQueryable to query User objects and get the desired result I was looking for. So now I will be using a static List for the time being though I know I will have to change sooner rather than later.

How increase the performance of asp.net application?

Hi
I want to increase the performance of asp.net application when multiple user access my application about 5000 users.
Can we do this
Your ASP.NET application performance depends on various things. You can improve your site's performance by doing various stuff. Your questions is very subjective and of course the answer would be some best practices about improving ASP.NET applications' performance.
I have gathered some tips from the net. Unfortunately, I cannot remember where. Search on any item and you will find many resources that can help you implement it:
Use Cache:
Page output caching.
Page fragment caching.
Data caching.
Avoid frequent trips to database.
Use DB-level paging. Don't retrieve unnecessary data that's not going to be shown in the current page.
Be careful with Session variables. Usually, you should avoid session variables because each ASP page runs in a different thread and session calls will be serialized one by one. So, this will slow down the application. Instead of session variables you can use the QueryString collection or hidden variables in the form which holds the values.
Select the Release mode before making the final Build for your application.
Set debug=false under compilation: <compilation default Language="c#" debug="false">
Avoid Inline JavaScript and CSS
Use Finally Method to kill resources. (But not in the case of using)
Avoid Exceptions: Use If condition (if it is check proper condition)
Check “Page.IsPostBack”. To avoid repetition code execution.
Use single css file instead of multiple css file.
Use Client-Side Validation. (but not all the time you have to validate even on the server side).
Turn off Tracing unless until required.
Turn off Session State, if not required.
Disable ViewState when not required.
Try to use StringBuilder instead of string.
It is nice to use Stringbuilder instead of String when string are Amended. Strings occupy different memory location in every time of amended where stringbuilder use single memory location.
Never use object value directly; first get object value in local variable and then use. It takes more time then variable reading.
Avoid using code like x = x +1; it is always better to use x+=1.
Data Access Techniques: DataReaders provide a fast and efficient method of data retrieval. DataReader is much faster than DataSets as far as performance is concerned. But that depends on you deciding to balance between features/performance.
Use Repeater control instead of DataGrid , DataList, Because It is efficient, customizable, and programmable.
Reduce cookie size.
Compress CSS, JavaScript and Images.
Use server side compression software such as Port80s
Make your page files as light as possible. That is try to avoid unnecessary markups, e.g. use div elements instead of tables.
Write static messages in div and make it visible when necessary. This is faster than letting server set Text property of your label or div.
Retrieve necessary data from database at once, if possible. Don't add up to database trip as far as possible. For this, combine the datafields from different tables and select them.
Remove blank spaces from your html it will increase your kb. You can use regular expression to remove white spaces. I will post the code for removing white spaces next posts.
For asp.net 2.0 and higher version use master pages. It will increase your performance.
Use ADO.NET asynchronous calls for ado.net methods. asp.net 2.0 or higher version is supporting your performance. If you are using same procedure or command multiple time then use ADO.NET Prepare command it will increase your performance.
Do IIS performance tuning as per your requirement.
Disable view state for your controls if possible. If you are using asp.net 2.0 or higher version then use asp.net control state instead of view state. Store view state in session or database by overriding the default methods for storing view state
Use Ajax for your application wisely. Lots of Ajax calls for a page will also decrease your performance.
Call web service from java script instead of server side. Use asynchronous calls to call a web method from web service.

node_load or direct query?

What rule of thumb do you use for deciding to use node_load() or just writing a direct db_query()?
In a situation I'm looking at right now I need to get some node data and resolve data on two nodereference fields. So that would be 3 calls to node_load(). At some point here, would it be more efficient to construct the query with Joins directly?
This is for use in a self contained module that won't be distributed or used anywhere else, so I don't believe I need to worry about subverting node modification hooks (or do I?).
Edit:
Thinking about my question more, node_load() is only really applicable when you have one node to grab (and then maybe drilling down further into nodereferences like in my example). But as soon as you need to return more than one node based on some criteria, you're pretty much forced to use db_query right? Does Drupal have any abstracted API for writing queries like this?
Not a full answer (Not sure myself), just some hints.
node_load() is using a static cache (in Drupal 7, you can even use the entity_cache module to make it a permanent cache). If the nodes you are loading are being used a second time on the same page, that call will be free.
Querying CCK-tables is tricky. The schema structure can change completely based on configuration, for example when using a single or multiple values.
The reasoning behind using API methods for DB calls over direct DB calls is to provide a DB abstraction layer so that your app could move between supported database engines etc, also it enables your app to gracefully handle any schema changes (however unlikely) that core/module may make to the tables in question. It's also likely easier as #Berdir says for CCK fields and Node_Ref fields, but that depends on which you are more confident with Drupal API& PHP or MySQL...the payoff of doing it the Drupal way is increased future productivity and understanding of the codebase and what is possible :)
Oh and my rule of thumb is - Do it the Drupal way if at all possible (possible being variable depending on app time/cost/performance/whatever requirements)

LINQ/EDM cache's efficiency in a webapp

I was just learning how to use the LINQ/EDM combination to retrieve and update a simple user-thread-and-comment webapp as part of evaluating it.
When I turn on SQL Profiler, I rarely see a SQL statement executed by my app.
I'm starting to really like how well it keeps things cached, coz as soon as I add new data, it magically updates itself while I'm blinking.
But is that something I should be scared of?
My concern is when I use this to make a webapp with some traffic (whatever hit count that reaches this approach's par).
Should I keep a single context object at app-level, so that different sessions can benefit from each other's cache entries?
Or should I do the create-and-release on each page submission?
I know this sounds like an open-ended question, but I really have that as a question: how does the entity cache its data when using LINQ?
On the ObjectContext question you should use a lifetime of per-page-request cycle or smaller. It's designed for a unit of work and not for the application lifetime. Search SO for "ObjectContext lifetime" or "DataContext lifetime" and you'll see this is a common question.

ajaxified auto suggest

I have a search module with Auto Suggest feature to build in ASP.Net
The search criteria is Training Name and there is a table in database that stores trainings. The size would be as large as 30,000 trainings in the table so I have to be very careful in selecting the approach keeping in mind the performance.
There could be about 3000 users logging in the system simultaneously. When the user starts typing a training name the system should autosuggest.
The approaches that came in my mind were as under
Cache object - There would be a database hit after the user types 3 (e.g. saf) characters and the system would search the activity table for all trainings starting with saf and would cache them. The other requests would go thro this cache.
But the problem with this approach would be if there are 3000 concurrent users using the system and if they all search for different combinations of 3 different letters the cache would just blow.
Client side caching - Did not think much on this. The only drawback I see here is we might have to purge the temporary internet folder periodically.
Using Session - I thought to rule this out completely as I thought it would hit performance.
Can you please suggest the best or any other different approach I can take here. I am looking for all information/ideas that you have on this.
Thank you so much
Deepa.
My favourite jQuery plug-in to do that (if you're in intent to use jQuery) is the Flexbox.
It has a really impressive list of features.
You could use the jQuery Auto Complete plugin, which has caching features built in.
$(document).ready(function()
{
$(".landingpage").autocomplete('/AutoSuggestHandler.ashx',
{
minChars: 1,
matchSubset: 1,
autoFill: false,
delay: 10,
scroll: false
}).result(OnResultSelected);
}
Furthermore, you could specify outpu caching on the generic handler, to accommodate the need of caching across users.
I think your first approach will work.
Make sure there is an index on the field - you probably won't need to index the whole field. This should give the database a decent boost. You may need to look at full text indexing depending on how your search works, or even use an external library like lucene for the index is performance is an issue.
Cache the object, or even the resulting xml/json from the queries to improve performance.
You should also set the http headers so that browsers cache the xml/json as well.
Your posting really contains two questions:
How can I get autocomplete on my webpage?
I am concerned about performance due to a large number of queries hitting my database at the same time.
My answers...
1: We've found the ASP.NET AJAX AutoComplete Extender works well on all modern browsers, provides a slick user experience and is pretty easy to implement.
In your web application you need to create a web service that has a method with a specific signature (covered in the documentation linked to above).
2: Have you proven that you actually have a performance bottleneck with this part of your project? I'd recommend setting up a test harness and hitting your database with a large number of autocomplete queries to see how much it can take. Be wary of premature optimization.

Resources