how to cache a data source contents? - asp.net

how to cache data source contents?
suppose I'm retrieving some records from my sql server database and fill them in a data source or a data table.
How can I cache the data source or data table contents?
DataTable dt1 = new DataTable();
DataTable dt2 = new DataTable();
DataSet ds = new DataSet();
SqlCon.Open();
string sq = "exec RetrieveLastVariableWorkerInfo #RealCode";
SqlCom = new SqlCommand(sq, SqlCon);
SqlCom.Parameters.Add("#RealCode", SqlDbType.NChar).Value = RealCode;
da = new SqlDataAdapter(SqlCom);
da.Fill(dt1);

Basically you need to use the Cache object provided by ASP.NET. This is shared globally across your application and any objects stored there are available to all page requests. You can specify the amount of time an object should stay in the cache (either and absolute time or with a sliding expiration) and define a dependency which will remove the cached object if it is triggered e.g. a Sql dependency will remove the object from the cache if a change is made to the table(s) where the data has come from.
This MSDN article gives a good overview of the subject, specifically the section on Caching API, Using the Object Cache.
You may also want to look at the CacheDependency (specifically SqlCacheDependency in your case) object to refresh you cached items as changes occur in your database.
Things to be aware of are the size of the objects your are storing, how long they are stored for before they are outdated and should be removed from the cache and if there should be any external triggers which should remove them from the cache.

ASP.Net offers several APIs to store data.
The Cache object can be used to store any object. The API is pretty flexible.
Cache["MyObject"] = dt1;
Cache.Insert("MyObject", dt1)
This will put your data table into the system cache. You can place expiration times on the cache if the data goes stale or link it to an external dependency (such as SQL server or a file) so that changes in the external reference automatically invalide the cache.
It's also worth noting that ASP.Net will clear the cache automatically if memory becomes tight. Any data you place in the cache is available to all requests.
If the data is more specific to one user, you could place the data table into the user's Session object. The Session is generally linked to one user's browsing session by a unique ID. This way you can read several different data tables into memory without any conflict.

Related

how to not lose tables values in asp.net when the function page_load Avoke

I created a dynamic table in VB.NET on asp.net page and its shows right .
But when I try to read the table on a click even I wrote: tablename.rows.count it shows 0 , in me page i see the table and the rows .
How do you persist the table between pages? You state that you created a "dynamic" table which suggest that it isn't stored on disk or in a database somewhere. The quick way to deal with this is to store it in the Session object like so:
Session("MyDataTable") = MyDataTable
In the click event you retreive the datatable like so:
MyDataTable = Session("MyDataTable")
using the basic session object requires that the client have session cookies turned on, there are other ways to deal with session (and persisting data), but this is the quick and dirty way.

What should be best approach to keep data in memory temporarily(at user level & for reuse the data) in ASP.NET?

Currently I am using 'Session' to keep the datatables in memory.
But after doing few R&Ds, I came to know the it is not a good practice
e.g.
Session("Syllabus") = RegistartionLogic.GetSyllabusInfo(Session("StudentID"))
Requirement:
The items of dropdown will be different based on student-type.
The dropdown data will be fetched from DB and these controls are used
in more than one screen.
Multiple DB call is not preferred from different screens for same
data.
So I need to call only one DB call, keep the data in memory and
then read data from memory next time onward.
I tried with 'cache' as well, but the issue was "Cache is not unique to the user.The scope of the data caching is within the application domain unlike "session". Every user can able to access this objects".
Kindly help me out.
For your scenario, HttpContext.Current.Cache should work.
Yes cache is not unique to the user. But cache key can be made unique.
var studentId = GetStudentIdFromRequest();
var cacheKey = "SyllabusInfoCacheKey_" + studentId;
Then you can make use of the unique cachekey, to insert and later get values for the particular student.
Session is also at Application level. Every user has a ASP.NET_SessionId cookie that is sent from client side and used at the server side to store/retrieve values.
Note: For Session and Asp.Net Cache to work in a load balanced environment, load balancer should be sticky.

LINQ to SQL - updating records

Using asp.net 4 though C#.
In my data access layer I have methods for saving and updating records. Saving is easy enough but the updating is tedious.
I previously used SubSonic which was great as it had active record and knew that if I loaded a record, changed a few entries and then saved it again, it recognised it as an update and didn't try to save a new entry in the DB.
I don't know how to do the same thing in LINQ. As a result my workflow is like this:
Web page grabs 'Record A' from the DB
Some values in it are changed by the user.
'Record A' is passed back to the data access layer
I now need to load Record A again, calling it 'SavedRecord A', update all values in this object with the values from the passed 'Record A' and then update/ save 'SavedRecord A'!
If I just save 'Record A' I end up with a new entry in the DB.
Obviously it would be nicer to just pass Record A and do something like:
RecordA.Update();
I'm presuming there's something I'm missing here but I can't find a straightforward answer on-line.
You can accomplish what you want using the Attach method on the Table instance, and committing via the SubmitChanges() method on the DataContext.
This process may not be as straight-forward as we would like, but you can read David DeWinter's LINQ to SQL: Updating Entities for a more in depth explanation/tutorial.
let's say you have a product class OR DB, then you will have to do this.
DbContext _db = new DbContext();
var _product = ( from p in _db.Products
where p.Id == 1 // suppose you getting the first product
select p).First(); // this will fetch the first record.
_product.ProductName = "New Product";
_db.SaveChanges();
// this is for EF LINQ to Objects
_db.Entry(_product).State = EntityState.Modified;
_db.SaveChanges();
-------------------------------------------------------------------------------------
this is another example using Attach
-------------------------------------------------------------------------------------
public static void Update(IEnumerable<Sample> samples , DataClassesDataContext db)
{
db.Samples.AttachAll(samples);
db.Refresh(RefreshMode.KeepCurrentValues, samples)
db.SubmitChanges();
}
If you attach your entities to the context and then Refresh (with KeepCurrentValues selected), Linq to SQL will get those entities from the server, compare them, and mark updated those that are different
When LINQ-to-SQL updates a record in the database, it needs to know exactly what fields were changed in order to only update those. You basically have three options:
When the updated data is posted back to the web server, load the existing data from the database, assign all properties to the loaded object and call SubmitChanges(). Any properties that are assigned the existing value will not be updated.
Keep track of the unmodified state of the object and use Attach with both the unmodified and modified values.
Initialize a new object with all state required by the optimistic concurrency check (if enabled, which it is by default). Then attach the object and finally update any changed properties after the attach to make the DataContext change tracker be aware of those updated.
I usually use the first option as it is easiest. There is a performance penalty with two DB calls but unless you're doing lots of updates it won't matter.

updating batches of data

I am using GridView in asp .net and editing data with edit command field property (as we know after updating the edited row, we automatically update the database), and I want to use transactions (with begin to commit statement - including rollback) to commit this update query in database, after clicking in some button (after some events for example), not automatically to insert or update the edited data from grid directly to the DB...so I want to save them somewhere temporary (even many edited rows - not just one row) and then to confirm the transaction - to update the real tables in database...
Any suggestions are welcomed...
I've used some good links, but very helpful, like:
http://www.asp.net/learn/data-access/tutorial-63-cs.aspx
http://www.asp.net/learn/data-access/tutorial-66-cs.aspx
etc...
Well,
you can store your edited data in a DataTable in session. and then pass this data table as a bulk insert in to the database. 2 options are available for this
if you are using SQL Server 2005 you can use OpenXML to achieve this, as i have stated here
if you are using SQL Server 2008 youc an use Table Variables like i did here.
i hope it helps
First way:
Create session variable that will contain your DB object (DataTable or mapped objects).
The GridView should work with this instance instead of sending the data to the database.
Once editing is finished you may take the object from the session and save it in the way you normally do.
Second way:
I would use javascript to collect all changes on the client side while he is editing as a array of objects (each object is separate row).
Once the editing done, you can create json string from the collection and pass it to the server.
If your json object configuration is same as server class then you can use JavaScriptSerializer to deserialize your string into collection of object.
After that, you can save your objects in the way you normally do.

Efficeintly maintaining a cache of distinct items in a huge DB table

I have a very large (millions of rows) SQL table which represents name-value pairs (one columns for a name of a property, the other for it's value). On my ASP.NET web application I have to populate a control with the distinct values available in the name column. This set of values is usually not bigger than 100. Most likely around 20. Running the query
SELECT DISTINCT name FROM nameValueTable
can take a significant time on this large table (even with the proper indexing etc.). I especially don't want to pay this penalty every time I load this web control.
So caching this set of names should be the right answer. My question is, how to promptly update the set when there is a new name in the table. I looked into SQL 2005 Query Notification feature. But the table gets updated frequently, very seldom with an actual new distinct name field. The notifications will flow in all the time, and the web server will probably waste more time than it saved by setting this.
I would like to find a way to balance the time used to query the data, with the delay until the name set is updated.
Any ides on how to efficiently manage this cache?
A little normalization might help. Break out the property names into a new table, and FK back to the original table, using a int ID. you can display the new table to get the complete list, which will be really fast.
Figuring out your pattern of usage will help you come up with the right balance.
How often are new values added? are new values added always unique? is the table mostly updates? do deletes occur?
One approach may be to have a SQL Server insert trigger that will check the table cache to see if its key is there & if it's not add itself
Add a unique increasing sequence MySeq to your table. You may want to try and cluster on MySeq instead of your current primary key so that the DB can build a small set then sort it.
SELECT DISTINCT name FROM nameValueTable Where MySeq >= ?;
Set ? to the last time your cache has seen an update.
You will always have a lag between your cache and the DB so, if this is a problem you need to rethink the flow of the application. You could try making all requests flow through your cache/application if you manage the data:
requests --> cache --> db
If you're not allowed to change the actual structure of this huge table (for example, due to huge numbers of reports relying on it), you could create a holding table of these 20 values and query against that. Then, on the huge table, have a trigger that fires on an INSERT or UPDATE, checks to see if the new NAME value is in the holding table, and if not, adds it.
I don't know the specifics of .NET, but I would pass all the update requests through the cache. Are all the update requests done by your ASP.NET web application? Then you could make a Proxy object for your database and have all the requests directed to it. Taking into consideration that your database only has key-value pairs, it is easy to use a Map as a cache in the Proxy.
Specifically, in pseudocode, all the requests would be as following:
// the client invokes cache.get(key)
if(cacheMap.has(key)) {
return cacheMap.get(key);
} else {
cacheMap.put(key, dababase.retrieve(key));
}
// the client invokes cache.put(key, value)
cacheMap.put(key, value);
if(writeThrough) {
database.put(key, value);
}
Also, in the background you could have an Evictor thread which ensures that the cache does not grow to big in size. In your scenario, where you have a set of values frequently accessed, I would set an eviction strategy based on Time To Idle - if an item is idle for more than a set amount of time, it is evicted. This ensures that frequently accessed values remain in the cache. Also, if your cache is not write through, you need to have the evictor write to the database on eviction.
Hope it helps :)
-- Flaviu Cipcigan

Resources