I have a user control that fetches data from a database. It takes a lot of time. Also the speed of my web application has become slow. What should I do to make page loading faster?
I will answer in general way.
If you only fetches data with out calculations, then:
DataBase Part
a. Optimize your sql query, be sure that you use the right indexes on database.
b. Do not load more than the data that you won to show, and do paging.
c. If you fetch data from too many tables at the same time, create a new 'flat' table, and pre render your results there on a scheduled regular base on a background thread.
Page Part
a. While you load data, show them imidiatle and do not buffer them, and time to time make a flush to the responce.
If you fetche data with calculations, then make the calculations on a background thread before you show them on a scheduled base, place them on one other flat table, and show that table.
For example, how to show the data while you get them....
<%
int KathePoteFlush = 1;
object Data;
While(GetNextData(Data))
{
if (20 % KathePoteFlush++ == 0)
Response.Flush();
Response.Write(RenderMyTableData(....data....));
};
%>
Related
First, is there an official CS term for sending things between the front end and back end? I just made up "the wall" but I would like a cooler term.
So in appmaker it seems you cannot pass whole records through to the backend (although you can handle them on either end).
So basically what I was doing was
get set of records, divide into chunks
var records = app.datasources.filesToProcess.items;
call backend process one time per chunk with this
google.script.run.withSuccessHandler(onSuccess).backendProcess(records, start, end);
This allows for a kind of multithreading. The problem is passing records. Is there an easy way to get just the IDS from a set of records client side so I can pass those as an array in place of the records? Passing the record object itself gives an error.
Just do the following:
var records = app.datasources.filesToProcess.items.map(function(item) {return item.id;});
and then you can use
function backendProcess(records, start, end) {
var items = app.models.YourModel.getRecords(records);
}
On the backend.
Right now my website(ASP.NET) has a database where it pulls data from, and the more data enter the longer the page. I am wondering how I can make it so for example once it pulls ten items from database it starts a new page and puts ten more items on the next. Basically like how http://fmylife.com does it.
Thanks
You should take a look att the DataPager Control.
To create a solution that works in the long run, you need to let the paging selection end up in your sql where clause, so that you only retreive the amount of records that you have specified in the PageSize property of the DataPager Control each time you round trip to the database. Otherwise it will lose it's performace when the records in the database increases.
I have a gridview which was working fine with a small dataset in development. In production it has to bind to thousands of records, which is making it slower to load. Is there a way to improve performance, like retrieving the data during gridview pageindex changing?
Also chances are you only want to bind it once. So you should (if not already):
if(!IsPostback)
{
DatabindGridLogicHere();
}
This way your GridView will only have to hit the db the first time to get the data.
First and formost turn off ViewState.
You should tell your datasource to take less records and then enable paging in your grid and datasource.
You can enable "AllowPaging" property to true in your GridView and give a page size say 10. Then Write your data retrieval logic to return a batch of data instead of the whole set of data at once.
When you write the SQL query make sure to order it by an ID.
Thus if the page index is 1 you can take the first batch of data by passing page index of 1 and the page size of 10.
Logic will be;
SELECT [RequiredFields]
FROM [YourDataSource]
WHERE (Id>=((PageIndex-1)*pageSize) AND Id<(PageSize*pageIndex)) ORDER BY Id
In the first page it will return first set of entries of those Ids starting from 0 to 9. In the second page it returns entries of those Ids starting from 10 to 19 assuming the pageSize is 10. You can change the page size and the query logic as you wish.
But if sorting is enable then this will not produce accurate results.
I have been tasked with sifting through the worst classic asp spaghetti i've ever come across.
The script runs a series of recordsets in sequence, getting 1 record at a time. As the record is built it takes the id and passes it to the next loop, which gets data, and passes on the id to the next loop. It then continues in this manner and builds an unordered list, kicking out the required html as it goes.
Here are my efforts so far:
have a class delivering data via sqldatareaders and output these to nested repeaters (this failed due to not being able to loop and get the id)
Have a datatable populated with all the required data, then datatable.select to filter it out.
have 4 datareaders looping and building the ul arraylists (I
couldnt get the id's to match up)
Please can you suggest the best method
Yes of course it can be rewritten in ASP.NET - note that i said rewritten, not just refactored, there is no saving that code (which is okay for what it does, but things are different with ASP.NET).
To be honest, i didn't even check the code, it hurt my eyes. A lot. But in general you can use a nice SQLConnection object, and a SQLCommand, call a stored procedure and get a nice SqlDataReader full of data from which you can build a DataTable or an IEnumerable list of data objects. You then have a repeater type control (ListView, GridView etc) in the UI, simply by binding your datatable or list to that control will render the results.
With the ListView, you specify a template for each data item that is rendered. For a GridView you specify the columns (or column templates) and which properties on each data item the columns should bind to.
When you retrieve the data, you can leave it as a DataTable, or translate it into something else like a list (or array) of data objects. As long as your list/array implements IEnumerable you should be able to just assign the list to the ItemsSource property of the aforementioned repeater control and it will perform its magic.
You don't even have to use a SQLCommand and DataTable object - you could even use Linq to SQL and bind the results straight to your repeater control.
This is just a high level overview of how you could do it, there are a couple of different ways. Once done, your code is going to be way cleaner and more maintainable than the classic ASP code.
Edit: your main issue is how to produce an ordered list, which is what the current code is doing with its nested loops supplying IDs for the detail items.
I would suggest you take a step back, and rethink the sql. In fact, throw the current sql away, it is incredibly inefficient with the tools you have today. It takes literally tenths of a second, and just one database call to return a flat table of data. Your first instinct may be "but i don't want to return too much data!" - relax, even returning several thousand data rows can be a sub 1 second operation if done correctly. You can also restrict the returned data by passing parameters in to the stored proc, or appending them to the dynamic sql statement that you construct (although it pains me hugely to mention dynamic sql, i think it is evil, but some people still use it - i would not recommend it unless it was your only option). To sum up what i am saying, how your data is returned from the database and how it looks on screen are two different things, don't let one guide the other. You can get the data from the database, then manipulate it before rendering it to the UI.
If you still want to show the data as an ordered list, then use a ListView, define a template for each data item, then you can use LINQ to group or filter the data you like (the data item template can contain whatever HTML or ASP.NET controls you like, and that template gets rendered for each data item in the list of data). Alternatively, you could use a GridView, and then use the grouping capability of the GridView to do the work for you - just specify which column(s) you want to group on, and the GridView takes care of the rendering.
I have a datagrid control which is bound to a table which containing more than 1000 records. And i want to show only 25 records once a time. I have used paging in datagrid. But each time the next page index is set, query is fired again. This takes lots of time. So what is the easiest way to bound data in this case to improve performance..??
I don't recommend using caching since the whole data will be returned to the server anyway for the first time.
You can improve performance by using custom paging queries to the database.
Assuming you're working with at least SQL Server 2005,
Here's a great article for your purpose with different benchmarking results
have you considered caching your data set? then you would only need to query the data if the cache is empty or expired.
When you handle your Page Changed event, you need to pull in the new page information. You will need to create a stored procedure that takes CurrentPageNumber and PageSize as arguments.
This is in addition to any other arguments you already supply when bringing down your data.
In the SP, you fill up a temporary table or table variable (or you can use a CTE) with the data that you are returning, but also the RowNumber.
Based upon your CurrentPageNumber argument, you are able to return all the results between CurrentPageNumber * PageSize and (CurrentPgaeNumber + 1) * PageSize - 1.
Here's a good resource:
How to return a page of results from SQL?