asynchronous processing in asp.net - asp.net

on my page load I am loading few list, also on my page I have filter conditions which is taking around 30 seconds to get the filtered records from the database, reason : Database is big and I have to navigate to 9 tables to get the list of records depending upon the selected values.
What is the simplest way to achieve asynchronous processing ?

How are you structuring your SQL? If you have to access 9 tables it seems like a view would be a more appropriate solution than joining 9 tables.

When you page load is finished, d an ajax call with jquery to a service. That services generates the output html that you need. You can place a loader in the div/table where the table is going to be displayed
Maybe you should refactor your query, 30 seconds is really long. Are you using keys, indexes, temp tables, full text searches, ect ect to optimize your query

Related

Need to run multiple copies of ASP.net website with datagridview sharing single url C#

Have drop-down menu which fills 4 datagridviews based on the branch selected or when the start button is pressed loops through 80 branches.
4 sql server procs, 1 per datagridview, unique sql table, read access, only.
Need to access multiple copies, single url.
Database retrieval time = # of copies run (single asp.net websites over single url called multiple times) * database runtime.
So if it takes 30 seconds for data retrieval, running 3 copies takes 90 seconds and seems to fragment the data or timeout..
I'm using nolocks so there isn't deadlock.
But I need to optimize this.
Should I create one web service and will this solve the problem of hitting the database only one time instead of 1x per single url hit.
Thank you.
David
Thank you, the timer was taking over and performing differently on the server than on my local. Also the UI, timer, and Database were out of synch. So adding a thread.sleep helped. Adding a longer interval on the timer, helped. Also putting all the database calls together, instead of 1 connection per database call helped. Now it runs all the same time.
The main takeaway I think is that the timer and the Thread.Sleep was the main thing.
I also had a UI button - which I added some code so that once it's pushed, if you keep pushing it, it doesn't do anything.
Thank you to everyone that posted answer..
Well, this will come down to not really the numbers of records pulled, but that if you are executing multiple SQL statements over and over.
I mean, to fill 4 gv's with 4 queries? That's going to be quite much instant assuming the record set size for each grid is say in the 100 row range. Such a button click and filling the grids should be very low time.
And even if you using a row databound event - once again, it will run fast. But ONLY if you not executing a whole bunch of additional SQL queries. So the number of "hits" or so called SQL statements sent to the database is what for the most part determine the speed of this setup.
So say you have one grid - pulls 100 rows. But then the next grid say needs data based on 100 rows of "new" SQL queries. In that case, what you can often do is fill a reocrdset with the child data - and filter against that recordset, and thus say not have to execute 100 SQL queries, but only 1 query.
So, this will really come down to how many separate SQL queries you execute in total.
To fill 4 grids with 4 queries? I don't see that as being a problem, and thus we are no doubt missing some big "whopper" of a detail you not shared with us.
Expand in your question how many SQL statements are generated in total - that's the bottle neck here. Reduce that, and your performance should be just fine.
And if the 4 simple stored procedures have "cursors" that loop and again generate many SQL commands - get rid of that.
4 basic SQL queries pulls is nothing - something else is at work that you not sharing. Why would each single stored procedure take so very long? That's the detail we are missing here.

Grid loading too slow with out pagination

We have developed Asp.net web application. and we are using asp.net gridview to display the records and edit.
here we have 5000 rows and 23 columns in single grid. it is taking long time for binding. our client refuse the pagination option. how to make the binding faster with 5000 to 7000 records.
Please Advise.
Thanks
Mayil.M
Where is your data come from? Is it database or other external resource?
You can use caching so you don't load the whole dataset from the external resource but from the memory. Please note this solution will not work if your data changes often.
Another approach would be to use some kind of partial loading mechanism using for example Ajax. This will however require changing approach as I am not sure the grid view control supports this. You would have to create custom control and then make sequential requests (using for example Ajax) for smaller chunks (eg. 200 records) of data and display them. Eventually you will have complete set but the data will be available faster.
Finally you can combine both, to make it even faster.
You should implement your own paging mechanism. The problem is that DataBind retrieves all 7000 records (although only i.e. 20 is displayed/rendered). Create for example a stored procedure that will fetch only selected range of records (if you are on Page 2 you need to display only record id > 20 and <= 40 - considering that your pagesize is 20). Use SQL server CTE to get the row number (on SQL Server side) and features like BETWEEN. This stored procedure would return only those records which you really need to. Then change your grid view to get the data from this stored procedure.
You can load data on scroll like facebook wall.

Time taking Asp.Net GridView with 2 million Records

I did a task which geting more than 2 millions record from sqlserver and populate them in Asp.net GridView.
The problem is that, query taking more than 2 minutes in getting record while my query is now fully optimized.
when i move from one page to another by pagination it take again same time and hit server.
so i need a solution where its not take time during page movement or get only 50 or 100 record in each request.
Thanks,
Nauman
Use paging in GridView - check this link
Also adjust display property like display header with available visible cells to load the grid faster.
Its even better if you bind the grid data using jQuery and not from server side.
Use this link to get started
Instead of using GridView you can use repeater or even jQuery templates also with custom paging. that'll be even more fast.
if you are fetching 1000 records and displaying just 50 using pagination, this is really a waste. better to display 50 records each time, this would be much faster. go through the following link:Custom Paging in ASP.Net GridView using SQL Server Stored Procedure
it's problem on sqlserver...
optimization the data base,separate the database and the table

Autocomplete optimization for large data sets

I am working on a large project where I have to present efficient way for a user to enter data into a form.
Three of the fields of that form require a value from a subset of a common data source (SQL Table). I used JQuery and JQuery UI to build an autocomplete, which posts to a generic HttpHandler.
Internally the handler uses Linq-to-sql to grab the data required from that specific table. The table has about 10 different columns, and the linq expression uses the SqlMethods.Like() to match the single search term on each of those 10 fields.
The problem is that that table contains some 20K rows. The autocomplete works flawlessly, accept the sheer volume of data introduces deleays, in the vicinity of 6 seconds or so (when debugging on my local machine) before it shows up.
The JqueryUI autocomplete has 0 delay, queries on the 3 key, and the result of the post is made in a Facebook style multi-row selectable options. (I almost had to rewrite the autocomplete plugin...).
So the problem is data vs. speed. Any thoughts on how to speed this up? The only two thoughts I had were to cache the data (How/Where?); or use straight up sql data reader for data access?
Any ideas would be greatly appreciated!
Thanks,
<bleepzter/>
I would look at only returning the first X number of rows using the .Take(10) linq method. That should translate into a sensbile sql call, which will put much less load on your database. As the user types they will find less and less matches, so they will only see that data they require.
I'm normally reckon 10 items is enough for the user to understand what is going on and still get to the data they need quickly (see the amazon.com search bar for an example).
Obviously if you can sort the data in a meaningful fashion then the 10 results will be much more likely to give the user what they are after quickly.
Returning the top N results is a good idea for sure. We found (querying a potential list of 270K) that returning the top 30 is a better bet for the user finding what they're looking for, but that COMPLETELY depends on the data you are querying.
Also, you REALLY should drop the delay to something sensible like 100-300 ms. When you set delay to ZERO, once you hit the 3-character trigger, effectively EVERY. SINGLE. KEY. STROKE. is sent as a new query to your server. This could easily have the unintended and unwelcome effect of slowing down the response even MORE.

How to make ajaxtoolkit autocomplete super fast and bind it on clientside

I have used ajaxtool kit's autocomplete on a page which gets data from a web service. This autocomplete is slow, at the time I only have 10 to 20 records in table and it take about 3 to 5 seconds to search and show result in autocomplete. User have to wait about 4 second on average to see data.
I am not getting how to make it super fast please guide me. Is it possible to bind autocomplete on client side? My idea is if I get data with page load from server and put it in some array in JavaScript and as user click it get data from client side.
The problem may be that you are getting the data from the webservice which may take a few seconds. Why dont you cache the data in the servlet itself(in a Hasmap or a List) and then periodically ( say every 2 mins ) calling the webservice and getting the latest.
Hence when your autocomplete plugin requests for the latest autocomplete data you will return the cached values and not the actual values from the webservice.
I've noticed that some sites will store the hash map/list on another page and reference that page from the autocomplete function. Therefore the loading of the said page will not be impacted, and the autocomplete will be extremely fast (practically instantaneous). Also, you can maintain that list at your liesure once a minute/hour/day/month/year and it will be completely independant of the users expperience.

Resources