I have used ajaxtool kit's autocomplete on a page which gets data from a web service. This autocomplete is slow, at the time I only have 10 to 20 records in table and it take about 3 to 5 seconds to search and show result in autocomplete. User have to wait about 4 second on average to see data.
I am not getting how to make it super fast please guide me. Is it possible to bind autocomplete on client side? My idea is if I get data with page load from server and put it in some array in JavaScript and as user click it get data from client side.
The problem may be that you are getting the data from the webservice which may take a few seconds. Why dont you cache the data in the servlet itself(in a Hasmap or a List) and then periodically ( say every 2 mins ) calling the webservice and getting the latest.
Hence when your autocomplete plugin requests for the latest autocomplete data you will return the cached values and not the actual values from the webservice.
I've noticed that some sites will store the hash map/list on another page and reference that page from the autocomplete function. Therefore the loading of the said page will not be impacted, and the autocomplete will be extremely fast (practically instantaneous). Also, you can maintain that list at your liesure once a minute/hour/day/month/year and it will be completely independant of the users expperience.
Related
I have a table of records that has a column processed. In my web page i'm using sessions.. because every user that opens the page should see different data from the table.. so i initially have processed=0 and when i'm selecting my data i'm updating the column where processed=0 and then i'm updating the column to processed=2 so that way if another opens the page he gets other data... but the problem is that if the user closes the page without changing anything about the page i need to put my column back to 0 processed=0 but i can't handle an event on the close button of the page... and also not on the log out because they may close the page without logging out... so does anyone has any idea how can i manage this?
note that i'm using asp.net with vb.net
for what you need, don't rely on some browser action, i.e. browser close button (what if internet goes down for one of the users :) ).
The simple way to achieve this is to perform somekind of polling at both levels, i.e. in the db(since your flag is stored in the db) and at the code. Polling implies, constantly making request for certain operation after a fixed interval of time. Obviously, polling is taxing, but it is one of the solution.
Another way to do is to as soon as your user Logs in, you create a Http Long lived request, which neither of the parties(client and server) break and as soon as it ends, you set the flag to 0 again, but having a parallel long lasting request along with all those others is not simple. It is termed as Comet
So I would recommend, to constantly make an ajax request say every 2minutes, to update a certain field, say LastActive in user accounts table. This would constitute your Code side Polling
and Then create a sql server job, which constantly monitors this LastActive field and say, if the difference between it and current DateTime is more than 2.10 minutes, it sets the processed=0 for that user.
You can also look into SessionExpire fields(or whatever it is called), if you are using Forms Authentication of ASP.NET through Session
I have a table in SQL Server which contains more than 12000 rows. When I load all the rows into an asp.net web page it takes 10 to 15 minutes to load all the rows.
Please help me loading data in seconds..
Which part of the process is taking the most time? There are many distinct stages to this action:
Query execution time in SQL Server.
Transfer time from SQL Server to your data reader (ADO.NET? across a network?)
Binding the data to the grid in ASP.NET.
Transferring the rendered HTML to the client.
Only when you know exactly what is slow, can you properly optimise.
Solutions:
You can use qzip. Apparently, it won't bring load times down to seconds. But you will see good improvements in load time.
Write an handler which returns results in set of 100's and display results using jQuery asynchronously as in Facebook, where when you move down, more stories are feed, you can do a similar thing for your website, or you can have a "Load More" button which does the same thing except the user has to press the "Load More" button instead.
Note: Implementing the above is not that hard.
probably you need to find an third party grid control which have on-demand virtual paging support.
I'm looking to build an ajax page; it's a reporting page. By default, load today's report. On the page there's a calendar control and when the user clicks on a date, reload the gridview with the corresponding data. Is it considered good practice to do the following:
1) on the first page load, query the data for the page
2) put the query result in the session object and display it in a gridview
3) if the user requests new data, get new data from the query with different parameters
4) put the result of the second query in the session object and display it
5) if the user then requests the data from the first query, get it from the session object
6) do the sorting and paging with the data held in the session.
Note: the data of each query will contain about 300-500 rows and about 15 columns. I'd like to do all this with ajax calls. What are some suggestions and pitfalls to avoid.
Thanks.
I would use Backbone.js:
Server produces report in JSON format.
Client has a Backbone.js Model for this report, which binds to the JSON endpoint.
Client renders the Report Model as a Backbone view.
Client reloads the report from server only when appropriate.
Reports from previously viewed days will still be around in the client as Backbone Model instances, so you don't need to reload from server unless the user forces. I believe this is your main concern?
You're probably still in the realm of can-do-without-a-client-side-framework, but if you plan on doing more of these pages or getting any more complex, you can go to spaghetti pretty quickly without something like Backbone.js.
PS. I just noticed this is .NET related. I know nothing about .NET so maybe there's a built-in client-side framework that can do something similar.
EDIT (updated after reading comment):
For server-side caching, I think a either a denormalized report table in the DB or a separate dedicated cache store (e.g. memcache) is a better practice than session object.
It depends though. If there was say, 1 possible report per-user per-day, and you didn't have memcache set up, and you don't want to use the DB for whatever reason, then it could make sense to store it in their session object. However, if each day's report is the same for all users, you're now caching it N times instead of 1. It could also be hard to invalidate from an external hook and the user loses their cache when they logout.
So I would probably just have a typical get-or-set pattern to try and load report from cache first, and fallback to DB. Then invalidate/update the cached report only when the user forces, or if data used to create the report has changed. AJAX call requests the report by date or however a report is identified.
Since you are hoping to use the data in Javascript Ajax scenerios it would make the most sense to create a HTTP Handler to query and return the needed data result sets on demand.
Using the session object is not a solution because it cannot be accessed asynchronously. As a result, your page would not be able to query this data to feed back to your Javascript objects (unless you created an HTTP Handler to send it back, but that would be pointless when you could just query the data in the HTTP Handler directly).
You are forgetting about windows. A client isn't a window, a client is a browser it can contain many windows/tabs. You need to make sure you are rendering/feeding the correct window. Usually i handle this by submit hidden values.
Problem is separating resuming a session / Starting a new window.
I wouldn't bother holding more than one copy of the query in the Session. The primary reason you'd want to hold it in Session is to improve the sorting/paging speed, presumably. Users expect those to be relatively fast, but choosing new dates can be slower. Plus, what's the likelihood that they'll really reload the first query?
The other answers raise good pitfalls with storing in Session in general.
I have a web app, that consumes a web service. The main page runs a search - by passing parameters to a particular web service method, and I bind the results to a gridview.
I have implemented sorting and paging on the grid. By putting the datatable that the grid is bound to in the viewstate and then reading / sorting / filtering it as necessary - and rebinding to the grid.
As the amount of data coming back from the web service has increased dramatically, when I try to page/sort etc I receive the following errors.
The connection was reset
The connection to the server was reset while the page was loading.
I have searched around a bit, and it seems that a very large viewstate is to blame for this.
But surely the only other option is to
Limit the results
Stick the datatable in the session rather than the viewstate
Something else I am unaware of
Previously I did have the datatable in the session, as some of this data needed to persist from page to page - (not being posted however so viewstate was not an option). As the amount of data rose and the necessity to persist it was removed, I used the viewstate instead. Thinking this was a better option than the session because of the amount of data the session would have to hold and the number of users using the app.
It appears maybe not.
I thought that when the viewstate got very big, that .net split it over more than one hidden viewstate field, but it seems all I'm getting is one mammoth viewstate that I have trouble viewing in the source.
Can anyone enlighten me as to how to avoid the error I'm getting? If it is indeed to do with the amount of data in the viewstate?
It sounds like your caching the whole dataset for all pages even though you are only presenting one page of that data. I would change your pagination to only require the data for the current page the user is on.
If the query is heavy and you don't want to have to be constantly calling it over and over because there is a lot of paging back and forth (you should test typical useage pattern) then I would implement some type of caching on the web service end to cache page by page (by specific user if the data is specific to a user) and have it expire rather quick (eg a few minuites).
I think you need to limit the total amount of data your dealing with. Change your code to not pass back extra data that might never be needed is a good place to start.
EDIT: Based on your comments:
You can't change the web service
The user can manipulate the query by filtering or sorting
There is a large amount of data returned by the web service
The data is user specific
Well I think you have a perfect case for using the Session then. This can be taxing the the server with large amounts of users and data so you might want to implement some logic to clear the data from the Session and not wait for it to expire (like on certain landing pages you know the user will go when they are done, clear the session data).
You really want to get it out of the ViewState beacuse it is a huge bandwidth hog. Just look at your physical page size and that data is being passed back and forth with every action. Moving it to the Session would eliminate that bandwidth useage and allow for you to do everything you need.
You could also look at the data the web service is bringing back and store it in a custom object that you make as 'thin' as possible. If your storing a DataSet or a DataTable in your Session, those objects have some extra overhead you probably don't need so store the data as an array of some custom thin object and just bind to that. You would need to map the result from the WS to your custom object but this is a good option you cut down on memory useage.
Let me know if there is something else I am missing.
I wouldn't put the data in either the view state or the session. Instead store the bare minimum information to re-request the dataset from the web service and store that (in either view state or session, or even on the URL). Then call the web service using that data and reaction the data on each request. If necessary, look to use some form of caching (memCache) to improve performance.
I have a couple of customer websites that have been in use for 4+ years. One of the pages on both sites contain drop downs that now contain 3000+ items. I have attempted initial solutions to this problem by adding new pages to both sites where one site is using silverlight and the other is using ajax.
The silverlight page currently performs better than the ajax page once the control has loaded but requires the user to have silverlight or the permissions to install it. The ajax version has issues that it sends the still requires an initial download of all of the data to complete the drop downs when the site is first loaded.
The ajax version still uses view state and sends 400k+ to the server on every request.
What I would like to do is use the ajax version but cache the drop down data on the client and only download the data once a day.
If I save the data using asp.net to isolated storage (I have that part sorted) is it possible to access it using client side code such as jquery?
Personally, there's no way I would use a dropdown for 3000+ items. Not only is there a problem with data transfer & viewstate but also its a pain for any user to scroll through that many items to find the option they require.
Have you considered something like this:
http://jquery.bassistance.de/autocomplete/demo/
You have a textbox that says something like 'start typing'... the second the user types the first letter of what they are looking for, an ajax query is made grabbing all the entries that begin with that letter.
Given that there are 26 letters in the alphabet, you will return on average 4% of the data i.e 120 entries rather than 3000!! Also as users get to know your system, they can type more and more letters and find what they are looking for much faster. Beats scrolling through a list of 3000 entries, and makes your application easier to use, more responsive and more network friendly!