ajax: loading data in the background or creating a new sql table - asp.net

I am making a website where users can see photos uploaded,
In any page, the users can click a button and "see three more photos", this is limited up to 100 photos per webpage.
basically I can use ajax to read the next 3 rows from the photos table, and load them to the user using ajax, but, while the user is browsing the site, the DB is most likely to have already been updated, and changed.
I have come up with two solutions and I can't decide between the two:
use ajax to load in the background first 100 items into an array (just the urls of the photos), and use javascript to display them on the page when the user click the "see three more"
create a new temporary table in my db, containing the first 100 rows, so I could access every time with ajax, while I'll know it will not change by another user uploading a new photo.
the goal is to find a way that will not be too time consuming for the server (in case there are a lot of requests and many temporary tables will have to be created),
and also a way that will not be too heavy and bandwidth consuming for the user (I don't want to load 10 MB of stuff to the clients browser "just in case" he will push a button).
which option would be the better one?
is there a third option i'm not thinking about?
Thanks guys!

I would say that creating temporary tables isn't a good solution for the reasons you have already mentioned; you'll be doing a lot of cleanup work and you're not really solving your other problem, which is heavy bandwidth consumption.
I think the problem is your user experience expectation, which I'm not really clear on.
So - I'm a user of your site, I go to a page and see some (how many?) photos - with a button that says "see 3 more". I click it and see the next 3 photos, loaded using AJAX.
Does it matter if there are 50 new photos added in between each time click "see 3 more" ? Surely I want to see the next 3, not the newest 3 ?
Loading "the next 3 via AJAX" doesn't mean you have to preload the 100 URL's into a client-side array; just keep the most recent Image ID or most recent date/time client side, and pass this to your server-side function in "see 3 more"; should be straightforward enough to return the next most recent 3 URL's (if there are any) and then update your "current" Image ID.
Does that help ?

Related

Adobe Target, placing user in experience based on URL contains

If I have an Adobe Target experience that shows content in Experience A to 50% of users and content in Experience B to the other 50% of users...how can I insert someone into one of these two experiences?
I was thinking of having a button the user can click that has a url parameter added to it for example ?exp1, and then a different button that would have ?exp2.
But if I use the refinement 'url contains exp1 or exp2' in each of the experiences in target, then that is only when the mbox will fire. Whereas I want them to fire on the original page that the mbox is on.
Any help is greatly appreciated...thank you all!
Adobe Target will serve up your two experiences without having a user click a button. You can have experience A hard coded on your site. Then when you go to make your Target A/B test just enter the URL in the first pop-up box that asks for the activity URL.
Then on the next page Experience, A should be what was hard coded and live on your site. Select experience B and code up your second experience. When done you will select your audience - most likely all visitors and then make sure your set to 50/50 split.
This was a visitor will automatically be shown either A or B when they come to your site. The target mbox fires when the page loads and makes the decision who to show what to automatically. One interesting quirk with Adobe Target is that they don't send one customer to A then the next to B and the next to A as you may expect. Sometimes they send a bunch to A back to back before sending some to B. It works out to a 50/50 over time. And the first 24 hours of data may look a little funky as there is sometimes a latency in data being processed. Hope this help.

Show list of pages that are opened at the time being

I have a task to list all pages which are opened at that moment and show how many people are on that page.
I am looking for a way to make that happen without keeping any db records or saving information on a text file or smth like that. (Not seccessarily, then. Of course I am going to save that info to a dB, I just wanted to the logic of catching opened page addresses.)
I can of course keep track of every page which are opened till that time, but I want the page address appear on the page when someone opens that address and disappear when user is no longer browsing that address.
Can you give me some ideas how to make that happen using ASP.NET?
Note: I am using web forms with asp.net 4.5
Thanks!
"I just wanted to the logic of catching opened page addresses"
Use javascript in a timed loop (onload and then every 30 seconds perhaps) on every page, to asynchronously post to a page on your server. It should send information identifying the page. This will give you a good idea of how many people currently are on this page.
Store this information in a db in your code-behind, and use this information to report as you wish.
Of course if a user leaves their browser open on one of these pages or opens another tab it will still be reporting as 'open'.
To get the current url in javascript you can use:
var pathname = window.location.pathname;
In google analytics you can see what pages are being used in near-real time.
Why not use that to solve this issue - it's easy to setup.

VB.Net application - display a message to the user whilst the application is starting up

I have recently created an application where a lot of data is loaded into objects when the application starts up, and other data as it is required. For example if the user requests the catalogue page then it will load all the top level category data into objects of type Category. This will then stay there to be used by other users (who will therefore not have to load this data into objects) and can be altered by admin if they happen to login during the same application instance. I know this is not the most efficient solution, as pointed out below, but it works and the page load, at the moment, is not too long. It is very quick if most of the required data is already loaded into objects. It is also tailored to the business' needs - unlike other techniques such as Linq-to-SQL.
The problem I am facing is when a page is requested which requires lots of data to be displayed about different types of object. For example when a catalogue page is requested which displays information on a product which can be bought, it then loads all the products and categories (as the products make reference to the category object, not just the category name).
I would like to display a loading symbol with a message whilst all this data is being loaded into objects, so the user knows its not just in a loop or anything. Is there any way to do this? I am open to using JS / jQuery if I need to.
Thanks in advance.
Regards,
Richard
PS I am working on ways to make it more efficient - such as using HashTables or HashMaps. However this is taking time as there are so many different types of item (News, Events, Catalogue Item - Range, Collection, Design, RangeCollection, CollectionDesign, RangeCollectionDesign and RangeDesign - Users, PageViews and the list goes on).
Please correct me if I'm wrong, but I do believe that Javascript is required in order to display a "loading" image... Using server-side scriping alone would typically require an entire page load after all the content loads unless you want to start messing with IFrames.
This is a job for AJAX. A common solution to your problem is to have a small page that displays a loading icon. The page has some JavaScript that makes additional HTTP requests to the server to download the rest of the page. JQuery has a "$.ajax" method that is designed to simplify this process.
I would suggest looking at the documentation to the .ajax method in the jQuery documentation. Unfortunately, it seems to be a rather delicate process to get all the scripting code right and it takes a while to learn it all.

ASP.NET/Javascript: Loading huge data in browser

I have this GUI that shows, let's say Customer Orders. When my client nailed down the requirements, he asked me to keep pagination like this,
Show Items Per Page : 10/50/150
For each customer there could be thousands of orders and each order will have atleast 50 attributes to show up in the screen. So, assume a 50 column html table with 2000 or 3000 records associated with it spanning across multiple database tables (anyway, this is a different story)
Things were breeze until yesterday, now my client has come up with new change requests, in that he specified Show Items like this,
Show Items Per Page : 10/50/150/All
Yes, he wanted to see 2000 or 3000 records by just select "All" option. Internally, this is not a big change, I would go back and remove the filters I apply on rowcount etc., but when it is loaded in GUI it really sucks ... view state was huge etc., etc.,
I know, this is a standard problem. How you guys deal with it? I cannot convince my client to remove this "All" option, he got stick to it. (the reason is simple, he got a big 42" screen where he can easily see 1000 items in one page)
I also tried to use javacript to prepare DOM in a ajax call .. but still, inserting 2000 TDs is really slow.
Any help is greatly appreciated.
Some Extra Info
This application is a intranet application or else accessed through VPN connection
This problem is about browser performance.
I suppose you can do two things.
1) you can use <div> instead of <table> (this is possible with CSS) because browser do not render table until closed tag. So it will take long to load page but it will render first results faster.
2) If you use Ajax+Json and render every <tr> piece by piece, you can render whole thing and only than put it in DOM. That will be faster because browser will not render every time you put another row
If you want you can load the data in sort of installments. Sort of like how pagination works but it is not quite pagination to be precise. You can label your installments/pages with a proper ID. Load the page one after another via ajax calls. You can even show a progress bar to show how much data is actually loaded. Append this data to the table you are displaying the data in. I would not go about using server controls for this...you have to handle this via javascript or jquery.
You might want to append table rows incrementally.
When client scrolls close to page bottom - fire an ajax call, return next page and render it.
But best solution would be to convince your client - this is not how web applications works. We had similar situation - pure nightmare.
Instead of an ASP.NET GridView, you'd be better to use a DataRepeater.
Better yet, if you are not constrained by technology, you can use Microsoft Ajax Preview 4 with WCF REST Services. You would just need to find some hacks to "stream" data from the service and display it.
Also there is JQuery Grid (if you don't want to use Microsoft Ajax Preview 4) that supports JSON serialization.

Architecture question involving search & session state

I have a grid with several thousand rows that can be filtered and sorted. On each row you can click a details button, which will bring you a new page with detailed information about the page. Because this is a button, you can't middle click or right click and open in a new tab. In addition, when clicking back you lose your filters and search results.
To solve this problem, I considered the following: Switch the buttons to links, and when filtering and searching, use get instead of post requests. This way, you could switch to new pages with a right click or middle click, and if you did follow a link normally, back would work properly.
This change was not made however. We were asked add a 'next result / previous result' set of buttons on the details page, that would allow you to navigate. While not an elegant solution, it would at least work.
I proposed adding querystring parameters to the details page, that would regenerate the search query based on filter, and allow you to get the next and previous results in code.
A team member took issue with this solution. He believes that it is a waste of server resources to re-query the database. Instead, a solution was proposed to add session variable that included a list of results. You could then use that to navigate.
I took issue with that because you can't have multiple tabs open without breaking navigation, and new results aren't appended to the list in real time. Also, if you worried about optimization, session would be the last thing to use since it eats memory and prevents server replication... unless you store the results back in the database.
What's the best solution?
Session doesn't sound like a winner, won't scale with lots of users.
Hitting the database repeatedly does seem unnecessary, but it depends on the cost - how many users, how often would they refresh/filter and what is the cost of that query?
If you do use querystrings you could cache the pages by parameter.
What about some AJAX code on that button to retrieve details - leave the underlying grid in place and display details in a div/panel or a new window/tab.

Resources