Autogenerate List of Links - asp.net

I have two tables, one for Regions and one for those regions Areas, I want to automatically create a list of links that will be like:
Region1
a.Area1
b.Area2
c.Area3
Region2
a.Area1
etcetc
This link should be generated automatically when I load my page.
The items in the list are not only text, when I click on Area1, I will call a function passing Area1 Region1 IDs to do some action in my function, keep in mind that the user might click Area1 from Region1 or might just click Region1.
I am using asp.net 3.5 and vb.net.

I think the reason your question is not getting answered is that it involves too many fundamentals. if someone writes all this for you from scratch you'll have difficulty taking it any further. I would recommend looking at some ASP.NET beginners articles, work out how to write code which reacts to control events (e.g. LinkButton clicks), how to query a database, how to dynamcially populate a page etc. Then, if you have any more specific questions, ask again.

Related

Error while trying to publish an Infopath 2013 form to Sharepoint 2013 Document Library

I am a newbie in Infopath & Sharepoint. I am trying to create a form from Infopath 2013 and publish it as a document library to Sharepoint. I have some 60 fields that needs to be calculated(add) into another field. When tried to use the Design Checker, it throws an error as mentioned below in the screenshot. But it accepts if I key in only 45 fields in the Insert Formula text area. Is there any limitation on number of fields to be entered in Insert Formula? When I use PREVIEW in Infopath it works fine. This error pops up only when I try to publish it to Sharepoint. Any ideas on how to resolve this? - Thanks inadvance
InfoPath preview is rendered with IP Filler. The Browser experience has always been different, and the Filler preview is not a reliable check for the browser experience. You may have hit the limits of what a browser form can do. I don't have the numbers or limits, though.
Looking at the error message, you seem to be amassing an awful lot of calculations in one single field. My gut feeling is that this is very bad information architecture. What is the purpose of the form? What are you trying to achieve? Why would anyone have 60 fields in a form?
It looks as if you are summing a large number of cells. InfoPath is not a spreadsheet.
Use repeating tables to capture similar data. Then you can total the table entries with a standard IP function.
This looks like a sum of all the items a restaurant has on the menu. This is a perfect case for a repeating table. Don't use all 60 items on the menu in a list of 60 fields all in one form. That is overkill and not user-friendly. Create a repeating table structure where the user selects one of the 60 items and enters the transaction data. Each row of the repeating table can have another item of the list of 60. The grand total will be calculated from the entries.
If that is not viable, use helper fields to calculate sub totals by item category, and create a grand total from all the category totals.

How to get the back button and session state to work together?

There may be an easy answer to this and I just don't see it b/c I am too close to the project - so be it.
I have an asp.net 2.0 search application. It is a series of pages that start with a search form and end with the results being displayed to the user. Between the search and results page is a filter page that displays a series of filters the user can filter down the search results to. I execute the initial search on the search page and store the results in the session. If the initial search results total more than a certain number (let us say 50 for the sake of the example), the user is taken to the filters page. There they are presented with a number of filters they can apply to the results.
Once the selected filters have been applied to the search results, if the count is still more than 50 they stay on the page with only the filters they have not selected still displayed on the page. If the count is less than 50, they are taken to the results page. Now if they are on the filters page but wish to see the results, there is a button that takes the current state of the results and takes the user to the results page.
Here is my problem - if I am on the results page after applying some filters and click on the back button (none of the pages cache), how can i get the previous state of the search results BEFORE whatever filters I selected had been applied? Even further, if I got to the results page after a series of say 4 "apply filters" (apply filter - still over 50, apply other filter - still over 50, apply yet another filter - still over 50, and finally apply another filter - yay! under 50 go to results page), how do I get each version of the ever shrinking search results from the session if I keep hitting the back button?
Sorry if this is a bit weird and not that easy to understand - this is one of the problems that is not simple enough to just wrap it in a few simple sentences.
I am most eager for any thoughts (pertaining to the question at hand) or questions.
UPDATE -
FYI, I did not decide on the mult-page design. A requirement stated that it follow the flow of an existing third-party search app (reverse engineering is wonderful right?).
Thanks
Not to be too critical, but what you described sounds like a seriously jacked up way of handling search.
Typically, your search criteria and results are on the same page. When you modify the criteria and click search you should just display the top 50 results and let them know there are more. This can't be any more expensive an operation that what you've described because you have to run the queries under both circumstances anyway.
Take a look at NewEgg.com and try their "advanced search" from this page. You'll notice there are about 20 different criteria factors on the left. As you add a new criteria, the bread crumb at the top of the page changes. They have a little (x) next to each in the bread crumb so you can quickly eliminate any criteria from your search results. Voila no back button needed.
Note that at no point do you need session state to handle this. At most you could use hidden form fields which would still support back button usage in the browser, if they really wanted to.
Use your current session-based parameters as is, but let any querystring-parameter override them. This way you keep your values w/o assigning them to every url, yet adjusts to any previous manual data-enrty (querystrings in the browser-history).
(And don't use POST for search.)

Search through Drop down list in asp.net

I am making a page in asp.net using C# and my page consists of two drop down list one for state and one for subjects, both the data are coming from database, and there is a third database table for storing user details and the third table consists of name class and subject
Now my problem is that on click of search button i want to search the users table and return the value selected by person searching through drop down list on the master page and the result will be displayed on the next page
Please help as to what to do for search button, and what to write on second page on which i want to display the results.
Your help is badly needed
Thanks in advance.
This sounds like a homework assignment... Have you tried to solve this already on your own? What did you try, and what-if-any problems did you run into?
Here's one (of a thousand other) tutorials that will walk you through filtering a result page using a drop-down.
http://msdn.microsoft.com/en-us/library/aa581789.aspx
Good luck!

Dynamic Data Web Site is unusable due to slowness

I have created a small "Dynamic Data Web Site" using the Entity Framework. I've no experience with this really, but it looks very interesting. Anyway, I have a single table being displayed on a single web page. The table contains over 21000 rows and the page limits me to 10 records per page, which is all fine.
My problem is that the page is incredibly slow. I'm guessing that maybe every row in the table is being loaded whenever I try to navigate, but I can't be sure this is the cause.
How can I increase the performance of the page? I want to be able to click through pages of results quickly and easily. It currently takes more than 60 seconds to click to the next set of results.
this is usually caused by filters on a table where the filter has MANY rows you could fix this using the Autocomplete filter which prefilters the data base what the user types in.
You can get this filter and other from ny NuGet package Dynamic Data Custom Filters
Also try having a look in it using Ayende's EFProf. It is a commercial product but it has a free 30 day trial. I can sometimes point out silly things you are doing and suggest some ways to optimise your data access

How to provide multiple search functionality in website?

I am developing a web application, in which I have the following type of search functionality;
Normal search: where user will enter the search keyword to search the records.
Popular: this is no a kind of search, it will display the popular records on the website, something as digg and other social bookmarking sites does.
Recent: In this I am displaying Recently added records in my website.
City Search: Here I am presenting city names to the user like "Delhi", "Mumbai" etc and when user click this link then all records from that particular city will be displayed.
Tag Search: Same as city search I have tag links, when user will click on a tag then all records marked with that tag will be displayed to the user.
Alphabet Search: Same as city and tag this functionality also has links of letters like "A", "B", .... etc and when user clicks on any letter link then all records starting with that particular letter will be displayed to the user
Now, my problem is I have to provide above listed searches to the user, but I am not able to decide that I'll go with one page (result.aspx) which will display all the searches records, and I'll figure using query string that which search is user is using and what data I have to display to the user. Such as, lets say I am searching for city, delhi and tag delhi-hotels then the urls for both will be as :
For City: www.example.com/result.aspx?search_type=city&city_name=delhi
For Tags: www.example.com/result.aspx?search_type=tag&tag_name=delhi-hotels
For Normal Search: www.example.com/result.aspx?search_type=normal&q=delhi+hotels+and+bar&filter=hotlsOnly
Now, I feels above Idea of using a single page for all searches is messy. So I thought of some more and cleaner Idea, which is using separate pages for all type of searches as
For City: www.example.com/city.aspx?name=delhi
For Tags: www.example.com/tag.aspx?name=delhi-hotels
For Normal Search: www.example.com/result.aspx?q=delhi+hotels+and+bar&filter=hotlsOnly
For Recent: www.example.com/recent.aspx
For Popular: www.example.com/popular.aspx
My new idea is cleaner and it tells specifically everything to the user that which page is for what, it also gives him idea that where he is now, what records he's seeing now. But the new idea has one problem, In case I have to change anything in my search result display then I have to make changes in all pages one by one, I thought that solution for this problem too, which is using user-control under repeater control, I'll pass all my values one by one to user-control for rendering HTML for each record.
Everything is fine with new Idea, But I am still no able to decide that with which I dea I have to go for, Can anyone tell me your thoughts on this problem.
I want to implement an idea which will be easy to maintain, SEO friendly (give good ranking to my website), user-friendly(easy to use and understand for the users)
Thanks.
One thing to mention on the SEO front:
As a lot of the "results" pages will be linking through to the same content, there are a couple of advantages to appearing* to have different URLs for these pages:
Some search engines get cross if you appear to have duplicate content on the site, or if there's the possiblity for almost infinite lists.
Analysing traffic flow.
So for point 1, as an example, you'll notice that SO has numberous ways of finding questions, including:
On the home page
Through /questions
Through /tags
Through /unanswered
Through /feeds
Through /search
If you take a look at the robots.txt for SO, you'll see that spiders are not allowed to visit (among other things):
Disallow: /tags
Disallow: /unanswered
Disallow: /search
Disallow: /feeds
Disallow: /questions/tagged
So the search engine should only find one route to the content rather than three or four.
Having them all go through the same page doesn't allow you to filter like this. Ideally you want the search engine to index the list of Cities and Tags, but you only need it to index the actual details once - say from the A to Z list.
For point 2, when analysing your site traffic, it will be a lot easier to see how people are using your site if the URLs are meaningful, and the results aren't hidden in the form header - many decent stats packages allow you to report on query string values, or if you have "nice" urls, this is even easier. Having this sort of information will also make selling advertising easier if that's what's you're interested in.
Finally, as I mentioned in the comments to other responses, users may well want to bookmark a particular search - having the query baked into the URL one way or another (query strings or rewritten url) is the simiplist way to allow this.
*I say "appearing" because as others have pointed out, URL rewriting would enable this without actually having different pages on the server.
There are a few issues that need to be addressed to properly answer your question:
You do not necessarily need to redirect to the Result page before being able to process the data. The page or control that contains the search interface on submitting could process the submitted search parameters (and type of search) and initiate a call to the database or intermediary webservice that supplies the search result. You could then use a single Results page to display the retrieved data.
If you must pass the submitted search parameters via querystring to the result page, then you would be much better off using a single Result page which parses these parameters and displays the result conditionally.
Most users do not rely on the url/querystring information in the browser's address bar to identify their current location in a website. You should have something more visually indicative (such as a Breadcrumbs control or header labels) to indicate current location. Also, as you mentioned, the maintainability issue is quite significant here.
I would definitely not recommend the second option (using separate result pages for each kind of search). If you are concerned about SEO, use URL rewriting to construct URL "slugs" to create more intuitive paths.
I would stick with the original result.aspx result page. My reasoning for this from a user point of view is that the actual URL itself communicates little information. You would be better off creating visual cues on the page that states stuff like "Search for X in Category Y with Tags Z".
As for coding and maintenance, since everything is so similar besides the category it would be wise to just keep it in one tight little package. Breaking it out as you proposed with your second idea just complicates something that doesn't need to be complicated.
Ditch the querystrings and use URL rewriting to handle your "sections".. much better SEO and clearer from a bookmark/user readability standpoint.
City: www.example.com/city/delhi/
Tag: www.example.com/tag/delhi-hotels/
Recent: www.example.com/recent/
Popular: www.example.com/popular/
Regular search can just go to www.example.com/search.aspx or something.

Resources