I've been learning about AJAX and jQuery to build web applications and now I see how powerful are these tools. Because this, some questions about the network traffic gennerated by standard ASP.NET applications without those techniques came up.
It is known that every control that has the runat="server" property setted, puts on the viewstate it's current values, which is codified and placed inside a hidden input on the response to the user.
However, every little action on the page triggers a post to the server, sending back the entire page's values. Depending on the complexity of the page, it may be very dangerous to the application because it would generate a lot of traffic unnecessarily.
An example: i've build a page that it's size is about 155kb rendered (62kb only is the viewstate). So, every post on the page returns a new rendered page with similar size, even it's contents does not changed. Inside an Intranet environment, it seems nothing, but on the web it would be inappropriate.
What do you think about this question? Am I wrong?
My opinion is that if you don't like the purely server-side nature of vanilla ASP.NET, you should just include the very techniques you mentioned. There is lots of documentation and many step-by-step guides to help you understand how to use a mix of client side and server side techniques with ASP.NET.
What do i think about this question? Which question? I'm not sure you have a question other than your question about your mystery question.
I'll just leave this here: http://www.asp.net/ajax
Related
Our team is building a new web site with ASP.NET. We plan to use a 3-tier architecture. The problem is that the controls shown on the web page need to be changed all the time according to the customer's requirements, which means adding a pair of label/textbox, or removing a pair of label/dropdownlist when the customer needs it. So the layout needs to be flexible and allow to easily add or remove controls, although it just shows some simple product messages like price, discount, tax, etc.
The previous version of the web site saved all the control information in a database, like control name, control type (textbox, label, dropdownlist), which page and panel it belongs to, etc. You can see there is a big performance hit because every time there is a request to this page, it needs to get all the required controls from the database and add them to the page manually, no matter whether the request is a postback or not.
We thought about adding the controls directly to the .aspx page, but in this case it will be difficult to change them later. We also considered holding all the controls' information in XML files, which may give a little performance advantage, but it still needs to render the controls all the time.
So this is the problem we have, to improve the app's performance and also meet the users' needs at the same time. Could anyone help me with any solutions or ideas?
PS: you can also ask questions if I didn't make it clear enough. Best regards.
This sounds like a good situation for User Controls. If all you're doing is toggling child-control visibility, then creating a user control with toggleable visibility properties should meet your needs. You can still use your backend to toggle visibility, but you'll only need to pull yes/no flags from the db instead of entire page schemas.
From an architectural standpoint, User Controls are great because they encourage modularity, code reuse, and lend themselves well to version control (UsercontrolV1.cs, UserControlsV2.cs, etc). The point on version control is especially great in cases where change requests require logic updates, or simply need to revert to a build that existed x iterations ago.
Now that is what i call a Flexible web-Application.
the controls shown on the web page need to be changed all the time
Who will change the controls? The client? Can you not just update the .aspx file and publish it to the server every time a control is requested to be changed?
but any way, its an interesting question. There is nothing else really that can be done except using a XML file.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I just finished an intro to web dev course in my CS program and came away wondering something simple. When should you use JavaScript (client-side) instead of server-side (we used PHP but anything applies) code? Vice-versa as well.
There is no recipe for deciding that. A few notes:
security and validation should always be present at the server side (sometimes duplicated in the client).
the client-side should contain only UI-logic. No business logic.
logically, everything that accesses a database should be on the server.
Of course, if your application is a RIA (rich internet app), then you can have logic on the client. So it all depends.
Javascript should be only used to manipulate the UI of the page. You can also do certain validations using it, however, there must be corresponding validation on the server-side. For doing any data manipulation, applying business logic, etc you should always use server side code.
Here are some cases where you will use client-side code:
Changing the look (UI) of the page e.g. dynamically show/hide some
elements
Validate user inputs (this should also be done on server side)
Cases where to use server-side code:
Validation of user inputs (should always be done on server side irrespective of whether done on client side or not.)
User authentication
Business logic (deciding what to show to which users, calculations)
Database access
Imho i would say, use server-side if you can. All client-side code can be manipulated. Or maybe will not run cause the browser dont support it.
I have several pages of my web application done. They all use the same master page so they all all look very similar, except of course for the content. It's technically possible to put a larger update panel and have all the pages in one big update panel so that instead of jumping from page to page, the user always stays on the same page and the links trigger __doPostback call-backs to update with the appropriate panel.
What could be the problem(s) with building my site like this?
Well, "pages" provide what is known as the "Service Interface layer" between your business layer and the http aspect of the web application. That is all of the http, session and related aspects are "converted" into regular C# types (string, int, custom types etc.) and the page then calls methods in the business layer using regular C# calling conventions.
So if you have only one update panel in your whole application, what you're effectively saying is that one page (the code behind portion) will have to handle all of the translations between the http "ness" and the business layer. That'll just be a mess from a maintainable perspective and a debugging perspective.
If you're in a team that each of you will be potentially modifying the same code behind. This could be a problem for some source control systems but one or more of you could define the same method name with the same signature and different implementations. That's won't be easy to merge.
From a design perspective, there is no separation of concerns. If you have a menu or hyper link on a business application, it most likely means a difference concern. Not a good design at all.
From a performance perspective you'll be loading all of your systems functionality no matter what function your user is actually doing.
You could still have the user experience such that they have the one page experience and redirect the callback to handlers for the specific areas on concern. But I'd think real hard about the UI and the actual user experience you'll be providing. It's possible that you'll have a clutter of menus and other functionality when you combine everything into one page.
Unless the system you are building a really simple and has no potential to grow beyond what it currently is and provide your users with a one page experience is truly provide value and an improved user experience and wouldn't go down this route.
When you have a hammer, everything looks like a nail.
It really depends on what you are trying to do. Certainly, if each page is very resource-intensive, you may have faster load times if you split them up. I'm all for simplicity, though, and if you have a clean and fast way of keeping users on one page and using AJAX to process data, you should definitely consider it.
It would be impossible to list too many downsides to an AJAX solution, though, without more details about the size and scope of the Web application you are using.
Recently I was asked the following questions in an interview:
How will you do performance optimization over your jQuery in ASP.NET?
How many script managers can you have in an ASP.NET application? Why? (AJAX related)
Could someone explain the answer to these questions? I have no idea about either of them.
You can only have one ScriptManager on a page. The script manager has several responsibilities, such as loading the MS Ajax libraries, creating proxy classes for web services, and enabling partial page rendering (e.g. UpdatePanel) support. It doesn't make sense to have more than one per page, and you'll get an exception if you try to do this.
If you need to load additional scripts or references, in a user control for example, you can use the ScriptManagerProxy class.
Regarding (1)
Use tools like firebug or dynatrace to do the profiling and digging the code.
JQuery is not out of nowere its JavaScript so one has to know it well for optimization.
Adhere to good JavaScript coding practices that helps big time.
Always remember that each JS file and line of code you write is transfered to client for execution so be aware of it while writing code as this cas cause issues. So if you decide to include 4 JS file sum sized to 400 KB emans u are putting burdon on the client especially if he is on slow lines.
Yes and for (2) refer http://forums.asp.net/t/1073734.aspx link.
One page can only have one script manager.
Hope that helped :)
Watch your selectors, especially when you are working with .NET. You don't want to run the same selector multiple times. Instead you would want to define a javascript variable to hold the selector and then use that variable...that way jQuery is not having to find the same selector multiple times.
You can have 1 ScriptManager per page.
You can have only one ScriptManager per page
A page can contain only one script manager in its hierarchy according to the documentation. For jQuery optimization, it is important to use minified versions of all JS files and profiling tools such as Firebug are helpful.
This would be an normal statement for performance optimization,
Performance is an important aspect of the modern day web application development. Not only does it make a site seamless to use, but also increases the scalability of the website and makes it future proof. In this article, we will look at various aspects of improving the performance of ASP.NET web applications. We will only concentrate on the browser/web server side performance as opposed to server/app server/database server performance optimizations.
You can have only one scriptmanager per page otherwise you will get an exception when you run the application.
you can also refer this link for Interview questions and answers.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
As I understand it, just URL re-writing is not the only thing one needs to do for making a website SEO friendly. You also needs to maximize the use of div (instead of tables), reduce javascripts, flashes and have a clean HTML.
I need to know how can this be achieved when one used a ASP.Net control. ASP.Net send loads of stuff to the screen which in technologies like PHP can be delivered using much cleaner code.
Can anybody tell me is there a way to force ASP.Net to render cleaner code and work with divs instead of table when one used datagridview.
Also I would appreciate if one can post the suggestions for making a existing website SEO friendly which was coded in ASP.Net C# 2.0
Making your site's pages "SEO friendly" is really about ensuring that search engines (Google), can understand the content on the on the pages. Using "semantic" html markup can go a long way to help the search engines.
ASP.NET doesn't so much make it hard to do semantic markup as it does make it easy NOT to.
Wrapping a sub-heading in an <h2> tag and styling the <h2> helps the search engine understand that a particular string of text has more weight than other text on the page. ASP.NET makes in easy to fall into the trap of just using a Label server control and applying styling to it to make it look like a heading.
GridView data controls render tables. If you repeating data would be better understood with more semantic markup, consider using a Repeater control or a Listview control if you need to support paging etc.
Step 1 to SEO optimization is understanding semantic markup. Then you can find the appropriate ASP.NET controls to achieve your optimized SEO output.
Server controls have been the main selling point for ASP.NET WebForms. It has allowed developers to quickly put up pages without thinking of HTTP, HTML, CSS, JavaScript, SEO or anything. Exactly this kind of knowledge you will need to consistently create quality markup that is SEO-friendly.
If you absolutely wish to stay with WebForms, you need to look at what output the controls you use render. If you don't like it then you may have to redefine their rendering algorithms or better create your own controls.
Also get some url rewriting module (or use the one included in .NET 3.5 SP1 - the one used by ASP.NET MVC framework) and define good-looking self-describing urls for your existing pages. Also take advantage of header tags (H1...H6), search engines look at them to see what the page says it is about.
I wouldn't worry about divs vs. tables and validation, this is not clear of how relevant this is for the SEO, there are too many widely different opinions on these matters with proofs to support each point of view. What does matter, is the content. As they say, content is the king.
What I would pay attention to is the view state that ASP.NET injects into pages. It is widely known that the close to the beginning of the page the content is, the better for search engines. ASP.NET steals the beginning of a page by putting there an often huge block of serialized view state (under circumstances can reach megabytes). Try to turn off view state for your pages if you can (if your server logic can be adapted to stateless operation). This will be a very important step.