I've a Ajax .net website which follows this structure :
Control (ascx) : TopMenu, LeftPanel, RightPanel, Footer, all are very simple controls and don't require any connection to database or server side code !
One div body (ajax)
Everytime the website starts, the 4 controls load first, then comes the Ajax body. The performance is pretty good in development environment.
But when i uploaded the precompiled site to the host, it always take quiet long for starting up, after the first load, the performance is good
What i can't understand is : as far as i know, the four ascx control will be rendered first, that means the page will be loaded to the client, after that is the ajax content. So what's causing the performance on start up ?
P/s :
i did set the key compilation=false in web.config
i compiled the site using Publish tool in VS 2010 (Release mode and not allow updatatable ... )
i have no images on the site, it's a very simple site
i've checked similar topics, and event posted a question not so long ago about
this, but still without success
my site: http://iketqua.net
From your site and running the Network Analysis on google chrome what is blocking the render of your site is a huge delay for make a lot of calculations on page load, there is a lot of time that takes to start get data.
Also the google analytic script, must be placed on bottom of your page, together with other external scripts for google plus, facebook like etc.
Also there are 2 fonts on this css, that can not be load, and this takes almost 3 seconds delay.
http://iketqua.net/Styles/Fonts/MyriadPro/font.css
(source: planethost.gr)
If you are referring to the very first request after deployment to production. I don't think there's anything you can do about it. ASP.NET first request will always be slow, even if it is a pre-compiled site because the server still needs to load resources on the server-side.
But, if you are talking about first load from the client-side perspective, by just running Chrome Developer Tools I can see that your site's home page is quite heavy (44 requests, ~4 seconds to load) which explains why the first load takes some time and sub-sequent requests are quicker...mainly, because most of those 44 requests get cached by the browser. Now, in your dev environment it happens quickly because there is no significant network latency or connection hops, once you move to production the network lantency and connection hops plays a big role in performance...that's why many sites use CDNs.
Suggestions
Make your site lighter. There's many things you can avoid. For example:
This background image (http://iketqua.net/img/header_bg.png) is useless because it is a plain color which you can easily achieve that using css. That'll translate to one request less
Bundling and minification tools to minify and merge style sheets and js files
Optimize your css. Take the time to review your css and clean it. I can't believe that such a simple page can be requesting 9 css files...probably most of them are coming from open source frameworks (jQuery UI, DatePick, etc)
I lack permissions to post this as a comment, but if it's fine in the development environment, it may be something as simple as ability of the host or the connection to the host.
After the first load, the performance is good
I'd be inclined to think this is due to the site being cached.
Related
I am trying to restrict the user from downloading the page as .html or .aspx file from browser.
Or is there a way to change the content of file if its downloaded?
This is a complex area, with lots of moving parts. The short answer is "there is no way to do this with 100% success; there are a few things you can do which make it harder".
Firstly, you can include JavaScript to disable the right click context menu. This doesn't stop Ctrl+S, but might discourage casual attempts.
Secondly, you can use DRM in the browser (though this is primarily aimed at protecting media content. As browser support is all over the show, this isn't realistic right now.
Thirdly, you could write your site as a single page web application, and build some degree of authentication into the "retrieve content" logic. This way, saving the page to disk wouldn't bring the content along, just the "page furniture". However, any mechanism you include to only download content when you think you should is likely to be easily subverted by anyone who is moderately motivated.
Also, any steps you take to stop people persisting your pages locally are likely to break the caching mechanisms on which the internet depends for performance, so your site would likely be dramatically slower.
No you can't stop them.
Consider how the web actually works here: once the user has visited your website and loaded your page into their browser, they have already downloaded it - the web page was transmitted from your server to their computer and appeared on their screen.
All they have to do then is click the Save button to keep it permanently on their disk. That doesn't involve downloading it again, it just copies the page data from a temporary folder to a permanent one. Of course it's also possible for people to use another HTTP client (i.e. not a browser, but maybe an existing program, or some code they wrote themselves) to visit the URL of your page and save the returned contents.
It's not clear what problem think you would solve by stopping people from saving pages. Saving the page is something done within the browser - you as a site developer don't control the user's browser, so you can't prevent that. And if you stop them from downloading your page in the first place then - by definition - you also stop them from using your website...which kind of defeats the point of having one :-).
If you've got some sort of worry about security, you'll have to clarify exactly what you are concerned about, and maybe you can get advice about a sensible way to deal with it.
I have a WordPress site hosted in a server. I also verified the server configs everything is perfect. No space or ram issues.
When I start to load my site URL. In my networks tab, I see that the first request was "instant-loan/" which took 1min and rest all requests come faster after that. Ofcourse they loaded from cache in the image I shared but if I open in incognito also the rest of the requests are in miliseconds. This happens in all pages,
What could be the issue here? I have been searching for a possible cause for a very long time.
I have performed a site performance test in google insights and GT. They both gave the same below results:
Page loading: 35sec.
They only say to optimize images and js content.
The total page size is 6mb.
214 requests are beeing processed.
[NEW EDIT]
Below is the performance result. It shows a lot of idle time and js image rendering seems to happen quite fast. A max of 4 sec to render, load the site. Thus I assume there is something wrong with the server. I have minified CSS and JS. Also, use compressed images only. Is this because of any mysql connection issue ?? Has anyone faced it?
Here is my page URL: www.1800-gifts.com/USA/Cake-Delivery and other pages like that all are loading very slow even i have caching , compression enabled, i have tried to call go daddy which is my hosting provider but they do not respond positive.
Developer is telling me that it is a server issue, but i don't find any issues in server it is fine.
This website is developed in asp.net 4.0, database is mssql 2012 r2.
server is VPS, with 2 gb of ram, I have 2 GB data in database, and some table contains more than 100k records.
Please look at my site and give me suggestions, i have checked in google page speed and other tools they are all saying different views.
I am not sure if this is the cause but if you enable developer mode (F12) and run the site in chrome you will see that the cake-delivery page is the one that is causing the loading time (44s). You will also notice that there are JQuery errors on the page.
This could possibly be part of the problem.
EDIT:
After looking at the linked page I think Erik is right, JQuery is not the issue.
The person that is developing the site needs to revisit the way the page works completely. There is a massive amount of operations happening in the page load of the page. The operations that are used are also hack and slash ways of doing things that there is already built in methods for. This is simply a page taking forever to load due to bad coding.
I would suggest the developer returns to the drawing board.
There are a lot of great tools that look at your page and tell you what might be wrong with it. Analyzing your page with GTmetrix for example gives you this. There are also important tips you can work on right away, for example:
gzip compression
Minifying css, html and js
Concatenating scripts
and a lot more. I also recently wrote an article showing important optimization for web performance
Looking at the waterfall chart of your page (also available on GTmetrix) shows that the biggest problem is indeed your server. It takes 16 seconds to receive an answer for the first request (time-to-first-byte). There is clearly something wrong!
There are a lot of things that could be wrong on your server. You should test your database queries (are they slow? How many are performed for a page load?).
I am having a simple blank page without any source code.The page also taking very long time to come.I am not able to understand the reason behind this.
The domain is getting a high requests.
What exact settings needs to be done in iis 7.0 so that it will be faster.
Please help.
ASP.NET pages always have an initial delay when the first request is made after the file has been created/edited/uploaded because the server needs to recompile them, however it shouldn't be more than 2-3 seconds in practice, and does not affect subsequent pageloads.
The only thing I can think of is an overloaded server. Assuming you're on a shared hosting package then I recommend you find another ISP. If not, then I'm afraid there's a lot more to it than just a "page pages load faster" switch hidden away.
I am testing ASP.NET website and for that I have turned logging on at IIS6.0.
Following are the observations during testing:
Each link, png image, MS Chart and CSS file has been requested separately, one after another.
For request of say login page it is taking around 30-45 seconds to complete and in that page only 6 images are there and at log file it is observed that there are separate requests for each images one after another.
Can anybody help me to improve site performance and also I would like to know that is it possible that all requests would send to server parallel?
Yes it is possible to improve on the app speed by parallelizing the downloads !
I recommend going through google page-speed and yahoo's yslow, and read the practices that they propose. I felt it informative.
http://code.google.com/speed/page-speed/
http://developer.yahoo.com/yslow/help/index.html
Thanks
First of all, have you checked web-site Performance tab? Limits could've been set there. Also check that keep-alives are enabled (web site tab).
Then you should profile your server using System Monitor.
If everything mentioned is ok, you should check client side and what's between client and server.
What's happening is that the browser makes HTTP requests to the server for each object it finds on the page. You can eliminate those requests, or reduce how often they happen, by enabling client-side caching. For static files, you can configure that in IIS.
You can parallelize requests for images (not JS files) by assigning them to different domains; if they are all in a single domain, the browser will request only two at a time.
However, you question opens the door to a big subject. In an attempt to provide a detailed answer, I ended up writing a book on the subject, called Ultra-Fast ASP.NET. I cover the answer to the question from the OP in great detail in Chapter 2.