My page is almost complete! I am in the process of adding some more performance-boosting features. I have come across a piece of code to enable caching on a page, but would like to know whether it is a good idea.
My page:
Consists of 2 Iframes, one with static content and one with youtube content (it is embedded youtube videos that my company created)
Has 13 static images which wont change regularly
Consists of a lot of tables.
And, will it also be a good idea to place the youtube iframe in a asp:UpdatePanel?
Thanks in advance.
Related
We have set up several of our websites on Apple News now. For half of our sites (built on one platform), images are displayed in the Article List view 95% of the time. The other half of our sites display an image in the Article List 0% of the time.
However, images are reliably displayed on the Article View page.
The images used are almost certainly NOT the one provided in the RSS feed in the media thumbnail tag; they are almost certainly being scraped from the article itself. Images are set properly in the OG tags.
This behaviour is consistent between iPads and iPhones.
Any ideas would be welcome.
I can think of a couple of things, this may or may not constituent a complete answer, but its some ideas.
Self host the images on a webserver that tracks all requests (nginx - awstats - etc)
Wrap the images or the urls in a tracking url script (can even be monetized) where you submit an image url and receive one in return (called something like In-Image Advertising, or using a link tracker/redirector)
embed a javascript into the theme or what-have-you into this news thing that tracks each image being loaded, ( OR
The rest of these kinds of ideas really very greatly depends on your amount of control, and skills.
Hope I helped you with some good ideas. Willing to help more.
I have installed and customized WooCommerce Product Pages on my WordPress site, but one of the product category pages takes about 7 seconds on average to load. Other category pages load in around 3 seconds. I am struggling to find the reason for this. There are less products on this page than other pages and less sub-categories. I have installed plug-ins such as 'W3TC' and 'Better WordPress Minify' but it hasn't made much difference.
Has anyone else experienced an issue like this and if so, would you mind sharing how you resolved it?
Any help would be greatly appreciated.
Thanks
Using caching plugins is fine and dandy but the reason these pages load slowly is simply the data model that WordPress uses, post-types and the metadata look-ups. The only way to truly get speeds up is good hosting and turning on Object Cache on the server.
We enable this on a WP-Engine site and it was night and day. 12 seconds turned into 2.5 seconds.
Object caching
Object Caching is designed to capture queries to the database and store them in memory. This allows you to run an "expensive" query - a query that takes a long time - one time, and then reuse the results again. When used properly, Object Caching can give your site a speed boost by reducing the time that is spent accessing the database. Please note that this change can take a while to take effect.
There can be many reasons for a WordPress pages to load slower. But you problem seems to be unique.
Here are some useful tips by which you can speed up your page loading:
Optimize Your Images
The page on which you are having issue might have High Resolution Images.
Avoid displaying flash on your Page
Avoid too many advertisements
Cut off the Unnecessary ads from the page.
Do not use inline cascading style sheets
Besides utilizing inline cascading style sheets make a CSS file and call up file on all page of your site that will likewise help in repressing download speed.
Put stylesheets at the top - Put scripts at the bottom
Utilize javascript at the bottom of the page this will serve to load up your page fast. When web browser download javascript it will finish downloading your internet site data, and so any analog downloading will end while browser request Javascript downloading.
Use CSS Sprites
A CSS sprite is an an image comprised of other images used by your design as something of a map containing the coordinates of all the images. Some clever CSS is used to show the proper section of the sprite when your design is loaded.
Here you do not have to load multiple images which are used on you site. Just loading of a single sprite image will do all your work.
Limit Your External Scripts
There might be a issue that external script is being loading on that page. You need to check and limit the same.
Add LazyLoad to your images
You can use this technique to load the page part by part.
Control the amount of post revisions stored
I saved this post to draft about 8 times.
WordPress, left to its own devices, would store every single one of these drafts, indefinitely.
Turn off pingbacks and trackbacks
Let me know if the problem resolves using these tips for you site.
The list of suggestions that WisdmLabs mentions above is great!
However, I'm not sure if you've seen the plugin for Wordpress called W3 Total Cache. It has a load of built in functionality to automatically improve the performance of your Wordpress web pages.
It's free and worthwhile using if you are looking to improve the performance across your whole site.
https://wordpress.org/plugins/w3-total-cache/
Hypothetical Situation: I have a small obscure website called "miniatureBoltsInCarburetors.com" which provides content about the miniature bolts which hold a carburetor together as well as some general related automotive information. My site also has a single page which allows someone to find the missing bolt in their carburetor, and while no one will access this page directly from my website, one billion other popular automotive sites have embedded this single page in their website using an iframe, yet not included a link back to my site.
I recognize that this question is related to SEO which is considered off topic, however, all of the many SEO related forums discuss the marketing steps one could take, and not the programming steps or strategies, and hope others will allow this question to be answered here.
I wish my site "miniatureBoltsInCarburetors.com" to be ranked high for general automotive searches. What could I do to allow the 3rd party sites which include an iframe back to my site to improve my ranking? Could using JavaScript in the iframe to create a link on the parent page provide any value? What about when my server renders the page, use PHP to get the referring URL from $_SERVER, and include it in the content?
I am providing a solution here. Not sure if this is what you want though.
In your page which is used by other websites in iframe you can put below Javascript. This javascript checks if the webpage is opened inside an iframe or directly in browser.
So using this check when you see it is opened in an iframe. On click on something navigate to your website.
// This works in all browsers
function inIframe () {
try {
return window.self !== window.top;
} catch () {
return true;
}
}
Also for your reference you can check the below URL.
How to prevent my site page to be loaded via 3rd party site frame of iFrame
Hope it helps.
Iframes are seen seperate pages by Google. Your approach may end up being penalized due to being sourced from untrusted site. According to Google Webmaster Support
Frames can cause problems for search engines because they don't
correspond to the conceptual model of the web. Google tries to
associate framed content with the page containing the frames, but we
don't guarantee that we will.
One of the best approaches to rank higher for a specific keyword is, make multiple related sites. In your case a 3-4 paged site about carburetors, bolts, other things your primary site contain would do it. These mini sites will be more intense about the subject due to less page count. Of course they should contain unique articles on each page. Then link from mini websites to primary websites and you can see the dramatic change.
In fact, the thing you are trying to do was a tactic to rank competitors down worked occasionally a few years ago. Now, it is still a risk.
I see. You don't want to mess up the page for your own site, but you want to do something with all the uncredited embeddings.
The solution is fairly simple:
Create a copy of the page.
Switch your site to use the copy.
Amend the version that countless other sites are embedding, so that there is a small link back to you. Or, add an iframe blocker script that will load your site.
If the page is active (ie user interacts with it to find the missing bolt) you could include a sales message with the response encouraging the user to visit your site.
I think that your goal is getting your link onto these other sites long enough to get indexed by Google before it is noticed by the people doing the embedding, so it's a bit of a balancing act.
I see conflicting advice about how Google indexes iframes. You should use a PageRank checker to see if the existing iframe page url has PageRank, and compare it to the page that you embed it on.
I dont Think you need to worry ,.
Google bot does seem to crawl through Iframes ,but the Web-Page Containing that Iframe is not Credited for that Content .. In other Words,, Page-Ranking of that particular Web-Page do not Change due to Contents from Iframe .
is IFrame crawled by Google?
Do robots crawl iframes?
I have a album page (using ASPNET MVC + MongoDB) which has following parts:
#1 List of thumbnails displayed as a grid
Note: I load few fields for each video in the video album: url, length, caption, tags. If user hovers over thumbnail I show some of them.
#2 Then I have related albums
#3 Then I have List details
#4 Then I have current video details (including description which can be 5K characters)
#5 Then all comments of the current video.
As you can see list of thumbnails, related albums, list details are same for all videos.
But video details, video comments are different for each videos.
I NEED TO STREAM VIDEO, GET VIDEO DETAILS & COMMENTS EACH TIME USER CLICKS DIFFERENT THUMBNAIL
WHAT IS THE BEST WAY TO DO IT?
Option 1: url redirect user for each click (will cause reloading of common pieces)
Option 2: ajax to load comments and VIDEO details & update page (more coding)
Is there a way to leverage caching common pieces?
Please share your thoughts.
Thanks
If it is in scope I would probably also look into using one of the many framework for creating a Single Page Application, like Backbone (which I happen to like like a lot).
If it is only a caching solution that you are looking for, MVC has an OutputCache attribute that can be applied to a Controller or to single Actions and allows fine tune and managing of caching behvaiours server side, client side, parameter based etc. etc.
I've got a Flex 3 project. One of the problems I have is that not very much of its content is indexed by Google. Currently, I pull data from a mySQl database, so the Googlebot doesn't see most of the site.
My goal is to increase the amount of content indexed by Google, improve the SEO, and improve SERPs.
I thought that instead of pulling the data from the database that I would change the project's architecture and create separate "pages". So, in my case, I would compile each puzzle separately and upload it to the server in its own directory. This way the info in each puzzle would get indexed.
The negative is that if I add a puzzle, I'd have to add a link to it in all of the puzzles that are already on the server. I would have to add the link, re-compile each puzzle and upload it to the server. Is there a way to get around this problem? Also, if I wanted to communicate some data from one puzzle to another in the future, I wouldn't be able to do so.
Any suggestions?
Thank you.
-Laxmidi
The usual way to achieve this goal is to develop a hidden parallel site in HTML.
On the first page you will have your flash and, hidden by javascript, a list of links to the other pages. These links will be parsed by the robots. Ideally, the href pages are virtual (look for "url rewriting"). On each "fake" page, your server-side language will print on the page a content or links from your database AND the flash. The flash will be provided with a string explaining where it is and what it's supposed to show.
Ex: http://www.mysite.com/category1/content7 The URL rewriting sends this request to http://www.mysite.com/index.php?uri=category1/content7. The page should display the Flash with FlashVar "uri=category1/content7". The Flash knows which content it has to display so when an user comes from google, following this link, he will find the content he was looking for.
Every linking and content for SEO should be in HTML, don't trust robots capability of reading Flash.
have a look at Adobe's reference on deep-linking.
you can generate a website's sitemap.xml with a cron process (daily), such that the URLs encode the state of the application you need. This URL will encode whatever content you need to retrieve from the db, with just one index.html page.
good luck!