LinkedIn Not utilizing og:image - wordpress

I've got a site that has multiple share buttons on entries in a WordPress site.
We designed this so there are no individual entries to view, they're Podcasts and videos. The listing page has a minimum of 10 entries, each with share buttons.
Currently the share links and titles are working correctly. But the page is not recognizing the og:image, and instead is picking up the default logo for the site itself.
I read another post on Stack Overflow that said it might be an issue for LinkedIn if the image is utilizing SSL for the link. But I just find that hard to believe.
The other issue I'm struggling with, the docs say once an image is scraped it stays cached for approximately 7 days.
I had an issue with FaceBook and there's a debugger that allows you to rescrape the page which let's me verify my changes worked.
My two questions are, is there something other than the og:image i should be specifying? since I can't specify it per post, it's in the head of the page itself, i would think it would pick that up. No?
Second, is there a way a developer can re-check after the meta info has been changed to see if the changes worked, without having to wait the TTL on the cache?

try this:
url/link?blah=1
url/link?blah=2
url/link?blah=3
to get around the cache.
This should trick it into thinking its a new page each time.
Can i get a link to test?

Anthony Walz posted the correct answer. Through email he also helped another problem i had which corrected a new issue i didn't realize I had until i looked.
my LinkedIn shares were not picking up the show title, they were picking up the page description instead (i have several podcasts showing on one page, we don't use individual post pages, they all play from the listing.)
he pointed me to the developer docs on formatting sharing links
Which gives a real world example - here:
https://www.linkedin.com/shareArticle?mini=true&url=http://developer.linkedin.com&title=LinkedIn%20Developer%20Network
&summary=My%20favorite%20developer%20program&source=LinkedIn
Thanks a ton for assist Anthony!

Related

Had all relavant opengraph social meta (created by AIOS) but facebook debug failed to recognize

I know this is an old topic but I've digged dozens of related questions and solutions but neither of them works.
Some of questions I've read:
Can't fix: 'og:image' property should be explicitly provided, even if a value can be inferred from other tags
Wordpress All In One SEO plugin not sharing to Facebook wall
My test url for facebook debug:
https://trangthietbiytehcm.com
Also, I debuged via Facebook sharing debug and Facebook Object debug
Here is Facebook Object debug:
Here is Facebook Sharing debug:
here is the post's head tab:
I also add that sometimes this post could be displayed well (including image thumbnail) on my facebook page but most of time it did not.
I've cleared my wordpress caches.
Absolutely, I can read the social opengraph tab on my post but facebook failed to read them. I do not know the reason behind this problem. Please help me to be clear! thanks
EDIT:
This is a capture of the fact that sometimes Facebook receives my blog post well. And the post link is: https://trangthietbiytehcm.com/uncategorized/san-pham-test-lan-2/
And I wnat to add that the above test link (https://trangthietbiytehcm.com/uncategorized/bai-viet-mau/) also has the same fact as this
EDIT (UPDATE):
After months of be frustrated with this problem, I even tested on another new web on new host server. I've finally found that the reason of this problem is the page cache function of the W3 total cache
We've found this issue to be related to W3 Total Cache, too. Specifically, it's the Wordpress Dashboard > Performance > Minify > Eliminate render-blocking CSS by moving it to HTTP body feature. What this does is shove all the CSS right after the tag, moving all the other header tags down.
And it seems Facebook (and Twitter?) only parse a fixed amount of the page before deciding to try and render... and if your CSS is long, well, it never sees your og:image tag (or any others, like Title, etc).
We have yet to find a good fix which still allows this feature to be used.

Wordpress and Cloudflare: how to update (cached?) images

I have my blog (wordpress) web site hosted at bluehost.com. A few months ago I decided to enable Cloudflare (through CPanel).
It's all working well (and I am seeing better overall performance etc) BUT I have a small issue I am dealing with.
I posted this blog some 10 days ago: http://it20.info/2016/01/why-docker-containers-and-docker-oss-docker-inc/
A few days later I had to change picture #2 (of 3) to tweak it a bit. The old picture says "Unikernel" in the red rectangle and the new picture (I uploaded) says "Unikernel/vm".
Note that inside the blog post I make an external reference to the picture (in the html code):
http://www.it20.info/misc/pictures/WhyDocker-ContainersAndDockerOSS-DockerInc2.jpg
If you point STRAIGHT to the picture you will see the new version (so I know I have updated it properly).
However the blog post still shows the old picture (as if Cloudflare is caching it indefinitely).
If in the blog post I right click on the image and do a "view image" (Firefox) it points to: http://i0.wp.com/www.it20.info/misc/pictures/WhyDocker-ContainersAndDockerOSS-DockerInc2.jpg?resize=640%2C392
(which shows the OLD image).
Funny enough if I remove the "?resize=640%2C392" it shows the proper picture.
I am trying to figure out a proper procedure to 1) write a blog post that refers to pictures as external links 2) possibly update said picture via an FTP upload and 3) have Cloudflare render the updated picture.
Thanks.
The root of your problem could be that query string after the image URL:
?resize=640%2C392
In CloudFlare, go to the caching settings and check if your current caching level is standard. If yes try changing to either no query string or ignore query string.
As far as a proper proceedure for future posts where images could change, as an alternative to purging all your site files in CloudFlare, or selectively purging just the image file in question, woudl it be feasible for you to simply change the name of the updated image file, or keep the same name but upload it to a different directory? And of course update the image src in your HTML as well.
Good luck

Site producing bad urls?

I'm using a custom Genesis child theme and lately I've been noticing that many false articles have been showing up on webmaster tools. They look something like this:
I haven't written these nor are they topics my site focuses on so I have no clue why they are showing up. So far, I've had to delete about a hundred of these. I read on a forum that this can be due to my theme generating bad urls but I'm not sure what that means nor do I know how to fix it. What can be causing this?
I believe that this problem is due to your website being hacked or Google is trying to Crawl or follow a link within your content that is not really a link.
This is what webmaster tool tells you about the problem:
In Crawl Errors, you might occasionally see 404 errors for URLs you don't believe exist on your own site or on the web. These unexpected URLs might be generated by Googlebot trying to follow links found in JavaScript, Flash files, or other embedded content.
To find out if your website has been hacked. First get this total = WordPress number of pages + number of post + number of categories + number of PDF or files + Images. Then do a google search using the following query (without the quotes) "site:yourdomain.com" if the result number is exaggerated greater than the calculated total then your website is definitely hacked.
If you believe that your website is not hacked try to find from where these links are being generated. Here is the trick: Go to the Web Master Tool report and click on one of those links, check the "Linked from" tab. There should be one or many possible pages listed from where these unexpected links are coming from.
Two possible Outcomes:
The page from where the link is found is from your own website: Go
to that page and open the source code, do a Ctrl+F search for that
link, if found check what section or content is generating this
problem.
The page from where the link is found is NOT from your own website:
In this case try to contact the owner of the other site and ask the
link to be removed, if not possible I highly recommend you to create
a 404 page within your WordPress installation with some useful
links. Google how to do this, there are plenty of resources.
Hope this helps

Web crawlers and IFrames

Hypothetical Situation: I have a small obscure website called "miniatureBoltsInCarburetors.com" which provides content about the miniature bolts which hold a carburetor together as well as some general related automotive information. My site also has a single page which allows someone to find the missing bolt in their carburetor, and while no one will access this page directly from my website, one billion other popular automotive sites have embedded this single page in their website using an iframe, yet not included a link back to my site.
I recognize that this question is related to SEO which is considered off topic, however, all of the many SEO related forums discuss the marketing steps one could take, and not the programming steps or strategies, and hope others will allow this question to be answered here.
I wish my site "miniatureBoltsInCarburetors.com" to be ranked high for general automotive searches. What could I do to allow the 3rd party sites which include an iframe back to my site to improve my ranking? Could using JavaScript in the iframe to create a link on the parent page provide any value? What about when my server renders the page, use PHP to get the referring URL from $_SERVER, and include it in the content?
I am providing a solution here. Not sure if this is what you want though.
In your page which is used by other websites in iframe you can put below Javascript. This javascript checks if the webpage is opened inside an iframe or directly in browser.
So using this check when you see it is opened in an iframe. On click on something navigate to your website.
// This works in all browsers
function inIframe () {
try {
return window.self !== window.top;
} catch () {
return true;
}
}
Also for your reference you can check the below URL.
How to prevent my site page to be loaded via 3rd party site frame of iFrame
Hope it helps.
Iframes are seen seperate pages by Google. Your approach may end up being penalized due to being sourced from untrusted site. According to Google Webmaster Support
Frames can cause problems for search engines because they don't
correspond to the conceptual model of the web. Google tries to
associate framed content with the page containing the frames, but we
don't guarantee that we will.
One of the best approaches to rank higher for a specific keyword is, make multiple related sites. In your case a 3-4 paged site about carburetors, bolts, other things your primary site contain would do it. These mini sites will be more intense about the subject due to less page count. Of course they should contain unique articles on each page. Then link from mini websites to primary websites and you can see the dramatic change.
In fact, the thing you are trying to do was a tactic to rank competitors down worked occasionally a few years ago. Now, it is still a risk.
I see. You don't want to mess up the page for your own site, but you want to do something with all the uncredited embeddings.
The solution is fairly simple:
Create a copy of the page.
Switch your site to use the copy.
Amend the version that countless other sites are embedding, so that there is a small link back to you. Or, add an iframe blocker script that will load your site.
If the page is active (ie user interacts with it to find the missing bolt) you could include a sales message with the response encouraging the user to visit your site.
I think that your goal is getting your link onto these other sites long enough to get indexed by Google before it is noticed by the people doing the embedding, so it's a bit of a balancing act.
I see conflicting advice about how Google indexes iframes. You should use a PageRank checker to see if the existing iframe page url has PageRank, and compare it to the page that you embed it on.
I dont Think you need to worry ,.
Google bot does seem to crawl through Iframes ,but the Web-Page Containing that Iframe is not Credited for that Content .. In other Words,, Page-Ranking of that particular Web-Page do not Change due to Contents from Iframe .
is IFrame crawled by Google?
Do robots crawl iframes?

FB Like Button not working properly, counts multiple posts at one

My problem is that I was using a "Like button" plugin on wordpress and I didn't liked the looks of it. I deactivated the plugin and then tried with manual code of XFBML button. That screwed the count to some of the posts, having their counts all in one. I reverted the changes, using the plugin again, deleted the code added and the problem persists. Some of the posts have the "shared" count box. And when you "Like!" any of that posts, only the last one appears on Facebook.
It's it possible that this is a caché issue or something that is wrong in the code?
I tried reverting the meta tag "og:type" from "blog" to "website" but it didn't allowed me, can this be the problem?
Why do some of the posts share that countbox if the links are not the same?
And the wierdest thing, why only some of the posts have the issue while some are shared correctly?
As for an example, say post 1, 5 and 7 share the same countbox (+200). When you "like!" any of them, only the last of them (the most recent) gets to the FB wall.
This doesn't happen with the new posts, only with part of the old ones.
If the case, you can see it working: http://elrincondelacritica.com/
Thanks in advance.
By the way, if you need any piece of code just ask, this is not my page and I really need to fix this asap, also because it's online and running.
Thank you.
The FB debug tool shows that you didn't supply the fb:app_id > which seems to be crucial to render the news feed.
See: https://developers.facebook.com/tools/debug/og/object?q=http%3A%2F%2Felrincondelacritica.com%2F
You can create a Facebook App here: http://developers.facebook.com/apps
Once you've done that you can add the fb:app_id by adding this into your header:
<meta property="fb:app_id" content="000yourAPID000"/>
Thanks mate for the share. Next time I make a website I'll make sure to create an APP too.
And by the way, the error was because of the meta tag og:type changed from website to article. What I did was to leave it to article (not trying to change it "forcing" og:type - website) and placing each "bad" post in the FB Debugger. There were 30 or so... but it worked, and now the new posts work too, with zero count from the start and incrementing correctly, each in particular.

Resources