Adopt another site's Open Graph properties - facebook-opengraph

I'm implementing the Open Graph protocol on my site, and I'm curious if it is possible to "adopt" another site's Open Graph properties within my site's web pages.
For example, suppose a user makes a post to my site that contains a link to an article (such as the New York Times). They then share that post on platforms such as Facebook and Twitter, and I want the preview image on those platforms to contain the Open Graph image from the New York Times article web page.
Is this possible?

It's possible, when the user validates the post, you have to check the links inside to target the link of NYT. Then you have to parse yourself the page of the NYT to return the content you want. And finally you use the informations of the OG Tags of the NYT's page.
I think it's a heavy process, and keep in mind that Google penalize heavily the duplicate content from the others websites.

Related

How to add a bespoke social sharing message to a specific page

I've used services like 'Add This' for a while but now I need to add a couple of specific bits of functionality to an ecommerce order completion page. It's to work like Amazon's order thank you page where it allows you to post a message to Facebook saying something like 'I just bought a widget on Amazon'.
Equally I'm looking for the equivalent in Twitter.
I've added a bunch of OG tags and share buttons but can't get it to do what I need. From further reading it sounds like I might need to create a Facebook app of some sort and use FB ui to create the link to post to the user's wall. I was hoping to do this without getting tangled up in that level of permissions etc but maybe that's not possible any more?
This is being developed on asp.net C#, in case there's a library that I haven't found in my searching.
Can anyone familiar with this type of development point me in the right direction?
For Twitter, the simplest way is to use Web Intents.
For example, if you want to share the text
I love http://example.com
URL encode the text to I%20love%20http%3A%2F%2Fexample.com and use the Twitter Web Intent URI. E.g.
https://twitter.com/intent/tweet?text=I%20love%20http%3A%2F%2Fexample.com
When the user clicks on that link (try it!) or is directed there by your service, they'll be prompted to share that text.

Web crawlers and IFrames

Hypothetical Situation: I have a small obscure website called "miniatureBoltsInCarburetors.com" which provides content about the miniature bolts which hold a carburetor together as well as some general related automotive information. My site also has a single page which allows someone to find the missing bolt in their carburetor, and while no one will access this page directly from my website, one billion other popular automotive sites have embedded this single page in their website using an iframe, yet not included a link back to my site.
I recognize that this question is related to SEO which is considered off topic, however, all of the many SEO related forums discuss the marketing steps one could take, and not the programming steps or strategies, and hope others will allow this question to be answered here.
I wish my site "miniatureBoltsInCarburetors.com" to be ranked high for general automotive searches. What could I do to allow the 3rd party sites which include an iframe back to my site to improve my ranking? Could using JavaScript in the iframe to create a link on the parent page provide any value? What about when my server renders the page, use PHP to get the referring URL from $_SERVER, and include it in the content?
I am providing a solution here. Not sure if this is what you want though.
In your page which is used by other websites in iframe you can put below Javascript. This javascript checks if the webpage is opened inside an iframe or directly in browser.
So using this check when you see it is opened in an iframe. On click on something navigate to your website.
// This works in all browsers
function inIframe () {
try {
return window.self !== window.top;
} catch () {
return true;
}
}
Also for your reference you can check the below URL.
How to prevent my site page to be loaded via 3rd party site frame of iFrame
Hope it helps.
Iframes are seen seperate pages by Google. Your approach may end up being penalized due to being sourced from untrusted site. According to Google Webmaster Support
Frames can cause problems for search engines because they don't
correspond to the conceptual model of the web. Google tries to
associate framed content with the page containing the frames, but we
don't guarantee that we will.
One of the best approaches to rank higher for a specific keyword is, make multiple related sites. In your case a 3-4 paged site about carburetors, bolts, other things your primary site contain would do it. These mini sites will be more intense about the subject due to less page count. Of course they should contain unique articles on each page. Then link from mini websites to primary websites and you can see the dramatic change.
In fact, the thing you are trying to do was a tactic to rank competitors down worked occasionally a few years ago. Now, it is still a risk.
I see. You don't want to mess up the page for your own site, but you want to do something with all the uncredited embeddings.
The solution is fairly simple:
Create a copy of the page.
Switch your site to use the copy.
Amend the version that countless other sites are embedding, so that there is a small link back to you. Or, add an iframe blocker script that will load your site.
If the page is active (ie user interacts with it to find the missing bolt) you could include a sales message with the response encouraging the user to visit your site.
I think that your goal is getting your link onto these other sites long enough to get indexed by Google before it is noticed by the people doing the embedding, so it's a bit of a balancing act.
I see conflicting advice about how Google indexes iframes. You should use a PageRank checker to see if the existing iframe page url has PageRank, and compare it to the page that you embed it on.
I dont Think you need to worry ,.
Google bot does seem to crawl through Iframes ,but the Web-Page Containing that Iframe is not Credited for that Content .. In other Words,, Page-Ranking of that particular Web-Page do not Change due to Contents from Iframe .
is IFrame crawled by Google?
Do robots crawl iframes?

MVC page design using caching for best performance and UX

I have a album page (using ASPNET MVC + MongoDB) which has following parts:
#1 List of thumbnails displayed as a grid
Note: I load few fields for each video in the video album: url, length, caption, tags. If user hovers over thumbnail I show some of them.
#2 Then I have related albums
#3 Then I have List details
#4 Then I have current video details (including description which can be 5K characters)
#5 Then all comments of the current video.
As you can see list of thumbnails, related albums, list details are same for all videos.
But video details, video comments are different for each videos.
I NEED TO STREAM VIDEO, GET VIDEO DETAILS & COMMENTS EACH TIME USER CLICKS DIFFERENT THUMBNAIL
WHAT IS THE BEST WAY TO DO IT?
Option 1: url redirect user for each click (will cause reloading of common pieces)
Option 2: ajax to load comments and VIDEO details & update page (more coding)
Is there a way to leverage caching common pieces?
Please share your thoughts.
Thanks
If it is in scope I would probably also look into using one of the many framework for creating a Single Page Application, like Backbone (which I happen to like like a lot).
If it is only a caching solution that you are looking for, MVC has an OutputCache attribute that can be applied to a Controller or to single Actions and allows fine tune and managing of caching behvaiours server side, client side, parameter based etc. etc.

Spotify integration with Facebook OpenGraph

I'm trying to reproduce some cool things of the Spotify opengraph integration but there is one thing I understand how they do :
when you go on your spotify app profile (mine : https://www.facebook.com/antonio.mendespinto/music) you can see that the musician links points to the facebook page and not the spotify web pages (http://open.spotify.com/artist/7CajNmpbOovFoOoasH2HaY). How do they do that.
Also, is it this that lets Facebook to do behind the scenes the nice box in the artists page https://www.facebook.com/ogp/464730384564/ on the top showing friends interactions with the artist and spotify friend interactions.
Everything seems to point to the facebook pages instead of the spotify pages. How do they do that?
Yes, Spotify uses Facebook Open Graph Music, a predefined set of objects and properties for music.
https://developers.facebook.com/docs/opengraph/music/
Then I guess the Spotify account is marked in a way that makes this available. It is possible that this is what makes Facebook show the nice box in the artist page.
I work at Spotify, but I am not really sure about all the details of this. I know other music streaming services also use this, but I am not sure if it still requires a special account. It did in the beginning. Spotify was one of the first users of Open Graph.
The destination of the links inside Open Graph artifacts are left to the discretion of the developer. Say you're writing an app that lets people share restaurant tips. When you post a "Tip" object to OG, you naturally would include a link to the restaurant. As the app developer, you could choose the restaurant's web page, its Yelp page, its OpenTable page, your own representation of the restaurant page on your web site or any other web page on the internets. :-)
Being faced with a similar situation, I chose to use my own application's web page representing a restaurant. I experimented with using the restaurant's Facebook page (which I had to look up using the Graph API for search) as well as a third-party provider of restaurant information, e.g. Yelp. Using the Facebook page, my app felt more tightly integrated with Facebook, but I didn't get the luxury of having my own Facebook app metadata. Because I chose to link to my own restaurant page, I was able to set and retrieve whatever metadata I wanted, which really came in handy later when I started configuring aggregations.
I don't know how Spotify data surfaces on artist pages nor do I know how they managed to shoehorn song AND album objects into each listen post on OpenGraph, e.g.: "Chris listened to Torn and Frayed on Exile on Main Street." I could only ever get ONE object linked to an action, e.g. "Chris left a tip on California Pizza Kitchen." My assumption is that since they were one of the (if not the only) Facebook Open Graph launch partner, they probably had some inside help.

Flex 3: Project Architecture & SEO

I've got a Flex 3 project. One of the problems I have is that not very much of its content is indexed by Google. Currently, I pull data from a mySQl database, so the Googlebot doesn't see most of the site.
My goal is to increase the amount of content indexed by Google, improve the SEO, and improve SERPs.
I thought that instead of pulling the data from the database that I would change the project's architecture and create separate "pages". So, in my case, I would compile each puzzle separately and upload it to the server in its own directory. This way the info in each puzzle would get indexed.
The negative is that if I add a puzzle, I'd have to add a link to it in all of the puzzles that are already on the server. I would have to add the link, re-compile each puzzle and upload it to the server. Is there a way to get around this problem? Also, if I wanted to communicate some data from one puzzle to another in the future, I wouldn't be able to do so.
Any suggestions?
Thank you.
-Laxmidi
The usual way to achieve this goal is to develop a hidden parallel site in HTML.
On the first page you will have your flash and, hidden by javascript, a list of links to the other pages. These links will be parsed by the robots. Ideally, the href pages are virtual (look for "url rewriting"). On each "fake" page, your server-side language will print on the page a content or links from your database AND the flash. The flash will be provided with a string explaining where it is and what it's supposed to show.
Ex: http://www.mysite.com/category1/content7 The URL rewriting sends this request to http://www.mysite.com/index.php?uri=category1/content7. The page should display the Flash with FlashVar "uri=category1/content7". The Flash knows which content it has to display so when an user comes from google, following this link, he will find the content he was looking for.
Every linking and content for SEO should be in HTML, don't trust robots capability of reading Flash.
have a look at Adobe's reference on deep-linking.
you can generate a website's sitemap.xml with a cron process (daily), such that the URLs encode the state of the application you need. This URL will encode whatever content you need to retrieve from the db, with just one index.html page.
good luck!

Resources