Getting an Adsense account approved on a Meteor website - meteor

I am having difficulty getting approved with Adsense. It seems there is not enough content but I have many blog articles, no inappropriate content or copyright infringements and I have the Ad code in place within the footer.
I believe the issue may caused by my site using client side rendering. (Meteor javascript framework)
So this means that if I do:
$> curl http://www.dales-sports-media.com
I get mostly empty html (meta and html tags, but nothing in the body)
Sharing articles from my site to Facebook and Twitter seems to work fine
Is it possible that google's adsense approval bot is unable to see the fully rendered page?
Has anyone successfully applied for a Adsense account with a Meteor web app?
Thanks,
Mick

What you need is Prerender which is a service that will render and cache your page(s), and then bots will be served up that version so they get the full HTML body.
You should set up nginx to be in front of your Meteor app, so that nginx will use proxy_pass to pass traffic from port 80 into your Meteor app on localhost port 3000, for example.
Then use this nginx config file as a guideline to set up Prerender: https://gist.github.com/thoop/8165802
If you're limited and can't install your own web server, make sure you've tried the spiderable package.
$ meteor add spiderable

Related

Google Tag Manager loads something over http instead https

I have implemented the Google Tag Manager script on my website just like they pointed in this tutorial this tutorial.
But they load something with http protocol instead of https which is causing the security on my website to fail (no padlock is showing).
Can I fix it, or is it Google's failure?
Thank you very much! :)
This is not Googles fault. The tag in your picture is not part of GTM.
Apparently "exelator" is malware that tries to redirect your requests to unsafe websites (on order to steal your data). So this is something on your local system, you need to run an antivirus/anti-malware software to remove it.

Use Wordpress as CMS and access content via REST API

I have used Contentful before to host content and access it in my application via REST. It works great however it is not free.
I am trying to find out if I can host my app's content similarly on Wordpress and access it in my app using its REST apis.
Does anybody know if we can and how? It is not that straight forward to figure out on their website.
PS: I don't care about the security about the website content.
wordpress has an embeded REST API, this can be done easily, check the REST API Handbook in the following url here: https://developer.wordpress.org/rest-api/

Making your site shareable on LinkedIn

I'm having a few issues with making our site shareable on linked in and I'm at a loss. The og: meta tags all look fine, the facebook scraper picks it up fine, but the linkedIn scraper does not... and the img etc are not on a protected folder or anything like that.
When inspecting the developer tools the get request to the url-preview?url= link shows that the img etc.. aren't there.
The image is less than 1mb, all og: meta tags are obeyed. The only think that may not be 100% is the image ratio is not 1/4 or 4/1 (it's 2/1)... But that is only a recommendation and not a hard and fast rule.
Does LinkedIn provide something similar to FB (https://developers.facebook.com/tools/debug/) where you can test the scraper and re-run it? Or is there another way to debug this? Any help appreciated.
https://www.hipla.co.uk (is the page i'm trying to share).
cheers
It transpires linked in doesn't offer a similar facility to FB or twitter to test the OG meta tags and re-scrape the page. They cache a page for 7 days and then re-scrape again. However, you can refresh the linkedIn crawler cache simply by appending GET params to the URL, i.e. https://www.hipla.co.uk?123.
I eventually figured out what our issue was. We were using a wildcard cert (for multi domain, so we could have a single ssl cert for multiple subdomains) which meant we had to set the server name in the apache default-ssl.conf file, but we had a typo in it for the www instance ... which meant it gave an SSL error (for the linkedIn crawler) which isn't debuggable (if that's a word) using linkedIn but was spotted as we got an SSL error when testing the twitter metadata tags using the twitter card validator. Hope this helps anyone else who has a typo in their ssl settings. Note that the ssl error was not visible using a browser(s) as all looked fine.

Website not posting to Facebook: security & app id issues

I'm a new WordPress designer. My site runs Tesseract Theme and is built with Beaver Builder.
PROBLEM: When I post my website (https://louiseclark.tech) on Facebook it removed my site after a couple minutes. Now when I try to post my site it gives me this message--> It looks like a link you're sharing might be unsafe. If you can, please remove this link: louiseclark.tech Note: The unsafe link might be on the page you’re linking to.
What I've done to try and resolve:
When I ran my site through the Facebook debugger I got this message:
The 'fb:app_id' property should be explicitly provided, Specify the app ID so that stories shared to Facebook will be properly attributed to the app. Alternatively, app_id can be set in url when open the share dialog.
I created an app id following this instructional video: https://www.youtube.com/watch?v=V97h03H21y0
I pasted my app id into my Yoast SEO plugin under the Facebook category.
Check my Google Webmaster Tools Sitemap...all is verified and sitemap set.
SSL certificate is set - checked with my hosting company SiteGround. When I asked them about this problem they didn't really feel that the security issues where from their side.
I've reported this problem to the black hole that is Facebook support.
Thank you for any insight.
In case anyone sees this thread, I found the solution.
When I moved my WordPress sites to managed WordPress hosting I also migrated my websites to https with the SSL certificates. While the pages were migrated and displaying the https just fine, the images still held their old url (http).
I did two things:
I installed SSL Content Fixer plugin. This worked for some images but not others.
I installed Better Search Replace plugin. I had found the specific insecure images using Firefox. From my page in Firefox, I went to:
Tools -> Page Info -> Media This showed me every image/js/css call on this page. Finding these images allowed me to use the plugin to make the changes.
It worked. I'm quite sure knowing how to code my site would be much better in this situation. But I'm a newbie and this is what I could come up with.
What I learned: It's a flag when you have a secure site that embeds non secure objects/images.

how to test my local page on Fetch as Googlebot

I have written a page and need to test it locally.
How can I see the result of my development site served from my local machine using Google's "Fetch As Google" feature in Webmaster Tools?
(disclaimer: more of a comment than an answer)
This is an excellent question and there are amazingly little sources on the web for a solution.
Fetch Google Bot - http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=158587
The Fetch as Googlebot tool lets you
see a page as Googlebot sees it. This
is particularly useful if you're
troubleshooting a page's poor
performance in search results. For
example, if you use rich media files
to display content, the page returned
by the tool may not contain this
content if Google can't crawl it
effectively. You can choose to fetch a
page as Google's regular web crawler
sees it or, if you publish mobile
content, as our mobile crawlers do.
I followed the link above and tried out User Agent Switcher but it doesn't accomplish what the asker is looking for. See this thread: chrispederick.com/forums/viewtopic.php?id=788
You can change the user agent settings
to be the same as GoogleBot, for
example, but I'm not sure if sites
also change their appearance based on
the headers the search bot sends.
Changing the headers is beyond the
scope of the extension, however.
And chrispederick.com/forums/viewtopic.php?id=259
Q
For example if i put googleBot i'd
like it to customize that it would be
emulate Google's spider.
A
The User Agent Switcher has always
been designed to be a simple,
light-weight solution so I'm not
planning on adding anything like this.
In short I don't think there is a solution. This would be a great opportunity for a google app
Do you mean you want to see how your site will react to the google web crawler?
For this you could use Firefox with the User Agent Switcher addon.
In order to test your localhost website with the official Google tools, you can use Ngrok as i described in this post : https://www.aymen-loukil.com/en/blog-en/how-to-test-localhost-website-with-google-seo-tools/
Fetch as Google is not possible to use directly with non verified domain in Google Search Console (Webmaster tools). A trick to view it, is to iframe your Ngrok URL in a another verified domain.
- you should have a website verified in Search console
- Make an iframe that loads your Ngrok URL of your localhost webpage
Ngrok + Fetch as google combination is great, but you will need to go through the verification process each time you launch ngrok on the google tools side.
In my case I just needed to check if server side render was properly done, just went to google Chrome navigator settings and disabled javascript:
Settings >> Advanced >> Content Settings >> Disable >> Javascript Allowed (off)
It allowed me to check that the page was being 100% rendered in the server side(nextjs server side rendering) and no JavaScript render was being run on the client side.

Resources