I found this img src
http://pixel.wp.com/g.gif?v=ext&j=1%3A4.3.1&blog=127730128&post=2&tz=0&srv=hurbtrade.com&host=mywebsite.com&ref=&rand=0.8617862838961312
the wired thing this is generated form wordpress website
the image for a skull so any one know what is that
Try fetching this image by entering its URL directly in your browser.
I get a tiny gif that is a mere 6 by 5 pixels in size.
The typical use of "pixel images" is for ad/visitor tracking and statistics between different sites. The request parameters that are included in the URL, along with HTTP REFERER inform the pixel provider (in this case wp.com) about the page-view. If the visitor to your site happens to be known to the pixel issuer, then in the request they'll also get a cookie that informs them WHO the visitor is.
This can be used for statistics.
This can be used to track effectiveness of ads (conversion rates after someone clicked on an ad and went to a landing page, for instance).
This can be used to track your interests (do you visit a lot of sites about motor cycles or swimming?), etc.
This way you can have tracking pixels from several issuers on one page.
The issuer wp.com (i.e. WordPress.com) is a hint that the site uses Jetpack and probably its included statistics package.
Related
I manage the analytics of a website that uses a headless web, and have noticed an unusual amount of page_view events one some of the pages.
Perhaps it could have something to do with the website being headless, meaning that the URL doesn't change/refresh when clicking, even though the content on the site is changed as if it was a url redirect.
does this make sense? Anyone got any good suggestions on why my events might be off?
My first thought was that the event tracking configuration wasn't set up correctly, resulting in multiple pageviews on the wrong pages (i.e. first page visit → 2nd page → 3rd page = three pageview fires on first page), but upon investigation this doesn't seem to be the problem.
Checked for bot traffic and it doesn't seem to be that, as we're also tracking through UA and Matomo and those numbers look way more likely.
First, what you've described is not necessarily a headless website, it's just a misconfigured Single Page Application that doesn't care about updating the url. A Huge SEO issue, but not a blocker for Analytics. And when an SPA affects analytics, it's most commonly less events, not more.
If the bot traffic inflates one analytics system as a side-effect of whatever it does, it will inflate similarly pretty much any other analytics system, so if numbers in UA and Matomo look alike, it doesn't rule out bots. Especially if your GTM sends events to both systems on the same triggers.
Now, there are ways to debug it besides just going to the website and looking at a few pages tracking.
In cases like this, you want to use data to debug your tracking.
You build a report (custom report, or just use pregenerated UA reports) in which you compare the anomalous traffic period to the previous period so that you would have your base. Now, whatever dimension you're using, you're looking at the value or a few values of this dimension that contain most of the anomalous traffic. This is to see if any dimension contains the outlier that would explain the nature of the anomaly.
Dimensions that I would look at right away are: hostname, country, hour of the day, source, page, landing page, exit page, referrer. I would also take a quick look at all the conversion numbers, bounce rate and avg time on site.
If all of these look organic, then I would presume natural growth of good traffic.
Can someone tell me what's the best way to measure users that go to a home page and watch a video then go to another page in the same website vs a user that goes to the website and doesn't watch the video. I would like to provide a measurement showing users that watched the video then went somewhere else in the same website vs users that didn't watch the video. I am thinking that users that watch a video would go to another webpage at a higher rate then users that don't watch the video.
Sequence segments are your friend! They allow you to create a segment based on Users or Sessions performing actions on your site in a particular order.
Are you tracking video plays via event tracking?
If so, you could create a sequence segment based on sessions, where the sequence starts anywhere in the session, with step 1 based on the event, and step 2 immediately follows as a pageview of any page on the site excluding the home page (so as to not include possible home page refreshes).
This would give more granular data than (for example) a sequence segment including someone simply visiting the homepage and then navigating to another page, without actually having watched any of the video.
Couple of good articles on Sequence Segments that are worth the read:
https://www.bounteous.com/insights/2016/04/04/sequence-segments-more-accurate-reporting/
https://online-metrics.com/how-to-leverage-sequence-based-segments-in-google-analytics/
I'm currently running an experiment without redirect, using Google Analytics, but I'm running in some issues.
The case
I work for a company that has two websites, with two separate brands, selling the same product. Today, we are plaining a merge of the brands, one of the reasons being the low costs of maintanance.
To see how this would affect sales, we are doing an a/b test. The test consists of changing the logo of the sites, and displaying an information about the merge of brands in the variant. The original is the website without changes.
We have some requirements to do it:
We use a CMS that has no support to the Google Analytics Experiment tag (we get some errors when we install it to the , and are unable to run it)
We need to run it through all pages of our websites. We have also a subdomain in each site, that the user is redirected to place an order.
We doesn't have time to wait for the experiment to end for itself. So, we came up with the idea to track the rejection and sales using a duplicate pageview with "/variant" in the url and in the title.
To do that, I used the Content Experiments without redirects, with the Google Tag Manager.
Configuration of the Experiment
In Google Tag Manager, I load the Content Experiment Javascript API and define the choosenVariation variable in all pages of both websites and subdirectories.
I track the "gtm.load" event, to see when the page finished loading all elements and change the DOM in three ways: changing the logo, adding the content about the merge and add an item to the main menu. All of this, through Javascript.
Along with the changes of the DOM, I add a datalayer called VirtualPageView, and pass the corresponding url with "/variant" and the title with "Variant".
When the datalayer fires, I send a new Pageview with the variant information.
The problem
The experiment is running right, but when a user gets the B variant of the experiment and procceed to a subdomain of our websites to place an order, it seems that it's somehow running another test, and happens to the user get the A variation.
We are trying to persist the original session and the client Id through the domain and subdomain, in order to the user that saw the different logo, continue in his way to order.
I saw this page about Running Experiments across Subdomains, but its about the Classic Analytics and the classic experiment, and we are using the Universal Analytics with the Content Experiment without redirects.
I don't know if my explanation was clear enough, so if someone have doubts, please ask me. I don't have a profound knowledge of Google Analytics or the Content Experiments either. So, if you have a better way to do this, please, tell me.
I came up with a solution to our problem. We agreed to use the experiment only in the pages of the main domain, so I can change the content otherwise in the pages of the subdomain:
When a user visits our main domain, through Google Tag Manager, I created a cookie that says what the result of the variation chosen for the user (0 for the original and 1 for the variation).
When this user goes to our subdomain to place an order, still via GTM I check the cookie to see its value. If its equal to 1 (a variation), I change the logo and the menu, according to our previous configuration, and I send a virtual pageview to help us check the data.
Until now, this is working properly.
We've set up a new mini-site with extensive social sharing, including LinkedIn. Lots of OpenGraph tagging, the works. We have chosen specific images to be shown when sharing by using the og:image meta property.
The images work fine on Facebook and Pinterest, but are not working properly on LinkedIn. Here's the OG image tagging:
<meta property="og:image" content="https://img.mshanken.com/d/wso/Articles/2016/ST_TheBreakers070516_1600.jpg">
But if you click the LinkedIn icon we have set up at the bottom of our page, you end up on a share page that looks like this, which does NOT show the image:
Weirder still, if you inspect that share preview, the image IS in the source code:
<div class="image-thumbs-container">
<img src="https://media.licdn.com/media-proxy/ext?w=180&h=110&f=c&hash=q0uvWygJS2HJrhZZ2qZGdYu2Tig%3D&ora=1%2CaFBCTXdkRmpGL2lvQUFBPQ%2CxAVta5g-0R6jnhxUzw8p4aCKqEH-50hKCoaTFXP-RFTovozTPCKqZsXfeLS-xzl5HHRU4kZnLrT9AnPhFZO5KoyAfNpxi4m_ZMc" width="130" alt="Preview of the share image" data-orig-url="https://img.mshanken.com/d/wso/Articles/2016/ST_TheBreakers070516_1600.jpg" data-width="" data-height="" data-size="" data-position="1" class="active">
</div>
What do we need to do to get that image showing up on LinkedIn shares?
I was having the same issue last night. Spent hours researching solutions. Finally I contacted LinkedIn about this issue and they responded right away. Their development team has implemented a new tool called "Post Inspector", which allows you to optimize content sharing. Literally, in just minutes this worked.
All you have to do is type in your URL and they do all the busy work i.e. verifying correct code of properties such as image, author, title, description, publication date etc. Not only do they verify, they also tell you what to include and what is missing.
Here is the website to use Post Inspector:
https://www.linkedin.com/post-inspector/
Couple of things it could be:
The dimensions 1600x900 and size of 220kb are within LinkedIn's requirements. However, your aspect is 16:9 instead of 4:1 / 1:4.
Max file size: 1 MB
Minimum image dimensions: 80 x 150 pixels
Recommended aspect ratio: 4:1 or 1:4
Making Your Website Shareable on LinkedIn
Your image URI is https, it could be they are unable to retrieve your image. Have you tried with an http image?
Note: If the image meets the requirements, but it still does not
appear in updates on LinkedIn, your website may be blocking us from
pulling the image or the image may be located on a protected directory
or website.
Making Your Website Shareable on LinkedIn
Was the image change from the first time LinkedIn crawled your page for the image? They do cache for ~7 days.
The first time that LinkedIn's crawlers visit a webpage when asked to
share content via a URL, the data it finds (Open Graph values or our
own analysis) will be cached for a period of approximately 7 days.
This means that if you subsequently change the article's description,
upload a new image, fix a typo in the title, etc., you will not see
the change represented during any subsequent attempts to share the
page until the cache has expired and the crawler is forced to revisit
the page to retrieve fresh content.
Shared Content Caching
Chiming in from the future - I faced this issue today as our site update wasn't displaying the proper image. In my case the solution was simple: try posting the link like so: https://url.com/?jhskjsh and it forced linked in to fetch the meta tags again which then displayed my image, as opposed to the grey square it previously had.
Trying to advertise a website on Google Adwords, I got "disapproved" due to 'Invalid HTTP Response Codes'.
The website runs fine as far as I can see. I suspect this may happen due to using iframes, which contain "3rd party" websites, which may produce errors over which the advertised parent website has no control (you can examine the actual website http://ambatya.com).
Can this be the case? And if so, do I have any quick method for solving this without compromising the website's functionality?
You Can use individual landing page for the Adwords.
Because I saw your website and it have 4 different section like Animal, Cultur & Art, Food and Fashion.
And according to Google Adwords Policy
They will not approved your page if your page have hatred; violence; harassment; racism; sexual, religious, or political intolerance, or organizations with such views content that's likely to shock or disgust, content that's exploitative or appears to unfairly capitalize at the expense of others.
Then It will not approved and as far I can see that your website have many pictures within the iframe which inappropriate.
For More detail please check it
https://support.google.com/adwords/answer/6316?hl=en
https://support.google.com/adwordspolicy/answer/6008942?hl=en#con
If still you have any issue let me know.