I have a Drupal site which allows users to sign up and add their own content. I am trying to create something that emails users every week and shows them statistics like how many people viewed your content etc. I have turned on the statistics module and checked on count content views. I moved the popular content block into the correct region but it will not show me the number of hits only the most popular content pages. I would like to be able to show the number of hits. I was also wondering if there is a programatically way to print the number of page hits so I could do it that way. Thanks
Check out the statistics module's documentation. You can use statistics_get($nid) function to get total count of views of a node.
try Google Analytics Reports module. Statistics module has big impact on performance because of the huge db writes.
Related
I've been struggling for some time to get an answer, and still can't find it out on the web. I would like to to a seemingly simple thing:
1) Facebook page A sends me some visitors through a link to MYPAGE.com/?utm_campaign=mycampaign& etc. etc.
2) I count the unique page views (not users) received from people that have clicked this link, and no other visits are counted as being part of these (e.g. direct visits of someone recurring that has first come across to the site through that campaign should not be included in the count)
This way, I'd like to monitor exactly the unique page views coming from different facebook pages which I have a partnership with. And another thing I cannot figure is: how do I make this work on subdomains too?
Best regards
Step 1 - You need to have different utm_campaign values for each Facebook page.
Step 2 - Create Google Analytics segment for that specific campaign.
Use "Filter Sessions" as you only want the sessions that came straight from Facebook.
Step 3 - Use Behaviour -> Site Content reports to see which pages those users visited.
I'm just getting to grips with Google Analytics for a site I'm doing some content management for, and want to know how to look at the traffic for pages created recently/in specific timeframes.
Anyone got any ideas?
You would need to log the page creation date as a custom dimension. Then you'd need to select your timeframe, and filter by your custom dimension (via regex, since you cannot apply the date filter to a custom dimension).
If the page does not have any hits in the selected timeframe it will not show up at all. And in any case you will see only metrics from the hits in the selected timeframe, even if the page has been created before that.
So this is sort of possible, but rather more complicated than one would assume.
I have a large amount of items that each item has a page like this site/item_show.aspx?id=The_ID_here there are tens of thousands of items and each day nearly two thousands are added. Furthermore each item has a description in its page so for each item, its page should be crawled by search engines.
My question is with this amount of data described: How can I generate sitemaps or anything like that to make all items visible by google and other search engines?
It is clear that I cannot show all items in the first pages but I can make pages that simply just contains the link of items and provide tens of them each page for just search engine. Would it work or is it anything better to do for making items be indexed by google?
essentially there are 3 methods which will help you with the Mass-Indexing :
1. Create an XML SiteMap for all of your pages and link to it from you HomePage.
2. you should have Google Webmaster Tools Installed and you can load that same XML file into it.
3. Have an Organized Categories structure - depending on the type of your site think about a logical Categories Structure for example in eCommerce stores all the products are categorized by product Main-Category then a product Sub-category and sometimes by Brands etc... - of course you should do this via your Shopping-cart platform - Just remember that if you begin chaging the URLs-structure you'll need to take care of all the Redirects from the old URLS to the new ones.
First, use XML sitemaps and submit those to Google (note that i said sitemapS - more than one).
Next, ensure that your on-site content is nicely organised into categories and sub-categories - ideally you'd want all elements to be reachable in as few clicks as possible without users (or Googlebot) having to resort to the search function.
Finally, ensure that your more popular / important items are featured in the homepage or 1-2 clicks deep, and get links and social shares to those specific product pages.
Be popular and get links to your site. Have a good server which can handle the crawl.
There is also not a hard limit on our crawl. The best way to think
about it is that the number of pages that we crawl is roughly
proportional to your PageRank. So if you have a lot of incoming links
on your root page, we’ll definitely crawl that. Then your root page
may link to other pages, and those will get PageRank and we’ll crawl
those as well. As you get deeper and deeper in your site, however,
PageRank tends to decline...
https://www.stonetemple.com/matt-cutts-interviewed-by-eric-enge-2/
Recently, I am developing a self-using blogging system, but I have some questions about the implementation of pagination.
The pagination I want is :
10 results per page
When user visits the home page(/), he/she could see the current page he/she is at(in this example is page one), and if there are more than 10 results, then he/she could be able to see the 'Next' Link.
If there are some results before this page, and there are also some results after this page, then this page should have 'Prev' and 'Next' Link and the current page number.
This thinking appeared when I was designing the algorithm about pagination:
If I want to implement this functionality, I need to know the total numbers of published posts.(If it is not published, it shouldn't be displayed), so maybe I need to write this SQL code :
SELECT COUNT(*) FROM post WHERE published IS NOT TRUE;
And I could pick the result above and the current page number the user is at to compute the pagination result.
But the question I want to ask is if there are many records , performance will be very terrible, how could I make this program more faster?
I see the pagination of a blog that has thousands of post and is built on top of WordPress is very fast, I want to know how could I archive this speed ?
Thanks a lot.
go through this sample PHP Pagination coding. Results won't be slow because of the LIMIT n,n+RecordsPerPage part of the mySQL query.
I have a website that features a call to action/promotion button on nearly all pages of the site.
I have currently configured a conversion funnel that shows me how many people arrive on the call to action page, and then how many people make it to successfully complete the action page.
What I want to see though is how many unique visitors over the reporting see the banner at the top of the funnel.
eg. Something like this:
Visitors accessing website: 1000
Visitors clicked on call to action page: 100
Visitors successfully submitted call to action form: 45
My initial thoughts was to do this using the frontpage only, but I forgot that this banner/call to action ad is featured on many pages around the website. Many people find the site through SEO and never even pass through the frontpage.
Is it possible to use a wildcard for a domain or something similar in Google Analytics? Or maybe I am approaching this the wrong way.
Last of all - I know I can accomplish this by pulling up 2 reports: site wide unique visitors and comparing that to how many people hit the first stage of the existing conversion funnel. But it's a hassle to have to do this regularly manually.
While using funnel analysis, it is normal to have funnel steps that represent more than 1 urls. Take the basic case of ecommerce sites, where the final goal maybe the same transaction completion page, but the funnel step corresponding to product page can be triggered by many different product pages and not just one.
Based on the page url structure of your website, you can choose any of the below 2 match types to add multiple urls to a single step:
1, Begins with : If all the different pages displaying the ad have a set of common characters in the beginning, then use this.
2, Regular Expression Match : If the different pages that contain your banner ad how totally unrelated url, then find a suitable regex that can capture all those urls