I am trying to set up Google Analytics (Universal/analytics.js) to track a user account set-up funnel on a single website with many subdomains. On each subdomain, a user can express interest in creating an account, and then create the account. Once the account is created, they leave the subdomain and arrive on the main domain.
For the subdomain foo, the flow is like this:
foo.maindomain.com - register interest
foo.maindomain.com/inv/acc0unTt0k3n - enter account set-up details
maindomain.com/extra_information - info is supplied
maindomain.com/home - goal end point reached
I have set up a View for each subdomain e.g 'Viewing foo.maindomain.com', and each View has a Filter that accurately shows visits to foo.maindomain.com & foo.maindomain.com/inv/acc0unTt0k3n.
I don't know how to go about tracking traffic through this whole funnel. My ideal end-goal would be to track this funnel for all subdomains combined, but I would be satisfied for now with getting a Goal or Funnel that worked for each subdomain individually.
Inside my 'Viewing foo.maindomain.com' View, I have attempted to create a Goal to track this. I can capture the first two steps by creating:
Goal Type > Destination
Destination: /inv/
Funnel: /
This gives a 20% conversion rate, which matches up with my server data for account creations. But if I try to change the Destination to maindomain.com/boarding, and add /inv/ as another step in the Funnel, it no longer works ("Verify this Goal" returns 0).
How can I create a Goal that captures all of these steps?
Make sure you are using the same UA code on every subdomain.
You should then just be able to see all the subdomains in the standard view out of the box - UA tracks across subdomains automatically. See the Google documentation for details
You don't need to set up a view for each subdomain if you don't need it.
Name your master View "Rollup" if you like.
You can see all subdomains in your "Hostname" report for that view.
You may want to overwrite your pageview names so they also incude the subdomain, so you can tell them apart (if say subdomain1.domain.com/index.html and subdomain2.domain.com/index.html exist, in the standard pagetracking they will be aggregated) - this can be done via filters:
Filter Type: Custom filter > Advanced
Field A: Hostname Extract A: (.*)
Field B: Request URI Extract: (.*)
Output To: Request URI Constructor: $A1$B1
Related
I'm trying to setup conversion tracking on a site, where a form can be filled on different pages. The only thing that the confirmation url after filling a form is ?surveySucces, so pages where a form is filled could be:
domain.com/landingpage?surveySuccess
or
domain.com/landingpage/subpage?surverySuccess
I just want to setup up one goal, so I can track all the forms that are filled. I've tried with RegEx, but with no luck.
Any ideas?
Try these goal settings:
Goal setup: Custom
Type: Destination
Match type: Regular expression
Value: \?surveySuccess
Case sensitive: unchecked
Let's say I have a site and the user comes into it with a parameter:
http://example.com&url=blahblahblah
How do I go about passing along the url value from the parameter into Google Analytics?
1) User comes to the page with a url in the params
2) User clicks a download link with a ga tracking code attached to it which was generated from ga account like this:
http://example.com/download/param1=dkljdf&_ga=1.149898996.39207121.1424368466
You have to create a custom Dimension and a metric for that.
About custom Dimensions and metrics:
https://developers.google.com/analytics/devguides/platform/customdimsmets
After you have created a Dimension, you can add metrics to it by view, in example.
Follow the steps here for Universal Analytics:
https://developers.google.com/analytics/devguides/collection/analyticsjs/custom-dims-mets
Note that due to not have 10 point of reputation, I wrote the a "_" in http like "ht_tp"
BUT:
I think what you want to know is the number os visitors that clicks a download link in your site that comes from, lets say "blahblahblah" as web origin or other methods.
For that, you have the param utm_source that you can receive directly in the url.
So instead of ht_tp://example.com&url=origin you should receive ht_tp://example.com&utm_source=origin
In this way, you have no care about it. Analytics is going to take care for you so you can get a report of clicks by source.
Or, just use the referer in case all the incoming visitors are from webs:
ga('set', 'referrer', 'ht_tp://example.com');
And a final option, to use Events:
_gaq.push(['_trackEvent', 'ReferencedVisitors', jsVarWhereYouHaveTheOrigin]);
I want to have content groups in Google Analytics and I'm using Google Tag Manager to implement them. The way to do it, according to their reference, is to create a lookup table that is using the url_path macro to filter URLs. The url_path only gives the path of the URL, stripping the end of it, so for a url http://www.example.com/hello/index.html the result would be /hello/.
I want to group my users' account pages which are like: http://www.example.com/accounts/profile/user1/
The problem with the above macro is that it would return /accounts/profile/user1 which is not what I want. I only want to keep /accounts/profile/.
How could I accomplish that using this macro?
For helping you, in GTM you just have to configure the "Content Grouping" part (and take special care of the index that you put in). All the stuff is on GA Backend, where you declare your content group and which give you an index for each content group (index that you have to keep in GTM).
For some GA account you have to wait around 48 hours till you got some data, if your hit is ok you can see your content grouping information in the variable utmpg (like :" 1:Accueil,2:Page de destination | Actualité | ---,5:www.ouest-france.fr/home" for example).
Hopes it will help you to understand.
Fanny
I have a web app. The home page has two main actions:
Sign up to the application
Log into the application
I have a goal set up for sign-ups. I am trying to track the goal conversion rate of users who have never logged into the application before.
The problem I have at the moment is that the conversion rate is being skewed by users who are visiting the homepage simply to log in.
Is there a simple way of doing this?
Thanks very much,
Ben
For the kind of tracking you are looking for, there should be some coding on both your system and Google Analytics.
First, I would recommend you place into your system the intelligence to know the number of log-ins the user has made (As an example, a counter on your database).
Now, to implement that, you will need to set a Custom Var in Google Analytics in a visit level, to segment the users from the non-users, in that CustomVar, you can store both the user ID and the number of logins he has made.
This is the on the login page:
Your Code Should Look Like:
_gaq.push(['_setCustomVar',
1,
'Member Login',
'NUMBER OF LOGINS', // SET THIS FROM YOUR SYSTEM
2 //VISIT LEVEL Custom Var
]);
Remember that this code should go before tracking the pageview.
After setting this up, you should use Advanced Segments to check those specific users, one case could be: (Using the example above)
Setup an Advanced Segment that "Excludes" the Custom Variable (Key 1) - Mathing RegExp : .*
This will give you all the never-loged user access.
Something else you could do is set up Adv Segments to check on a specific number of logins
Setup an Advanced Segment that "Excludes" the Custom Variable (Key 1) - Mathing RegExp : [^1]
This will give you all the users that loged-in more than once.
You can find more info on Custom Vars here.
I am looking for a way to setup a google analytics sandbox environment that will allow me
to test out my custom js code near real time.
My app will be using custom variables for advanced segmentation, and I would like to test out multiple scenarios quickly, as opposed to setting up a dummy GA account and wait for a whole day to confirm the test.
Thanks
Great question.
For GA, server updates occur every four hours, and after every sixth such update, the entire set is recalculated, which means a 24-hour lag from code change to reliable feedback. This delay also applies to most customizations to the GA Browser (e.g., "custom filters").
So if you are going to use GA as your web metrics system, and you expect to actually rely on those data then a test rig is essential.
For me, it's useful to group test systems for client-side analytics using two rubrics: (i) complete, self-contained (closed-loop) systems; or (ii) simpler automated data pulls from the production system (by "production system" here i mean GA's system, not the Site whose pages the GA code is tracking).
For the latter, just add this line to each page of your Site that contains the GA tracking code, just below '__trackPageview()':
pageTracker._setLocalRemoteServerMode();
That line will cause a copy of each transaction line to be logged to your server's activity log--so in essence, you get the data captured by GA in real-time That's all you need to do to capture the data; to parse it, you can use, for instance, any of the excellent open source web log analyzers like AWStats, or roll your own.
This is simple and reliable--but all it can do is tell you (in real-time) "does the analytics code i just implemented on pages served by my production server actually work?"
Usually, that's not good enough--you would rather know if your code will work before it's on your production server. To do that, you need to simulate the production environment and find a way to access in real-time the data GA collects.
This kind of test rig is a little more involved, but still not difficult.
In sum, it requires these steps:
host/serve the ga.js and the
tracking pixel locally;
log the __utm.gif requests (in the
GA data flow, each request
corresponds to one logged
transaction); and
parse the headers into some
convenient human-readable form.
If you want more detail than that (ie, a step-by-step implementation), here it is:
I. Hosting/Serving the GA Script (& automating updates
To do that, you can create a small shell script like this one to wget the latest ga.js version into your local directory (replacing the extant version it finds there).
#!/bin/sh
rm /My_Sites/sitename.com/analytics/ga.js
cd /My_Sites/sitename.com/analytics/
wget http://www.google-analytics.com/ga.js
chmod 644 /My_Sites/sitename.com/analytics/ga.js
cd ${OLDPWD}
exit 0;
(Thanks to AskApache.com, which provided the original motivation and config details to do this in a production context.)
II. Create __utm.gif file
This is just a transparent 1x1 pixel gif image, which you will place in Site directory (doesn't matter where, it just needs to match the location recited in your pages)
III. Log the __utm.gif Requests
For a testing protocol in which you are the source of the client-side activity (e.g., you want to verify the cross-browser fidelity of some event-tracking code you've added to a page on your Site, so you automate 5000 clicks on the button you just wired up,serving the page from your dev server set up for this purpose) it's probably simplest to just log the Request Headers, because it's in those headers that the GA script directs the client to gather various data from the DOM, from the location bar (url), and from prior http headers, and append them to a request for a resource on the GA server (__utm.gif, which is just a 1x1 transparent pixel).
For this type of protocol, i use the Firefox addon, LiveHTTPHeaders. You install it like any other Firefox addon, a few mouse clicks is all. Next, open it, and click the "Generator" tab. From this window, you can see the actual requests in real time. At the bottom of the window is a 'save' button to store the log. I find it easier to configure LiveHTTPHeaders to log only the __utm.gif requests; to do that, just click the 'Edit' tab and create a siimple filter to exclude everything except these particular gif images (using the check boxes on the right, and the large text box to the right).
Other kinds of test protocols require you to work from your Server Activity Logs; in that case just add this line to each page of your Site, just below __trackPageview():
pageTracker._setLocalRemoteServerMode();
IV. Parse those logged requests so you can actually read them
So now your log will contain individual transction lines, each one of which is a string appended to an HTTP Request for the GA tracking pixel. This string is just a concatenation of key-value pairs, each key begins with the letters "utm" (probably for "urchin tracker"). Each of these parameters corresponds to a variable that you see in the GA Dashboard (here's a complete list and description of them). This is all you need to know to build a parser. In more detail:
First, here's a sanitized __utm.gif request (the entries in your LiveHTTPHeaders log):
http://www.google-analytics.com/__utm.gif?utmwv=1&utmn=1669045322&utmcs=UTF-8&utmsr=1280x800&utmsc=24-bit&utmul=en-us&utmje=1&utmfl=10.0%20r45&utmcn=1&utmdt=Position%20Listings%20%7C%20Linden%20Lab&utmhn=lindenlab.hrmdirect.com&utmr=http://lindenlab.com/employment&utmp=/employment/openings.php?sort=da&&utmac=UA-XXXXXX-X&utmcc=__utma%3D87045125.1669045322.1274256051.1274256051.1274256051.1%3B%2B__utmb%3D87045125%3B%2B__utmc%3D87045125%3B%2B__utmz%3D87045125.1274256051.1.1.utmccn%3D(referral)%7Cutmcsr%3Dlindenlab.com%7Cutmcct%3D%2Femployment%7Cutmcmd%3Dreferral%3B%2B
This is my parser (in Python):
# regular expression module imported
import re
pattern = r'\&{1,2}'
pat_obj = re.compile(pattern)
# splitting the gif request on the '&' character
# (which GA originally used to concatenate each piece to build the request)
# (here, i've bound the __utm.gif to the variable by 'gfx')
gfx1 = pat_obj.split(gfx)
# create a look-up table to map a descriptive name to each gif request parameter
# (note, this isn't the entire list, which i've linked to above)
keys = "utmje utmsc utmsr utmac utmcc utmcn utmcr utmcs utmdt utme utmfl utmhn utmn utmp utmr utmul utmwv"
values = "java_enabled screen_color_depth screen_resolution account_string cookies campaign_session_new repeat_campaign_visit language_encoding page_title event_tracking_data flash_version host_name GIF_req_unique_id page_request referral_url browser_language gatc_version"
keys = keys.strip().split()
#create the look-up table
GIF_REQUEST_PARAMS = dict(zip(keys, values))
# parse each request parameter and map the parameter name to a descriptive name:
pattern = r'(utm\w{1,2})=(.*?)$'
pat_obj = re.compile(pattern)
for itm in gfx1 :
m = pat_obj.search(itm)
if m :
fmt = '{0:25} {1:10}'
print( fmt.format( GIF_REQUEST_PARAMS[m.group(1)], m.group(2) ) )
The result looks like this:
gatc_version 1
GIF_req_unique_id 1669045322
language_encoding UTF-8
screen_resolution 1280x800
screen_color_depth 24-bit
browser_language en-us
java_enabled 1
flash_version 10.0%20r45
campaign_session_new 1
page_title Position%20Listings%20%7C%20Linden%20Lab
host_name lindenlab.hrmdirect.com
referral_url http://lindenlab.com/employment
page_request /employment/openings.php?sort=da
account_string UA-XXXXXX-X
cookies
To avoid making this longer still, i left out the cookies' value. They obviously require a separate parsing step, though it's virtually identical to the step i just showed. Again, each request represents a single transaction, so you can store them as you need to.