Delay when publishing a new GTM container version - google-analytics

I am wondering how many hours does it takes to a new publishing version of a Google Tag Manager container to take in consideration the modifications ?
I tried to find the answer without result...
Thanks by advance for your help.
Coki

It does not take hours. Publishing a new container will immediately create a new gtm.js file with your changes.
New visitors will receive the new file immediately. Recurring visitors might have a cached version of the file, but then GTM sets http cache headers so that the file should not be cached too long. Some users (in company networks etc.) might sit behind proxy servers that cache and old version of the file.
But most users should receive the updated version of the file minutes after you have published it. I recommend you send the build in Container Version variable as a custom dimension to Google Analytics, so you can always check if changes in your KPIs correspond to specific container versions.

Related

How can I measure a new vs current website using GA4?

I am launching a new website while I still will keep the old for some months, since the migration will be done by parts.
First I want to have the data from dev environment and then, send the data from the live version.
I would like to include this new website tracking in my GA4 but I am not sure if adding a new data stream will allow me to differentiate old and new (they have same subdomain).
Worth to mention that some pages from the new website will redirect users to the old one, since some parts will take time to migrate.
Any advise?

Google Tag Manager on localhost

I was asked to implement the Google Tag Manager scripts into the React application. I've added the scripts to head and body of my html file. I've tested the site in the Preview mode of GTM and everything seems to working fine. Clicks and route changes are tracked correctly.
I have one doubt... What about the localhost development ? Is it going to generate unnecessary logs on the analytics (which I have no access to)? Or maybe it's just enough to paste the snippets and that's it?
I can't find an answer so that's why I decided to ask here.
If someone has some experience on that topic - please let me know :)
I would advise you to map separate Google Analytics property IDs by either a datalayer variable, URL or custom JavaScript to return a separate property ID based on whether users are accessing your website from either your localhost development environment, your UAT environment (if any) and then your production environment (or others as applicable).
Essentially you're looking at having something that says "if the URL contains "localhost", return my development property ID", then use this variable name in your Google Analytics tag(s) instead of a static value.
Yes, unfortunately all your existing testing is all in your profile because if you had the production property ID configured and fired off a bunch of events and pageviews, it absolutely collected and sent there as part of the debugging experience. Generally though, that's a low concern on a production app because you make up such a small portion of overall traffic; you're just a couple of blips in the larger dataset.

How do I make a programatically changed site dashboard refresh without restarting the Alfresco/Tomcat service?

I've created a web-script module extension and have verified that it works correctly. What it does is takes the dashboard.xml and related page.component-X-Y.type~id~dashboard.xml files from one site, deletes all dashboard related files on another site then copies the source files to the new site that had them deleted.
pseudo-code
var siteDashboard = getDashboard("site1-shortname");
var siteDashboard = renameShortNames("site1-shortname", "short2-shortname");
deleteDashboard("site2-shortname");
createDashboard("site2-shortname", siteDashboard);
renameShortNames just renames the site id inside the dashboard files to the new site's id.
This all works, I've tested and verified it. My problem is that when I go to http://alfrescosite.com/alfresco/s/remoteadm/get/s/sitestore/alfresco/site-data/pages/site/site2-shortname/dashboard.xml it shows me the new dashboard layout from site1-shortname which is the correct behavior but when I go to the actual site's dashboard within Alfresco share it shows the old site2-shortname dashboard. The only way I can get the new dashboard to show is by restarting the Alfresco/Tomcat service. I've even tried looking at the dashboard with a different browser just in case it was a local caching issue but it's not.
Any ideas on how to make the dashboards refresh to the new layout without having to restart the Alfresco/Tomcat service every time?
I figured out what the problem was. The problem was that I was deleting and recreating the dashboard via Remote API calls to the Alfresco Repository and doing it that way was making the appropriate changes but not telling Alfresco Share of those changes.
The solution was to use a combination the Share root object sitedata to remove the component bindings, delete the components and recreate them through Share so that the changes are automatically updated on the front end without the need for a service restart.
Basically this ended up being a modified version of the code in customise-dashboard.post.json.js inside Alfresco Share

What does increasing the modification attribute do?

In %TRIDION_HOME%\web\WebUI\WebRoot\Configuration\System.config we can increment the modification attribute's value to instruct the Content Manager to force a download of items.
The setting is mentioned on the PowerTools discussion but also on the Skinning the Content Manager Explorer topic on SDL Live Content.
<server version="6.1.0.55920" modification="7">
Alternatives to updating the CME include clearing browser cache (CTRL+Shift+Delete in Chrome) or setting cache settings per user.
Question
Should I use this for any CM-side changes such as GUI extensions, schema changes, or template linked schemas? Or does it only apply to certain parts of the Content Manager Explorer?
In other words, after a schema and template change, what's the best way to make users get the latest versions of components, schema drop-downs, and template selections?
The values of the modification and version attributes become part of the URL of every CSS and JavaScript file that the Tridion UI generates/merges and of many of the static (image) files too. So the URLs look like this edit_v.6.1.0.55920.7.aspx?mode=css. Since the browser sees this as a new URL, there is no way it can have the file in its cache yet. And thus it will always have to download the files from the server, instead of using (possibly outdated) files from the local cache.
This is a technique of injecting some version information into the URL is known as "URL fingerprinting". Google commonly embeds a hash-value of the file into the URL, ensuring that the fingerprinting happens without requiring the developers to increase a version number manually. But whichever way of fingerprinting is used, the technique is a pretty efficient way to ensure that all browsers download the latest version of your code.
If you are developing a GUI extension, you can indeed typically get the same effect by clearing your browser cache or even disabling it completely (for the Tridion domain). But once you roll out your extension to a non-development server, changing the modification attribute is the most certain way to ensure that all your users get the latest JavaScript/CSS changes without each of them having to clear their cache manually.
The URL fingerprinting in Tridion only affects CSS, JavaScript and image files. The actual CMS data (such as Schemas and Components) is loaded using XMLHttpRequests and thus not affected by the modification attribute.
As far as I know,
<server version="6.1.0.55920" modification="7">
This clears only JS and CSS related caching. When a User access the CM then CM loads all the files including latest copies.
Should I use this for any CM-side changes such as GUI extensions, schema changes, or template linked schemas? Or does it only apply to certain parts of the Content Manager Explorer?
For this line, answer is No. Since when ever user does any changes to schema, changes should refresh on all publications. Currently this is not happening on the browser.
Hopefully this might be fixed in on coming versions.
In other words, after a schema and template change, what's the best way to make users get the latest versions of components, schema drop-downs, and template selections?
Currently user should do a forceful refresh to get updated info on all publications.
The SDL Tridion CMS interface caches CMS Items in order to provide faster browsing and loading of its own interface. This does mean that sometimes:
Custom GUI extensions may not display latest versions of the files
Recently created or modified CMS items may not be shown, or show the latest version.
This is why sometimes a new keyword isn't shown within a component field, or a new component template isn't shown when trying to add a component page.
Incrementing the modification number in the node will cause all CMS items to show the latest versions to the CMS user(s). You'll see if uses this value to reference CSS and JS files used by the CMS GUI.
As a developer I also turn off my Firefox cache (I prefer firefox for the firebug extension which is great for working with GUI extensions) as this means you don't need to go and change this value, a simple browser refresh seems to always do the trick. Turning off cache is explained here : https://superuser.com/questions/23134/how-to-turn-off-the-firefox-cache

How to track a completed file download in ASP.NET

I have this ASP.NET web site that allows users to download program installation packages (just normal files). I want to be able to track when a download is completed (i.e. the file has been fully downloaded to the user's computer) and then invoke a Google Analytics script that reports a completed download as a 'Goal' (obviously, one of my goals is to increase file downloads).
The problem is that I need to support direct file URLs, as opposed to the "redirect page" solution. This is because a lot of traffic comes from software download sites that explicitly demand a direct file URL when submitting a product. Perhaps, they do their own file analysis (i.e. virus checking). But with this set of limitations, a typical scenario is:
The user visits my product listing on a software download site
The user clicks the "Download" button on this site
The "Download" page is typically a redirect that finally brings the user to my file via the direct URL I've initially submitted, i.e. http://www.ko-sw.com/somefile.exe
If under these conditions, an exact solution for monitoring is not possible, maybe there exists a workaround? What comes to my mind is temporarily storing the number of performed downloads on the server and then accessing an administrative page that somehow reports this number to Google Analytics and finally sets it back to zero. With this workaround, there is at least no need to try to attach a javascript handler to a non-HTML resource. But even then there are issues:
How to track if a download has completed?
How to track user geolocation and browser capabilities to make them further visible in the reports?
Thanks everybody in advance
According to awstats aborted download has http status code 206 so if you analyze server log for such code you can get those downloads that were not completed.
#Kerido ~ I'm curious what the business case is here. Are you trying to track installs or downloads? If installs, go with #SamMeiers solution.
However, if you're trying to track downloads, then the next question is what webserver base are you using? IIS? Apache? Something else?
In IIS, assuming you're using 7 (or later), you could (easily?) write a HttpHandler that checks for the last bytes of the file to be sent, and on that, record a log somewhere.
On Apache, just setup logging to tell you how many bytes were transferred (a trivial change in httpd.conf) and then parse the logs daily (awstats [amongst others] is pretty good for this, but you might have to write a sed/awk script) and find out how many full transfers were completed. Just depends on how thorough you're trying to be.
But I go back to, what's the business case for this? What does it matter if there were unfinished downloads?
It's possible to track links as a goal, which may be of use to you. However, this won't track when the download was completed.
http://www.google.com/support/analytics/bin/answer.py?answer=55529
Hope this helps.
Cheers
Tigger
I think the solution of #SamMeiers is very good but you may optimized by calling a web services after the installation complete but you might find a small problem if the use installing the app in an environment without internet but you might force to check if there is an internet or not.
You can create any trigger when you installation start as a start flag then when if finish check if the start flag exists then the app have been downloaded and installed also.

Resources