I'm new to Google map api. My goal is to overlay a series of transparent PNGs over a map over a timespan.
I'm wondering if a purely javascript approach to this is a good start, or if it would be not too hard to do some code behind in asp.net
Also, I'm curious if either google map Api v2.x or v3.x is better suited to this task
You will need to do it in Javascript, unless you want to reload the page every second or so. (Which is a horrible idea).
You might as well use the latest version of the Google Maps API.
I recommend avoiding v3 for live sites until its syntax is more stable.
While it's a Google Labs project, Google's policy is that they can make changes to the syntax that will break existing pages with only a few months notice in the discussion group. A while ago they changed the names of all the get_ and set_ methods, and changed some of the event names.
Once the API goes out of Labs and into Beta you can expect Google to take care to preserve backward compatibility whenever they make syntax changes. If you use v3 for a live site while it's still in Labs you'll need to monitor the discussion group for announcements of changes that might break your page.
Related
I wish to use the results of the Google translation that results from right clicking on a web page in Chrome, as opposed to using the API. These results I will use as part of a web language learning tool. I have read this page: https://cloud.google.com/translate/attribution about adding a logo, and have also read the HTML markup requirements at https://cloud.google.com/translate/markup.
My question is as these terms and conditions pages are referring to the API, do they also apply to using the translation results of using the Chrome menu item? I could use the API but this is much simpler for my temporary need.
Yes, when you're using the Google Translate Tool in a page by using the right-click, you can see that it send a request to the Google Translate API.
In definition, you have to add Google Attribution.
I guess for a personal website or non-commercial use it might not be that big of an issue but still, it will avoid future ones.
I am working on a project where I do not have access to page source nor can I ask client's IT team to create in-page/onload dataLayer e.g. having dataLayer before the tag. Is the idea of implementing enhanced e-commerce tracking remotely possible via dataLayer.push e.g. build dataLayer as you go?I have a very little knowledge of dataLayer.push (which I am starting to read up). Question is :
Is dataLayer.push the correct way move forward? Have anyone done this
before?
what issue I might face e.g. "add to cart" event, remove cart or
category page any working example? I still don't have a clear view of
the workflow here.
What are the downside of doing it this way beside "style/css" based
trigger might fall off during future site design update.
Thanks.
Your main problem is that you need to get the data from somewhere. Usually without a dataLayer this means DOM scraping and then assembling the scraped data into a dataLayer in a custom javascript function. Drawbacks are the same as with your stle/css based triggers:
Implementation is strongly coupled to the page layout
data might be missing, or require cleaning (if mixed with html or unrelated text)
potentially expensive in clients CPU time
custom javascript introduces new points of failure (i.e. can you test rigorously enough to guarantee that there are no side effects)
If you do workarounds around your client's ITs shortcomings remember that you own them - you will be responsible if the workarounds break, or have side effects, or need to be amended to account for new features. Make sure that you are very well compensated for that risk (and personally with the experience I have now I would not do this at all).
I think it's important to acknowledge here that the Data Layer is just one way of storing this information, and is used solely because it's possible to push data to it from the page.
If you can't write the code into the site itself, don't push information to the data layer. Just keep that information for yourself, in GTM's variables. You'll save yourself a huge headache, and a bit of computation too.
DOM scraping is a perfectly reasonable way to get hold of information, but you will run into some barriers.
You're going to have to write a lot of JavaScript to get the data you need.
Some buttons may turn out to be composed of several elements that you'll have to cover with your triggers etc.
Any changes to the site will potentially ruin your code.
Not everything is available to you, especially verification of data (checking if a purchase went through is probably no longer possible before reporting the transaction).
We are just about to release a big update to our website. It is a complete rebuild. Our domain is staying the same and the website generally functions the same way (the business hasn't changed and we are still selling the same stuff). But every page has changed and most page URLs have also changed.
My question is, how should we deal with Google Analytics in this situation? Should we stick with the original GA account and simply start feeding information from the new website instead? Or should we make a new account, or just a new property, or view?
I think it makes sense to stick with the original account and property, so we can easily derive meaningful stats about the effect of the website upgrade on performance. However, I'm worried that having a completely different URL structure mixed in with the older structure will make things difficult to dig through.
Am I right that sticking with the original account and property is sensible in this situation, and does anyone have any other general pointers?
Thanks.
I have been searching around to find out anything that needs to be taken into consideration when upgrading to Universal Analytics.
I found this post:
Google Analytics - Upgrading to Async Code
He explains that if you are not doing anything advanced you should be ok. We have lots of event tracking in place that we would like to keep the same. We also have some Custom Variables I could do without and/or deal with in Custom Dimensions.
Other than that we have a fairly basic setup.
We do have a "keep alive" event in place also that helps determine an accurate time on site.
Another really important question has everything to do with being able to transistion from Classic analytics to Universal. Is this possible? I found an article that said it wasn't but that article was a couple months old and not sure if its still true.
Thoughts?
Universal Analytics is Google's newest tracking code. It is currently in beta phase, so you may want to hold off on it, depending on your resources... In principle it works more or less the same way as the async code. Here are the major points about it:
The syntax is different. All config/tracking is done by making a call to ga()
Some of the "config" arguments for things have been moved to the GA interface. For example, names and scope for custom variables are no longer passed as arguments on-page. They are now done within the GA interface. Actually to be more accurate, custom variables no longer exist. Google now has custom dimensions and metrics as a replacement. Custom dimensions are the closest translation to custom variables (they are pretty much the same in principle). Custom metrics are some kind of mix between dimensions and events.
Google currently does NOT offer a way to upgrade your profile(s)/account(s) to Universal Analytics style. In order to use Universal Analytics, you have to create a new account or profile. If you want to try Universal Analytics out, Google currently recommends implementing it along side the traditional or async version you already have implemented.
Google has not currently officially stated when or if they will provide a means to migrate existing profiles, though I personally think they eventually will, since preserving historical data and reducing time and costs associated with migrating is a huge concern to everybody.
In addition to #Crayon-Violent's answer, be aware that the current Core Reporting API (v3) doesn't support retrieving custom dimension/metric data.
This month Google anounced that classical analytics properties can be upgraded to Universal without any data loss.
Universal Analytics is a set of technological innovations that improve
the way data is collected and processed in Google Analytics. The
Universal Analytics Upgrade is a process you can use to upgrade all of
your classic Google Analytics properties into Universal Analytics
properties without losing any data or changing your account settings.
All Google Analytics properties will soon be required to use Universal
Analytics. Any properties that don’t follow the upgrade process will
be auto-transferred to Universal Analytics in the future.
You can upgrade your analytics property from analytics admin panel. https://developers.google.com/analytics/devguides/collection/upgrade/
In case of upgrade problems you can refer to Google group:
https://groups.google.com/forum/#!forum/ua-upgrade
I recently inherited a project loaded with Google Analytics and I had never worked with them before. The code migration to Universal Analytics was straightforward with the exception of moving from Custom Vars to Dimensions. Google's documentation does little to highlight this.
After some digging around I found a couple of things that I think may help others that are migrating:
1) Set up or edit custom dimensions & metrics
Note that when you set up the Dimensions you are provided with code snippets that you can copy & paste into your project.
2) How to use the code to send custom dimensions & metrics
This docmentation will help you understand the provided code snippets and learn how to better work with them.
I hope this info spares others some of the pain that I experienced going through this.
I've created a plugin like jQuery Migrate to backward compatibilty of eventTracking and other features.
Allow developers to migrate old methods _gaq.push() to ga() object.
https://github.com/empiricompany/universal-analytics-migrate
Running an MVC2 site against IIS7 and would like to capture more detail of how users traverse the site - ideally to the point of being able to replay even the duration between mouse clicks - feedback of where people pause and/or backtrack.
I could do this with flash but that's no longer an option. Now it's just IIS7 via asp.net f4. IIS7 _should be able to provide this via 3rd party extensions - especially for this sort of niche need. I'm willing to consider client-side .net components but this sure seems to be the responsibility of the server.
[opps...does this belong on serverfault?]
thx
justSteve. Here is a solution that we have used:
http://www.seevolution.com/
I don't think that it gives time between clicks, but it does give very detailed tracking considering it's price (I don't know if that's an issue). We have really liked it. Fantastic detail.
You could also roll your own solution. Using jQuery and the $(document).click() function, you can log when they click, and the points on the screen. Then every couple of minutes, serialize it and fire it off to the server. You can get extremely fine-grained detail that way. The nice thing with seevolution is that they've done all of the work for you already, but it probably isn't as detailed as you would like.
JMax
Maybe not the "in-house" solution you're after but we are about to implement SessionCam at my company, which seems like a pretty good match for what you're looking for. Not having actually finished implementing it yet, I can't vouch for it in terms of quality at this point - but the description of the product certainly matches.
You aren't going to be able to capture the level of detail you need using a solely server-side solution. There needs to be a degree of client-side work - whether it's in flash or javascript - to capture things such as where the mouse is hovering (for heatmaps etc).
I personally haven't used this product, but a friend of mine spoke highly of it.
Clicktale