We currently have logging setup with custom metrics that pulls the HttpResponseStatus from our logs using (?:HttpResponseStatus=)(...)
When this is put into a dashboard it will show a count of all the different status codes e.g 200, 204
What I am trying to work out is if its possible to have it show them as a percentage out of 100
example
200 95%
204 4%
500 1%
Is that possible using the Stackdriver dashboard and custom metrics? the logs resource type is currently Global
I have checked the documentation for charts and alerting for Stackdriver and found no mention on representing the results as a percentage.
If you think this might be useful for you or/and other users maybe you should go ahead and file a feature request.
Cloud Monitoring doesn't really have the ability to calculate percentages, the way you're looking for.
Refer the Document:
https://cloud.google.com/monitoring/charts/metrics-selector
I am trying to get a route between two points using avoid areas with Routing API HERE maps and I am having the error "Maximum number of avoid areas exceeds limit".
Below you could find the request I am using:
https://route.cit.api.here.com/routing/7.2/calculateroute.json?app_id=#####&app_code=#####g&waypoint0=geo!39.4640023,-0.3850572&waypoint1=geo!39.476885,-0.3801068&mode=fastest;car;traffic:disabled&avoidareas=39.45315053433366,-0.3745426849935516;39.45244111598196,-0.3758222574575006!39.45646192309658,-0.3727107307399733;39.456087897102364,-0.3738696063317133!39.45594467628818,-0.37061955378352013;39.455610302758494,-0.37146705481229625!39.46063897809171,-0.3637087111174383;39.460208373008,-0.36463342201032306!39.46027406507121,-0.3644229889377801;39.45945896807123,-0.36512131930616654!39.45778290983732,-0.36235345142498465;39.45722411730335,-0.36284132909356276!39.458055076124936,-0.3685070306751628;39.45796969111227,-0.369566281083658!39.45960790790132,-0.36670532495457014;39.45880954421065,-0.3687782227883713!39.46786419209955,-0.3788290555558871;39.466598324440575,-0.37952348064968555!39.46629280916266,-0.37952060299424345;39.46579450682472,-0.3798614186868332!39.447906189702366,-0.3865406097869585;39.44771727050539,-0.38799155376945255!39.447906266440604,-0.3860336486039068;39.44767149909636,-0.3866130855790714!39.45518409583871,-0.3836551666444044;39.454907307568014,-0.38405749286187724!39.45964221683283,-0.38704088462136754;39.45899783260966,-0.38824034688297143!39.46042754674725,-0.3884778363064053;39.45992759234617,-0.3890550711354175!39.46052328505597,-0.38689531313812037;39.459738005168106,-0.38822226990415315!39.46193614040639,-0.389429648171608;39.46154553298938,-0.3900677999760695!39.46191503182935,-0.39018276482275266;39.46111159154079,-0.3909970310465749!39.4639823644881,-0.39148296845987174;39.46280010046198,-0.39256505432368666!39.467739617727254,-0.38146113326699044;39.46706936907706,-0.3821015492686101!39.46976679855793,-0.3846784165944088;39.46901057325898,-0.3860335643592155!39.469205752460745,-0.38845565289929074;39.46871087077631,-0.38937325235434783!39.47401789680908,-0.39616459840290014;39.4733842284781,-0.396270815957154!39.47423647867104,-0.39540645561031434;39.47401789680908,-0.39616459840
I would like to know if this limitation is because I am using a free trial plan or because the API have this limitation. In this case any idea to solve this limitation?
Below you could find the here guide to use Avoid areas, if someone is interested.
https://developer.here.com/rest-apis/documentation/routing/topics/example-route-avoiding-an-area.html
Thanks,
This is an API limitation. Trial credentials don't limit the functionality of available service.
I'm looking for prices of additionnal API calls within Google Analytics.
In the developers console, I can enter my billing information, but before, I'd like to find a price table somewhere.
I know that the free quota is 50 000 requests per month, I know we can ask for an extension, but I need much more and I'm ready to pay. The only question is : what does it cost ?
Thanks !
That's 50,000 requests per day. If you need more, you can go Premium, but even if you do some limits will remain the same. Check their page on Collection Limits and Quota. Pricing probably varies on your needs and scopes.
Say I have an article which has been viewed 100 times and has an Average Visit Duration of 01:00:00 hrs. Is there any way I can break down those statistics - and see how long each individual visit lasted for?
(I should state that I'm not looking to find out information about particular IP addresses or anything like that. I just want to get some idea of the 'mode visit' - the time most people spent on the page.)
Google Analytics doesn't provide enough detailed insights for invividual visitor details. If you want a more granular data try CardioLog Analytics
Yes, right, Google doesn't provide that. I tend to use sitemeter in conjuction with Google. Not sure if I recommend sitemeter though. It does give specifics about individual visitors, but they are very flaky. I don't think I've ever gotten a response from their so-called "tech support" or anything else from them.
The short answer is no, you can't. Google Analytics doesn't provide individual visitor details as it violates the GA Terms of Service.
However there are a couple ways to get at or close to this information:
1) Create an advanced segment - use the "Page" dimension and include the URI of the article on your site. Apply it and then look at the city or service provider report - it will show you all visits that viewed the article.
2) Keep a copy of the tracking data sent to Google and process it with on premises web analytics software that doesn't have the same ToS/privacy restrictions.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I'm trying to determine why an enterprise wouldn't want to use Google Analytics.
Here are the main reasons I've seen mentioned:
Inability to track clients that have Javascript disabled.
Lack of ownership of the statistics - Google owns the data.
Most of the web clients with Javascript disabled will probably be bots/spiders. This data is interesting, but probably not very useful.
As for the ownership issue, this is a bit paranoid IMO.
What am I missing here? When is Google Analytics not good enough?
Here are my findings from additional research:
Google Analytics is limited to 5 million page views per month - source
If a web site generates more than 5 million pageviews per month it will need linked to an active AdWords account to avoid interruption of service.
Lack of / slow technical support
All Google support is handled through email and response times can take a week or more. Commercial analytics products often have much faster & personalized support.
Inability to track files (PDF's, Images, etc.)
GA relies on Javascript and files lack the ability to execute Javascript. The workaround to this problem is to tag the link, but this won't track requests that go directly to the file.
Limited ability to customize
This is a selling point that I see pushed by commercial analytics tools (WebTrends). However it's never explained what customizations are denied by GA but allowed by WebTrends.
The Google Analytics EULA does not allow you to track individual users by identifying them. So if you wanted to add a custom variable for username to track how many times each user logs in, then you would be in a gray zone if not outright violating the EULA.
I use Google Analytics on about 10 sites right now and it's a great tool. In addition to all the analytics stats, you can tie it in with AdSense and it becomes a marketing/revenue tool and not just "wow look at all these cool user stats". If there was a way to track by user ID in certain circumstances (e.g. if user's agreed to it, or if they work for the company that owns the site) then I would have no issues.
Besides, it's free and all you have to do is add JavaScript to the files, so give it a try and see what you think after a few months.
One reason that was, surprisingly, not posted:
timing / speed of reaction
It takes at least 4 hours (up to 24) for GA to update your data.
This is ok for me personally in most of the cases, but when reacting fast is crucial (news sites, one-off events, etc.) you may want to employ some other solution (Mint comes to mind, but it's not the only one out there of course).
Thought I'd add my two pence worth to this thread, as this a topic close to my heart and one I've debated with colleagues for years. We've used webtrends in house for as long as i can remember, back to version 4 of the log analyzer (how different things were back then!). Since Google Analytics came along, we've started to come under increasing pressure from certain parts of our business to switch, as 'it does everything we need form an analytics tool'
Well, true in many senses it does, especially these days. But I championed the integration of our CRM and web analytics tools back in 2006, and as our business isn't e-commerce (the 'conversion' happens offline, sometimes months after the visitor acquisition) we need to integrate in this way to get a true picture of campaign effectiveness, and notion of ROI.
All of this means, we need access to the raw data, need to be able to join visitor records on sessionID etc, without this access we'd be screwed. I'd love it if we could roll without it, but the current requirements mean we can't, so this alone is a HUGE reason why Google analytics is not good enough.
Over and out
For tracking desktop software or creating a whitelabel solution there are better solutions.
For white label an integration based analytics, i use MixPanel. For Desktop Software, i use Deskmetrics
Google Analytics does not work well with mobile phones. While the iPhone and the Palm may be supported, many of the existing handsets do not support the javascript that Google uses.
If you're based in the UK, then theoretically you could be breaking the Data Protection Act by using Analytics.
If information about your users (like which web pages they're looking at) goes "outside the European Economic Area" and onto Google's servers in the US, then you're breaking the DPA.
Pretty obscure, but you did ask :)
Piwik avoids the problem because you host it on your own servers.
Lack of ownership of the statistics - Google owns the data.
... As for the ownership issue, this is a
bit paranoid IMO.
One problem with it is that we can't even access the raw data. We had a use case this week where we wanted a visitor map for an executive presentation. We needed to get more flexible with how the visitor map is displayed (wanted to view the map in Google Earth plug-in). In GA, you can't. You take what they give you. You can see a map of how many visits came from each city, but you can't export a data file of cities and number of visits, to run the data through other tools. So, paranoia aside, there are significant limitations on what you can accomplish with GA.
However this is not a problem if you use Urchin, the self-hosted version of GA: you can export the data and do what you want with it. (And the exported data is richer than the web server log's, as it includes some analysis already.)
Since Piwik is open source, and pluggable, I imagine you could enhance the visitor map plug-in any way you wanted to. And export whatever data you want.
Whether this limitation affects you depends on your needs, obviously.
Update: I've now looked at the GA Data Export API, and it turns out that things you cannot do through the UI (as you can with Urchin), you can do with this API. It does look like you can export the visit data I was talking about, via a feed (although there are daily traffic caps on those requests). So sprinkle salt heavily on what I wrote above.
A couple more points that I've come across:
GA doesn't let you dig beyond full-day statistics; I would often like the ability to investigate whether a traffic dip the previous day was caused by the design update I did at 1pm or the soccer match on TV at 8pm.
GA doesn't offer a workaround for traffic spikes caused by DDoS attacks, Slashdotting etc. When I'm looking at a GA visitor graph of 2009, all I can see is the 2-million-pageview-spike on October 16th, pushing the entire rest of the year down flat against the horizontal axis of the graph. To get a meaningful graph, GA should offer the ability to trim or exclude outlying data points, or the ability to limit/bracket the graph window itself
GA doesn't have an event monitoring client (think Reinvigorate's Snoop tool)
While GA is very user-friendly, I've found it's not as granular as some of the other stats programs (or maybe I'm not looking in the right places). Before the marketing monkeys I work with began pushing GA, we were very satisfied with AWStats. The sheer scope of the data helped us on several occasions hone sites to better suit their audience. While GA is very shiny and laid out well, I personally still prefer the raw numbers like I used to get through AWStats.
Slow data processing speed - Can be as low as 15-30 mins for page views, but may be up to 48 for eCommerce
EULA is limiting in some cases
You won't own or have any control of the data. Google's engineers might use it (anonymously) for testing
Anything more complex requires customization - Downloads and such care of no issue, but there are limits
Cross domain tracking by linker is faulty at best
Visit based - Proper tools are based on Visitor level, GA works on Visit based reporting mostly
Limited number of custom vars used at one time (5)
No tech support, if you're realistic
Usually when there is a downtime notice, it's already gone
API limitations (4 dimensions and 10 metrics at one time, not all can be used together in addition to that)
I have many more, but at the end of the day it is a good tool for it's price.
From the non-technical point, I think the most important is that some enterprise has the high level data security policy. All of the data should be controlled and managed by themselves.
If you use the Google analytics,the data is stored in google's server. For some special enterprise, like insurance, financial company. The policy should be followed.
I would NOT go with server logs. In fact I have them disabled on my server. Why you ask me?
For the simple reason that everytime you hit my server that stupid logging program makes an entry in the physical log file on my HDD. So if my server gets 100,000 hits in a day that's 100,000 time a HDD write operation happens.
You think that's cool? Well it's not. It's slowing your server down, specially if the log file is huge.
Why would someone even consider doing that to their server? Specially when we're working so hard to minify javascript, css and make image files 2 KB smaller!
Please do yourself a favor don't log directly on your server.
At least Google Analytics logs it on Google's server so my server's healthier.
I wouldn't use it for any of my sites, because you're forcing the user to accept your proprietary JavaScript code in their browser, which is bad. Also, giving your data is Google is a really bad idea.
See Piwiki for something you can run yourself as in free software, eliminating both of the problems.