Has anyone found any documentation or research about what data is transfered to Google Analytics when it's added to a site. The main thing I'm wondering about is post data, but the details of exactly what is sent would be useful.
I'm considering implementing it on a sites that have a lot of private data on them. I'm wondering what data Google will capture, if any. (The sites are login only.) I'm needing proof so I can provided to the users.
The official information can be found here
The visitor tracking information that you can get in the Google Analytics reports depends on Javascript code that you include in your website pages, referred to as the Google Analytics Tracking Code (GATC). Initial releases of the GATC used a Javascript file called urchin.js.
That script is then discussed in detail in that blog, and Google Analytics Help group can also provide some details.
a More detailed list of what that javascript collect is listed here.
I found the official google documentation here:
http://code.google.com/apis/analytics/docs/tracking/gaTrackingTroubleshooting.html
i also found this very discussion VERY useful:
http://www.google.com/support/forum/p/Google%20Analytics/thread?tid=5f11a529100f1d47&hl=en
helped me find out WTF utmcc actually DID
All info passes via URL and post params:
Cache
page
1
utmac
UA-745459-1
utmcc
__utma=52631473.656111131.1231670535.1235325662.1235336522.264;+__utmz=52631473.1235287959.257.8.utmccn
=(organic)|utmcsr=google|utmctr=site:domain.com|utmcmd=organic;+
utmcs
windows-1255
utmdt
page title
utmfl
10.0 r12
utmhid
1524858795
utmhn
www.domain.com
utmje
1
utmn
1273285258
utmp
/shakeit/?
utmr
0
utmsc
32-bit
utmsr
1280x800
utmul
en-us
utmwv
1.3
Host
www.google-analytics.com
User-Agent
Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.0.6) Gecko/2009011913 Firefox/3.0.6 (.NET CLR
3.5.30729)
Accept
image/png,image/;q=0.8,/*;q=0.5
Accept-Language
en-us,en;q=0.5
Accept-Encoding
gzip,deflate
Accept-Charset
ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive
300
Connection
keep-alive
Referer
http://www.hadash-hot.co.il/shakeit/?&page=1
Pragma
no-cache
Cache-Control
no-cache
look at http://www.google-analytics.com/urchin.js under the function urchinTracker and you'll see what's going on :)
I recommend trying the Google Chrome extension: https://chrome.google.com/extensions/detail/jnkmfdileelhofjcijamephohjechhna
This extension will provide debug information for all of the data sent to google analytics. It's especially helpful when you are adding new analytics features and want to verify they are working the way you expect them to.
Related
We set up server-side tagging using the docker-container google provides in its "manual setup guide"
Everything is working fine, but all request against the tagging-server are answered without any compression: no gzip, no deflate, no br, just plain text.
Is there anything we are missing? The docs provided by google do not give any hints...
As of 2022, this is not possible. We used a cdn that gives us content compression.
I am assuming that all web browsers send User-Agent, DNT, Accept, Accept-Language, Accept-Encoding etc automatically. The web developer do not have to do anything to set these headers. I am saying this because previously www.whatismybrowser.com used to show these header values.
If so then which headers are set by the web browser and sent automatically?
OP here. I got the answer from reddit.
One thing you could easily do is create a page like test.php and set it to just:
<?php
print_r($_SERVER);
Then visit that in the different browser and OS combos that you care about and take any of the notes that you're looking for.
A website was audited for vulnerabilities and it had flagged XSS for many pages which, from my point of view, do not appear to be vulnerable as I don't display any data captured from form the page or the URL (such as query string).
Acunetix flagged the following URL as XSS by adding some javacript code
http://www.example.com/page-one//?'onmouseover='pU0e(9527)
Report:
GET /page-one//?'onmouseover='pU0e(9527)'bad=' HTTP/1.1
Referer: https://www.example.com/
Connection: keep-alive
Authorization: Basic FXvxdAfafmFub25cfGb=
Accept: /
Accept-Encoding: gzip,deflate
Host: example.com
So, how could this be vulnerable or is it possible that it's vulnerable?
Above all, if onmouseover can be added as XSS then how will it be affected?
Since you asked for more information, I'll post my response as an answer.
The main question as I see it:
Can there still be an XSS vulnerability from the query string if I don't use any of the parameters in my code?
Well, if they actually aren't used at all, then it should not be possible. However, there are subtle ways that you could be using them that you may have overlooked. (Posting the actual source code would be useful here).
One example would be something like this:
Response.Write("<a href='" +
HttpContext.Current.Request.Url.AbsoluteUri) + "'>share this link!</a>
This would put the entire URL in the body of the web page. The attacker can make use of the query string even though they aren't mapped to variables because the full URL is written in the response. Keep in mind it could also be in a hidden field.
Be careful writing out values like HttpContext.Current.Request.Url.AbsoluteUri or HttpContext.Current.Request.Url.PathAndQuery.
Some tips:
Confirm that the scanner is not reporting a false positive by opening the link in a modern browser like Chrome. Check the console for an error about "XSS Auditor" or similar.
Use an antixss library to encode untrusted output before writing to the response.
read this: https://www.owasp.org/index.php/XSS_(Cross_Site_Scripting)_Prevention_Cheat_Sheet
We've noticed that for some users of our website, they have a problem that if they following links to the website from external source (specifically Outlook and MS Word) that they arrive at the website in such a way that User.IsAuthenticated is false, even though they are still logged in in other tabs.
After hours of diagnosis, it appears to be because the FormsAuthentication cookie is not sent sometimes when the external link is clicked. If we examine in Fiddler, we see different headers for links clicked within the website, versus the headers which are as a result of clicking a link in a Word document or Email. There doesn't appear to be anything wrong with the cookie (has "/" as path, no domain, and a future expiration date).
Here is the cookie being set:
Set-Cookie: DRYXADMINAUTH2014=<hexdata>; expires=Wed, 01-Jul-2015 23:30:37 GMT; path=/
Here is a request sent from an internal link:
GET http://domain.com/searchresults/media/?sk=creative HTTP/1.1
Host: domain.com
Cookie: Diary_SessionID=r4krwqqhaoqvt1q0vcdzj5md; DRYXADMINAUTH2014=<hexdata>;
Here is a request sent from an external (Word) link:
GET http://domain.com/searchresults/media/?sk=creative HTTP/1.1
Host: domain.com
Cookie: Diary_SessionID=cpnriieepi4rzdbjtenfpvdb
Note that the .NET FormsAuthentication token is missing from the second request. The problem doesn't seem to be affected by which browser is set as default and happens in both Chrome and Firefox.
Is this normal/expected behaviour, or there a way we can fix this?
Turns out this a known issue with Microsoft Word, Outlook and other MS Office products: <sigh>
See: Why are cookies unrecognized when a link is clicked from an external source (i.e. Excel, Word, etc...)
Summary: Word tries to open the URL itself (in case it's an Office document) but gets redirected as it doesn't have the authentication cookie. Due to a bug in Word, it then incorrectly tries to open the redirected URL in the OS's default browser instead of the original URL. If you monitor the the "process" column in Fiddler it's easy to see the exact behaviour from the linked article occurring:
For Filepicker.io we built "grab from url", but certain sites aren't happy with not passing a User-Agent header. I could just use a stock browser user agent as suggested in some other answers, but as a good web citizen I wanted to know if there isa more appropriate user-agent to set for a server requesting another server's data?
Depends on the language you wrote your server in. For example, Python's urllib sets a default value to User-agent: Python-urllib/2.1, but you can just as easily set it to something like User-agent: filepicker.io/<your-version-here> or something more language specific if you'd like.