Web Site URL for analytics-tracking in a Spotify-App? - google-analytics

I want to track pageviews and userinteractions in a Spotify-app.
Spotify says it's possible: https://developer.spotify.com/technologies/apps/guidelines/integration/#usertrackinganalytics
Which »Web Site URL« do i enter when setting up a property for analytics-tracking in a Spotify-App?
Or am I doing it wrong and have to do it in another way?

It doesn't matter.
You can use a fake one if you will.
The only place where GA uses that is when creating links back from the reports to your website. Since your website doesn't exist, it's more like an extension from what I understand. Then you don't need those links anyway.
If feeling not-creative just use:
http://my-spotify-app.com

Related

Tracking links within my site

I want to track particular links on my site to see where they come from. For example, I want to know which links on my navigation are being clicked, so if something is not being clicked I could potentially remove it.
I have been using UTM's, super easy, but results in skewed analytics data.
I looked into Google Tag Manager, but I don't want to slow down my website. I can change the site easily, so not sure if this is the best solution.
I found an article dated 2008 that says I can do this:
https://www.example.com/?from=topnav
Is that still valid? Is there a better way. I can't seem to find any information on this and assume somebody wants to acquire this information.
Thank you.
I have been using UTM's, super easy, but results in skewed analytics
data.
UTM codes are meant to track inbound traffic. Don't use them to track internal/outbound navigation, as it will seriously mess up your reporting.
I looked into Google Tag Manager, but I don't want to slow down my
website.
GTM is loading async, just like GA, so performance-wise they are equivalent.
I found an article dated 2008 that says I can do this:
https://www.example.com/?from=topnav
By default GA will not track link clicks. You can indeed add parameters to URLs and then use those to build custom reports and see which links are being clicked.
Since what you're trying to do is custom implementation, you won't find a single best answer, it's up to you to implement something that fits your needs. These are some examples:
https://analytical42.com/2017/track-internal-links-google-analytics-gtm/
https://www.gravitatedesign.com/blog/can-google-analytics-track-link-clicks/

Google Analytics, iframes & cross-domain

I have GA on every page on one domain (actually not me, but my company, whose programmer needs auditing). Just the default code (Classic version, ga.js), no special accommodations whatsoever that I've seen or know of. Bare minimal if any configuration past registering the service with the main site...
All the pages are either aspx or static HTML. It's common practice for this guy to embed pages on the site within other pages on the site in iframes, where both the parent (top-level) & child (embedded) pages contain the GA script.
I don't really know much at all about GA, have never worked with it, but I do suspect that might result in extra hits being counted by GA or something, that that may be messing with the metrics. But then I've read stuff about GA using first-party cookies so by default pages loaded in iframes won't be tracked/counted... I could really use some clarification on this, please.
Then our programmer frames pages from the main site in pages on other sites that we own, that are on different domains. So then there's this cross-domain business, with no segregation of sources, because they really don't care much. So what should be the outcome of that? The external sites' pages don't have the GA code.
However, we're rebuilding one of those other sites - actually I am, for the most part - and the programmer told me to just copy and paste the same exact GA script used on the main site into that one. So, it's a different domain. That wouldn't work as-is, would it? Wouldn't there have to be some sort of special configuration, setting of the domain, something?
I'd really appreciate if someone could tell me more about the scenarios described above. Thanks in advance.
In the Google Analytics developer menu, you can create a new 'profile' for this new site. The analytics will then be tracked for just that one site, not for all. In theory, it is possible to use one GA.js for all your sites, but it kind of kills the whole concept of Google Analytics, so it's not recommended.
Your really shouldn't be using iframes anymore IMO. There are reasons to use them like embedding code for tracking etc, I think, even GA uses iframes. But, generally Google doesn't like them because a lot of spammers use them to try and fool the Google Crawler.
Also, it get's very complicated to understand what is going on within GA.
To answer your question: Each iframe is like an independent webpage completely separated from the other webpage (for security reasons). So when Google or a web browser goes to your website it will do this:
Load your main html document.
Render that page.
See that you have an iframe.
Load that page in the iframe.
Render the iframe.
Now, if you don't have GA installed on the iframe page it will not track the page being loaded.
But if you do put GA in the iframe it will record when the iframe is loaded or the webpage is loaded.
But, remember that one of the main reasons of having GA is to see where your customers are coming from and why. If you have an iframe of another webpage, you really don't know if that is because a customer is:
A) visiting your website from the page directly.
OR
B) the customer is visiting that page through an iframe on another page.
It can get very complicated
You must generate a new tracker for each domain you are using. Otherwise what is to stop someone from just copying your GA code, and putting it on their webpage.

Can I track who is linking or manipulating my site's data?

Is it possible to track if someone links to data on my site? Specifically if my data is used in a site dynamically generated by a developer program? I would like to know if someone is blatantly passing off my site's data as their own. There are obviously ways around directly linking to content, such as content manipulation or even manual manipulation. But if someone where to link(or directly add word for word or manipulate) my content into their website, is there a way to track it?
Can I avoid someone being able to scrape my website at all, or is everything just up for grabs?
the best answer and the easy one is called GOOGLE - WEBMASTER TOOLS!
HERE
actually doing that is very hard and you would need to crawl the web to discover those links that address to your pages... dynamic content as well is linked so it would be find by google as well.
this tool will allow you to see outer links that address to your site.. and you can check them.
for extra - you can monitor requests and traffic to your site and find ip's that are using the same page over and over again. that can tell u that an outer page is dynamically loading content from your web page.
EDIT:
here is a good article in this subject: link - scroll down and you can see the use of google
webmaster tool with some other progrmas and method.
here is a good start guide to the google webmaster: link
ENJOY!

how to get search engines to understand a DB driven asp.net site

All,
This would seem like a fairly basic asp.net question - but in all my years of coding, I've never really thought about it.
Say you have a asp.net 2.0 site with only a masterpage and a default.aspx and its a blog that saves all the data into the database. Links on the side are generated automatically. So ... the URL is always just http://www.XXXXX.com/default.aspx.
So, with that being the case, what do you need to do so that ... say google ... knows about all the different blog entries and links directly to the entries instead of just the base URL?
Is it as simple as changing the forms method to: method="get"?
Thanks, L. Lee Saunders
There are at least two solutions:
Search engines understand query strings, so just add the article IDs to the URLs in your anchor tags -- no need to even use a form control.
Use URL rewriting to expose one set of URLs to the outside world (like /article-title/1234/) in your anchor tags, and then modify the URL to be default.aspx when it arrives at your site; the page could then pull the article to be displayed from any number of places, including but not limited to a query string.
You could have a REST webservice so that you can just use urls to navigate the site, and perhaps have a front page with some new posts, so that the spider can navigate the site..
As an example, look at the urls for SO, it is easy for a spider to navigate this database-driven website.
Create a page that just serves up XML Sitemap (the data obviously being pulled from your database) and submit the sitemap to Google.
Google will then index any links in your sitemap.
(This assumes that these is some difference between each article - e.g. a Querystring key/value).
Useful Link(s):
Web Sitemap Generators
Google Sitemap Validator
Google Sitemaps for ASP.NET 2.0 (there are about a gazillion interesting links off the back of this as well).
some sort of URL rewriting may be an answer
I wouldn't recommend a postback for your situation, it can get ugly for refreshes etc. So, yes, change the method to "get"
Then, say your page of, default.aspx?postid=12345 will get translated into /mm/dd/yy/this-is-my-post.aspx

Using ASP.Net, is there a programmatic way to take a screenshot of the browser content?

I have an ASP.Net application which as desired feature, users would like to be able to take a screenshot. While I know this can be simulated, it would be really great to have a way to take a URL (or the current rendered page), and turn it into an image which can be stored on the server.
Is this crazy? Is there a way to do it? If so, any references?
I can tell you right now that there is no way to do it from inside the browser, nor should there be. Imagine that your page embeds GMail in an iframe. You could then steal a screenshot of the person's GMail inbox!
This could be made safe by having the browser "black out" all iframes and embeds that would violate cross-domain restrictions.
You could certainly write an extension to do this, but be aware of the security considerations outlined above.
Update: You can use a canvas utility function to get a screenshot of a page on the same origin as your code. There's even a lib to allow you to do this: http://experiments.hertzen.com/jsfeedback/
You can find other possible answers here: Using HTML5/Canvas/JavaScript to take screenshots
Browsershots has an XML-RPC interface and available source code (in Python).
I used the free assembly UrlScreenshot.dll which you can download here.
Works nicely!
There is also WebSiteScreenShot but it's not free.
You could try a browser plugin like IE7 Pro for Internet Explorer which allows you to save a screenshot of the current site to a file on disk. I'm sure there is a comparable plugin for FireFox out there as well.
If you want to do something like you described. You need to call an external process that prints the IE output as described here.
Why don't you take another approach?
If you have the need that users can view the same content over again, then it sounds like that is a business requirement for your application, and so you should be building it into your application.
Structure the URL so that when the same user (assuming you have sessions and the application shows different things to different users) visits the same URL, they always see same thing. They can then bookmark the URL locally, or you can even have an application feature that saves it in a user profile.
Part of this would mean making "clean urls", eg, site.com/view/whatever-information-needed-here.
If you are doing time-based data, where it changes as it gets older, there are probably a couple possible approaches.
If your data is not changing on a regular basis, then you could make the "current" page always, eg, site.com/view/2008-10-20 (add hour/minute/second as appropriate).
If it is refreshing, and/or updating more regularly, have the "current" page as site.com/view .. but allow specifying the exact time afterwards. In this case, you'd have to have a "link to this page" type function, which would link to the permanent URL with the full date/time. Look to google maps for inspiration here-- if you scroll across a map, you can always click "link to here" and it will provide a link that includes the GPS coordinates, objects on the map, etc. In that case it's not a very friendly url but it does work quite well. :)

Resources