Gviz only showing first part of fusion table on Google Map - google-maps-api-3

I modified one of Geocodezip's examples to use my fusion table. The table is 40+ rows but only the first 11 are displaying on the map. Everything geocodes OK from within the fusion table but not here. Can someone please explain why.
http://6tango.com/Map_Examples/fusion_with_geocode.html
[EDIT] I just noticed that Geocodezip's example only has 11 points. Guess I'd better look at his javascript closer.
On another note, is there an easy way to show a block of code here without manually adding spaces to each line? (please excuse my ignorance)
PS - Thank you Geocodezip for all your examples. For a newbie like me they are a God-send!

You are geocoding the entries in the table on the fly, check for the case when status != google.maps.GeocoderStatus.OK, you will find you are getting OVER_QUERY_LIMIT responses.
I would suggest geocoding the addresses offline (or capturing the coordinates from the points that are displayed) and adding the resulting coordinates in additional columns of the table to avoid the problem.

Related

Scraping Spotify Top 200 streaming data with R

novice R user here. I'm looking to scrape a large amount of data on daily streaming volumes on songs that are on Spotify's Top 200 charts for a research project I am involved with. Basically, I would like to write a script to scrape all info for tracks in the top 200 on a given day, such as today's chart, and have this done for every day for a number of years, across a number of countries. I used some code from a guide that I followed previously to successfully scrape said data, but it is now not working for me.
I previously followed this guide pretty much word for word. While this originally worked, it now returns an empty tibble. I suspect that the problem may have to do with the fact that Spotify have re-developed their charts site since my last attempt. The site is different in appearance, but importantly the html node names appear to be different as well. My hunch is that this is what is causing the issue.
However, I am not at all sure if this is the case. Would appreciate it greatly if I could have some guidance on what I would need to do differently to achieve my aims, and whether it is indeed still possible to scrape these charts.
Cheers

How to only display traffic from particular sources in a Google Analytics dashboard widget

I can't believe this is as difficult as I'm finding it to be, so I must be missing something obvious!
I want to track data from two particular ads, one on TheSixFifty.com and one in the mountain view voice email newsletter. I've gotten as far as identifying these two sources in a table:
https://imgur.com/a/ljbeonT
I want to only display those two sources, so I thought a filter would be the way to do that, set up like so:
https://imgur.com/p7lBxnk
But that results in this sad, sad empty table:
https://imgur.com/hOzdOdu
Please tell me what I am doing wrong! Does "containing" not mean what I think it does? Help!
You're right - it is something simple!
Your filter contains an AND statement, so it will only show data where the source contains BOTH mv-voice.com and TheSixFifty.com.
Your filter should look like:
Only show Source Matching RegEx:
(TheSixFifty|mv-voice)\.com
Here's a great intro to Regular Expressions from Robbin Steif's guide, they'll be incredibly useful for any analysis.

Tableau reacts poorly to latitude and longitude for one particular location

I have been troubleshooting this issue where I click on datapoint in geographic map and NOTHING appears. However, all other datepoints work as expected
Troubleshooting steps
I deleted and re-created the tableau map
I removed the offending datapoint, and all other datapoints worked
I renamed the address of the datapoint, same problem
But then, I changed the latitude and longitude of the data-point and it worked.
Now when I revert to the correct latitude and longitude of the data-point it doesn't work.
Why on Earth doesn't it work?
Is there a certain way I should format longitude and latitude? This is how I formatted it:
Please please please help. I've been working on this all day.
This is screenshot of it working, when I select location from drop-down
This is screenshot when I select data-point from map and it DOESN'T WORK.
Notice how data at the bottom is BLANK, as if nothing is selected.
But if I select any other data-point on map it works
Update
Proof of concept is here,
Notice when you click on Eat at Joes on map, the data display is blank, but if you select Eat at Joes from dropdown, then it works
Another update
If I go to dashboard, do rectangular select, this is what I get
If I go to original map and do rectangular select, I get this. It says 64 marks, 1 row by 1 column
And if I use quick filter, select Eat at Joes, it displays the data, including the name Eat at Joes
But if I hover over or select 'Eat at Joeson the map (not the drop-down), the name in the dashboard saysNoneinstead ofEat at Joes`.
And this happens to all data points that I hover over.
I uploaded latest workbook here
Update after calculated field
I dragged Cal_Loc to Details and it is aggregate (I am unable to change to Dimension) and not appearing in drop-down of dashboard panel.
However on lower left corner of Maps screen, it says 5 marks even though I see 4, which is still quite unsettling ...
There are two problems.
The first is that you're using ATTR(). Instead of putting the fields of interest into Tooltip as attributes, put them into Detail as dimensions. Filtering on an attribute is tricky (that *'ll get you into all sorts of trouble). For filtering, dimensions are usually the way to go.
The second is that you have lat/lng in the map as dimensions. Try changing them to a measure. If your dimensions (Location, Type) can uniquely identify every point on the map (and now that you've made them dimensions and not attributes, they can), then you can have the lat/lng averaged.
Your title problem is a known issue with Tableau. They've acknowledged the problem for about two and a half years now, but there's no fix in sight. Behavior with putting dimensions in titles is very inconsistent (a quick search through the Tableau forums reveals a pretty shocking number of people with your exact issue). I couldn't find a solution to your problem, but here's a hacked together one specific to your situation.
Make a calculated field:
IIF(COUNTD([Location]) > 1, 'Multiple Locations', ATTR([Location]))
Then replace [Location] in your title with that field. It just checks to see how many locations are present in the partition. If there's just one, it uses ATTR([Location]), which we can safely assume will return the name of a single location and not a "*". Otherwise, it returns "Multiple Locations", which you can obviously adjust to fit your needs.

Google store locator library limit markers in right hand panel

I'd like to limit the number of markers that appear on the map in the right hand panel to something like 10 at any zoom level.
How can this be achieved?
The library can and examples can be found here:
http://storelocator.googlecode.com/git/index.html
I am following the code example given here:
http://storelocator.googlecode.com/git/examples/panel.html
There is a code reference here:
http://storelocator.googlecode.com/git/reference.html
But it's still not clear to me exactly how I can customise the example I am following so that it only shows a maximum of 10 markers at any one time.
EDIT : Why I want to do this
I sell a product wholesale to many salons. With this map I am trying to show customers which salons they can go to buy the products I supply.
However in the example given by google, the full list of salons appear as markers on the map. This is not good because it is then possible for competitors to glean an entire list of salons that they can market competing products to.
The solution I'd like would be to only show a maximum of 10 markers at a time according whichever is closest to the inputted address.
For me the example( http://storelocator.googlecode.com/git/examples/panel.html ) always show only up to 10 entries in the panel. There is a hardcoded limit of 10 , so it's not possible to achieve it without modifying the store-locator.min.js
But when you wan't to display less than the 10 entries, it would be possible via CSS:
/* limit the displayed entries to 5 */
.store-list li:nth-child(n+6){display:none}
When you want to apply a higher limit(or when it should be compatible with IE<9) edit this part in store-locator.min.js(line 28)
m=e.min(10,c[E]);
(set the 10 to the desired value)
To limit the number of results at all edit this line in MedicareDataSource.prototype.parse_
for (var i = 1, row; row = rows[i]; i++)
and set it to
for (var i = 1, row; row = rows[i],i<XXX; i++)
(where XXX is the limit +1, so e.g. setting XXX to 11 will apply a limit of 10)
There's a few general approaches, and the better solution depends a bit on your total number of stores you have, and how hard you want to make it for someone to scrape.
You could continue to use the static data feed like in this example (which means sending all stores to the browser on load), and then add some logic to only display the closest 10 (such as setting the map to null for all markers that aren't also shown in the panel), but this is not a good solution if:
there are lots of stores (more than a thousand or so) since it will be unnecessarily slow to load them all when only displaying a few.
you don't want someone to look at your code and just grab the full CSV you're sending down the wire with all your data.
Given your scraping concern, a better method is probably to implement the store locator using a dynamic datasource that only returns the closest N records for a given lat/lng so you don't expose the entire thing at once. Using Google services you could use Maps Engine which has an API, and the store locator includes a Google Maps Engine example you could start with. Your security concern here is if these queries are publicly available for anyone to hit directly, the table is also public and then someone could query for the full table. So you'd want to put a proxy inbetween to avoid that type of query hack (although of course someone could just feed you lots of locations to eventually get all your stores if they really wanted).
Other options (again just looking at Google's stack although there are lots of alternatives for this kind of thing, like CartoDB and many more) include AppEngine's Search API which also returns the N closest items (but would require some server side coding which Maps Engine would not), or even put the data into Google spreadsheets and implement a basic Script -> Web Service, where your script takes the lat/lng and do some basic math to find the closest.
Again if you don't love the server side aspect then Maps Engine is probably your best bet for a quick start especially given there's a working sample in the storelocator code.

Scrape all google search result for a specific name

I think the question has been answered here before,but i could not find the desired topic.I am a newbie in web scraping.I have to develop a script that will take all the google search result for a specific name.Then it will grab the related data against that name and if there is found more than one,the data will be grouped according to their names.
All I know is that,google has some kind of restriction on scraping.They provide a custom search api.I still did not use that api,but hoping to get all the resulted links corresponding to a query from that api. But, could not understand what will be the ideal process to do the scraping of the information from that links.Any tutorial link or suggestion is very much appreciated.
You should have provided a bit more what you have been doing, it does not sound like you even tried to solve it yourself.
Anyway, if you are still on it:
You can scrape Google through two ways, one is allowed one is not allowed.
a) Use their API, you can get around 2k results a day.
You can up it to around 3k a day for 2000 USD/year. You can up it more by getting in contact with them directly.
You will not be able to get accurate ranking positions from this method, if you only need a lower number of requests and are mainly interested in getting some websites according to a keyword that's the choice.
Starting point would be here: https://code.google.com/apis/console/
b) You can scrape the real search results
That's the only way to get the true ranking positions, for SEO purposes or to track website positions. Also it allows to get a large amount of results, if done right.
You can Google for code, the most advanced free (PHP) code I know is at http://scraping.compunect.com
However, there are other projects and code snippets.
You can start off at 300-500 requests per day and this can be multiplied by multiple IPs. Look at the linked article if you want to go that route, it explains it in more details and is quite accurate.
That said, if you choose route b) you break Googles terms, so either do not accept them or make sure you are not detected. If Google detects you, your script will be banned by IP/captcha. Not getting detected should be a priority.

Resources