Automatically Download Data in R from Website where click is required - r

First time question here! I'm new to R and am trying to have some fun in it with some NBA data. I want to automatically download shot tracking data and put get it into R so I can produce images, run analysis, etc.
I have been unable to find a website that displays all of the necessary data that I could just grab through web scraping so I turned my attention to finding files that have all of the data.
I found a website that has exactly the data I am looking for, but to download it I have to click a download button. In the upper right corner of the chart at the below link there is a "download csv" link that gives me all of the data related to every shot taken in the NBA. Can anyone please help me to figure out how I can automatically get this with R instead of going and manually downloading it each day?
Below is the link.
http://nbasavant.com/shot_search.php?hfST=&hfQ=&hfSZB=&hfSZA=&hfSZR=&ddlYear=2017&txtGameDateGT=&txtGameDateLT=&ddlGameTimeGT_min=&ddlGameTimeGT_sec=&ddlGameTimeLT_min=&ddlGameTimeLT_sec=&ddlShotDistanceGT=&ddlShotDistanceLT=&ddlTeamShooting=&ddlTeamDefense=&hfPT=&ddlGroupBy=player&ddlOrderBy=shots_made_desc&hfGT=0%7C&ddlShotMade=&ddlMin=0#results

Related

Scrapping information (report name and refresh time) from PowerBI workspace

I would like to get details and monitor the refresh time for dashboards and reports in PowerBI daily. How to I scrap the table details from Power BI site (without manual copy and paste as I have a lot of them).
For example, I would like to extract the table shown in image below into a csv/xlsx file.
Examples
I tried to use the default get data from weburl in PowerBI but it doesn't work. :(

How to use URLs extracted from a website as data source for another table in Power BI

I have a situation where I need to extract tables from 13 different links, which have the same structure, and then append them into only one table with all the data. This way, at first I extracted the links from a home page by copying the link from the respective hyperlink, and then import the data through the Web connector on Power BI. However, 3 months later, I realized that those links changed every quarter but the link from the homepage where they are listed remain the same.
This way, I did some research and I found out this video on YouTube (https://www.youtube.com/watch?v=oxglJL0VWOI), which explained how I can scrape the links from a website, by building a table with the header of the link as a column and the respective link as another column. This way, I can have the links automatically updated, whenever I refresh the data.
The thing is that I am having issues to figure out how can I use this links to extract the data automatically without having to copy them one by one and then import the data using the Power BI Web connector (Web.BrowserContents). Does anyone can give me a hint of how can I implement this?
Thanks in advance!

R - Download Data from Webpage

I want to download CSV file from a webpage, where I have to select the time frame described by the data as well as the columns I want to download.
The page is the following:
https://www.transtats.bts.gov/DL_SelectFields.asp?Table_ID=258&DB_Short_Name=Air%20Carriers
I wanted to ask, how I can achieve downloading this table for years 2015 and 2016, with columns passenger, carrier, origin and dest.
Using the Chrome Developer Tool I found out, that when clicking upon the button "Download", a function "TryDownload()" is being called in the background, which should be callable using a POST request. However, I dont understand, how I can call this function using R as well as changing the default selected columns.
Thank you for your help.

Is it possible to link data from R forms with Google Drive SDK

I wanted to get some feedback on the plausibility of this project before I grind too many gears.
Try running:
install.packages("webutils")
library(webutils)
demo_rhttpd()
Enter info. into the form and get a text file in the browser window after submitting it.
Now, is it possible to get R code to host this form in Google Drive:
https://support.google.com/drive/answer/2881970?hl=en
If not, why?
and is it possible to get R code to store the submitted form data in a Google Spreadsheet:
https://developers.google.com/google-apps/spreadsheets/
If not, why?

Seeing cost data in analytics reports

I've created a Custom Data Uploader script
which is uploading data to my google analytics profile.
I can see it's working and its uploading the file. I can see it on the "Custom Definitions" Tab in the Profile page. (Second picture on the link I attached).
But I cant see the data on the reports.
I tried to look under Traffic Sources -> Overview, it should be there from what I thought.
Where can I find this data in the reports?
https://developers.google.com/analytics/devguides/platform/features/cost-data-import
The Traffic Sources > Cost Analysis report should contain your data, but it can take 12 hours for the data on a new feed to show up. However, I've found that subsequent loads are usually much faster.

Resources