Google sheets importxml formula error - google-maps-api-3

I try to import some Google Maps data into Google Sheets using the importxml function.
My URL looks like this:
https://maps.googleapis.com/maps/api/directions/xml?origin=Disneyland&destination=Universal+Studios+Hollywood4&sensor=false&alternatives=false&key=MYGOOGLEAPIKEY
(replace MYGOOGLEAPIKEY with yours or remove the whole key parameter)
In a browser this returns the XML as expected.
However if I use
=IMPORTXML("https://maps.googleapis.com/maps/api/directions/xml?origin=Disneyland&destination=Universal+Studios+Hollywood4&sensor=false&alternatives=false&key=MYGOOGLEAPIKEY","//leg/distance/value");
in Google Spreadsheets I get a error message pop-up stating:
There was a problem
It looks like your formula has an error. If you don't want to enter a
formula, begin your text with an apostrophe (').
Screenshot of error message
I am unable to figure out what the problem is.

Related

Open CSV data in tableau

I have had problems uploading the following file to Tableau:
https://www.kaggle.com/datasets/shivamb/netflix-shows/download
When loaded it looks like this
but loading it in R
Is it possible to load them in R and then by Rserve connect to tableau or is there a way to load them fine
Looks like a problem within the interpreter.
I can't download the file myself as I don't have a Kaggle account, and its not clear from you R screenshots, though you could adjust the text file properties to see if you can adjust how the interpreter works by right-mouse the object "netflix_titles.csv" in the data model window and selecting Text file properties from the context menu.
Another option would be to try to use the interpreter Usar el intérprete de datos
It looks like Tableau is reading this file as a Text file and not a CSV. Tableau should have multiple headers for every comma that it sees but your screenshot has a single column for the entire first row.
Sometimes, Tableau can correctly read the file if you check the "Use Data Interpreter" checkbox.
If you have trouble making that work, just simply open the CSV in Excel and save it as an XLSX. You could even connect to it via an import to Google Sheets if you don't have Excel.

RStudio character encoding

I am trying to view characters of multiple of languages in RStudio. What I find unusual is I am able to view these in the console, but not in the viewer. UTF-8 encoded characters appear like 'U+3042', 'U+500B', etc. in the viewer.
Is there a way to get the viewer to display the actual characters instead of the encoded character?
Here are a couple of images showing what I mean -
In console: https://ibb.co/T0681H7
In viewer: https://ibb.co/QnxF25c
This is a known issue in RStudio. Feel free to comment/upvote here:
https://github.com/rstudio/rstudio/issues/4193

RStudio: Save data from Viewer

Due to a stupid mistake and a defective USB stick I lost a bunch of data and I am now trying to recover it.
Some of the the data is still displayed in the Viewer tabs when I open RStudio. However, I can only save R Scripts and R Markdownfiles out of the Viewer. The displayed data frames are nice and complete, I can sort and filter them in the Viewer, however, I cannot find a "save" option. Is there a possibility to save this displayed data into Rdata or csv or something similar?
I would suggest three different approaches, but none of them will necessarily work. I sort them according to my prior expectations of success.
1) You can copy all your data frame from the viewer and paste it into an external spreadsheet software to obtain a .csv file. E.g. through the "convert text to columns" button in MS Excel.
2) You can copy and paste the character string into an object that is passed to the text option of read.table or to dput(). Check out the "Copy your data" section of this famous SO question
3) Finally, you can get google Chrome's "Inspect Element" function to inspect the html code of the object in the viewer. Once you find the table you can copy paste and scrape with an html parser, e.g. using the rvest package. Good luck!
Thanks everybody, there is a way to access the data as Rdata files, which was kindly explained to me here
I used the second method and located the files in %localappdata%\RStudio-Desktop\viewer-cache.

locate invalid character causing error in R xmlToDataFrame()

For background I am very new to R, and have almost no experience with XML files.
I wrote a webscraper using the RSelenium package that downloads XML files for multiple states and years from this website, and then wrote code that reads in each file and appends it to one file and exports a CSV. My webscraper successfully downloads all of the files I need, and the next segment of code is able to successfully read all but two of the downloaded xml files.
The first file that I am unable to read into an R dataframe can be retrieved by selecting the following options on this page: http://www.slforms.universalservice.org/DRT/Default.aspx
Year=2013
State=PA
Click radio button for "XML Feed"
Click checkbox for "select data points"
Click checkbox for "select all data points"
Click "build data file"
I try to read the resulting XML file into R using xmlToDataFrame:
install.packages("XML")
require("XML")
data_table<-xmlToDataFrame("/users/datafile.xml")
When I do, I get an error:
xmlParseCharRef: invald xmlChar value 19
Error: 1: xmlParseCharRef: invalid xmlChar value 19
The other examples I've seen of invalid character errors using xmlToDataFrame usually give two coordinates for the problematic character, but since only the value "19" is given, I'm not sure how to locate the problematic character.
Once I do find the invalid character, would there be a way to alter the text of the xml file directly to escape the invalid character, so that xmlToDataFrame will be able to read in the altered file?
It's a bad encoding on this line of XML:
31 to $26,604.98 to remove: the ineligible entity MASTERY CHARTER SCHOOLS 
but the document seems to have other encoding issues as well.
The TSV works fine, so you might think abt using that instead.

mPDF error when table too large for page

Using the following PHP code:
$pdf = new mPDF('utf-8', 'A4-L'); //have tried several of the formats
$pdf->WriteHTML($content,2);
$pdf->Output();
where $content is a very plain html table
As long as the number of rows in the table are less than one page, the PDF is generated without errors. If I add one or more rows to the html I get the PHP warnings:
Message: Invalid argument supplied for foreach()
Filename: mpdf/mpdf.php
Line Number: 11043
repeated many, many times. With mPDF error notices turned on I get
mPDF error: Some data has already been output to browser, can't send PDF file
I tried suppressing the warnings using the code in Can't get rid of PHP notices in mPDF
This sort of worked, but the force-downloaded PDF has the following quirk:
first page: blank
2nd page: table header only
3rd to next to last pages: header and 1 row of data
last page: header + full page of data
This is the 3rd PDF from PHP library I've tried to use with my CodeIgniter framework, so I'm extremely frustrated at this point.
PHPExcel (which I'm using to give my client Excel download data option) works awesome for formatting Excel spreadsheets, but the output to PDF is hideous no matter what options I throw at it (mainly, it puts a outline around each cell with a HUGE padding)
ezPDF/PHP-PDF (per http://www.ahowto.net/php/easily-integrate-ezpdf-a-k-a-pdf-php-into-codeigniter-framework) worked awesome. Except I couldn't get it to work with the client's logo that they were adamant about showing at the top of the PDF.
I guess I'm going to try domPDF next.

Resources