How do you save gl_speech_op output to an object within the R?
I successfully ran GoogleLanguageR to convert an audio file to text within the Google Cloud Platform. I can see the output but I don't know how to save the output to an object within R Studio.
Sample code is below. I am using R Notebook.
library(googleLanguageR)
library(tidyverse)
###let's get Craig Watkins
gl_auth("D:/Admin/Documents/Google API JSON Authenticate/My Project two test-db5d6330925e.json")
watkins <- gl_speech("gs://testtwoibm/craig watkins 2018_05_07_14_08_08.flac",
encoding = c("FLAC"), sampleRateHertz = 44100, languageCode = "en-US",
maxAlternatives = 1L, asynch = TRUE)
## Send to gl_speech_op() for status or finished result
gl_speech_op(watkins)
RStudio notebook output showing converted speech to text.
The easiest way in R to save any output of an operation to an object in R is assigning it via the assignment operator <-
In your case, you would only assign it to an object like this:
transcript <- gl_speech_op(watkins)
One small reminder: This will also work if the asynchronous API request hasn't finished transcribing yet. However, your object will not contain any information. In your case it will be any list of 2 with two NULL elements. If finished, the object will contain both the transcript and the timings.
I understand you want the output as a text. If this is the case, then you can use capture.output:
new_obj = capture.output(gl_speech_op(watkins))
new_obj
Related
I am trying to write an R script (to run from Rstudio by my students) that can accept a file of the phyloseq object type (is several tables connected to each other stored into one object) so I can use that object to run the code on. Since it seems like I cannot accept any phyloseq file directly, I decided to strip the file into 3 tables (of which are stored in the input file) like so
input <- as.data.frame(readline(prompt = "Enter phyloseq object name "))
input.taxtab <- as.data.frame(tax_table(input)) #tax_table is a ps-function that takes out the taxonomy table
input.otutab <- as.data.frame(input#otu_tab)
input.samdat <- as.data.frame(input#sam_data)
however, the obvious issues I see with my code are that the input file cannot store all the information, but it also (at the moment only seems to accept characters, i.e. it only takes the filename literally, not the file itself).
I have tried to get the data frames separately (by having 3 times the prompt input) but that doesn't coerce the file itself either. Here is a snippet of that:
input.taxtab <- as.data.frame(tax_table(readline(prompt = "Enter phyloseq object name ")))
Hope that someone can help me evolve my code to accept an actual file instead of its name
I'm having trouble accessing the Energy Information Administration's API through R (https://www.eia.gov/opendata/).
On my office computer, if I try the link in a browser it works, and the data shows up (the full url: https://api.eia.gov/series/?series_id=PET.MCREXUS1.M&api_key=e122a1411ca0ac941eb192ede51feebe&out=json).
I am also successfully connected to Bloomberg's API through R, so R is able to access the network.
Since the API is working and not blocked by my company's firewall, and R is in fact able to connect to the Internet, I have no clue what's going wrong.
The script works fine on my home computer, but at my office computer it is unsuccessful. So I gather it is a network issue, but if somebody could point me in any direction as to what the problem might be I would be grateful (my IT department couldn't help).
library(XML)
api.key = "e122a1411ca0ac941eb192ede51feebe"
series.id = "PET.MCREXUS1.M"
my.url = paste("http://api.eia.gov/series?series_id=", series.id,"&api_key=", api.key, "&out=xml", sep="")
doc = xmlParse(file=my.url, isURL=TRUE) # yields error
Error msg:
No such file or directoryfailed to load external entity "http://api.eia.gov/series?series_id=PET.MCREXUS1.M&api_key=e122a1411ca0ac941eb192ede51feebe&out=json"
Error: 1: No such file or directory2: failed to load external entity "http://api.eia.gov/series?series_id=PET.MCREXUS1.M&api_key=e122a1411ca0ac941eb192ede51feebe&out=json"
I tried some other methods like read_xml() from the xml2 package, but this gives a "could not resolve host" error.
To get XML, you need to change your url to XML:
my.url = paste("http://api.eia.gov/series?series_id=", series.id,"&api_key=",
api.key, "&out=xml", sep="")
res <- httr::GET(my.url)
xml2::read_xml(res)
Or :
res <- httr::GET(my.url)
XML::xmlParse(res)
Otherwise with the post as is(ie &out=json):
res <- httr::GET(my.url)
jsonlite::fromJSON(httr::content(res,"text"))
or this:
xml2::read_xml(httr::content(res,"text"))
Please note that this answer simply provides a way to get the data, whether it is in the desired form is opinion based and up to whoever is processing the data.
If it does not have to be XML output, you can also use the new eia package. (Disclaimer: I'm the author.)
Using your example:
remotes::install_github("leonawicz/eia")
library(eia)
x <- eia_series("PET.MCREXUS1.M")
This assumes your key is set globally (e.g., in .Renviron or previously in your R session with eia_set_key). But you can also pass it directly to the function call above by adding key = "yourkeyhere".
The result returned is a tidyverse-style data frame, one row per series ID and including a data list column that contains the data frame for each time series (can be unnested with tidyr::unnest if desired).
Alternatively, if you set the argument tidy = FALSE, it will return the list result of jsonlite::fromJSON without the "tidy" processing.
Finally, if you set tidy = NA, no processing is done at all and you get the original JSON string output for those who intend to pass the raw output to other canned code or software. The package does not provide XML output, however.
There are more comprehensive examples and vignettes at the eia package website I created.
I'm trying to parse through a JSON file, this file has thousands of individual JSON objects but for testing purposes I am working with only 3 rows of JSON objects. What I've run into is, I've declared a function in R and using the rjson package, but when I try to execute the R file in RStudio, I get an error in the console saying that my function is undefined.
I've tried calling the function via the console, and executing via the R script file. I've also read through a couple tutorials as I've never worked in R before.
library("rjson")
parseJsonData <- function (fileName)
{
result <- fromJSON(file = fileName)
jsonFrame <- as.data.frame(result)
return(jsonFrame)
}
parseJsonData("testData.json")
I would expect to have a data frame returned and print to the console window.
UPDATE:
Seems to be an issue with the file format, which I don't have control over unfortunately, the file provided is a json file, but it is formatted as having 10k individual JSON objects, not a list of objects.
For example, this is how the file is formatted inside the testData.json
{"name":"test1", "value":"1"}
{"name":"test2", "value":"2"}
{"name":"test3", "value":"3"}
{"name":"test4", "value":"4"}
{"name":"test5", "value":"5"}
I think I have exhausted the entire internet looking for an example / answer to my query regarding implementing a h2o mojo model to predict within RShiny. We have created a bunch of models, and wish to predict scores in a RShiny front end where users enter values. However, with the following code to implement the prediction we get an error of
Warning: Error in checkForRemoteErrors: 6 nodes produced errors; first
error: No method asJSON S3 class: H2OFrame
dataInput <- dfName
dataInput <- toJSON(dataInput)
rawPred <- as.data.frame(h2o.predict_json(model= "folder/mojo_model.zip", json = dataInput, genmodelpath = "folder/h2o-genmodel.jar"))
Can anyone help with some pointers?
Thanks,
Siobhan
This is not a Shiny issue. The error indicates that you're trying to use toJSON() on an H2OFrame (instead of an R data.frame), which will not work because the jsonlite library does not support that.
Instead you can convert the H2OFrame to a data.frame using:
dataInput <- toJSON(as.data.frame(dataInput))
I can't guarantee that toJSON() will generate the correct input for h2o.predict_json() since I have not tried that, so you will have to try it out yourself. Note that the only way this may work is if this is a 1-row data.frame because the h2o.predict_json() function expects a single row of data, encoded as JSON. If you're trying to score multiple records, you'd have to loop over the rows. If for some reason toJSON() doesn't give you the right format, then you can use a function I wrote in this post here to create the JSON string from a data.frame manually.
There is a ticket open to create a better version of h2o.predict_json() that will support making predictions from a MOJO on data frames (with multiple rows) without having to convert to JSON first. This will make it so you can avoid dealing with JSON altogether.
An alternative is to use a H2O binary model instead of a MOJO, along with the standard predict() function. The only requirement here is that the model must be loaded into H2O cluster memory.
The following works now using the json formatting from first two lines and the single quote around var with spaces.
df<- data.frameV1=1,V2=1,CMPNY_EL_IND=1,UW_REGION_NAME = "'LONDON & SE'" )
dfstr <- sapply(1:ncol(df), function(i) paste(paste0('\"', names(df)[i], '\"'), df[1,i], sep = ':'))
json <- paste0('{', paste0(dfstr, collapse = ','), '}')
dataPredict <- as.data.frame(h2o.predict_json(model = "D:\\GBM_model_0_CMP.zip", json = json, genmodelpath = "D:\\h2o-genmodel.jar", labels = TRUE))
I am using tableNominal{reporttools} to produce frequency tables. The way I understand it, tableNominal() produces latex code which has to be copied and pasted onto a text file and then saved as .tex. But is it possible to simple export the table produced as can be done in print(xtable(table), file="path/outfile.tex"))?
You may be able to use either latex or latexTranslate from the "Hmisc" package for this purpose. If you have the necessary program infrastructure the output gets sent to your TeX engine. (You may be able to improve the level of our answers by adding specific examples.)
Looks like that function does not return a character vector, so you need to use a strategy to capture the output from cat(). Using the example in the help page:
capture.output( TN <- tableNominal(vars = vars, weights = weights, group = group,
cap = "Table of nominal variables.", lab = "tab: nominal") ,
file="outfile.tex")