I'm using this amazing package to be able to read and upload data with my shiny app. It's working ok, but when I add a row to the sheet, it does not keep the same encoding from server, neither behaves like the data in the previous rows. Spanish names I manually entered are OK, but when I use the app to load data, special latin characters (UTF-8) are replaced in the sheet.
That data, is not recognized by the app in the following sessions.
library(googlesheets)
table <- "Reportes"
saveData <- function(data) {
# Grab the Google Sheet
sheet <- gs_title(table)
# Add the data as a new row
gs_add_row(sheet, input = data)
}
loadData <- function() {
# Grab the Google Sheet
sheet <- gs_title(table)
# Read the data
gs_read_csv(sheet)
}
Then, I use a button in the UI, and an observer in the SERVER to load the data...
observeEvent(input$enviar, {
exit <- input$enviar
if (exit==1){
addData <- c( as.character(input$fecha),
as.character(input$local),
as.character(input$dpto),
as.character(input$estado),
as.character(input$fsiembra),
as.character(input$ref),
as.character(loc$lat[loc$Departamento==input$dpto & loc$Localidad==input$local]),
as.character(loc$long[loc$Departamento==input$dpto & loc$Localidad==input$local]),
as.character(getZafra(input$fecha)))
saveData(addData)
d <- loadData()
reset('fecha')
reset('dpto')
reset('local')
reset('estado')
reset('fsiembra')
reset('ref')
reset('pass')
disable('enviar')
}
})
Please... if anyone can help I'd be very happy.
I discovered that I needed to encode the character vector before uploding...
I used:
Encoding(addData) = "latin1"
saveData(addData)
and worked just fine!.
Related
I am creating a shiny application where I allow the user to write the data out to either csv or kml. However, my code below does not write the features out to the kml file, such that when I open the KML in google earth it shows black dots and when clicked it displays the row index of the original data rather than All column values for that specific point. I was using writeOGR function but it was not writing the file, so I switched to using plotKML package. I want the user to choose where the file is saved (using the filename I specify with date and timestamp) and display in Google Earth all features of any given datapoint.
output$downloadData <- downloadHandler(
filename = function() {
paste0("data_",Sys.Date(), input$download_type)
},
content = function(file) {
if (input$download_type == ".csv"){
write.csv(data, file, row.names = FALSE)
} else if (input$download_type == ".KML") {
features <- c("COLUMN_1", "COLUMN_2", "COLUMN_3") #These are the features I want displayed in Google Earth
data[features] <- as.character(data[features])
coordinates(data) <- ~X + Y
proj4string(data) <- CRS("+proj=longlat +datum=WGS84")
kml_description(data, caption = "Data",
delim.sign = "_", asText = F)
kml(data, file = file) #Not sure why this produces points but doesn't display features in Google Earth
#writeOGR(data, dsn = file, layer="Data", driver = "KML")
}
})
enter image description here
Getting a KML in a way that can be adeuqately read by Google Maps is not yet as easy as it should be.
You may want to give a try to export via the sf package and libkml.
sf::st_write(obj = an_sf_object,
dsn = kml_file_path,
driver = "libkml")
(you may need to install libkml on your server)
See also this function from the latlon2map package for a working (albeit less than ideal) implementation.
I am following the simple file upload example from shiny gallery, but with a slight modification. I am required to modify the csv files locally, and see the changes reflected on the UI. However, I believe that this is not possible unless we poll for any changes in the source.
Therefore, I simplify the problem, by allowing re-uploading of the file. But, this is also not happening in Shiny. Once a file "file1.csv" is uploaded, i cannot upload the same file again. I have to upload a different file "file2.csv" and then once again the original file "file1.csv".
This is just time consuming, and I am wondering if anyone has come across such an issue, and possibly found a solution for it.
Adding a jquery to clear the value of the fileInput does the trick.
...
fileInput("csvInput", "Upload CSV file", multiple = TRUE,
accept=c('text/csv',
'text/comma-separated-values,text/plain',
'.csv')),
tags$script('$( "#csvInput" ).on( "click", function() { this.value = null; });'),
...
Actually it is one of the shortcomings of Shiny's fileInput module: it stays still if the same file was selected again, and there is no force-upload option built-in.
To enforce re-uploading, the basic idea is:
After each time uploading, keep the uploaded data in a reactiveValues as storage.
Apply a new fileInput to replace original one
Show a successfully upload message under new fileInput for a better user experience.
In ui.R,using
uiOutput('fileImport')
In server.R:
# reactive storage
v <- reactiveValues()
# fileInput index
v$fileInputIndex <- 1
updateFileInput <- function (name = NULL){
# update output with a new fileInput module
output$fileImport <- renderUI({
index <- isolate(v$fileInputIndex)
result <- div()
result <- tagAppendChild(
result,
fileInput(paste0('file', index), 'Choose CSV File',accept=c('text/csv','text/comma-separated-values,text/plain','.csv'))
)
# show a message of successful uploading
if(!is.null(name)){
result <- tagAppendChild(
result,
div(name," upload complete")
)
}
result
})
}
dataInputRaw <- reactive({
# equals to `input$file1` when initialized
inFile <- input[[paste0('file', v$fileInputIndex)]]
# TICKY PART:
# 1. If initialized, `inFile` and `v$data` are both `NULL`
# 2. After each uploading, new `fileInput` is applied and
# we want to keep previous updated data.
# It also prevent recursive creation of new `fileInput`s.
if (is.null(inFile)){
return(v$data)
}
# load as data frame
v$data <- data.frame(read.csv(inFile$datapath))
# if file successfuly uploaded, increate the index
# then upload `fileInput` with successful message
if (!is.null(v$data)){
v$fileInputIndex <- v$fileInputIndex + 1
updateFileInput(name = inFile$name)
}
# return data
v$data
})
# init
updateFileInput()
I have tested this snippet and it works.
I'm using the googlesheets package (CRAN version, but available here: https://github.com/jennybc/googlesheets) to read data from a Google Sheet in R, but would now like to add rows. Unfortunately, every time use gs_add_row for an existing sheet I get the following error:
Error in gsheets_POST(lf_post_link, XML::toString.XMLNode(new_row)) :
client error: (405) Method Not Allowed
I followed the tutorial on Github to create a sheet and add rows as follows:
library(googlesheets)
library(dplyr)
df.colnames <- c("Project Short Name","Project Start Date","Proj Stuff")
my.df <- data.frame(a = "cannot be empty", b = "cannot be empty", c = "cannot be empty")
colnames(my.df) <- df.colnames
## Create a new workbook populated by this data.frame:
mynewSheet <- gs_new("mynewsheet", input = my.df, trim = TRUE)
## Append Element
mynewSheet <- mynewSheet %>% gs_add_row(input = c("a","b","c"))
mynewKey <- mynewSheet$sheet_key
Rows are added successfully, I even get the cheery message Row successfully appended.
I now provide mynewKey to gs_key, as I would if this were a new sheet I were working with and attempt to add a new row using gs_add_row (Note: before evaluating these lines, I navigate to the Google Sheet and make it public to the web):
myExistingWorkbook <- gs_key(mynewKey, visibility = "public")
## Attempt to gs_add_row
myExistingWorkbook <- myExistingWorkbook %>% gs_add_row(input = c("a","b","c"), ws="Sheet1", verbose = TRUE)
Error in gsheets_POST(lf_post_link, XML::toString.XMLNode(new_row)) :
client error: (405) Method Not Allowed
Things that I have tried:
1) Published the Google Sheet to the web (as per https://github.com/jennybc/googlesheets/issues/126#issuecomment-118751652)
2) Enabled the sheet as editable to the public
Notes
In my actual example, I have an existing Google Sheet with many worksheets within it that I would like to add rows to. I have tried to use a minimal example here to understand my error, I can also provide a link to the specific worksheet that I would like to update as well.
I have raised an issue on the package's github page here, https://github.com/jennybc/googlesheets/issues/168
googlesheets::gs_add_row() and googlesheets::gs_edit_cells() make POST requests to the Sheets API. This requires that the visibility be set to "private".
Above, when you register the Sheet by key, please do so like this:
gs_key(mynewKey, visibility = "private")
If you want this to work even for Sheets you've never visited in the browser, then add lookup = FALSE as well:
gs_key(mynewKey, lookup = FALSE, visibility = "private")
Goal
To upload a file in a Shiny app that reads the data, name the variables (columns) and do some data analysis before presenting plot output
Reference Shiny App
I am using this app from Shiny gallery as a reference: enter link description here
What I have tried:
I want to use the uploaded data in many outputs after doing different analyses. So, instead of reading file inside renderTable or renderPlot, I read it in server function:
server <- function(input, output) {
inFile <- reactive({input$file1})
sdf <- reactive({read.csv(inFile()$datapath, header=F)})
colnames(sdf()) <- c('Vehicle.ID', 'Time', 'Vehicle.class.no', 'Vehicle.type2',
'Vehicle.Length', 'Lane', 'Preceding.Vehicle.ID', 'Spacing','Spacing2', 'State',
'svel.mps', 'deltaV.mps', 'sacc', 'lane.change') }
Error
But when I run this app I get:
shiny::runApp('app-CC')
Listening on http://127.0.0.1:7484
Error in colnames(sdf()) <- c("Vehicle.ID", "Time", "Vehicle.class.no", :
invalid (NULL) left side of assignment
Question
How can I fix this error? I don't want to read the same file again in every render* function. Are there any online examples of shiny apps where a new file is read, column names are defined and then some analysis is done before using the render* functions?
You'd be better off assigning the column names during the read.csv
server <- function(input, output) {
inFile <- reactive({input$file1})
sdf <- reactive({
read.csv(inFile()$datapath, header=F, col.names = c('Vehicle.ID', 'Time', 'Vehicle.class.no', 'Vehicle.type2',
'Vehicle.Length', 'Lane', 'Preceding.Vehicle.ID', 'Spacing','Spacing2', 'State',
'svel.mps', 'deltaV.mps', 'sacc', 'lane.change')
)
})
}
Alternatively I believe you can perform multiple operations in the reactive block as long as you return the final object
server <- function(input, output) {
inFile <- reactive({input$file1})
sdf <- reactive({
dd<-read.csv(inFile()$datapath, header=F)
colnames(dd) <- c('Vehicle.ID', 'Time', 'Vehicle.class.no', 'Vehicle.type2',
'Vehicle.Length', 'Lane', 'Preceding.Vehicle.ID', 'Spacing','Spacing2', 'State',
'svel.mps', 'deltaV.mps', 'sacc', 'lane.change')
)
dd
})
}
An alternative is to use an actionButton and then to check for the validity of the files when you upload them. Lots of observeEvent is probably appropriate for firing off when something is "triggered," like a file being uploaded or a button being pressed. In addition, to make a chain of changes, it's probably best to have an "update" reactiveValue() thing to fire based on flags (which, I admit, is kind of messy, but works with Shiny since it doesn't have a proper callback framework like JS.)
I'm new to R (and programming in general), so apologies if this has been answered elsewhere. I was not able to find an answer via searching, but any help or direction would be great!
I'm trying to make a clickable interface in R, where I can have users click to find a file of choice that proceeds to get automatically analyzed in R.
Here's what I'm having trouble with:
require(tcltk)
getfile <- function() {name <- tclvalue(tkgetOpenFile(
filetypes = "{{CSV Files} {.csv}}"))
if (name == "") return;
datafile <- read.csv(name,header=T)
}
tt <- tktoplevel()
button.widget <- tkbutton(tt, text = "Select CSV File to be analyzed", command = getfile)
tkpack(button.widget)
# The content of the CSV file is placed in the variable 'datafile'
Yet when I try to execute it, and click on a CSV file of interest after the button pops up, nothing happens. By that I mean R gives me the error below when I type in datafile.
Error: object 'datafile' not found
Once again, any help is much appreciated. Thanks!
The top level object has an environment in which you can store things, keeping them local to the GUI. Give your callbacks that object as an argument:
# callback that reads a file and stores the DF in the env of the toplevel:
getfile <- function(tt) {name <- tclvalue(tkgetOpenFile(
filetypes = "{{CSV Files} {.csv}}"))
if (name == "") return;
tt$env$datafile <- read.csv(name,header=T)
}
# callback that does something to the data frame (should check for existence first!)
dosomething <- function(tt){
print(summary(tt$env$datafile))
}
# make a dialog with a load button and a do something button:
tt <- tktoplevel()
button.choose <- tkbutton(tt, text = "Select CSV File to be analyzed", command = function(){getfile(tt)})
tkpack(button.choose)
button.dosomething <- tkbutton(tt, text = "Do something", command = function(){dosomething(tt)})
tkpack(button.dosomething)
That should be fairly robust, although I'm not sure if there's anything already in the environment that you should beware stomping on, in which case create an environment in that environment and use that for your local storage.
If you want to quit the dialog and save things for the user, prompt for a name and use assign to store them in .GlobalEnv on exit.