write.csv and read.csv in Shiny App shared on shinyapps.io - r

I have created an app that I want to share on shinypps.io
Within the code for the I use the functions load, write.csv, and read.csv which read and write files to folders called outputs and data. My app works fine when I run it locally but when I deploy it I get the error:
cannot open compressed file 'data\Catchments.RData', probable reason 'No such file or directory'
I tried using a folder called www to store these but still had error messages. Is there a way to use these functions when sharing an app on shinyapps.io?

There's no possibility of using directories in shinyapp.io. An easy fix is to place an upload button inside the app, perform all the manipulations you need and finally download the result with a download button again. Getting the data from a remote server is also a good option.
As shown in this Article
"Local vs remote storage
Before diving into the different storage methods, one important distinction to understand is local storage vs remote storage.
Local storage means saving a file on the same machine that is running the Shiny application. Functions like write.csv(), write.table(), and saveRDS() implement local storage because they will save a file on the machine running the app. Local storage is generally faster than remote storage, but it should only be used if you always have access to the machine that saves the files.
Remote storage means saving data on another server, usually a reliable hosted server such as Dropbox, Amazon, or a hosted database. One big advantage of using hosted remote storage solutions is that they are much more reliable and can generally be more trusted to keep your data alive and not corrupted.
When going through the different storage type options below, keep in mind that if your Shiny app is hosted on shinyapps.io, you will have to use a remote storage method for the time being. In the meantime, using local storage is only an option if you’re hosting your own Shiny Server. If you want to host your own server, here is a guide that describes in detail how to set up your own Shiny Server."

Related

Unsure where image_write downloads to in shinyapps.io

I'm attempting to make a public shinyapps.io website, and I'm trying to use image_write to create a file into a local directory.
The following code works on my local R studio code:
image_write(im.resized, path = paste0(output_file_directory, file_name), format = "jpg")
When I run the code on the shinyapps.io website, the code runs without error, but I'm not sure where it downloads the file to. I know that the output_file_directory part isn't the issue, so I'm a little lost. Any help would be much appreciated!
On shinyapps.io it is not possibly to store permanently data, due to:
"Shinyapps.io is a popular server for hosting Shiny apps. It is designed to distribute your Shiny app across different servers, which means that if a file is saved during one session on some server, then loading the app again later will probably direct you to a different server where the previously saved file doesn’t exist."
See here:
https://shiny.rstudio.com/articles/persistent-data-storage.html

Is it possible to fetch and use a file from cloud storage at when deploying a cloud function

I have a firebase function that makes use of a SQLite database (read-only) which is currently uploaded along with the function.
The problem is that the db file is quite large and gets uploaded every time the function is changed. Is there a way to fetch this file from cloud storage during the installation process (during firebase deploy) - without hard-coding the URL in the source files?
What you're trying to do is problematic because your code running in Cloud Functions may actually be running on any number of server instances, determined by the load on your project. As such, downloading a file once at the time of deployment isn't going to naturally affect all the instances that maybe created or destroyed at any given moment.
It's far better to keep doing what you're doing, and include your extra data during deployment. When a new instance is spun up to handle events for your function, the file will be immediately ready to help service requests.

How can a shiny app handle read.csv of an https located csv file the way a local instance can?

I have a shinyapps.io application running fine, referencing a number of google sheets (by using publish->csv). This means that I can read.csv(url) those https://... urls both locally and in my shiny app. That's excellent.
However, one of my data partners offered up a csv file at their own https location, and that time the application failed with
"ERROR: cannot open the connection to
'https://www.ncdetect.com/PublicDownload/OverdoseReport.csv'"
I've seem a handful of seemingly related questions elsewhere, though none ended up being helpful.
The (public data) file I can read.csv() with in a shiny app is here, even though it's downloadable and my local instance can handle it just fine: https://www.ncdetect.com/PublicDownload/OverdoseReport.csv .

FinderSync invalidated on El Capitan

We have an application written in Mono that needs to communicate with an Finder Sync App extension.
All is working fine until we tried our app on El Capitan instead of on Yosemite.
We use a shared SQLite database to tell what paths are in which state and use NSDistributedNotificationCenter for communication between the two.
The shared SQLite database is outside of the sandboxed env so we have putted an excepention in our entitlements com.apple.security.temporary-exception.files.home-relative-path.read-write
If we remove this exception from the app extension, the extension works (but obviously we can't read our db)
Then we tought of putting the SQLite DB into memory, but shared memory databases isn't possible over multiple processes.
I can't find how I can create a NSFileHandle for a Sqlite Connection.
We could send over all the info to the application extension, but then that has to keep it in memory (preferably in a SQLite, cause we need to do some querying.)
Does anyone has some pointers of what we could do?
Try to look in The Application Group Container Directory it might do in your case. Basically it allows you to have shared folder between apps/extension.
App group container directories. A sandboxed app can specify an entitlement that gives it access to one or more app group container directories, each of which is shared among all apps with that entitlement.
After some research on similar problem I found it's much easier to have simple TCP server in main app that responds to extension with file status. This way you can easily broadcast file status change to all extension instances etc.

Use of shared database in Windows 8 App

I am developing a set of apps for Windows 8 which require local data storage and I'm experiencing a trouble in accessing the data from these apps. As far as I know, each app has its own location to store its data. Is it principally possible to access one file in Windows.Storage from different apps? I am using SQLite as a database server and it is necessary to have one common database to share data between the apps.
No, the applications are sandboxed so they cannot access each other's local data storage. If you're willing to have your application initialization process ask the user where to put/find the SQLite database, then the applications could share that same file at a location on the file system that the user has selected via the File Picker.
That means, of course, that the user has to remember where the file is whenever he first launches the "second" app, since he'll need to browse to the same location where he "first" app deposited it.
If you leverage AccessCache.StorageItemAccessList, though, you can save that file location in your app's local storage, so the next time the user runs either app he doesn't have to go through the File Picker to grant permission again.

Resources