I am trying to process a CSV file from within App Maker. I need to read the contents of the file and process it so that I can insert the correct data into the database. Is this possible?
Related
Is it possible to access a csv file directly with the csv storage engine without going through the trouble of loading it first?
We deploy a data warehouse where during load CSV files are read into a temp table and then insert the content in production fact tables. I wonder if we could just pass the load into and "directly go to jail insert"?
I have the following situation/workflow:
The user utilizes Tool A to capture sensor data and save it to a CSV file.
The user takes the CSV and uploads it to a R Shiny Applications (using fileInput) to process it further.
I would like to get rid of the intermediate CSV and directly open the Shiny application with the data already loaded. I.e. I need a way to transfer the contents of the CSV automatically to the Shiny application.
What could work:
Tool A stores the CSV in a special location that is served by a HTTP server on a fixed endpoint.
The Shiny application requests the file from the known location on startup.
However, this still needs the intermediate CSV file, and adds complexity by introducing an additional server (or at least endpoint). Furthermore it needs additional logic if multiple users are active / multiple files are created at the same time. So it is far from ideal.
Is there any way to get the contents of the CSV directly from Tool A to the Shiny Application? E.g. by mimicking the messages the 'fileInput' widget produces?
I create my first mobile app in Xamarin Forms. As I'm a C#.NET developer, I was thinking straight forward methods to manage with databases.
I have a pre-loaded database "MyStoryTitles.db3", and I just want to show the contents.
My thinking was to:
store the db3 file in a folder named Data
connect to the file
use list view to show the content
But when I researched, I came to know that we cannot do the SQLite in straight forward methods. What I understood from the online articles that I have to:
store the db3 file as Asset ( for Android)
copy the file prior opening it ( as shown here)
connect to the local copied db
use list view to show the content
Is this the standard procedure ? Or can I store the db3 file as EmbeddedResource and do the database operation as normal ?
I have my Dash plotly app running in PCF, my app.py runs based on a excel file which is uploaded to pcf along with app.py, but the excel feed changes daily, so daily i am uploading the new file to pcf using "cf push", is it possible to avoid that, like making pcf to read excel from my file system instead of uploading the new excel file to pcf cell container everytime
Basically you need some persistent storage attached to your container so app can refer the available file at run time. There are the options that can be explored:
If NFS is enabled at your end then you can mount the file share and pick the files from that location directly.
Otherwise you can have another PCF service
(just to keep it separate for better management) that can pull the
files from your server using sftp and transfer to S3. Amend your app
to refer the file from S3.
I have a working R shiny app which is being hosted from our internal org Amazon AWS server. Now, Users will generate data on that Shiny app using different widgets provided there. Now, I need to save all the data generated in one session into a file that is stored in our internal Amazon S3 bucket.
The possible challenge that we are facing is how to save these data when multiple users could generate data using that Shiny App and then need to be saved and reloaded back to the Shiny app further if needed.
We just can't lose any data even if two users simultaneously add data using the Shiny App.
Please guide what could be our best approach.
I did follow the guideline provided here :
https://deanattali.com/blog/shiny-persistent-data-storage/
But, is there a way where we don't need to create as many .csv files as we have a number of users accessing the app.