Download File in Appmaker - google-app-maker

I have created one app in Appmaker, I want to download certain reports from the Drive tables in Appmaker. For this currently I am creating SpreadSheet using Drive APIs. I want to add download feature as well which allows users to Download Spreadsheet in their local machine.
I have done research on Appscript which allows users to Download files using ContentService, however I do not have any HTMl page from where I can invoke this method. Are there any alternatives for this?

It seems that you can get download URL using DriveApp Apps Script service
// Server script
var downloadUrl = DriveApp.getFileById(FileIdGoesHere).getDownloadUrl();

Related

Load audio/images from url, download and save the audio/images for offline mode ionic 5 (angular)

In my case, when the photos or audio files have been loaded from firebase or played, they should be accessible offline and shouldn't need to be loaded from the URL again.
For that case try saving it to sqlite. Data does not persist when you close the application. So I suggest try to store the data on sqlite. Here's a tutorial on how to implement it.

Flutter Web Display Microsoft Documents Firebase

I am trying to embed MS Documents in a Flutter Web App.
Documents are stored on Firebase Storage. I am using MS Web Viewer to display them in browser.
This works without any problem:
https://view.officeapps.live.com/op/embed.aspx?src=https://file-examples.com/wp-content/uploads/2017/08/file_example_PPT_250kB.ppt
The following two versions where the documents are hosted on Firebase are not working:
https://view.officeapps.live.com/op/embed.aspx?src=https://firebasestorage.googleapis.com/v0/b/tutor-and-learn.appspot.com/o/public%2Ffile_example_PPT_250kB.ppt?alt=media&token=6e293eb9-9f3b-41ab-9969-f936b3c54384
https://view.officeapps.live.com/op/embed.aspx?src=https://storage.googleapis.com/tutor-and-learn.appspot.com/public/file_example_PPT_250kB.ppt?GoogleAccessId=firebase-adminsdk-t47jn%40tutor-and-learn.iam.gserviceaccount.com&Expires=1597309385&Signature=VHbm8U8xlf%2BYybwalAveZtl8FsmEmr6Uml%2BwX%2FR7TOFNqlj%2B8QW1FFSJUNB4qcAzVpEcntLzipT15Zj73B%2FLlSZlQwEU10s5RkJdR5CZeZ6MuF2DUptUbqfnNobdLkizEmwlQ6Bkk4DkDWCd9nRL%2BQ0GLYypBr%2Bxs39bpd8JSuxxACWCjq0Of8qLTBMZQmD%2BgbE8JkMdqvBVOV75A7EQyy1IWqHrRBD7RgVc46IEq4TaO2ZT9h56joJgawqZOt81%2Fkq95YmNWZNOeU9kVRuLpSFsqZru8Ku7aapiFcUXjrjuMWZeC1XCrTK7fwU6A8shNIyHq3bE8RB9a%2BCQnS0llA%3D%3D
Either via Firebase directly nor Google Cloud Storage I get it to work.
The individual links in the above example work without any problems and you can download the file.
https://firebasestorage.googleapis.com/v0/b/tutor-and-learn.appspot.com/o/public%2Ffile_example_PPT_250kB.ppt?alt=media&token=6e293eb9-9f3b-41ab-9969-f936b3c54384
https://storage.googleapis.com/tutor-and-learn.appspot.com/public/file_example_PPT_250kB.ppt?GoogleAccessId=firebase-adminsdk-t47jn%40tutor-and-learn.iam.gserviceaccount.com&Expires=1597309385&Signature=VHbm8U8xlf%2BYybwalAveZtl8FsmEmr6Uml%2BwX%2FR7TOFNqlj%2B8QW1FFSJUNB4qcAzVpEcntLzipT15Zj73B%2FLlSZlQwEU10s5RkJdR5CZeZ6MuF2DUptUbqfnNobdLkizEmwlQ6Bkk4DkDWCd9nRL%2BQ0GLYypBr%2Bxs39bpd8JSuxxACWCjq0Of8qLTBMZQmD%2BgbE8JkMdqvBVOV75A7EQyy1IWqHrRBD7RgVc46IEq4TaO2ZT9h56joJgawqZOt81%2Fkq95YmNWZNOeU9kVRuLpSFsqZru8Ku7aapiFcUXjrjuMWZeC1XCrTK7fwU6A8shNIyHq3bE8RB9a%2BCQnS0llA%3D%3D
I presume the MS Web Viewer can not cope with the URLs. Is there any way I can adapt or change anything in Firebase to get it to work?
Looking in Firebase Storage Console the files are listed with the correct type as application/vnd.ms-powerpoint.
Encode the signed URL before adding it to https://view.officeapps.live.com/op/embed.aspx?src=. You can use online tools to encode URLs like https://www.urlencoder.org/

Accessing files from Google cloud storage in RStudio

I have been trying to create connection between the Google cloud storage and RStudio server(The one I spinned up in Google cloud), so that I can access the files in R to run sum analysis on.
I have found three different ways to do it on the web, but I don't see many clarity around these ways so far.
Access the file by using the public URL specific to the file [This is not an option for me]
Mount the Google cloud storage as a disc in RStudio server and access it like any other files in the server [ I saw someone post about this method but could not find on any guides or materials that shows how it's done]
Using the googleCloudStorageR package to get full access to the Cloud Storage bucket.
The step 3 looks like the pretty standard way to do it. But I get following error when I try to hit the gcs_auth() command
Error in gar_auto_auth(required_scopes, new_user = new_user, no_auto =
no_auto, : Cannot authenticate -
options(googleAuthR.scopes.selected) needs to be set to
includehttps://www.googleapis.com/auth/devstorage.full_control or
https://www.googleapis.com/auth/devstorage.read_write or
https://www.googleapis.com/auth/cloud-platform
The guide on how to connect using this is found on
https://github.com/cloudyr/googleCloudStorageR
but it says it requires a service-auth.json file to set the environment variables and all other keys and secret keys, but do not really specify on what these really are.
If someone could help me know how this is actually setup, or point me to a nice guide on setting the environment up, I would be very much grateful.
Thank you.
Before using any services by google cloud you have to attach your card.
So, I am assuming that you have created the account, after creating the account go to Console ,if you have not created Project then Create Project, then click on sidebar find APIs & Services > Credentials.
Then,
1)Create Service Account Keys save this File in json you can only download it once.
2)OAuth 2.0 client ID give the name of the app and select type as web application and download the json file.
Now For Storage go to Sidebar Find Storage and click on it.
Create Bucket and give the name of Bucket.
I have added the single image in bucket, you can also add for the code purpose.
lets look how to download this image from storage for other things you can follow the link that you have given.
First create environment file as .Renviron so it automatically catches the json file and save it in a working directory.
In .Renviron file add those two downloaded json files like this
GCS_AUTH_FILE="serviceaccount.json"
GAR_CLIENT_WEB_JSON="Oauthclient.json"
#R part
library(googleCloudStorageR)
library(googleAuthR)
gcs_auth() # for authentication
#set the scope
gar_set_client(scopes = c("https://www.googleapis.com/auth/devstorage.read_write",
"https://www.googleapis.com/auth/cloud-platform"))
gcs_get_bucket("you_bucket_name") #name of the bucket that you have created
gcs_global_bucket("you_bucket_name") #set it as global bucket
gcs_get_global_bucket() #check if your bucket is set as global,you should get your bucket name
objects <- gcs_list_objects() # data from the bucket as list
names(objects)
gcs_get_object(objects$name[[1]], saveToDisk = "abc.jpeg") #save the data
**Note :**if you dont get json file loaded restart the session using .rs.restartR()
and check the using
Sys.getenv("GCS_AUTH_FILE")
Sys.getenv("GAR_CLIENT_WEB_JSON")
#it should show the files
You probably want the FUSE adaptor - this will allow you to mount your GCS bucket as a directory on your Server.
Install gcsfuse on the R server.
create a mnt directory.
run gcsfuse your-bucket /path/to/mnt
Be aware though that RW performance isnt great vis FUSE
Full documentation
https://cloud.google.com/storage/docs/gcs-fuse

How can I create and download a zip file using Actionscript

I'm developing an application using Adobe Flex 4.5 SDK, in which the user would be able to export multiple files bundled in one zip file. I was thinking that I must need to take the following steps in order for performing this task:
Create a temporary folder on the server for the user who requested the download. Since it is an anonymous type of user, I have to read Sate/Session information to identify the user.
Copy all the requested files into the temporary folder on the server
Zip the copied file
Download the zip file from the server to the client machine
I was wondering if anybody knows any best-practice/sample-code for the task
Thanks
The ByteArray class has some methods for compressing, but this is more for data transport, not for packaging up multiple files.
I don't like saying things are impossible, but I will say that this should be done on the server-side. Depending on your server architecture I would suggest sending the binary files to a server script which could package the files for you.
A quick google search for your preferred server-side language and zipping files should give you some sample scripts to get you started.

How to push data from excel to SQL Server?

I have written a simple ASP.NET MVC 2 application that stores data and can dynamically create excel files using Microsoft's openXML for excel files.
What is the best way to push changes the user makes in excel to my database? I know it can be done via file upload, but this is rather obtrusive to the end user to navigate to my site, select upload, and then select their file.
Is there a way to do 1 click publishing from the excel file using VBA? VBA can interact with the database directly, but this seems dangerous from a data security standpoint, and duplication of logic.
Do web services work with the MVC architecture? How do I get a vba macro enabled document to send itself to the server?
For anyone out there looking for a fix, I ended up using vba's InternetExplorer.Application object and interacting with an upload form on my site.
For more info on the upload form check out:
http://haacked.com/archive/2010/07/16/uploading-files-with-aspnetmvc.aspx
For more info on VBA and the InternetExplorer.Application object check out:
www.motobit.com/tips/detpg_uploadvbaie/
You might take a look at Sql server integration services for bulk upload of data into sql server. The integration services once created can be run using a normal c# desktop program or using a windows service.
But you might
need to make sure this happens in the background and will have to be
an asynchronous task.
also need to make sure it is properly secured
by not giving direct execute access to any other users
I'm assuming that this is for a specific user. I've done something very similar to what you are describing before.
Tell the user to save the excel file in their DropBox and share the file with you.
Have the server listen for changes to this file and run a server side routine to import the data.
Disclaimer: This is not a secure solution, but it's easy and will get the job done.

Resources