Is there a way to download from Firebase storage as a file? - firebase

In my application I am attempting to download from Firebase as a file object so I may use the File.copy method to save the image or video locally as a file. Is there anyway to download from Firebase as a generic File object?

Check the documentation here, you should be able to download the file.

Related

pass csv file from get request to S3 in airflow

I did't find here an answer for that, so thought anyone can help:
I'm receiving a csv file from a Get request.
I want to upload it to S3 (and then continue the pipeline..)
I'm using Airflow on the managed AMAA platform.
Since when uploading to S3, the script required a file path for the csv file.
how can I pass a file path when it's on the AMAA platform? is it even stored anywhere?
do I need a middle man to store it in between?

How to keep/re-create object metadata during gsutil cp on storage bucket

I would like to sync all of the files in my Google Cloud Storage bucket with the exported files in my Firebase Storage Emulator.
I downloaded all of my cloud files using gsutil to my local machine.
I used BeyondCompare to move all of the new files to the '../storage_export/blobs/ directory.
How do I update/create the JSON metadata in '../storage_export/metadata' to reflect these new files and make them available when I run the emulator and import them in?
Edit:
The gsutil docs mention the following:
when you download data from the cloud, it ends up in a file with no associated metadata, unless you have some way to keep or re-create that metadata.
How would one "keep" or "re-create" that metadata during a gsutil cp download?
You can use gsutil or the SDK to get each object's metadata and then write it down to a JSON file however, there's currently no native way to import Google Cloud Storage data in the Storage Emulator. But as I stated in my answer to this post, you can study how the emulator register the object by uploading sample files within the emulator and then running the export, you will see that the emulator will require 1 object and 1 JSON file that contains it's metadata.
Lastly, you can add the option --export-on-exit when starting the emulator, Downloaded all data from the real Firebase project, uploaded everything with the Emulator, then kill the emulator; as stated in this post.
Note: This is not a documented feature! Firebase doesn't expose the concept of download tokens in its public SDKs or APIs, so manipulating tokens this way feels a bit "hacky". For your further reference, check this post.

How does the InputFile function in R working

I am developing an R shiny App, and the app will receive video from the user and upload to AWS s3 bucket. I am not clear about how does this video been uploaded if I use R connect to deploy the app. Does it go through https or http? I know it will be saved to the R shiny server and then upload to s3 bucket but if there is a way to directly save the video to s3 bucket?
From my own research, the caveat I have found is that you must use the writeBin() function on your upload file, saving it in a temporary directory, before saving it in the aws.s3 bucket.

Does admin.storage.object contain/process multiple files if multiple files are uploaded to Firebase Storage?

I am trying to move files into separate folders in Firebase Storage once they have been uploaded. As it turns out, you can not achieve this with the the JavaScript Web Client SDK for Storage. However, it appears that you could do so with the Admin SDK for Storage using Firebase Functions. So that is what I am trying to do. I understand that you need to first download a file into your Firebase Functions and then re-upload into a new folder in Storage.
To download a file, I need to pass its reference from the client and here is where it gets confusing to me. I am currently getting all the uploaded files in the client via the listAll() function which returns items and prefixes. I am wondering whether or not I can use either the items or the prefixes to then download the files into Firebase Functions using them (items or prefixes). Alternatively, I can pass the URLs. However, the question is, which method do I use to get and download them in Functions afterwards?
I know of admin.storage.object as explained in https://firebase.google.com/docs/storage/extend-with-functions#trigger_a_function_on_changes. However, does it handle multiples files? In other words, the object, as I understand, is one file that is uploaded to Storage and you can use its attributes such as object.bucket or object.name to access more information. However, what if there are multiple files uploaded at the same time, does it handle them one by one? Also, if I am passing the references or URLs of the files that need to be downloaded from the client, is admin.storage.object the right choice? Because it seems to simply process all the files uploaded to Storage, instead of getting any references from the client.
Further, there is a description of how to download a file (https://firebase.google.com/docs/storage/extend-with-functions#example_image_transformation) which is this code: await bucket.file(filePath).download({destination: tempFilePath});
I understand that the filepath is basically the name of the file that is already in Storage (ex. /someimage). But what if there are other files with the same name? Might the wrong file be downloaded? And how do I make sure that the filepath is the file that I passed from the client?
Let me know what your thoughts are and whether or not I am heading in the right direction. If you include a code in your answer, please write it in JavaScript for the Web. Thank you.
Thank you!
Here are some points that could help:
In GCP Storage technically there are no folders, GCS emulates the directory structure by using / in the names of objects.
When setting a cloud function triggered by a GCS object change, each object change is an event, each event triggers an invocation of the function (you might have an bucket for unprocessed files which triggers the function and have them move to a different bucket when proccesed)
You might consider using the REST API to move/copy/rename the objects without having to download them
As a side note the question is a little too broad, possibly these points could help clarify things for you.

Flutter - where to put own SQLite .db file?

I have a SQLite .db file that I want to access through sqflite on Flutter. Where in the Flutter project am I supposed to put it so that I can access it both on Android and iOS? How do I make sure that it's shipped with the apk? All examples that I found assume that the db needs to be created from scratch at the first launch.
You can put the db file in your assets folder and declare it in your pubspec.yaml. On startup you can write it out to disk and then use that path with your connection string to connect the db.
You can read from assets using
var dbContent =
await rootBundle.load('assets/database/mydb.db');
Then write it out to your file system and go from there.
I've found that this problem is related to:
https://stackoverflow.com/a/51387985/3902715
Credits to R. C. Howell

Resources