How many files can I upload at once to Firebase Hosting? - firebase

We're thinking about using Firebase Hosting (which is awesome: ssl, control over redirects, easy CLI tool, etc) to host our API docs. Currently, we count 17k files generated. We did a test upload, and everything worked (pretty cool!). We're curious, is there a limit to the number of files we can deploy to Firebase Hosting?

I'm not aware of a limit on the number of files. But they zipped result (which is what actually gets uploaded) definitely has to be less than 2GB.

Related

How to synchronize files from firebase storage to app's assets folder?

still new to flutter and firebase. I understand how to store and retrieve images and display it on the app.
But how do I go about synchronizing files from the app's local asset folder to an asset folder stored in firebase storage? My intention is to check the cloud folder if a new image like an icon is recently uploaded, and download it to the app's local folder. If a file is removed in the cloud storage, it should also remove it from the local assets folder, mirroring it.
I need a way to compare local AssetManifest.json to the one on firebase storage. I Just need a little direction/algorithm to start with. Thanks for the help!
There is nothing specific built into Cloud Storage's Firebase SDK for this, so you'll have to build it in your own application code.
Using only the Cloud Storage for Firebase API
If you're just using Cloud Storage, you'll have to:
List all files that you're interested in from Storage.
Loop over the files.
Get the metadata for each file and check then the files was last updated
Compare that to when you wrote the local file.
Download the file if it is new or modified.
This approach will work, but it requires quite some calls to the Storage API, because there's no specific API to give you files that were modified since a specific date/time.
Storing metadata in a cloud database
You could also consider storing the metadata in a cloud database, like Firebase's Realtime Database or Cloud Firestore, and then use the query capabilities of that database to retrieve only files that were modified since your device last synchronized.
The recipe then becomes:
Determine when we last synchronized, which is a value you'll want to store in local storage/shared preferences.
Execute a query to the database to determine which files were added/modified since then.
Loop over the query results and...
Download each file that was modified.
In here, only step 2 and 4 make calls to the external APIs, so it is likely to be faster and cheaper to execute (but more work for you to write initially).

What's the purpose of .firebase/hosting. ALPHANUM.cache

Today I deployed firebase hosting. After deployment, I noticed firebase creates file of .firebase/hosting.ALPHANUM.cache, where ALPHANUM is actually some random baseNN ish value.
Question
What is the purpose of this file?
More especially, can I add this to .gitignore?
Or, I should not?
This file is part of a new feature in Firebase Hosting that minimizes the size the time of a hosting deployment by only uploading the files that changed since the last deployment. It's new in CLI version 4.2.0, and you can read about that on GitHub.
As Frank suggested, you should definitely add the .firebase directory to your .gitignore or equivalent file, since it contains information that's not strictly part of your project, and is likely not applicable for everyone sharing and contributing to your project source code.

Accessing Files from Firebase Storage vs Firebase Hosting?

So here is the scenario:
When I access files from Firebase Storage:
I get my file from storage bucket(.html, .png, .zip etc) (Small in size btw no more than 2mb).
Store that file in my local storage so the app don't need to download it again and consume the bandwidth of the server.
Use it from local storage everytime the app needs it.
When I access files from Firebase Hosting:
I get my file from nearest CDN of Firebase(.html, .png, .zip etc) (Small in size btw no more than 2mb).
Store that file in my local storage so the app don't need to download it again and consume the bandwidth of the server.
Use it from local storage everytime the app needs it.
NOTE: I also have one file version.txt on storage bucket (Firebase Storage). According to the value in this file, I decide whether to fetch file in Step 1 again or not. It means the version.txt is fetched everytime.
Questions:
How to achieve the similar version programming part in Firebase Hosting? I know we deploy folders, can we get their version from Firebase CDN. If yes, how?
In which method I gonna first hit my LIMIT as we know Firebase is paid after a limit.
Pros of Hosting: It will be faster. Link
PS:
1. My concern is bandwidth and not security.
Currently, I am using basic Plan (free) with limits Source:
From the Firebase docs:
The Firebase Realtime Database stores JSON application data, like
game state or chat messages, and synchronizes changes instantly
across all connected devices.
Firebase Remote Config stores
developer-specified key-value pairs to change the behavior and
appearance of your app without requiring users to download an update.
Firebase Hosting hosts the HTML, CSS, and JavaScript for your website
as well as other developer-provided assets like graphics, fonts, and
icons.
Cloud Storage stores files such as images, videos, and audio as well as other user-generated content.
Storage has higher free tier limits, while Hosting might be a little faster. Note that all files on Hosting are publicly accessible, so if you need authentication or authorization, you should use Storage.

Uploading 100GB of data [duplicate]

We're thinking about using Firebase Hosting (which is awesome: ssl, control over redirects, easy CLI tool, etc) to host our API docs. Currently, we count 17k files generated. We did a test upload, and everything worked (pretty cool!). We're curious, is there a limit to the number of files we can deploy to Firebase Hosting?
I'm not aware of a limit on the number of files. But they zipped result (which is what actually gets uploaded) definitely has to be less than 2GB.

Handling Wordpress media files on EC2/S3 with auto scaling

I'm working on a WordPress deployment configuration on Amazon AWS. I have WordPress running on Apache on an Ubuntu EC2 instance. I'm using W3 Total Cache for caching and to serve user-uploaded media files from an S3 bucket. A load balancer distributes traffic to two EC2 instances with auto scaling to handle heavy loads.
The problem is that user-uploaded media files are stored locally in wp-content/uploads/ and then synced to the S3 bucket. This means that the media files are inconsistent between EC2 instances.
Here are the approaches I'm considering:
Use a WordPress plugin to upload the media files directly to S3 without storing them locally. The problem is that the only plugins I've found (this and this) are buggy and poorly maintained. I'd rather not spend hours fixing one of these myself. It's also not clear whether they integrate cleanly with W3 Total Cache (which I also want to use for its other caching tools).
Have a master instance where users access the admin interface and upload media files. All media files would be stored locally on this instance (and synced to S3 via W3 Total Cache). Auto scaling would deploy slave instances with no local file storage.
Make all EC2 instances identical and point wp-content/uploads/ to a separate EBS volume. All instances would share media files.
Use rsync to copy media files between running EC2 instances.
Is there a clear winner? Are there other approaches I should think about?
You might consider looking at something like s3fs (http://code.google.com/p/s3fs/). This allow you to mount your S3 bucket as a volume on your server instances. You could simply have the code to mount the volume executed on instance start-up.
s3fs also has the ability to use local (ephermal) directories as a cache to the s3fs directory so as to improve performance.

Resources