I'm using firebase storage to upload avatars (original.jpg).
In a node.js background process I resize them with gm and then put them back in my firebase bucket. These pictures are publicly readable, see my storage rules :
service firebase.storage {
match /b/xxx.appspot.com/o {
match /images {
match /{allImages=**} {
allow read;
}
match /{userId}/avatar/original.jpg {
allow write: if request.auth.uid == userId;
}
}
}
}
I store the url of images in firebase database ex:
https://firebasestorage.googleapis.com/v0/b/xxx.appspot.com/o/images%2Fyyy%2Favatar%2Fzzz-small.jpg?alt=media
That I retrieved right after the file upload (web sdk) var downloadURL = uploadTask.snapshot.downloadURL; ; stripped off the token parameter as it's gonna be public ; and where zzz-small.jpg has replaced original.jpg.
It works, pictures are displayed.
But it's slow even though the picture is a 40x40px jpg. And it hangs for a while before actually downloading the file, see Chrome Network record :
Chrome network record
The file is 158B, there's a ~1s waiting before the download ~3ms...
Is firebase storage supposed to be as fast as a CDN ?
Any faster way to call a public readable file ?
[EDIT]
Got some feedback from Firebase support :
We're definitely aware that many users, such as yourself, are having
issues with slow downloading of file from Europe. We're exploring
potential solutions, but I can't share any details or timelines at
this time. We'll keep your feedback in consideration moving forward
though.
Keep an eye out on our release notes for any further updates.
I forgot to mention I'm in Europe and the storage in the US.
Related
I am struggling to find out how to set the limit of the storage that each user can upload to my apps storage.
I found method online Storage.storageLimitInBytes method, but I don't see this method even be mentioned in Firebase docs, let alone instructions on how to set it.
In general, how do startups monitor how many times user upload images, would they have a field in users document such as amountOfImagesUploaded: and everytime user uploads image I would increment that count and this way I could see who abuse the storage that way.
Or would I have to similar document that tracks users uploads per day and when the count reaches 100 or something then take action on that user.
I would really appreciate your help regarding this issue that I am facing.
Limits in Cloud Storage for Firebase security rules apply to each file/object separately, they don't apply to an entire operation.
You can limit what a user can upload through Firebase Storage's security rules. For example, this (from the linked docs) is a way to limit the size of uploaded files:
service firebase.storage {
match /b/<your-firebase-storage-bucket>/o {
match /images/{imageId} { // Only allow uploads of any image file that's less than 5MB
allow write: if request.resource.size < 5 * 1024 * 1024 && request.resource.contentType.matches('image/.*');
} } }
But there is currently no way in these rules to limit the number of files a user can upload.
Some options to consider:
If you hardcode the names of the files that the user uploads (which
also implies you'll limit the number of files they can upload), and
create a folder for the files for each specific user, you can
determine the sum of all files in a user's folder, and thus limit on
the sum in that way.
For example : If you fix file names and limit the allowed file
names to be numbered 1..5, the user can only ever have five files in
storage:
match /public/{userId}/{imageId} {
allow write: if imageId.matches("[1-5]\.txt");
}
Alternatively, you can ZIP all files together on the client, and
then upload the resulting archive. In that case, the security rules
can enforce the maximum size of that file.
And of course you can include client-side JavaScript code to check
the maximum size of the combined files in both of these cases. A
malicious user can bypass this JavaScript easily, but most users
aren't malicious and will thank you for saving their bandwidth by
preventing the upload that will be rejected anyway.
You can also use a HTTPS Cloud Function as your upload target, and
then only pass the files onto Cloud Storage if they meet your
requirements. Alternatively you can use a Cloud Function that
triggers upon the upload from the user, and validates the files for
that user after the change. For example : You would have to
upload the files through a Cloud function/server and keep track of
the total size that a user has uploaded. For that,
Upload image to your server
Check the size and add it to total size stored in a database
If the user has exceeded 150 GB, return quota exceeded error else upload to Firebase storage user -> server -> Firebase storage
An easier alternative would be to use Cloud Storage Triggers which
will trigger a Cloud function every time a new file is uploaded. You
can check the object size using the metadata and keep adding it in
the database. In this case, you can store total storage used by a
user in custom claims in bytes.
exports.updateTotalUsage = functions.storage.object().onFinalize(async (object) => {
// check total storage currently used
// add size of new object to it
// update custom claim "size" (total storage in bytes)
})
Then you can write a security rule that checks sum of size of new
object and total storage being used does not exceed 150 GB: allow
write: if request.resource.size + request.auth.token.size < 150 *
1024 * 1024
You can also have a look at this thread too if you need a per user
storage validation. The solution is a little bit tricky, but can be
done with :
https://medium.com/#felipepastoree/per-user-storage-limit-validation-with-firebase-19ab3341492d
Google Cloud (or Firebase environment) doesn't know the users. It knows your application and your application do.
if you want to have statistic per users you have to logs those data somewhere and perform sum/aggregations to have your metrics.
A usual way is to use Firestore to store those information and to increment the number of file or the total space used.
An unusual solution is to log each action in Cloud Logging and to perform a sink from Cloud Logging to BigQuery to find your metrics in BigQuery and perform aggregation directly from there (the latency is higher, all depends on what you want to achieve, sync or async check of those metrics)
I would like to know if this action that I am doing in my application generates expenses for Firebase Storage. That is,
if regardless of the times I do the following action it will generate some expense in Firebase billing.
I have images stored in storage, and I upload these images to an ImageView using the Glide library, like this:
Uri url = Uri.parse(pictureUser);
Glide.with(MainActivity.this).load(url).into(imageView);
I have a chat application, and in each message on the adapter I want to upload the user's photo this way, using Glide, so, as there are many uploads, I wanted to know if every time I run this code snippet, it generates some billing expenses of the firebase.
As you are using this in chat activity this means that same image may probably load many of times.
To decrease the number of images that download I suggest you to use instead of:
Glide.with(MainActivity.this).load(url).into(imageView);
replace it by :
Glide.with(MainActivity.this)
.load(url).diskCacheStrategy(DiskCacheStrategy.DATA)
.into(imageView);
by this way, you are activating the cache mode. this means the Glide will search in the cache if it exists then will upload it from the internet.
I advise you to use Picasso instead of Glide. it works better in mode offline than Glide:
Picasso.with(MainActivity.this)
.load(url)
.networkPolicy(NetworkPolicy.OFFLINE)
.into(imageView, new Callback() {
#Override
public void onSuccess() {
}
#Override
public void onError() {
// Try again online if cache failed
Picasso.with(MainActivity.this)
.load(url)
.error(R.drawable.user_placeholder_error)
.into(imageView);
}
});
Any time your code download a file, it will be billed as egress from Cloud Storage.
With Glide, it as a disk cache enabled by default, so it should only download a URL once and reuse it from cache (as long as it remains in cache). That would not be billed, since it's not downloading anything from the cloud.
Just trying to figure out something that seemed trivial in firebase, in google-cloud.
It seems as though if you're making a node.js app for HTML (i'm talking to it through Unity actually, but it's a desktop application) you can't use firebase-storage for some odd reason, you have to use google-cloud, even the firebase-admin tools use the cloud storage to do storage from here.
Nevertheless, i got it working, i am uploading the files to the firebase storage; however, the problem is in firebase, you could specify a specific file, and then do storage().ref().child(filelocation).GetDownloadURL(): this would generate a unique url for some set time that can be used publicly, without having to give out access to read to all anonymous users.
I did some research and i need to implement something called GS UTIL in order to generate my own special urls, but it's so damn complicated (im a newbie to this whole server stuff), i don't even know where to start to get this working in my node server.
Any pointers? I'm really stuck here.
-------if anyones interested, this is what im trying to do high level-----
I'm sending 3d model data to node app from Unity
the node app is publishing this model on sketchfab
then it puts the model data onto my own storage, along with some additional data specially made for my app
after it gets signed to storage, it gets saved to my Firebase DB in my global model database
to be accessed later, by users, to try to get the downloadURL of this storage file and send them all back to Unity users(s)
I would just download the files into my node app, but i wanna reduce any server load, it's supposed to be just a middleman between Unity and Firebase
(i would've done it straight from Unity, but apparently firebase isn't for desktop windows apps).
Figured it out:
var firebase_admin = require("firebase-admin");
var storage = firebase_admin.storage();
var bucket = storage.bucket();
bucket.file(childSnapshot.val().modelLink).getSignedUrl({
action: 'read',
expires: expDate
},function(err,url){
if(err){
reject(err);
}
else{
finalData.ModelDownloadLink = url;
console.log("Download model DL url: " + url);
resolve();
}
});
Anyone know why firebase storage would be so ridiculously slow compared to firebase hosting?
Results
Time to download image of firebase hosting: 16ms
Time to download same image from firebase storage: 2.23s (2.22s is TTFB)
Time to download same image from firebase storage (Asia Pacific Region): 1.72s (1.70s is TTFB)
(File size: 22.7kb / jpeg / firebase storage has read open to everyone)
This is repeated over and over in tests. Is there any way to speed this up to a decent time, or is firebase storage unusable for small files (images/thumbs)?
For Comparison
S3 North Cal - approximately 500ms
S3 Asia Pacific - Approximately 30ms
Cloudinary - Approximately 20ms
Extra info:
I am based in Australia.
Exact same files. Always images under 100kb.
The slow down is always in the TTFB according to dev tools.
Hosting URL: https://.firebaseapp.com/images/thumb.jpg
Storage URL: https://firebasestorage.googleapis.com/v0/b/.appspot.com/o/thumb.jpg?alt=media&token=
I found the solution.
If you have your files already uploaded to storage go to: https://console.cloud.google.com/storage/browser?project=your_project > pick your bucket > select all interesting files and click Make public (or something similar - I'm not english native).
To have all new uploaded files public by default you need to install Google cloud SDK (https://cloud.google.com/sdk/docs/) and from your command line use the following command for your bucket:
gsutil defacl set public-read gs://your_bucket
After that all my current and new images are available here storage.googleapis.com/my_project.appspot.com/img/image_name.jpg
and downloading time is definitely shorter.
Hosting = Storage + CDN, so really what you're seeing is you hitting a CDN near you, rather than going directly to the GCS or S3 bucket. Same is true with Cloudinary/Imgix. This is why performance is so much better for Hosting than Storage.
Addressing the issue of TTFB being so different between AWS and GCP: unfortunately this is a known issue of GCS vs S3 (see this great blog post w/ in depth perf analysis). I know this team is working to address this problem, but going the "stick a CDN in front of it" route will provide a faster solution (provided you don't need to restrict access, or your CDN can authorize requests).
Note: GCP has announced a Sydney region (announcement blog post) to be launched in 2017, which might help you.
In addition to #Ziwi answer.
I think it is also ok to change rules directly in Firebase
// Only a user can upload their profile picture, but anyone can view it
service firebase.storage {
match /b/<bucket>/o {
match /users/{userId}/profilePicture.png {
allow read;
allow write: if request.auth.uid == userId;
}
}
}
The source is https://firebase.googleblog.com/2016/07/5-tips-for-firebase-storage.html
I'm developing an app that will stream mp3 file stored in firebase storage. Nearly I'm having 100 songs like song1,song2,song3..... if a song is selected I have to stream that particular song without downloading. In such case I need to write plenty of code because for each song I have to mention the firebase storage url. the url would be like
https://firebasestorage.googleapis.com/......song1.mp3?alt=media&token=8a8a0593-e8bb-40c7-87e0-814d9c8342f3
For each song the alt=media&token= part of the url varies, so I have to mention the unique url for all songs. But here I need some simplified coding to play the songs by mentioning its name alone from firebase storage.
Please suggest a way to stream the audio file by using its name alone that is stored in firebase storage.
You have two choices if you want to get files out of a Firebase Storage bucket.
Use the full download url that you can get from a Storage Reference that points to the file path in your bucket.
Use a download task (Android or iOS) to fetch the data from the file.
You can't get the file data any other way from within a mobile app. Firebase Storage doesn't support special media streaming protocols, such as RTSP.
I did it using downloadUrl.
val storage = FirebaseStorage.getInstance()
storage.reference.child("songs/song1.mp3").downloadUrl.addOnSuccessListener({
val mediaPlayer = MediaPlayer()
mediaPlayer.setDataSource(it.toString())
mediaPlayer.setOnPreparedListener { player ->
player.start()
}
mediaPlayer.prepareAsync()
})
StorageReference filepath=storage.child("Audio").child(timeStamp+".mp3");
Uri uri=Uri.fromFile(new File(fileName));
filepath.putFile(uri).addOnSuccessListener(new OnSuccessListener<UploadTask.TaskSnapshot>() {
#Override
public void onSuccess(UploadTask.TaskSnapshot taskSnapshot) {
String audio_url=taskSnapshot.getDownloadUrl().toString();
Use audio_urlto stream the audio file using media player