There is a PWA based on vuejs, which is connected to firebase.
The app has several pages with images. Each page has several images – to see all images on the page you need to scroll down the page. All images are stored on firebase storage.
When you scroll down the page at the first moment images are not displayed because the app downloads them. After a few seconds images appear.
The task is to avoid the downloading of the images during scroll down the page and display the images immediately during the scroll.
At this moment:
all images are uploaded to the firebase storage with the following metadata: metadata = { cacheControl: 'public, max-age=300000000, s-maxage=300000000' }
when the image is downloaded it appears in the app cache. When you scroll down all pages – all images are in the app cache. In this case all images are displayed immediately during the scroll.
When you refresh the page - all images are deleted from the cache.
To resolve the task my suggestion is to download all images into cache when you open PWA. And it is desirable to download images into cache for the next 3 days.
Please, help me find the way to download all images into cache for the next 3 days at the point you open the PWA.
Or if my suggestion is not the best way to resolve the task – please, advice better way to solve the task.
Thank you!
It would be nice if you could share some code with us. Especially how your Service Worker (SW) is working.
There are multiple points you should consider:
Every SW has a lifecycle and some of them are set to clear the cache on the Lifescyle event "Activated". Check if yours is doing that.
One thing you should do is catch all GET request to the images and return them from cache if they are there and from server if not (this should be done in the SW)
You can store thenm in the SW while he is getting installed. Just make sure not to save to much at once because the installation could faill very easy then.
Try to save them in smaller batches to the cache as you scroll down in advance and in a async way. You can save data to the cache also from the front end code.
Related
I have offline first mobile chat app where chat messages can be also images.
I.e if I have 20 images, I want to download them first from URL and then to display them from the local files. In the meantime, I want to display a loading placeholder until the image is downloaded. Is there some best practice for doing this? Thanks!
UPDATE: I don't know when the image is downloaded because it will be processed in a queue and file location will be updated in local DB, so I can't await on that download process.
FFImageLoading has a LoadingPlaceholder property for your Images that supports UriImageSource, FileImageSource and StreamImageSource
For more information and examples, check the docs
I'm having some very frustrating issues with load time on a web application that I'm building and I don't have much experience with networking so my attempts to diagnose the issue aren't getting me too far.
As I said in the description, I'm using Angular 2 to build the app. On the home page there is a full page background image that is compressed down to around 314kb in size. The background image is being placed onto a browser sized div in the home page component's css file. I'm hosting it from an Amazon S3 bucket. I'm using the angular cli tool so I used the 'ng build' command to build the app into the dist/ folder and uploaded it to the s3 bucket. When I went to the endpoint for the bucket to test out the app, It took nearly 14 seconds for the network request to complete for the background image alone.
Below is a cropped screenshot from the 'Network' tab in chrome developer tools. I've noticed two interesting things about the request. The first is that nearly the entire length of the request is in the 'stalled' state, which is represented by the gray bar. I'm not entirely sure what that means or what would be causing it. According to some google documentation for the 'Developer Tools' feature, it usually means that the browser is already making the max amount of requests that it is allowed to make at one time (6 for chrome) and so the request is waiting for one of those to finish before it starts, however that doesn't seem to make sense because for the entire length of time the image request is 'stalled,' it is the only request that hasn't yet completed. The second interesting thing that I noticed is that 'initiator' file for virtually every other request was zone.js, whereas the initiator for the background image request was config.js. I'm not sure if that is of any significance or not, but it seems like it could have something to do with the issue.
I'm fairly new to front end development and JavaScript in general so I'm sorry if I'm not providing the right information, but if there is any other info that would help you to understand the issue just let me know and I'll get it for you.
I'm making a gallery site for a client, to be used for internal use. It is browsing hi-res images, so there's literally over 1 GB of images, each 3-4 MB, so loading the images through the web isn't an option due to load time.
My idea was to store the images on each machine locally, but maintain a central database online so all machines are in sync, and load the images using "file:///C:/images/file.jpg". But apparently browsers don't allow a website to load files from the local computer (for obvious security reasons).
How can I get around this?
Do I have to create a browser plugin myself to get access to the file system?
Alternatively, is there a better way to achieve my goal of (a) a centralized database of images and data, but (b) images stored locally?
Thanks for any advice you can offer.
You can store your images in your centralized database, but it would be of interest to also store smaller, resized images so that if the user is interested in the smaller version, s/he can click it, or hover over it, and have it load the larger version. 3-4MB isn't that insane for most computers to load, so long as the page isn't trying to load them all at once.
To get access to the file system, you can use the web-host's file access links, or you can use an FTP client, given that you know the FTP username/password.
I am implementing rackspace cloudfiles for a site. If a user uploads a profile image we want to store it on cloudfiles in a CDN enabled container. This works only it takes a couple of secs before the file is available on the CDN.
So when you upload your profile -> we store it in cloud -> reload the page it often isn't available yet resulting in a broken image.
Has anybody experienced this issue and if so how did you work around it?
Yes,this is a function of how CDN's work.
The CDN (in the case of RackSpace, Akamai) must become aware of your content, so it takes some time for your content to show up on a CDN (usually just a matter of minutes).
I'm currently developing a site where users can upload images to use as avatars, I know this makes me sound a little paranoid but I was wondering what if a malicious user uploads an image with incredibly large dimensions that will eat the server memory (as a DOS attack), I already have a limit on the file size that can be uploaded (250 k) but even that size can allow for an image with incredibly large dimensions if the image for example is a JPEG that contains one color and created with a very low quality setting. Taking into consideration that the image is uploaded as a bitmap in memory when being resized (ie. not compressed), I wonder if such DOS attacks occur, even to check the image dimensions it has to be uploaded in memory first, did you hear about any attacks that exploited this? Am I too worried?
The dimensions should be able to get at without loading the entire image map into memory? Maybe you can find out more on the issue at wotsit.org.
You have to validate that image files really ARE image files. The issue isn't an attack on your server. The issue is someone uploading an ActiveX control instead of an image. This file then downloads and installs and ruins every Windows machine that does the download.
The threat is not to you. The threat is that you will become a carrier for a virus.
You must validate each file to confirm that it is a real image file. You can check dimensions and what-not if you want. Most image-processing libraries can read the headers off the image, check the dimensions and number of pixels and what-not.
Often, folks make thumbnails from images, you can do that, also, once you've opened the image.
DoS may or may not be an issue - it depends on if someone decides to target your site.
However, for your site to scale to 1000s of concurrent users, you may consider handling the image processing in a separate process.
When the image processing is handled by page code, you run the risk of exhausting: memory, CPU, or ASP.NET threads --- the bottleneck depends on your server configuration.
Possible solution:
User uploads image.
Image is saved to shared directory.
Image path is saved to a queue in database.
Page returns with message "thanks for uploading, your avatar will be ready soon".
A Windows Service* wakes up periodically and checks the database queue.
The service resizes any images waiting in the queue, saves the outputs to a shared directory, and removes them from the queue.
Service updates the database indicating that the user's avatar is ready. So, next time they visit their profile page, they are shown the resized image.
*Ideally, the Windows Service runs on a separate server from the web server, which could be scaled up to meet future demands.
Whether this effort is worth it depends on your expected traffic. You could use load testing tools to script and simulate these actions, to see if your site can handle the load.
I think you should simply check the image dimensions. Having only a few formats, this isn't that hard and you can then easily filter large images out. Usually sites where you can upload avatars tell you to not only have a image smaller than a specific file size, but also give image dimension borders, so it's usual to check this.