I have a webpage where I can add multiple video streams and display them like a mosaic.
This looks like this:
The videos can be resized, but when they get too small, the bitrate will lower (the stream adapts to the size of the video) and artefacts start to become visible.
I was wondering if there was any way I could keep the videos at a fixed size (large enough to get a higher bitrate) while scaling them down for the mosaic.
I've tried using the CSS scale feature, but I couldn't achieve the desired outcome. Maybe it's not possible at all, but I'd appreciate any tips you might have (keep in mind that I'm unable the change the stream itself).
Thanks!
Related
I have a very large hero image, set as a background-image, that takes up the entire initial viewport size. I want to serve up multiple sizes based on device resolution for improved speed.
Media queries seem like the best approach, but I have some questions about implementing them:
Would it be better to use the width feature or device-width feature? I'm thinking the latter, because the width can change if a user resizes their browser, which could trigger a new background-image to load. With device-width, I can detect the largest possible screen size and load an image whose dimensions can look good if the user resizes their browser to full size.
How do you accommodate for pixel density? I need to make the images twice as big, or in some cases, three times as big, so they render properly on devices with large pixel density. So I can't really cut down the image size that much when I have to account for potentially high pixel density. I could make media queries targeting specific pixel densities and device-widths together, but that's seems like a complicated and suboptimal solution.
I'd love to hear some solutions people have found for this.
I am looking to make a website dynamically scale its assets (png/jpg) and output the appropriate css for three sets of resolutions: 540p, 720p, 1080p
Currently we have assets created for each resolution, so thats 3 sets maintained manually, ideally I want a Jenkins/hudson job to create the assets (by scaling from the highest resolution asset set, maybe use imagemagick commandline) and then generate CSS to make the resolution layout possible.
This is clearly not an old or unique problem, I am wondering what is the best approach to take for this?
The webpage is intended for low computing power embedded devices, which have limited capability, albeit HTML5 supporting. The solution has to be a server side creation of assets and CSS scaling as we cannot rely on the devices to be able to cope with much scaling.
Look forward to your thoughts and replies.
Cheers in advance.
Not a real solution, but rather a work-around which is working well and is easier to maintain:
Replace PNGs with SGVs. Take the highest resolution version and use something like http://vectormagic.com.
For JPGs take the highest resolution version, but compress it heavily and use something like ImageOptim. This works well for:
Browsers are nowadays very good at resizing higher resolution images to smaller ones.
Since the pixel density is so high if you are using the full size, the compression (artifacts) will be much less visible. And if your browser is scaling down, artifacts will be less visible because of the smoothing your browser applies.
File sizes will be bigger, but not much, since you can use a higher compression and it's getting more efficient with the image's size.
Reference: http://www.fngtps.com/2012/reasonable-ways-to-use-high-resolution-images-on-retina-displays/
Although it sounds a simple question I have been thinking about this.
How exactly do browsers render images? For an example of what I mean, lets say I have a 1MB image that is 3000px x 1500px. I then decide to put this image into a container that is the set to width:100%; meaning that it will scale down reponsively. Will the browser load the entire 1MB and then proceed to scale the image down to fit the container or will it scale it down and then proceed to load it?
I'm asking because this is pretty much my situation and if it does the former (load the 1mb first) then I guess I would have to serve a seperate image on mobile devices?
Thanks
Edit: Since people are saying that it'll load the 1mb image first, how would you suggest I serve a large image to the user? Scale it down for mobile and have a mobile/desktop version?
The CSS is applied after the image is fully loaded. So in short it would to load the 1MB image first and then apply the dimensions. Also images are stored at Server side. I found a very useful image which describes this:-
On a side note:
Browsers often render the same page multiple times in a very short period of time, specially after reading image metadata.
The browser (obviously) cannot scale (or do anything) with an image before loading it.
The image is retrieved first via a HTTP request e.g.
GET /images/myimage.png
and then scalings/transformations are applied. So if you want different images for different platforms then you should request as such.
I have a colourful background image that is 2000px x 1500px and because of the details I am saving it as a jpeg that renders at 1.1 MB. I am using the CSS background property to render the image. So being relatively new to web dev and working with a client/designer that's not open to a change of design at this point in the process, what should I do to help this image load blazingly fast. I don't know if it makes a difference but the site is using Joomla 1.5.9. This is something I've always wanted to understand but have had trouble uncovering solutions for... I hope someone can help!!
Thanks everyone!
There's no way to make images magically load faster. Sometimes, though, splitting an image in smaller images allows for surprising size gains; so if it's not out of question for your CSS layout (it typically is, though), you could try this.
Another possibility for you would be to use progressive JPEG images. They are encoded in such a way that enables browsers to display a progressively more precise image as it loads. This means that at first, the image appears blurry (or incomplete) but with full-dimensions, then it progressively gets sharper and better. It's well-suited for images meant to load slowly (or at least, images acknowledged to load slowly).
The best thing that you can do is try to shrink the file size to as small as possible. Let this be using some type of optimizer, smush.it for example. If you created the background image try saving it as progressive first, it loads a lower res version first then finishes loading. But the best thing is to try to shrink the size of the image width and height by finding a repeating pattern and cropping just that portion out and using it. Most about.me images are no larger than 100kb in size that are above 1200px wide.
you could immediatley call the image in the head by using
<script>
(new Image()).src = "IMAGE PATH";
</script>
and make sure you compress the image as much as you can with different programs, or if you have photoshop cs5 you can save it for a web device to strip out all of the extra junk, you can try yahoo's smush.it
http://www.smushit.com/ysmush.it/
or you could delay the loading of the site entirely for a few seconds, until you can be sure that the image is loading, you could try something like hiding all of the elements and fading them in upon a setTimeout something like
*CSS
body{
opacity:0;
}
*jQuery
setTimeout(function(){$('body').animate({opacity:'1'},300)},5000);
although that may not be all that practical.
Would loading the background image after the rest of the page has loaded work for you? At least this way visitors will be able to use the rest of your site until the 1.1MB has loaded.
Something along the lines of setting the style attribute of the <body> to be background: url(image.jpg) within an onload function?
Delivering the image over a CDN will likely speed things up; the better ones are usually optimized to deliver the proper TCP packet size/MTU to the wide variety of clients out there and have generally done a lot of work in the details of delivering things quickly. Slow connections like cellphones will still hate you for it, but getting things like packet size right can make the most of the bandwidth that is available.
I am attempting to capture a very large image that was made dynamically within the Flash Player (the size of the image is 2400px by 12,000px) and am running into some very serious issues... Let me run down how the image get's to that size in the first place:
User adds elements to a canvas and then when the user is finished the canvas scales up to 2400px wide and ~12,000px tall. The problem arises when I attempt to save the image to the hard drive. Now, I dont know if this will affect the recommended fix, but the rendered image wont be saved on the hard drive, it will be sent to a server. I know about the ~4050px limit in Flash Player and was thinking I could get around that by clipping the images with the ImageSnapshot.captureBitmapData() method and keeping the required Rectangle variable below 4000px then repeat that down the large image until it reaches the end where the final images will be pieced together at a later time. However... As i mentioned the error comes when it reaches the 'fileReference.save(pngImage);' method... Has anyone else done something like this?
Have you tried if fileReference.save works at all (e.g. with smaller images like 100 px height)? It seems that the image data will perhaps be transformed to string data, so there might be other limits you're not aware of at the moment (your uncompressed image data will be around 86 MB, so even a PNG file with good compression might be around 10 MB in size, at the moment you're trying to save a third of this, but 3 MB still is quite large).