I plan to be changing the color of a few hundred thousand divs a second and was wondering what the fastest way to do it was.
What are the best formats in terms of performance? rgb triples? hex codes? color words(black, chartreuse)?
I've run this jsPerf, and these are the general results:
basic color keywords is quite fast, and it's the fastest for Chrome. The extended list is a lot slower in some browsers though.
hsl is just the worst, except for IE, where it is actually the fasted (but then again, IE) (apparently this was just a single case, I couldn't reproduce it afterwards)
#RGB or #RRGGBB are both relatively fast in every browser (#RGB is slightly faster in general)
rgb() is generally slow, in every browser
In general, I think #RGB is the fastest format for every browser (on average).
Hex codes would be the fastest. When you say for instance "black", it is read then changed to its hex code #000000
Related
I am using Images.jl in Julia. I am trying to convert an image into a graph-like data structure (v,w,c) where
v is a node
w is a neighbor and
c is a cost function
I want to give an expensive cost to those neighbors which have not the same color. However, when I load an image each pixel has the following Type RGBA{U8}(1.0,1.0,1.0,1.0), is there any way to convert this into a number like Int64 or Float?
If all you want to do is penalize adjacent pairs that have different color values (no matter how small the difference), I think img[i,j] != img[i+1,j] should be sufficient, and infinitely more performant than calling colordiff.
Images.jl also contains methods, raw and separate, that allow you to "convert" that image into a higher-dimensional array of UInt8. However, for your apparent application this will likely be more of a pain, because you'll have to choose between using a syntax like A[:, i, j] != A[:, i+1, j] (which will allocate memory and have much worse performance) or write out loops and check each color channel manually. Then there's always the slight annoyance of having to special case your code for grayscale and color, wondering what a 3d array really means (is it 3d grayscale or 2d with a color channel?), and wondering whether the color channel is stored as the first or last dimension.
None of these annoyances arise if you just work with the data directly in RGBA format. For a little more background, they are examples of Julia's "immutable" objects, which have at least two advantages. First, they allow you to clearly specify the "meaning" of a certain collection of numbers (in this case, that these 4 numbers represent a color, in a particular colorspace, rather than, say, pressure readings from a sensor)---that means you can write code that isn't forced to make assumptions that it can't enforce. Second, once you learn how to use them, they make your code much prettier all while providing fantastic performance.
The color types are documented here.
Might I recommend converting each pixel to greyscale if all you want is a magnitude difference.
See this answer for a how-to:
Converting RGB to grayscale/intensity
This will give you a single value for intensity that you can then use to compare.
Following #daycaster's suggestion, colordiff from Colors.jl can be used.
colordiff takes two colors as arguments. To use it, you should extract the color part of the pixel with color i.e. colordiff(color(v),color(w)) where v would be RGBA{U8(0.384,0.0,0.0,1.0) value.
I'm trying to use tput to set foreground and background colors in my terminal in a device independent way.
If the whole purpose of termcap/terminfo/tput is to become device independent, why are there both versions that explicitly use ANSI controls (setaf/setab) and versions that do not (should not)?
This discussion quotes terminfo(5) which in turn quotes standards that explicitly says that those are to be implemented using ANSI and not ANSI, respectively.
Why isn't there just setf/setb and they always set the foreground and background colors. I don't care how it's done, that's why I use tput!
Why isn't there just setf/setb and they always set the foreground and background colors
are actually two questions!
The first part, why there are ANSI and non-ANSI terminal commands takes too long to exaplin, and it's unnecessary as the history is quite well explained on Wikipedia.
The second part could perhaps be freely rephrased to "what's the difference?" or "what can I do about it?".
Difference:
ANSI type terminals use another mapping between colour number and colours than non-ANSI terminals. For example, the code for yellow on one would be cyan on the other. There are simply two different mapping tables. Those things are described quite well on Wikipedia.
What you can do about it:
Discover which type of terminal you have, and use corresponding command.
Or modify your termcap.
None of these solutions are fully generic though, unfortunately.
I didn't know before about artistic or artwork QR codes, while checking some of these codes, they are completely different from the regular standard QR code, but how is it possible to create this kind of QR code without loosing it's value (the scan result is the same) ?
These QR Codes are the most ones that amazed me:
http://www.hongkiat.com/blog/qr-code-artworks/
The only thing in common is the 3 corners, and they're different in style.
So my question is, what are the elements that we should preserve while creating such QR Codes ?
The most important things are:
Dark-on-light
Very nearly square modules
Modest light border
Substantially preserve the three-finder patterns
... and the first line of modules around them, which carries format info
... and the bottom-right alignment pattern, is helpful
The rest, the interior, can be substantially obscured and still be readable, certainly with high error correction. But messing with the elements above will tend to make it unreadable much more rapidly
Does there exist a utf8 code for x(4), functional for cross browser/os.
According to http://en.wikipedia.org/wiki/Unicode_subscripts_and_superscripts
^4 should be at U+2074 but can't get it to work (on xp).
Works for me, but your mileage may vary by font. http://www.fileformat.info/info/unicode/char/2074/browsertest.htm
If you want reliable cross-browser rendering of more complex maths, you'll need to use mathjax
In CSS we can use several different methods to define a color:
Color word: red
Hexadecimal: #FF0000
Red/Green/Blue channels: rgb(255, 0, 0)
Hue/saturation/lightness: hsl(0, 100%, 50%)
I do realize that using named colors is not a good idea, as different browsers have their own idea of what aquamarine looks like.
Ignoring alpha channel and browser support, are there any differences performance-wise between these 4 methods?
If we were trying to squeeze every last bit of optimization out of our CSS, which one would be preferred, if any? Are the color values converted to a specific format internally, or does the performance of it depend on anything else (like which rendering agent or browser is used)?
Looking for a "technical" answer if possible, references appreciated.
If we assume a modern browser making full use of the GPU then the internal color representation will be RGB floats. Ignoring the color name - which is probably just a map to hex anyway - I think that hex and channels will be the fastest. HSB will undoubtedly be the slowest, as the conversion from HSB to RGB does require some work - about 50 lines of C code.
However, I think that for the purpose of CSS, this is a completely irrelevant question. Even for HSB to RGB the amount of work on one color will be totally trivial. By way of support for this, I have several programs - including those running on mobiles - which do color manipulation at a per-pixel level on largish images where I am doing RGB->HSB->(some manipulation)->RGB. Even performing this operation 100,000 times on an ipad only results in a delay of a couple of seconds - so on this relatively slow platform, I think your typical worst case conversion can be safely assumed to take less then 0.0001 seconds. And that's being pessimistic.
So just use whatever is easiest to code.
ADDED: to support the don't worry about this option. Internally a GPU will manipulate colors as an array of floats, so in C terms
float color[4];
or something similar. So the only conversion being done for the numeric options is a simple divide by 255.
On the other hand conversion of HSB to RGB takes considerably longer - I'd estimate, from having written code to do it, about 10 to 20 operations. So in crude terms HSB is considerably slower, BUT 20 (or even 20,000) operations on a modern GPU isn't worth worrying about - it's imperceptible.
Here are the results including color names, short hex, hex, rgb, rgba, hsl, and hsla. You can run the test yourself here.
I used the same tool from jsperf.com that the others did, and created my own test for different color formats. I then ran the test on IE11, Edge17, FF64 and Chrome71 and gathered all results in a compact excel spreadsheet.
Top three are green, bottom three are red, best and worst are bold.
I don't know why Chrome is so prone to named colors format, but it made me repeat the test many times with the same and different parameters. Results remain constant.
You cannot get conclusive results of any one format being the absolute best, but my conclusion is as follows.
I will keep using hex over named, lowercase over uppercase and start using short over long hex when possible.
Feel free to update results if they change with new versions of browsers.
Typically, css optimization is all about minimizing the number of bytes going over the wire. The hexadecimal colors tend to be the shortest (in your example, #f00 could be used instead of #ff0000).
I realize this isn't exactly answering the question you've asked but I haven't seen any browser tests which attempt to measure how different color representations affect rendering speed.
I too was curious about this (it's a Friday afternoon). Here's a JSPerf for the various CSS colour methods:
http://jsperf.com/css-color-names-vs-hex-codes/18
Edit: Each process has to get down to a binary value for r, g, and b. Hex and rgb bytes are already set up for that, so I guess they might actually be roughly the same speed. The rest have to go through a process to reach a hex/rgb value
#FF0000 = memory values of: 1111 1111 0000 0000 0000 0000
rgb(255,0,0) = memory values of: 1111 1111 0000 0000 0000 0000
Both cases are most likely stored in 3 int variables. So the real question is, which is faster to process into binary values for these integers? HEX or DEC? I think HEX, but I can't back that up. Anyhow, the code just takes the binary values of these variables.