#FFFFFF or "white" in CSS? - css

Is there a difference between #FFF (#FFFFFF) and white in CSS? Is one better than the other?

All are supported in the major browsers. It comes down to whichever unjustifiable, deep-seated prejudice you personally have for/against hexadecimal/the English language.

They're all guaranteed to be the same. CSS 3 Color Module (a Proposed Recommendation) defines white as #ffffff.
It later says that values like #rgb are converted to #rrggbb:
The three-digit RGB notation (#rgb) is
converted into six-digit form
(#rrggbb) by replicating digits, not
by adding zeros. For example, #fb0
expands to #ffbb00.
That means that #fff is equivalent to #ffffff (by doubling).

there is no difference. I would imagine browsers take "white" and translate it to "#FFFFFF" in the background. its just a matter of personal coding style which you will use. I prefer using hash because its easier to read and recognise as a colour

Technically, there is no real difference. See this list of supported color names by all major browsers. Of course, some will have a preference to one way or the other but for me as long as you keep it consistent it doesn't matter.

I know this question is very old, but Iam trying to explain with my explanation for the new people.
you can use #F instead of #FF only if there are two same characters followed by each other. Example:#00FF00 you can use instead #0F0. That means #F = #FF or #0 = #00 etc...

Related

Web colors hexadecimal notation

Usually colors in hexadecimal notation are presented with a hashtag following 6 hexadecimal characters. What color does the value #AAA produce? Are the other characters derived from the existing ones? Are the missing values just assumed?
#AAA is interpreted as #AAAAAA
When you have 3-hexadecimal colour, the browser is assuming that every single char (or number) gets doubled (so e.g. #ABC is equivalent to #AABBCC)
The six digit colours are traditional 24 bit colours, whereas the three character colour codes are "web-safe" colours...
http://en.wikipedia.org/wiki/Web_colors
Nice question.
Short answer: #xyz is read as #xxyyzz, so the specific example means #aaa becomes #aaaaaa, a gray quite dark tone (68.75%).
Wikipedia calls this shorthand hex form, https://en.wikipedia.org/wiki/Web_colors#Shorthand_hexadecimal_form
CSS does have a formal specification, but it is a very long read. To experiment with the specific colors that CSS allows, maybe browser debuggers is good. jsfiddle is also a possibility: http://jsfiddle.net/mYdb5/
which contains the following simple code:
Color test:<br/>
<div/>
div {
background-color: #aaa;
width:100%;
height:40px;
}

In ncurses, is there a simple way to use every combination of the 8 standard foreground and background colors?

I've noticed that (at least on my platform) COLOR_PAIRS is 64. I've read that color pair 0 is always the default foreground and background color, and cannot be changed. With 8 default colors, this means that we can explicitly set every combination of these 8 colors except one. This is a problem for me, as the user may not necessarily have a white-on-black terminal as I do. Another potential problem for me is that I have a transparent terminal, and color pair 0 retains the transparent background, while using an explicitly color pair with a black background does not appear transparent on my terminal.
Is there a way to either use all 64 combinations of colors that may not be mentioned by the crappy documentation for ncurses that I keep finding around the net? Or is it safe to change COLOR_PAIRS to 128 before initializing the library to extend the number of colors pairs I can use? If I can't find a reasonable solution, I may just use Termbox in my program. I'd like to use ncurses for its wide support and the fact that most Unix like platforms include it by default, but Termbox has a FAR simpler API.
With ncurses6 (August 2015), the default configuration provides for 256 colors, 32767 color pairs. If you have a current version of ncurses, you can get 64 color pairs easily enough.
TermBox may have a simpler API, but (read the source-code) is less capable, and apparently not under active development (last source-code change 8 months ago).

color code in X/HTML , CSS

In how many ways we can give color info in X/HTML, css?
I know some
Hex
color name
rgba
is there any other method?
and which method is preferred to use and which not? Please give explanation.
The three ways you mention are the only three.
I don't think a specific method is generally preferred, but as a developer, I like to see hex numbers.
I would avoid color names simply because if you want to know the true value of a color you have to look it up which is an annoyance in my workflow.
Also, hex numbers are the most compact way to describe a color (for most colors), so you might be saying a couple bytes of bandwidth by using hex. This doesn't really matter, but it's one of the only differences I can think of.
There is one more method: the one you're missing is old RGB. RGBa includes opacity (that's the 'a', for alpha)--it's not the same as RGB. RGB is supported by a wide array of browsers, old and new; RGBa is supported by a narrower but significant set of browsers: http://css-tricks.com/rgba-browser-support/ (IE being the main holdout, as usual).
Which method you use really doesn't matter. I prefer hex from habit, but am starting to use RGB more so that I can start getting used to extending it to RGBa.

Are there any good reasons for using hex over decimal for RGB colour values in CSS?

rgb(255,255,255) notation has been available since CSS1. But #ffffff seems to be vastly more popular.
Obviously it's slightly more compact. I know that hex is more closely related to the underlying bytes, and understand that there would be advantages in carrying out arithmetic on those values, but this isn't something you're going to do with CSS.
Colour values tend to be originated by designers (such as myself) who would never encounter hex notation anywhere else, and are much more familiar with the decimal notation which is the main way of specifying colour in the apps they use -- in fact I have met quite a few who don't realise how a given hex value breaks down into RGB components and assumed it didn't directly relate to the colour at all, like a Pantone colour system reference (eg PMS432).
So, any reason not to use decimal?
Hex values are easier to copy and paste from your favourite image editor.
RGB values are easier to manipulate with Javascript.
(My favourite Hex colour value is #EDEDED and a site we made for a client involved in motorsport had a background colour of #F1F1F1 :-)
Ed.
It's worth noting that if you want to input an RGBA value, hex notation is not supported; i.e., you can't fake it with #FFFFFFff. As a matter of fact, the alpha value must be a number between 0.0 and 1.0, inclusive. (Check out this page for browser support -- as always, IE is leading the pack here. ;) )
HSL and HSLA color support -- which is very designer friendly -- is also provided with a similar syntax to the RGB() style. If a designer were to use both types of color values in the same stylesheet, they might opt for decimal values over hex codes for consistency.
I think it's what you're used to. If you're used to HTML, you'll probably use HEX since it's just been used a lot in HTML. If you're from a design background, using Photoshop/Corel/PaintShopPro etc., then you're likely used to the RGB notation - though, a lot of programs these days incorporate a HEX value field too.
As said, RGBA might be a reason to just go with the RGB notation - consistency.
Though, I think it also depends on the scenario. If you're comfortable with both, you might just switch between them: #fff is a lot easier to type than rgb(255,255,255).
Another question is why people will say #fff instead of White (assuming most browsers support this keyword).
It's all a matter of preference and legibility - if you're maintaining a huge CSS file, being able to look at the colour value and know what colour it is, is a really good advantage. Even more advantageous is using something like LESS or Sass to add a kind of programmability to CSS - allowing constants for example. So instead of saying:
#title { color: #abcdef; }
You might instead do the following with LESS:
#base-color: #abcdef;
#title { color: #base-color; }
Maintaining the CSS becomes less of an issue.
If you're worried about the performance of the browser rendering it's result, then that could also be another factor to your choice.
So in summary, it boils down to:
Familiarity
Preference
Maintainability
Performance
The main reason is probably compactness, as you mentioned. #ffffff can even be further shortened to the #fff shorthand notation.
Another possible reason is that there's a perceived performance increase by saving the browser the trouble of converting the rgb notation.
Traditionally HTML has always used hex colours, so that has carried forward into CSS. Think <font color="#ffffff">
I always used hex, but today I prefer to set my values as:
rgb(82, 110, 188)
in my css files, so whenever I want to add opacity I just need to rename rgb to rgba and add the opacity value. The advantage is that I don't have to convert the hex value to rgb before being able to add the opacity:
rgba(82, 110, 188, 0.5)
CSS was invented by software developers, not designers. Software developers live and breathe hex. From my old C64 days, I can still read most hex numbers without thinking. A9, anyone?
Various things will accept a single hex value where they may have different ways of entering three decimal values. There's also the fact that it's always 6 characters (or 3, admittedly - plus the #) which makes it easier to scan down a list of them.
Just a couple of random thoughts to add to the mix...
Probably a touch of speed when the color is interpreted by a browser. Otherwise some people from design background may know how to compose colors from RGB components when they write code, and some others from programming background are probably more inclined to use HEX values.
HEX is most common due to historical reasons.
Before CSS was common in web development, colors were specified within HTML tags and the most commonly used and supported way to specify a color was to use HEX values.
no valid reason, other than personal preference.
Maybe I've done HTML too long, but I find it easier to think in HEX values. A lot of the pre-defined colour palette for HTML maps neatly to HEX values. Using the shortened format also gives you automatic 'web-safe' colours, though this is not really an issue in the days of 32bit colour displays.
I have always preferred HEX in comparison to RGB or HSL simply due to it being easier for me to read while styling.
When it comes to dynamic editing, I would like to use RGB because of the ease of cycling through different colors. This also helps a little when doing CSS gradients.

MeasureString() pads the text on the left and the right

I'm using GDI+ in C++. (This issue might exist in C# too).
I notice that whenever I call Graphics::MeasureString() or Graphics::DrawString(), the string is padded with blank space on the left and right.
For example, if I am using a Courier font, (not italic!) and I measure "P" I get 90, but "PP" gives me 150. I would expect a monospace font to give exactly double the width for "PP".
My question is: is this intended or documented behaviour, and how do I disable this?
RectF Rect(0,0,32767,32767);
RectF Bounds1, Bounds2;
graphics->MeasureString(L"PP", 1, font, Rect, &Bounds1);
graphics->MeasureString(L"PP", 2, font, Rect, &Bounds2);
margin = Bounds1.Width * 2 - Bounds2.Width;
It's by design, that method doesn't use the actual glyphs to measure the width and so adds a little padding in the case of overhangs.
MSDN suggests using a different method if you need more accuracy:
To obtain metrics suitable for adjacent strings in layout (for example, when implementing formatted text), use the MeasureCharacterRanges method or one of the MeasureString methods that takes a StringFormat, and pass GenericTypographic. Also, ensure the TextRenderingHint for the Graphics is AntiAlias.
It's true that is by design, however the link on the accepted answer is actually not perfect. The issue is the use of floats in all those methods when what you really want to be using is pixels (ints).
The TextRenderer class is meant for this purpose and works with the true sizes. See this link from msdn for a walkthrough of using this.
Append StringFormat.GenericTypographic will fix your issue:
graphics->MeasureString(L"PP", 1, font, width, StringFormat.GenericTypographic);
Apply the same attribute to DrawString.
Sounds like it might also be connecting to hinting, based on this kb article, Why text appears different when drawn with GDIPlus versus GDI
TextRenderer was great for getting the size of the font. But in the drawing loop, using TextRenderer.DrawText was excruciatingly slow compared to graphics.DrawString().
Since the width of a string is the problem, your much better off using a combination of TextRenderer.MeasureText and graphics.DrawString..

Resources