So, when working with fonts :
the default (aka "Regular") font weight is 400
the "bold" font weight is 700
the "light" font weight is 300
But... why ? 400 what ? Is this a sort of unit ? Is there a historical reason behind that ?
Not "400 what", just 400. As per the CSS specification, first formalized in https://www.w3.org/TR/CSS1/#font-weight. there are nine levels of font weight, represented as unitless numbers, starting at 100 and ending at 900, in 100 increments.
In addition to this, the spec defines two mappings between numerical value and string value:
the numerical value 400 and the string value normal are the same thing, and
the numerical value 700 and string value bold are the same thing
(Note that while CSS4 will change this to allow for the numbers 1-1000 in increments of 1, it will still only officially recognise the string values normal and bold, still mapping to 400 and 700 respectively. See https://drafts.csswg.org/css-fonts-4/#font-weight-prop for more information)
The only formal rules around these weights is that if you're using a font family in CSS context, a font weight of 400/normal should get you whatever is that font family's Regular typeface, and a font weight of 700/bold should get you whatever is the font family's Bold typeface. Anything else is left entirely undefined, and all you know is that 100, 200, and 300 are probably lighter than 400, 500 and 600 are probably in between regular and bold, and 800 and 900 are probably heavier than 700.
All of those are qualified as "probably" because #font-face gets to completely invalidate everything about this. If you use #font-face, you overrule CSS's rules for what those numerical values mean entirely. For example: this rule will effect an ultra-thin font when you set font-weight to 900, because that's what we're telling the browser it must do:
#font-face {
font-family: MyFont;
font-weight: 900;
src: url("./fonts/ultra-thin.woff2") format("WOFF2");
}
Also important to know is that these are the only two official number/string mappings. Officially there are no other mappings, and the table of numerical values in https://drafts.csswg.org/css-fonts-3/#font-weight-prop is there purely to illustrate which real CSS values map to which rough names people tend to use for them.
The most important part is that this only applies to CSS. It has nothing to do with actual font-stored values, or things you see in word processing software, etc.
On the reason for no units
Font weights are not given a unit because with increasing and decreasing font weights there is no specific, identifiable "thing" that you are "turning up and down." In contrast to font-size, increases/decreases in font weight on computers have not traditionally been achieved programmatically - each level of font thickness has been needed to be hand-created by the font designer as a completely different typeface.
This is because it's much harder to programmatically change font weight than it is to change, say, size. Bolding text doesn't just add thickness to the entire character. It adds contrast by selectively adding more thickness to certain parts of the letter than others. Notice on the font below how certain parts of the letter "a" grow very thick with a higher weight while other parts of the same letter don't grow nearly as much.
This is not easy do programmatically, it's mostly had to be done by hand to have it look good - typographers would create a few different versions of their font - usually three - developers would upload all three styles, and the CSS font-weight property would be set up to only change between those three, usually at 300, 400, and 700. This is assuming you wanted real bold / italics. Faux styling - faux bold, faux italics, etc - has been provided by browsers, but the results have generally not been great.
Where things are moving now is towards variable fonts, introduced in 2016. These are fonts that are specifically designed to give developers full control over style attributes such as font weight and italicization. Essentially, font creators create several different variations of their typeface with different levels of weight, and the computer then intelligently fills in the spaces in between so that developers now have potentially infinite degrees of font-weight to use for a given font.
Even with this change, though, it's hard to pin down a specific, scientifically measurable "thing" that makes text more or less heavy. So I doubt that we'll see any non-relative unit introduced for font-weight any time soon.
On the reason for multiples of 100 and default of 400
The system is based off of the Linotype system that was developed for the Univers font in 1957. The idea was that you would use three digits to specify font-weight, width, and italicization, in that order. So 099 would be very light, very wide, and very italicized. 905 would be very heavy, very compressed, with medium italicization. This is just an example, the actual available values were different. People had more use of the 'weight' part of the system than the other two digits, so the second two digits were used less and less until they became vestigial and people forgot that the numbering system was used for anything other than specifying weights. It would indeed make more sense to have a 1-9 scale for font weight at this point, but 100-900 has now become conventional.
My guess is that 400 is the default simply because it's right in the middle. I would think that they chose 400 over 500 because people more often desire to bolden their text than to lighten it so they chose the default value that left more room for that.
Related
I have a LOGFONT structure that I convert to an IDWriteFont using CreateFontFromLOGFONT():
IDWriteFont* dWriteFont = nullptr;
if (FAILED(dWriteGdiInterop->CreateFontFromLOGFONT(&logFont, &dWriteFont))) return;
If the LOGFONT describes a Tw Cen MT Condensed font, I would like to have the DirectWrite's font have the DWRITE_FONT_STRETCH_CONDENSED attribute assigned.
The LOGFONT has width '0', whatever stretch I choose (condensend, wide, etcetera). It seems the stretch can only be deducted from the font name, and DirectWrite's method fails to do so. Is this a bug?
How can I create a DirectWrite font with a certain stretch, based on a LOGFONT structure?
I don't think it's necessarily a bug, for example dwrite_3.h has comment for this method saying that only some of fields are considered: lfFaceName, lfCharSet, lfWeight, lfItalic. No lfWidth here.
You can still try to ask for condensed one going through family:
call GetFontFamily() on font returned from CreateFontFromLOGFONT();
use GetFirstMatchingFont() on this family with parameters you want.
That should work if Tw Cen MT family has in fact condensed variant, from DirectWrite point of view.
I'm importing the size 300 and 400 as per the following:
<link rel="stylesheet" href="https://fonts.googleapis.com/css?family=Roboto+Slab:300,400" media="all">
Yet I still can apply font-weight: 800 and it looks different than font-weight:400 Why? where does it get it from?
Reproduction:
https://jsfiddle.net/7164fk3j/
Even when I only import the font-weight 300:
<link rel="stylesheet" href="https://fonts.googleapis.com/css?family=Roboto+Slab:300" media="all">
Reproduction:
https://jsfiddle.net/7164fk3j/1/
How is this working? Is it just making a bold of 300?
Fallback Weights
font-weight uses fallback weights based on the following algorithm:
If the exact weight given is unavailable, then the following heuristic
is used to determine the weight actually rendered:
If a weight greater than 500 is given, the closest available heavier weight is used (or, if there is none, the closest available
lighter weight).
If a weight less than 400 is given, the closest available lighter weight is used (or, if there is none, the closest available heavier
weight).
If a weight of exactly 400 is given, then 500 is used. If 500 is not available, then the heuristic for font weights less than 400 is
used.
If a weight of exactly 500 is given, then 400 is used. If 400 is not available, then the heuristic for font weights less than 400 is
used.
Source
Synthesis
This explains how the browser maps weights, but where does it get the actual bold version from?
There's a CSS property called font-synthesis which provides control over how/when the browser synthesizes aspects of fonts (weight, styles) that are missing.
The font-synthesis CSS property controls which missing typefaces, bold
or italic, may be synthesized by the browser.
This property isn't implemented in many browsers, but its existence suggests that the browser is synthesizing the bold version when it is missing (and that someday this property will give us control over it).
Source
Synthesis Source
Putting the two concepts above together, it appears that Chrome will use a weight of 300 as the basis for synthesis, but if you import the 400 version, it will use 400 as the basis and yield a slightly thicker result.
With only 300 weight imported:
With 300 and 400 weights imported:
300, 400, and 800 weights imported:
The question has bothered me for a while.
In CSS, the font-weight can only be a value that is a multiple of 100, from 100 to 900
Example : https://www.google.com/fonts#QuickUsePlace:quickUse/Family:Open+Sans
So why it is so? What is the origin that its reffered as 100, 200, 300... etc. instead of, for instance, 1,2,3, etc?
Apparently it derives from the Linotype typeface classification system (Linotype).
Where a 3-number system is used, first numeral describes font weight, second numeral describes font width, third numeral describes position.
There's an interesting article here on some of the history of specifying font weights in print.
If I load a google font that comes with 400 and 700 weights, but in my CSS I use font-weight:550; what does the browser do ?
According to MDN and W3.org:
If the exact weight given is unavailable, then the following heuristic
is used to determine the weight actually rendered:
If a weight greater than 500 is given, the closest available darker weight is used (or, if there is none, the closest available lighter
weight).
If a weight less than 400 is given, the closest available lighter weight is used (or, if there is none, the closest available darker
weight).
If a weight of exactly 400 is given, then 500 is used. If 500 is not available, then the heuristic for font weights less than 500 is
used.
If a weight of exactly 500 is given, then 400 is used. If 400 is not available, then the heuristic for font weights less than 400 is
used.
This means that for fonts that provide only normal and bold, 100-500 are normal, and 600-900 are bold.
Also:
There is no guarantee that there will be a darker face for each of the
'font-weight' values; for example, some fonts may have only a normal
and a bold face, while others may have eight face weights. There is no
guarantee on how a UA will map font faces within a family to weight
values. The only guarantee is that a face of a given value will be no
less dark than the faces of lighter values.
For the font-weight property, the value 550 is invalid. According to CSS error handling rules, a declaration with an invalid value is ignored, i.e. the actual font weight is determined by other CSS rules (or falling back to defaults), as if the declaration were not there.
There is a similar question here whose answer in essence says:
The height - specifically from the top of the ascenders (e.g., 'h' or 'l' (el)) to the bottom of the descenders (e.g., 'g' or 'y')
This has also been my experiance. I.e. in 14px Arial the height of the letter K (the baseline height) is about 10px.
The specs say nothing about the calculated font-size so I'm guessing this is browser-specific, but I could not find any reference to it.
(Other questions here and here ask roughly the same thing but sadly no answer gives a satisfying explanation..)
Is there any documentation on why the font-size seems to be the size "from ascenders to descendes"?
Some background on the subject
Back when letters were created on metal, the em referred to the size of each block that the letter would be engraved on, and that size was determined by the capital M because it usually takes up the most space.
Now a days, font developers create their fonts on a computer without the limitations of a physical piece of metal so while the em still exists, its nothing more than an imaginary boundary in the software; thus prone to being manipulated or disregarded altogether.
Standards
In an OpenType font, the em size is supposed to be set to 1000 units. And in TrueType fonts, the em size is usually either 1024 or 2048.
The most accurate way to define a font style is to use EM that way when you define a font-size for use, the dimension does not refer to the pixel height of the font, but to the fonts x height which is determined by the distance between the baseline and the mean line of the font.
For the record 1 PT is about 0.35136mm. And PX is 1 "dot" on your screen which is defined by the dots per square inch or resolution of your screen and thus different from screen to screen and is the worst way to define a font size.
Unit conversion
This is a pretty good read if you enjoy literature that makes your eyes bleed like me.. International unification of typopgrahic measurements
1 point (Truchet) | 0.188 mm
1 point (Didot) | 0.376 mm or 1/72 of a French royal inch
1 point (ATA) | 0.3514598 mm or 0.013837 inch
1 point (TeX) | 0.3514598035 mm or 1/72.27 inch
1 point (Postscript) | 0.3527777778 mm or 1/72 inch
1 point (l’Imprimerie nationale, IN) | 0.4 mm
1 pica (ATA) | 4.2175176 mm = 12 points (ATA)
1 pica (TeX) | 4.217517642 mm = 12 points (TeX)
1 pica (Postscript) | 4.233333333 mm = 12 points (Postscript)
1 cicero | 4.531 mm = 12 points (Didot)
Resolutions
µm : 10.0 20.0 21.2 40.0 42.3 80.0 84.7 100.0 250.0 254.0
dpi : 2540 1270 1200 635 600 317 300 254 102 100
Standards are only worth so much..
The actual size of one fonts glyphs vs another font are always going vary dependent on:
how the developer designed the font glyphs when creating the font,
and how the browser renders those characters. No two browsers are going to be exactly the same.
the resolution and ppi of the screen viewing the font.
Example
As an example of how common it is for font developers to manipulate the geometry.. Back when Apple created the Zapfino script font, they sized it relative to the largest capitals in the font as would be expected but then they decided that the lowercase letters looked too small so a few years later they revised it so that any given point size was about 4x larger than other fonts.
If you don't have a headache, read some more..
There's some good information on Wikipedia about modern typography and origins; and other relevant subjects.
Point (typography)
Pixels per inch
Font metrics
Typographic units
And some first-hand experience
If you want to get more first-hand understanding you can download the free font development tool FontForge which is comparable to the non-free FontLab Studio (either of these two being the popular choice among font developers in my experience). Both also have active communities so you can find plenty of support when learning how to use them.
Fontlab Studio
Fontlab Fontographer
FontForge
Fontlab Wikibook
FontForge Documentation
The answer regarding typesetting is excellent, but be aware css diverges from traditional typography in many cases.
In css the font-size determines the height of the "em-box". The em-box is a bounding box which can contain all letters of the font including ascenders and descenders. Informally you can think of font-size as the "j"-height, since a lower case j has both ascender and descender and therefore (in most fonts) uses the full em-box height.
This means most letters (like a capital M) will have space above and below in the em-box. The relative amount of space above and below a capital letter will vary between fonts, since some fonts have relatively larger or smaller ascenders and descenders. This is part of what stylistically sets fonts apart for each others.
You ask why font-size is including ascenders and descenders, ie. why it correspond to the height of the em-box, even though the height of most letters will be less than this. Well, since most texts do includes letters with ascenders and descenders, the em-box height indicates how much vertical space the text require (at minimum), which is quite useful!
An caveat: Some glyphs may even extend beyond the em-box in some fonts. For example the letter "Å" often extend above the em-box. This is a stylistic choice by the designer of the font.
I've experimented to pin down exactly what the font-size corresponds to in terms of font-metrics (as shown in the diagram in davidcondrey's answer).
Testing on Safari, Chrome and Firefox on macOS, setting font-size to 100px seems to set the apparent size of difference between the ascender line and descender line to 100px.
Unfortunately, there's multiple meanings for the ascender and descender when talking about different font formats, so to disambiguate in the case of OpenType, we're talking about the 'Typo Ascender' and the 'Typo Descender' from the OS/2 table.
OpenType CSS font-height diagram
An interactive illustration is available on the opentype.js glyph inspector https://opentype.js.org/glyph-inspector.html
When positioning the character (again in terms of OpenType metrics), browsers seem to consider y = 0 to be the Ascender, (which is not the same as the 'ascender line' in davidcondrey's diagram, but instead it's the ascender in the hhea table in OpenType). However, if CSS line-height is not set to normal, the position is offset by some amount but I think the rules here might be a bit more complex.
I expect there's more that factors in and it may be different between operating systems and browsers, however this is at least a good approximation.
After searching for a satisfying answer to a similar question, I found this graphic to be a lot of help...
http://static.splashnology.com/articles/Choosing-the-Best-Units/font-units-large.png