How can I create a DirectWrite font with a certain stretch, based on a LOGFONT structure? - gdi

I have a LOGFONT structure that I convert to an IDWriteFont using CreateFontFromLOGFONT():
IDWriteFont* dWriteFont = nullptr;
if (FAILED(dWriteGdiInterop->CreateFontFromLOGFONT(&logFont, &dWriteFont))) return;
If the LOGFONT describes a Tw Cen MT Condensed font, I would like to have the DirectWrite's font have the DWRITE_FONT_STRETCH_CONDENSED attribute assigned.
The LOGFONT has width '0', whatever stretch I choose (condensend, wide, etcetera). It seems the stretch can only be deducted from the font name, and DirectWrite's method fails to do so. Is this a bug?
How can I create a DirectWrite font with a certain stretch, based on a LOGFONT structure?

I don't think it's necessarily a bug, for example dwrite_3.h has comment for this method saying that only some of fields are considered: lfFaceName, lfCharSet, lfWeight, lfItalic. No lfWidth here.
You can still try to ask for condensed one going through family:
call GetFontFamily() on font returned from CreateFontFromLOGFONT();
use GetFirstMatchingFont() on this family with parameters you want.
That should work if Tw Cen MT family has in fact condensed variant, from DirectWrite point of view.

Related

Why is the default font weight 400?

So, when working with fonts :
the default (aka "Regular") font weight is 400
the "bold" font weight is 700
the "light" font weight is 300
But... why ? 400 what ? Is this a sort of unit ? Is there a historical reason behind that ?
Not "400 what", just 400. As per the CSS specification, first formalized in https://www.w3.org/TR/CSS1/#font-weight. there are nine levels of font weight, represented as unitless numbers, starting at 100 and ending at 900, in 100 increments.
In addition to this, the spec defines two mappings between numerical value and string value:
the numerical value 400 and the string value normal are the same thing, and
the numerical value 700 and string value bold are the same thing
(Note that while CSS4 will change this to allow for the numbers 1-1000 in increments of 1, it will still only officially recognise the string values normal and bold, still mapping to 400 and 700 respectively. See https://drafts.csswg.org/css-fonts-4/#font-weight-prop for more information)
The only formal rules around these weights is that if you're using a font family in CSS context, a font weight of 400/normal should get you whatever is that font family's Regular typeface, and a font weight of 700/bold should get you whatever is the font family's Bold typeface. Anything else is left entirely undefined, and all you know is that 100, 200, and 300 are probably lighter than 400, 500 and 600 are probably in between regular and bold, and 800 and 900 are probably heavier than 700.
All of those are qualified as "probably" because #font-face gets to completely invalidate everything about this. If you use #font-face, you overrule CSS's rules for what those numerical values mean entirely. For example: this rule will effect an ultra-thin font when you set font-weight to 900, because that's what we're telling the browser it must do:
#font-face {
font-family: MyFont;
font-weight: 900;
src: url("./fonts/ultra-thin.woff2") format("WOFF2");
}
Also important to know is that these are the only two official number/string mappings. Officially there are no other mappings, and the table of numerical values in https://drafts.csswg.org/css-fonts-3/#font-weight-prop is there purely to illustrate which real CSS values map to which rough names people tend to use for them.
The most important part is that this only applies to CSS. It has nothing to do with actual font-stored values, or things you see in word processing software, etc.
On the reason for no units
Font weights are not given a unit because with increasing and decreasing font weights there is no specific, identifiable "thing" that you are "turning up and down." In contrast to font-size, increases/decreases in font weight on computers have not traditionally been achieved programmatically - each level of font thickness has been needed to be hand-created by the font designer as a completely different typeface.
This is because it's much harder to programmatically change font weight than it is to change, say, size. Bolding text doesn't just add thickness to the entire character. It adds contrast by selectively adding more thickness to certain parts of the letter than others. Notice on the font below how certain parts of the letter "a" grow very thick with a higher weight while other parts of the same letter don't grow nearly as much.
This is not easy do programmatically, it's mostly had to be done by hand to have it look good - typographers would create a few different versions of their font - usually three - developers would upload all three styles, and the CSS font-weight property would be set up to only change between those three, usually at 300, 400, and 700. This is assuming you wanted real bold / italics. Faux styling - faux bold, faux italics, etc - has been provided by browsers, but the results have generally not been great.
Where things are moving now is towards variable fonts, introduced in 2016. These are fonts that are specifically designed to give developers full control over style attributes such as font weight and italicization. Essentially, font creators create several different variations of their typeface with different levels of weight, and the computer then intelligently fills in the spaces in between so that developers now have potentially infinite degrees of font-weight to use for a given font.
Even with this change, though, it's hard to pin down a specific, scientifically measurable "thing" that makes text more or less heavy. So I doubt that we'll see any non-relative unit introduced for font-weight any time soon.
On the reason for multiples of 100 and default of 400
The system is based off of the Linotype system that was developed for the Univers font in 1957. The idea was that you would use three digits to specify font-weight, width, and italicization, in that order. So 099 would be very light, very wide, and very italicized. 905 would be very heavy, very compressed, with medium italicization. This is just an example, the actual available values were different. People had more use of the 'weight' part of the system than the other two digits, so the second two digits were used less and less until they became vestigial and people forgot that the numbering system was used for anything other than specifying weights. It would indeed make more sense to have a 1-9 scale for font weight at this point, but 100-900 has now become conventional.
My guess is that 400 is the default simply because it's right in the middle. I would think that they chose 400 over 500 because people more often desire to bolden their text than to lighten it so they chose the default value that left more room for that.

Why is font-weight by hundreds when the only available values are multiples of 100?

The question has bothered me for a while.
In CSS, the font-weight can only be a value that is a multiple of 100, from 100 to 900
Example : https://www.google.com/fonts#QuickUsePlace:quickUse/Family:Open+Sans
So why it is so? What is the origin that its reffered as 100, 200, 300... etc. instead of, for instance, 1,2,3, etc?
Apparently it derives from the Linotype typeface classification system (Linotype).
Where a 3-number system is used, first numeral describes font weight, second numeral describes font width, third numeral describes position.
There's an interesting article here on some of the history of specifying font weights in print.

When setting a font-size in CSS, what is the real height of the letters?

There is a similar question here whose answer in essence says:
The height - specifically from the top of the ascenders (e.g., 'h' or 'l' (el)) to the bottom of the descenders (e.g., 'g' or 'y')
This has also been my experiance. I.e. in 14px Arial the height of the letter K (the baseline height) is about 10px.
The specs say nothing about the calculated font-size so I'm guessing this is browser-specific, but I could not find any reference to it.
(Other questions here and here ask roughly the same thing but sadly no answer gives a satisfying explanation..)
Is there any documentation on why the font-size seems to be the size "from ascenders to descendes"?
Some background on the subject
Back when letters were created on metal, the em referred to the size of each block that the letter would be engraved on, and that size was determined by the capital M because it usually takes up the most space.
Now a days, font developers create their fonts on a computer without the limitations of a physical piece of metal so while the em still exists, its nothing more than an imaginary boundary in the software; thus prone to being manipulated or disregarded altogether.
Standards
In an OpenType font, the em size is supposed to be set to 1000 units. And in TrueType fonts, the em size is usually either 1024 or 2048.
The most accurate way to define a font style is to use EM that way when you define a font-size for use, the dimension does not refer to the pixel height of the font, but to the fonts x height which is determined by the distance between the baseline and the mean line of the font.
For the record 1 PT is about 0.35136mm. And PX is 1 "dot" on your screen which is defined by the dots per square inch or resolution of your screen and thus different from screen to screen and is the worst way to define a font size.
Unit conversion
This is a pretty good read if you enjoy literature that makes your eyes bleed like me.. International unification of typopgrahic measurements
1 point (Truchet) | 0.188 mm
1 point (Didot) | 0.376 mm or 1/72 of a French royal inch
1 point (ATA) | 0.3514598 mm or 0.013837 inch
1 point (TeX) | 0.3514598035 mm or 1/72.27 inch
1 point (Postscript) | 0.3527777778 mm or 1/72 inch
1 point (l’Imprimerie nationale, IN) | 0.4 mm
1 pica (ATA) | 4.2175176 mm = 12 points (ATA)
1 pica (TeX) | 4.217517642 mm = 12 points (TeX)
1 pica (Postscript) | 4.233333333 mm = 12 points (Postscript)
1 cicero | 4.531 mm = 12 points (Didot)
Resolutions
µm : 10.0 20.0 21.2 40.0 42.3 80.0 84.7 100.0 250.0 254.0
dpi : 2540 1270 1200 635 600 317 300 254 102 100
Standards are only worth so much..
The actual size of one fonts glyphs vs another font are always going vary dependent on:
how the developer designed the font glyphs when creating the font,
and how the browser renders those characters. No two browsers are going to be exactly the same.
the resolution and ppi of the screen viewing the font.
Example
As an example of how common it is for font developers to manipulate the geometry.. Back when Apple created the Zapfino script font, they sized it relative to the largest capitals in the font as would be expected but then they decided that the lowercase letters looked too small so a few years later they revised it so that any given point size was about 4x larger than other fonts.
If you don't have a headache, read some more..
There's some good information on Wikipedia about modern typography and origins; and other relevant subjects.
Point (typography)
Pixels per inch
Font metrics
Typographic units
And some first-hand experience
If you want to get more first-hand understanding you can download the free font development tool FontForge which is comparable to the non-free FontLab Studio (either of these two being the popular choice among font developers in my experience). Both also have active communities so you can find plenty of support when learning how to use them.
Fontlab Studio
Fontlab Fontographer
FontForge
Fontlab Wikibook
FontForge Documentation
The answer regarding typesetting is excellent, but be aware css diverges from traditional typography in many cases.
In css the font-size determines the height of the "em-box". The em-box is a bounding box which can contain all letters of the font including ascenders and descenders. Informally you can think of font-size as the "j"-height, since a lower case j has both ascender and descender and therefore (in most fonts) uses the full em-box height.
This means most letters (like a capital M) will have space above and below in the em-box. The relative amount of space above and below a capital letter will vary between fonts, since some fonts have relatively larger or smaller ascenders and descenders. This is part of what stylistically sets fonts apart for each others.
You ask why font-size is including ascenders and descenders, ie. why it correspond to the height of the em-box, even though the height of most letters will be less than this. Well, since most texts do includes letters with ascenders and descenders, the em-box height indicates how much vertical space the text require (at minimum), which is quite useful!
An caveat: Some glyphs may even extend beyond the em-box in some fonts. For example the letter "Å" often extend above the em-box. This is a stylistic choice by the designer of the font.
I've experimented to pin down exactly what the font-size corresponds to in terms of font-metrics (as shown in the diagram in davidcondrey's answer).
Testing on Safari, Chrome and Firefox on macOS, setting font-size to 100px seems to set the apparent size of difference between the ascender line and descender line to 100px.
Unfortunately, there's multiple meanings for the ascender and descender when talking about different font formats, so to disambiguate in the case of OpenType, we're talking about the 'Typo Ascender' and the 'Typo Descender' from the OS/2 table.
OpenType CSS font-height diagram
An interactive illustration is available on the opentype.js glyph inspector https://opentype.js.org/glyph-inspector.html
When positioning the character (again in terms of OpenType metrics), browsers seem to consider y = 0 to be the Ascender, (which is not the same as the 'ascender line' in davidcondrey's diagram, but instead it's the ascender in the hhea table in OpenType). However, if CSS line-height is not set to normal, the position is offset by some amount but I think the rules here might be a bit more complex.
I expect there's more that factors in and it may be different between operating systems and browsers, however this is at least a good approximation.
After searching for a satisfying answer to a similar question, I found this graphic to be a lot of help...
http://static.splashnology.com/articles/Choosing-the-Best-Units/font-units-large.png

How to determine the number of charecter will fit to screen in Qt

How do I determine the number of characters in a particular font will fit to the screen?
Have a look at QFontMetrics. Using this, you can determine, among other things, the width of a particular string:
QFontMetrics metrics(myFont);
int width = metrics.width(myString);
Is this what you want?
Note: It is not possible to find the number of characters of a particular font that will fit on the screen since not all fonts are monospace. So the number of characters will depend on the actual characters.
you can also use QFontMetrics::elidedText passing available space (remember to reduce it with margins/paddings. Then call length on result string

MeasureString() pads the text on the left and the right

I'm using GDI+ in C++. (This issue might exist in C# too).
I notice that whenever I call Graphics::MeasureString() or Graphics::DrawString(), the string is padded with blank space on the left and right.
For example, if I am using a Courier font, (not italic!) and I measure "P" I get 90, but "PP" gives me 150. I would expect a monospace font to give exactly double the width for "PP".
My question is: is this intended or documented behaviour, and how do I disable this?
RectF Rect(0,0,32767,32767);
RectF Bounds1, Bounds2;
graphics->MeasureString(L"PP", 1, font, Rect, &Bounds1);
graphics->MeasureString(L"PP", 2, font, Rect, &Bounds2);
margin = Bounds1.Width * 2 - Bounds2.Width;
It's by design, that method doesn't use the actual glyphs to measure the width and so adds a little padding in the case of overhangs.
MSDN suggests using a different method if you need more accuracy:
To obtain metrics suitable for adjacent strings in layout (for example, when implementing formatted text), use the MeasureCharacterRanges method or one of the MeasureString methods that takes a StringFormat, and pass GenericTypographic. Also, ensure the TextRenderingHint for the Graphics is AntiAlias.
It's true that is by design, however the link on the accepted answer is actually not perfect. The issue is the use of floats in all those methods when what you really want to be using is pixels (ints).
The TextRenderer class is meant for this purpose and works with the true sizes. See this link from msdn for a walkthrough of using this.
Append StringFormat.GenericTypographic will fix your issue:
graphics->MeasureString(L"PP", 1, font, width, StringFormat.GenericTypographic);
Apply the same attribute to DrawString.
Sounds like it might also be connecting to hinting, based on this kb article, Why text appears different when drawn with GDIPlus versus GDI
TextRenderer was great for getting the size of the font. But in the drawing loop, using TextRenderer.DrawText was excruciatingly slow compared to graphics.DrawString().
Since the width of a string is the problem, your much better off using a combination of TextRenderer.MeasureText and graphics.DrawString..

Resources