While delving into CSS units I've encountered a definition of the reference pixel. However, I wasn't able to find a consistent and comprehensive description of its relation to the CSS pixel unit. I've done some research on this matter, yet it's still a little bit unclear to me.
1. Gathered information
1.1 A pixel definition
There are two distinct types/definitions of a pixel:
"Device pixel" — a single physical point on a display.
And:
CSS pixel — a unit most closely matching the reference pixel. [1]
Two parallel concepts under the same name definitely don't clarify the confusion.
I fully understand the purpose of introducing the second one, but I find its nomenclature misleading. The CSS pixel is classified as an absolute unit and:
"Absolute length units are fixed in relation to each other." [1]
The above statement seems pretty obvious for every unit except for the pixel. Following the w3c specification:
"For a CSS device, these dimensions are either anchored (i) by relating the physical units to their physical measurements, or (ii) by relating the pixel unit to the reference pixel.
(...) Note that if the anchor unit is the pixel unit, the physical units might not match their physical measurements. Alternatively if the anchor unit is a physical unit, the pixel unit might not map to a whole number of device pixels." [1]
Considering the aforementioned quotation I assume that absolute units are not all that absolute, since they may be anchored to the reference pixel.
1.2 The reference pixel
The reference pixel itself is actually an angular measurement [2]:
"The reference pixel is the visual angle of one pixel on a device with a pixel density of 96dpi and a distance from the reader of an arm's length. For a nominal arm's length of 28 inches, the visual angle is therefore about 0.0213 degrees." [1]
What is illustrated on the image below:
Despite defining the reference pixel as a visual angle, we can further read:
"For reading at arm's length, 1px thus corresponds to about 0.26 mm (1/96 inch)."
Leaving inconsistencies aside, we are able to establish a value of the angle:
α = 2 * arctan(0.026/142) = 0.02098°
where:
α — a value of the visual angle
Thus a size of the displayed unit equals:
y = 2x * tan(0.01049°)
where:
y — a displayed unit size
x — a reading distance
Given the above formula, in order to calculate a unit size we need to determine what's the actual reading distance. As it may vary among users, its categorisation has been based on a device's DPI.
1.2.1 DPI
For convenience, let's assume that DPI == PPI.
This measurement allows us to guess a display type.
Quick check:
IPhone 6 (4.7", 1334×750): 326 ppi;
Sony Bravia 4K (54.6", 3840×2160): 75 ppi.
So, in general, the bigger PPI the closer to a screen a user sits. The table below [3] presents reading distance recommendations for devices with particular DPI:
———————————————————————————————————————
| DPI | Pixel size | Reading distance |
—————————————————————————————————————————————————————
|PC's CRT | 96 | ~0.2646 mm | ~71 cm |
|display | | | |
—————————————————————————————————————————————————————
|Laptop's LCD | 125 | 0.2032 mm | ~55 cm |
|display | | | |
—————————————————————————————————————————————————————
|Tablet | 160 | ~0.159 mm | ~43 cm |
—————————————————————————————————————————————————————
However, it's unclear to me how those distance values were obtained. Is the relation to DPI described with a function or is it just an empirical observation?
1.2.2 Device Pixel Ratio
The introduction of the Retina display complicated the matter even further. Its PPI tends to be approximately 2 times bigger than non-Retina one's, while a recommended reading distance should remain the same.
Since a CSS pixel size doesn't necessarily correspond with a device pixel size, I assume that the unit size on the Retina display is firstly translated into a reference pixel size (expressed in device pixels) and then multiplied by pixel ratio. Is it correct?
1.2.3 Zooming
While zooming in, the displayed reference pixel size grows [4], ergo a distance from a display grows. It's quite counterintuitive, because it means that we are "stepping away" from the screen, not getting closer to it.
2. Questions
Concluding my doubts and articulating questions:
How a CSS pixel size is calculated when the anchor unit is a physical unit?
How to establish the formula for a relation between DPI and a reading distance?
How a CSS pixel size is calculated for non-standard, high DPI/PPI devices such as printers and Retina Displays?
Also, please correct me if my reasoning is invalid or I am missing something. Thanks for replies.
3. References
W3C Specification
inamidst.com, Sean B. Palmer's site
Mozzilla Hacks
1uirksmode.org
I may be wrong but I don't think it's possible for CSS pixel to have the physical unit as an anchor.
Based on this article:
The px unit is the magic unit of CSS. It is not related to the current
font and also not related to the absolute units. The px unit is
defined to be small but visible, and such that a horizontal 1px wide
line can be displayed with sharp edges (no anti-aliasing). What is
sharp, small and visible depends on the device and the way it is used:
do you hold it close to your eyes, like a mobile phone, at arms
length, like a computer monitor, or somewhere in between, like a book?
The px is thus not defined as a constant length, but as something that
depends on the type of device and its typical use.
UPDATE: I was wrong. It is possible just not implemented in any browser currently. In cases where that is indeed the case, as per spec: "these dimensions are either anchored (i) by relating the physical units to their physical measurements" meaning that 1px will be equal to 1/96th of a physical inch.
As for the relation between DPI and reading distance in the table, it takes that if DPI = 96 then reading distance is ~71cm or 28in and these values are inversely proportional meaning higher DPI will result in a smaller reading distance.
From that it's easy to come up with a formula:
x = 125/96
y = 71/x
where:
x - ratio between second and first DPI value
y - reading distance for second DPI value
For higher resolution devices there is an example given lower in Mozilla Hacks article:
Let’s take iPhone 4 as the most famous example. It comes with a 326
DPI display. According to our table above, as a smartphone, it’s
typical viewing distance is 16.8 inches and it’s baseline pixel
density is 160 DPI. To create one CSS pixel, Apple chose to set the
device pixel ratio to 2, which effectively makes iOS Safari display
web pages in the same way as it would on a 163 DPI phone.
So that means we have two resolutions - physical (PPI) and CSS one (cssppi). It seems that cssppi is used for calculating reference pixel size and then device manufacturers choose how much reference pixels will they map to one CSS pixel (I assume this number is equal to device pixel ratio value but not 100% sure).
Here's table with comparisons for some common devices pixel ratio, PPI, and cssppi: http://mydevice.io/devices/
For more information and reference check following articles:
A List Apart - A Pixel Identity Crisis
Pixels Per Inch Awareness and CSS Px
Physical Size of CSS Units On Smartphones, Tablets & Co
IN THE PAST
1 CSS pixel = 1 pixel on a 96 DPI inch monitor
(common on old windows monitors)
TODAY 1 CSS pixel = (???) pixels on a newer DPI display (smart phones, newer monitors, tv's. etc.)
There are now two different interpretations of the pixel today. The past one and the present one. That's the source of the confusion. But a "CSS pixel" has always been a CSS pixel. It was renamed a "reference pixel" only recently to help associate a CSS pixel's recalculation into "device pixels" on newer HD displays. The old CSS pixel was closely associated with the physical units on screens and printers prior to 2011. At least, the W3C recommendation was asking vendors to follow that spec. It was not till the invention of the iPhone and later HD device screens that the "1 CSS pixel = 1/96th of an inch on a screen" interpretation failed and was abandoned.
The solution was to allow manufacturers with HD screens to "reinterpret", as they like, the "CSS pixel" or "reference pixel" into a new "Device pixel" only they control. So now, how many CSS pixels should fit into a given physical dimension on a screen and its affect on viewers on newer digital devices has changed. The new "device pixel" that tries to follow the 1 CSS pixel = 1/96th of an inch (modified by user viewing distance for smaller screens) no longer applies because screen PPI depth or "pixel density" has increased.
On most newer HD sceens today, the CSS pixel is now recreated across multiple device pixels depending upon the size and PPI of the device. A 72 dpi image on an old Windows monitor would look tiny on a modern iPhone screen, if it did not do that. But you will notice each phone and screen now has its own pixel density (PPI or DPI), screen resolution (pixel width x height), its own physical viewport size (inches), its own default zoom and font resizing systems, its own layout rules (landscape, etc.), and thus their own rules for how they reinterpret the CSS pixel into device pixels (2, 3, 4, etc). You ask what is the formula? I say there is no formula. The industry, in my opinion, does not follow any known standard and has not since abandoning W3C standards after 2010.
Below is some history that may help...
HISTORY OF THE CSS PIXEL
Years ago in the Golden Age of Web Development (pre-2010), defining an "inch" (in) in Cascading Style Sheets would be correctly correlated to an inch on the screen or in print, allowing the 96 pixels per inch to a reader viewing it at arms length always getting the same experience. That was why and how the 96 dots per inch standard was invented. It was designed to force manufacturers to give users the same visual experience with the almighty digital or CSS pixel.
Starting in the 1990's, these CSS standards fell on the shoulders of computer OS manufacturers and their monitors. When it was done right on old Windows and Mac screens prior to 2011, they mapped 96 DPI correctly. You saw Macs and Windows computer screens correctly laying the right pixels per inch onto the screen, so that the CSS "pixel" would fit consistently into it, regardless of manufacturer, vendor, screen, or device. There were visual differences as far as website displays between Macs and Windows due to their slightly different screen dimensions and pixel densities. But that was based on their views of how large or small visuals should be for their viewers. But that did not change the overall CSS standard.
After 2000, an "inch" of website in a browser looked close to an inch on most screens and worked that way for most viewers. Sure, you could increase resolution, or how many pixels could be crammed into a screen. But that did not change the 96 pixels per inch rule, as it worked on the natural assumption a user might choose larger or small text or DPI from the default. This was the same as zoom back then. But after 2000, the CSS pixel and millions of supporting websites fit perfectly on millions of desktop screens, devices, and printed pages.
Prior to 2007, most screens translated every web page and CSS pixel to the "per inch" expectation. So as developers, we initially started trying to fit pixel-based layouts and our favorites font size choices into strict physical layouts using standard 96 dpi and Windows monitors and 72 dpi Mac screens. Of course, after 2001, monitors starting getting larger and resolution settings higher. But everyone knew an inch was still an inch, and a pixel was still a pixel. We could design website layouts to expand and grow based on that 96 DPI standard.
Macs using 72 DPI, we always knew Mac people saw things a little differently, however. As a web developer in 2000, I remember using this system and it worked quite well. But when testing on Macs and Windows, I remember seeing a strange effect...how websites looked slightly larger on Mac screens. Because 72 dpi was the lowest common denominator that is what we used for the default resolution on everything, including sizes of images, web layouts in pixels, etc. That's why there's so much 72 dpi imagery on the web now. But this worked well for over a decade after 2000. CSS pixels worked as expected!
THE IPHONE
After the invention of the iPhone in 2007 the idea of the 96 DPI pixel started to change. Why? The resolution or number of pixels per inch (PPI) increased on tiny screens dramatically. That meant at say 200 PPI, the general resolution of a high density iPhone small screen device, a normal website would look tiny and compressed, the text unreadable and the images less that half their expected size as seen on Windows 96 DPI monitors.
But that is the way the CSS pixel was supposed to work. The way CSS addressed these small-screen, high resolution issues was with the "handheld" media query.
I remember building a website using CSS "handheld" and shifting the pixels in my layout, even revealing higher resolution images for iPhones. This worked initially except for one simple fact.....these new screens and devices REFUSED TO ADOPT THE CSS STANDARDS! Why? They wanted millions of websites to look good out-of-the-box without special CSS adaptions. This is a failure in my book, but se la vi.
So....the W3C by 2011 realized their great standards would never look consistently on these new vendor devices, who wanted quick profits, not standards. So they flipped the definition to start at 96 DPI as a "reference" pixel, then allow vendors to keep creating greater and greater resolution screens but then "reinterpret" the CSS pixel to a "device pixel" they control.
After 2011, the W3C stopped expecting common displays, screens, and devices to correctly define what an absolute dimensional unit in CSS (cm, mm, in, etc) should look like on screens. This was because the vendors making HD screens started increasing resolutions so websites got tinier and tinier until the CSS standards broke. That was when the concept of the "device pixel" was born where devices recreate your CSS pixel by cramming multiple pixels into it.
What does this mean? It means every device automatically regenerates your website design and its reference CSS pixel to a new pixel size. In other words, your website becomes larger and more readable on these tiny screens by default. Your 1 pixel may be remapped to 2 pixels. Average "device pixel" ranges are 2-4 now. Apple iPhone people had no choice in how they chose to translate CSS pixels when their high resolution devices came on the scene. If they didn't adapt to a new device pixel interpretation, we would all be looking at very teeny-tiny 72 dpi GIFs on 400+ dpi retina displays and 4k monitors all over the web. We would have unreadable websites on millions of new modern devices.
You cannot control this. All you can do is focus on avoiding strict pixel-based layouts and PPI/DPI imagery and focus on website layout flexibility. That is why I avoid pixels in 2022 and try and use text-based unit layouts that change based on the user's font-size and using absolute font-size unit layouts and rem/em units rather than pixels. My websites now naturally stretch to fill whatever the screen, phone, monitor, or tv settings use based on the natural size of fonts and text set by the user.
But this is how and why the "device pixel" was born and why pixels now are very difficult to control. The device pixel simply holds double or triple the website pixels given it, in most cases now. It tries to figure out how many web design CSS pixels are crammed into one inch of their physical HD displays.
Kids today building websites just assume its always been that way. They have no clue. That's why most experienced, modern web developers now use relative font sizing and avoid pixels as units for anything. Doing so, your layouts can now stretch based on either the percentages needed to fill the screens and odd view-ports used today, adapt to a user's larger or smaller font sizes, use a user's zoom settings, or just use the default device pixel settings the device uses out of the box. We aren't chained to precise pixel-based layouts any more because of this mess. Pages can stretch in websites to fill what's there. But many kids developing websites still cling to the "css pixel" as if its a God. It's not guys. Its dead.
I think that is how we got here and why new screen technology destroyed what used to be a very simple and elegant website technology. Today its a mess! Why? Because vendors hate standards. They never could come together as a tech industry and define it. There's no money in adopting any of it. So this website pixel translation problem will just get worse as newer devices come online the next 50 years. By then CSS will likely adopt a "hack" that allows us to control how pixels display. In fact in CSS3 you can already sniff for resolutions, set dimensions based on physical view-port sizes, mobile devices, device pixel settings, zoom, media queries, etc.
But those new ideas do not help fix the problem, in my book. They just layer over more complexity and confusion rather than forcing all screen and device manufacturers to use the same, simple CSS pixel, which should be universal across all screens, past and present.
Related
I want to display "natural size" pictures of goods on web site.
First of course i calculate size of picture :
<Height of picture in pixels> * <real height of item in mm>
-----------------------------------------------------------
<height of item on the picture in pixels>
Formula is logicaly correct and it works fine on
desktop 17'' 4:3 monitor 1280*1024.
Any other device shows metrics wrong.
I test:
2 smartphone.
22'' 1920*1080 desktop monitor
13.3'' 1366*768 notebook
14'' 1600*900 notebook
They all not 4:3 aspect ratio.
what's wrong with that? How can i reach my goal?
I search web but only workaround i found is to display piece of A4 size paper and ask user zoom page.
I better ask user to setup their system, but how to do it?
Unfortunately, although methods exist to query the browser for the exact pixels per inch of its display, the browser vendors decided to agree to a convenient lie... all browsers report 96 pixels per inch. Although you can get a browser to report this fact to you, there is no way to get the real pixels per inch.
A famous example is that there is no way through Javascript, HTML, or CSS to detect the difference between an IPad 2 and an IPad Mini, despite having radically different pixel density.
Actually I find the approach of using the paper and asking the user to zoom to calibrate the "real displayed size" rather clever. If you make it easy to use, I'm sure they will appreciate this feature!
In theory, by CSS 2.1 specs, the mm unit (and similar units like cm and in) relate to physical units. So if you set an image width in mm units, browsers should scale the image to the given physical measure, with the accuracy allowed by the resolution of the device.
In reality, browsers behave more the way described in the CSS3 Values and Units CR. The section on physical units says that 1in equals 96px by definition, and on high-resolution devices like printers, the inch is the anchor unit, corresponding to real physical inch, whereas on lower-resolution devices like displays, the pixel is the anchor unit. It adds: “Note that if the anchor unit is the pixel unit, the physical units might not match their physical measurements.” (Besides, even in printers, the correspondence between CSS in and a physical inch is not necessarily exact.)
So, mission impossible.
For calibration by the user, I would not use an A4 paper. It’s large, and not everyone has A4 papers at hand, especially in countries with a different standard paper size. A ruler, with both inches and millimeters, would be better. And perhaps you could add a zooming widget to make the zooming easier.
Thanks to all! I combined all of tips and make picture :
http://www.clker.com/clipart-258249.html
user takes one of common thing and fit it to picture by zoom.
Considering a css pixel is not a device pixel on high DPI device, predict and manage touch target size across these devices have been a headache for me.
For example, say I have a Web App with viewport meta "width=device-width, initial-scale=1.0", on iPad (9.7 inch screen) in landscape mode, viewport is set to 1024px, and 50px (in css) is roughly 1cm in physical size.
However for device such as Nexus 7 (7 inch screen), viewport would be set to 966px, thus 50px (in css) is only about 0.7cm in physical size. (Not to mention a growing list of High DPI device that I may not get my hands on)
Different guideline varies on recommeneded touch target size, but I tend to prefer around 1cm to allow for human error.
Is there a best practice for such scenario? the idea described in Let's Get Physical (Units) is the closest I found via google, but far from production ready.
Start using em or percentages instead of px.
This is probably going to be what pushes everyone over the line.
I need to put in my site div which is exactly 25cm width ( 10 inch) in every display. How I can do it ?
You can simply use the cm unit in CSS:
#mydiv { width: 25cm; }
Note that, as others pointed out, the result still depends on the correct reading of the monitor size by the operating system.
See the spec for more information.
How I can do it ?
You can't. Update: apparently, you can on many modern systems: Check out #Tomas's answer. It seems not to be always entirely reliable, though.
Old answer: You can't. Monitors display different numbers of pixels. The pixel size varies wildly from monitor to monitor.
There are ways to interpolate the pixel size if you know the monitor size. This information is sometimes available to the operating system; however, it is impossible for a web site to get hold of this information.
The only way to go would be to have the user do a calibration. For example, ask the user to hold an A4 piece of paper to the monitor, and use a draggable ruler to determine the area it covers. Using that information, you can then calculate how many pixels you will need to show 25 centimeters.
Update: #Tomas claims in his answer that using CSS cm values works on screen.
This is in fact true on my Windows 7 and 23" Plug&Play TFT Monitor (1920x1080 Pixels): 21cm translates perfectly to the short side of a A4 sheet of paper in Chrome 7, IE6(!), IE7, Firefox 3.6.
It doesn't seem to be entirely reliable, though: #Yi Jiang can't get it to work on a TFT using Ubuntu Linux; also, older Monitors may not send through their size information so it'll be impossible for the OS to determine a correct size.
Here's a simple JSFiddle for testing.
You can't. A program can only get the true physical dimensions of a screen by interrogating EDID as values returned by the Windows API are not reliable. A program can get the true values for resolution (e.g.1280 x 1024) and screen dpi, but browsers can't do any of this by themselves.
There is a constant confusion between the "physical dpi" of a screen and "screen dpi". The physical dpi, more properly called pixels per inch, is obtained by dividing the maximum pixel width of the screen by the physical (ruler) width in inches. The pixels per inch are fixed by the manufacturing process. The screen dpi is a number that the user can set via the Control Panel and it's only purpose is to convert a value in inches into a number of pixels. The user settable screen dpi value has no direct relationship whatever with the physical dpi (pixels per inch) and is just a number with a default value of 96. There is nothing magic about 96, or 120
Number of screen pixels = number of inches x screen dpi
It's as simple as that.
The reason 21 cm on a 23 inch monitor at 1920 x 1080 "translates" to the width of an A4 sheet (21 cm) is because with a 23 inch diagonal the screen width is 20.05 inches and at 1920 pixels across the pixel density is 95.76 pixels per inch.
With screen dpi default value of 96 then for one inch: pixels = 1 x 96 = 96 pixels
The pixel density of the 23 inch screen is 95.76 pixels per inch which matches the number of pixels you get, when specifying a length of one inch, with the default screen dpi value of 96.
If screen dpi is changed in the Control Panel, or the monitor video resolution is changed, then 21 cm would not match the width of a sheet of A4.
You will need to get hold of the resolution of the display and the dot pitch of the monitor to be able to calculate this.
Given these two values you'll be able to calculate the number of pixels you need.
However, you can't get hold of this information from a web site.
Given that you state it's a <div> in a site, we know you're in a web browser environment.
Sadly for you, the web browser doesn't have any way to find out the screen's DPI. You can find out what the screen resolution is, so you'll know whether the user has 1024x760 or whatever, but you'll never know whether those 1024x768 pixels are being displayed on an iPhone sized screen or a billboard, or anything in between.
Sorry about that.
I believe the layout engine would need to know three things to make this possible:
Screen resolution
DPI
Physical monitor size
As far as I'm aware, it doesn't know all three.
What is the difference between pt and px in CSS? Which one should I use and why?
px ≠ Pixels
All of these answers seem to be incorrect. Contrary to intuition, in CSS the px is not pixels. At least, not in the simple physical sense.
Read this article from the W3C, EM, PX, PT, CM, IN…, about how px is a "magical" unit invented for CSS. The meaning of px varies by hardware and resolution. (That article is fresh, last updated 2014-10.)
My own way of thinking about it: 1 px is the size of a thin line intended by a designer to be barely visible.
To quote that article:
The px unit is the magic unit of CSS. It is not related to the current font and also not related to the absolute units. The px unit is defined to be small but visible, and such that a horizontal 1px wide line can be displayed with sharp edges (no anti-aliasing). What is sharp, small and visible depends on the device and the way it is used: do you hold it close to your eyes, like a mobile phone, at arms length, like a computer monitor, or somewhere in between, like a book? The px is thus not defined as a constant length, but as something that depends on the type of device and its typical use.
To get an idea of the appearance of a px, imagine a CRT computer monitor from the 1990s: the smallest dot it can display measures about 1/100th of an inch (0.25mm) or a little more. The px unit got its name from those screen pixels.
Nowadays there are devices that could in principle display smaller sharp dots (although you might need a magnifier to see them). But documents from the last century that used px in CSS still look the same, no matter what the device. Printers, especially, can display sharp lines with much smaller details than 1px, but even on printers, a 1px line looks very much the same as it would look on a computer monitor. Devices change, but the px always has the same visual appearance.
That article gives some guidance about using pt vs px vs em, to answer this Question.
Here you've got a very detailed explanation of their differences
http://kyleschaeffer.com/development/css-font-size-em-vs-px-vs-pt-vs/
The jist of it (from source)
Pixels are fixed-size units that are used in screen media (i.e. to be read on the computer screen). Pixel stands for "picture element" and as you know, one pixel is one little "square" on your screen.
Points are traditionally used in print media (anything that is to be printed on paper, etc.). One point is equal to 1/72 of an inch. Points are much like pixels, in that they are fixed-size units and cannot scale in size.
Have a look at this excellent article at CSS-Tricks:
px – em – % – pt – keyword
Taken from the article:
pt
The final unit of measurement that it is possible to declare font sizes in is point values (pt). Point values are only for print CSS! A point is a unit of measurement used for real-life ink-on-paper typography. 72pts = one inch. One inch = one real-life inch like-on-a-ruler. Not an inch on a screen, which is totally arbitrary based on resolution.
Just like how pixels are dead-accurate on monitors for font-sizing, point sizes are dead-accurate on paper. For the best cross-browser and cross-platform results while printing pages, set up a print stylesheet and size all fonts with point sizes.
For good measure, the reason we don't use point sizes for screen display (other than it being absurd), is that the cross-browser results are drastically different:
px
If you need fine-grained control, sizing fonts in pixel values (px) is an excellent choice (it's my favorite). On a computer screen, it doesn't get any more accurate than a single pixel. With sizing fonts in pixels, you are literally telling browsers to render the letters exactly that number of pixels in height:
Windows, Mac, aliased, anti-aliased, cross-browsers, doesn't matter, a font set at 14px will be 14px tall. But that isn't to say there won't still be some variation. In a quick test below, the results were slightly more consistent than with keywords but not identical:
Due to the nature of pixel values, they do not cascade. If a parent element has an 18px pixel size and the child is 16px, the child will be 16px. However, font-sizing settings can be using in combination. For example, if the parent was set to 16px and the child was set to larger, the child would indeed come out larger than the parent. A quick test showed me this:
"Larger" bumped the 16px of the parent into 20px, a 25% increase.
Pixels have gotten a bad wrap in the past for accessibility and usability concerns. In IE 6 and below, font-sizes set in pixels cannot be resized by the user. That means that us hip young healthy designers can set type in 12px and read it on the screen just fine, but when folks a little longer in the tooth go to bump up the size so they can read it, they are unable to. This is really IE 6's fault, not ours, but we gots what we gots and we have to deal with it.
Setting font-size in pixels is the most accurate (and I find the most satisfying) method, but do take into consideration the number of visitors still using IE 6 on your site and their accessibility needs. We are right on the bleeding edge of not needing to care about this anymore.
A pt is 1/72th of an inch and is a useless measure for anything that is rendered on a device which doesn't calculate the DPI correctly. This makes it a reasonable choice for printing and a dreadful choice for use on screen.
A px is a pixel, which will map on to a screen pixel in most cases.
CSS provides a bunch of other units, and which one you should choose depends on what you are setting the size of.
A pixel is great if you need to size something to match an image, or if you want a thin border.
Percentages are great for font sizes as, if you use them consistently, you get font sizes proportional to the user's preference.
Ems are great when you want an element to size itself based on the font size (so a paragraph might get wider if the font size is larger)
… and so on.
pt is a derivation (abbreviation) of "point" which historically was used in print type faces where the size was commonly "measured" in "points" where 1 point has an approximate measurement of 1/72 of an inch, and thus a 72 point font would be 1 inch in size.
EDIT: Note to clarify
There are approximately 72 (72.272) points in one inch or 2.54 cm. The point was first established by the Milanese typographer, Francesco Torniella da Novara ( c. 1490 – 1589) in his 1517 alphabet, L'Alfabeto. (you can search for various references to those)
px is an abbreviation for "pixel" which is a simple "dot" on either a screen or a dot matrix printer or other printer or device which renders in a dot fashion - as opposed to old typewriters which had a fixed size, solid striker which left an imprint of the character by pressing on a ribbon, thus leaving an image of a fixed size.
Closely related to point are the terms "uppercase" and "lowercase" which historically had to do with the selection of the fixed typographical characters where the "capital" characters where placed in a box (case) above the non-capitalized characters which were place in a box below, and thus the "lower" case.
There were different boxes (cases) for different typographical fonts and sizes, but still and "upper" and "lower" case for each of those.
Another term is the "pica" which is a measure of one character in the font, thus a pica is 1/6 of an inch or 12 point units of measure (12/72) of measure.
Strickly speaking the measurement is on computers 4.233mm or 0.166in whereas the old point (American) is 1/72.27 of an inch and French is 4.512mm (0.177in.). Thus my statement of "approximate" regarding the measurements.
Further, typewriters as used in offices, had either and "Elite" or a "Pica" size where the size was 10 and 12 characters per inch respectively.
Additionally, the "point", prior to standardization was based on the metal typographers "foot" size, the size of the basic footprint of one character, and varied somewhat in size.
Note that a typographical "foot" was originally from a deceased printers actual foot. A typographic foot contains 72 picas or 864 points.
As to CSS use, I prefer to use EM rather than px or pt, thus gaining the advantage of scaling without loss of relative location and size.
EDIT: Just for completeness you can think of EM (em) as an element of measure of one font height, thus 1em for a 12pt font would be the height of that font and 2em would be twice that height. Note that for a 12px font, 2em is 24 pixels. SO 10px is typically 0.63em of a standard font as "most" browsers base on 16px = 1em as a standard font size.
Yes, "px" means "pixel"
Now that I said it, I can already hear an army of clairvoyants approaching, with "px has nothing to do with pixels" on their banners. They're so proud of knowing better that they look up every comment containing the original truth and explain in detail that it's false, incorrect, misleading, etc.
And yes they have a point - a very specific point in time, actually, called iPhone 4.
Here's what happened.
The peaceful days
Before Retina displays, one pixel was one pixel. Because that's how it should be, according to human logic. You put a single pixel line on the screen, you magnify the hell out of it, and there you go: it's exactly ONE PIXEL wide. On many hardwares, modern ones included, this is still the case, so it's everything but "incorrect" to say 1px = 1 pixel.
But.
Back then, one day, iPhone 3 was followed by iPhone 4, doubling the resolution in both X and Y, and the developers of Safari worried that all webpages will look ricidulous, especially because many web developers relied on the steady 320x480 resolution. So just creating a 640x960 pixel area would have killed most of the sites. And at this point, someone had the billion-dollar idea to introduce a magical beast: a CSS feature called -webkit-min-device-pixel-ratio - on iPhone 3 it was 1, and on iPhone 4 it was set to 2 by default. Meaning "1 CSS pixel now means 2 screen pixels". A very ugly hack to keep websites look somewhat intact - it worked at that time, with the very small cost of some images looking a bit blurry, but in the long run it caused this worldwide misunderstanding of poor old px who actually did nothing wrong.
So then: pt or px?
On screens, use px because on many-many displays it will mean ONE PIXEL. The biggest advantage of using pixels is they look crispy; even if 1px means 2 or 3 physical picture elements, whatever you draw will start at the boundary, not somewhere-in-between. This is very important. Watch any browser animation that includes text, especially size transitions: when you increase a div to double size, but slowly. You'll see how your browser recalculates its pixels and redraws the font when the animation is done. There's a temporary image of the area which is a little blurry - to make the animation itself smoother -, and then, after reaching its final state, a more exact image is calculated. See this CodePen.
1px is always an integer multiplication of hardware pixels; that is, unless your operating system is being smart like resizing your whole desktop to sqrt(2) x PI. Or just 125%, yes, hello Windows on tiny laptops. But anyway - with px, you have the highest chance to align your things with the physical grid.
What about pt? The funny thing about pt is it's actually translated to px, so it's just a worse way to specify pixel sizes. Here's a calculator. Points (since they come from the print world) start to make sense when you print something, but today there are better alternatives, depending on what you need - so tbh, points are almost never needed.
TL;DR
For screens, use px whenever possible.
Mozilla's documentation from elementFromPoint explains that the coordinates are not in physical pixels, but "CSS pixels". Exactly what are CSS pixels? I was under the impression that a pixel in CSS was the same as the physical pixel.
If this is not the case, how does one convert between physical and CSS pixels?
A pixel is a physical screen pixel as far as any web page can reasonably concern itself.
The wording ‘CSS pixel’ refers to a pixel as defined in CSS 2
the whole number of device pixels that best approximates the reference pixel. It is recommended that the reference pixel be the visual angle of one pixel on a device with a pixel density of 96dpi and a distance from the reader of an arm's length. For a nominal arm's length of 28 inches, the visual angle is therefore about 0.0213 degrees.
What this is saying is that a CSS pixel is a device pixel for the normal, simple screen cases. However, for specialist super-high-res screens where the OS scales up its normal dimensions by two, a CSS pixel may be two device pixels wide.
We now also have a ‘zoom’ feature in most browsers, which will naturally change the size of the CSS pixel (along with all other units) so it doesn't match a device pixel.
As I'm sure you know, CSS provides a number of different units for representing length. Some are based on physical, real-world measurements (inches, millimeters, points) while others are relative to something else (em width, percentage).
But pixels are neither of these. Originally, they were (as you assumed) merely the smallest addressable dot on a user's screen. However, this is problematic for a number of reasons:
The renderer may actually be using sub-pixel positioning to avoid rounding errors.
The output device may not have pixels - a ten-dot font on a 600dpi printer is unlikely to reflect what the designer actually wanted.
Similar to printing, pages designed for common, low-resolution displays (72-96dpi) may be unreadable on high-resolution displays.
Modern browsers offer powerful tools to scale / magnify pages.
And so CSS pixels are a useful abstraction: they have well-defined relations to other measurements (at least, within a given browser...) and thus page designers can rely on the results looking reasonably close to their designs even when the browser must change the relationship to actual, device pixels.
To answer your second question, you don't convert between physical and CSS pixels. That would defeat the whole point by destroying an abstraction that the renderer needs in order to operate properly. Gecko does provide a way to determine the relationship, but only to chrome scripts - normal web pages should remain blissfully unaware...
Further reading: Units Patch Landed
Conversion can be done with window.devicePixelRatio; now supported by all major browsers
See https://developer.mozilla.org/en-US/docs/Web/API/Window/devicePixelRatio
In most cases, yes, CSS pixels are the same as physical pixels. According to the CSS 2.1 Specification:
Pixel units are relative to the resolution of the viewing device, i.e., most often a computer display. If the pixel density of the output device is very different from that of a typical computer display, the user agent should rescale pixel values.
Typically, 1 pixel on the device will be the same as 1 pixel in CSS. On a device with a very high pixel density, you might find that its CSS pixel size is actually 2 physical pixels wide, but I don't know of any devices that do so. Not even the iPhone 4, with its Retina Display, will adjust its CSS pixel size.
As pointed out by Shog9, most browsers' zoom features will adjust the scale of the pixels being displayed, but in most cases, CSS pixels will be the same as the physical device.