The baseline of 96ppi is nominal only. Form factor and intended distance from screen matters a lot. In the laptop form factor, you’re aiming for more like 110–125 as 1×. Apple laptops range from 221–254ppi as 2×.
186ppi is designed for 1.5×, an uncomfortable space that makes perfection difficult-to-impossible, yet seems to have become unreasonably popular, given how poorly everything but Windows tends to handle it. (Microsoft have always had real fractional scaling; Apple doesn’t support it at all, downsampling; X11 is a total mess; Wayland is finally getting decent fractional scaling.)
Apple's HiDPI is "2x scaled" on Retina and >= 4k displays. But you can still pick a virtual resolution that isn't exactly 0.5x your display's native resolution, and it will look great.
For example my external monitor is 3840x2160, and has a default virtual resolution of "1920x1080", but I run it at "2304x1296". My 14" MBP display has a default virtual resolution of "1512x982", but I run it at "1352x878". Neither looks scaled, neither has a slow display, weird fonts or weird graphics. I never even really think about it. In other words, light years beyond the experience on Ubuntu and on Windows.
You omitted the next word in your quote, where I mentioned what Apple does—downsampling.
Your displays are high enough resolution that you may not notice the compromises being made, especially if you don’t get an opportunity to compare it with real fractional rendering, but the compromises are real, and pretty bad at lower resolutions. Pixel-perfect lines are unattainable to you, and that matters a lot in some things. And you might be shocked at how much crisper and better old, subpixel-enabled text rendering is on that same display.
Apple was in the position to do it right, better than anyone else. They decided deliberately to do it badly; they bet big on taking typical resolutions high enough that downsampling isn’t normally needed (though they shipped hardware that always needed such downsampling for some years!), and isn’t so painful when it is needed; and they’ve largely got away with it. I still disagree with them.
As for 1352×878, what on earth is that number, for a native 3024×1964 panel!? 2.237. It’s like they’re gloating about not caring about bad numbers and how terribly inconsistent they’re going to make single-pixel lines.
> Your displays are high enough resolution that you may not notice the compromises being made, especially if you don’t get an opportunity to compare it with real fractional rendering, but the compromises are real, and pretty bad at lower resolutions. Pixel-perfect lines are unattainable to you, and that matters a lot in some things. And you might be shocked at how much crisper and better old, subpixel-enabled text rendering is on that same display.
Do you have a test case where I can see this in action?
Nothing handy, sorry. For comparable results, you’d need to use an old version of Mac OS X. Up to 10.13, I think, if you can ensure subpixel text rendering is active.
Sorry, what I mean is, is there an image or PDF I can bring up that will show me imperfect lines on these displays?
As for rendering of text, there is definitely antialiasing in play. Subpixel rendering is no longer used, but I don't think you need it at these resolutions anyway. I'm not even sure what the subpixel arrangement is of my display (is it neat columns of R -> G -> B, or larger R and B with smaller but more numerous G? At 250-some PPI, the pixels are too small to notice or care!). But, I agree that if I was using my old 1920x1200 monitor I would miss it.
Yeah the focus really should be on multipliers. Is it a clean multiple of the typical “normal” DPI resolution for that screen size? You’ve got a great screen. No? It’s a compromise. Simple.
1.5x looks ok mostly (though fractional pixels can cause issues in a few circumstances), but across platforms nothing is handled as well as 2x, 3x, etc is. I have a 1.5x laptop and wish it were either 1x or 2x.
The appropriate display scaling multiplier for this screen is 200% (2x), which is exactly why I regarded it pretty much clearing even this bar. On Windows at least, you can only alter display scaling in 25% increments (this is also why application designers are requested to only feature display elements with pixel dimensions that are cleanly divisible by 4), and so the closest fit for this laptop's PPI will be exactly the 200% preset option.
Using a lower preset than this is trading PPI for screen real estate. I don't think that's reasonable to introduce into the equation here. Yes, you match the relative size of display elements by virtue of (potentially!) being closer to the screen, but in turn you put more of the screen into your periphery, just like with a monitor or a TV. I don't think that's a fair comparison at all. An immersive distance (40° hfov) for this display is at 37.1 cm (a foot and a bit) - I think that's about as close as one gets to their laptops typically already. This is pretty much the same field of view you'd ideally have at your monitor and TV too, so either you use this same preset on all of them, or we're not comparing apples to apples. Or you just really like to get closer to your laptop specifically, I suppose.
Nah, look at laptop norms for the last decade and it’s clearly targeting 1.5×, not 2×. Even more so given how small it is: you’ll aim for a lower scaling factor because otherwise you can’t fit anything on the screen.
There's PPI and then there's PPD. If they want more PPD (which is what's field of view and thus viewing distance and display size dependent), that's fine, but then it's not PPI they should be complaining about.
This might sound like a nitpick but I really don't mean it to be. These are proper well defined concepts and terms, so let's use them.
I wasn't thinking about the difference between PPI and PPD, so thanks for the clarification.
The bottom line is that I work with text (source code) all day long and I would rather read from a display with laser printer quality than one where I can see the pixels like an old dot matrix printer. Some displays are getting close to 300 DPI which is like a laser printer from 35 years ago.
I can definitely appreciate that. I just think it's important that people argue the right thing. It provides insight to the variables and mechanisms at play, and avoids people falsely giving rhetorical checkmates to each other, like I kind of did to you.
The brief version is that if someone has a screen real estate concern, they need to look for the PPI, but if they have a visual quality concern, they need to look for the PPD.
Maybe it will be elucidating if I describe a scenario where you will have low PPI but high PPD at the same time.
Consider a 48" 4K TV (where 4K is really just UHD, so 3840x2160). Such a display will have 91.79 PPI of pixel density, which is below even standard PPI (that being 96 PPI, as mentioned).
Despite this, the visual quality will be generally excellent: at the fairly typical and widely recommended 40° degree horizontal field of view, you're looking at 3840 / 40 = 96 PPD, well in excess of the original Retina standard (60 PPD), which is really just the 20/20 visual acuity measure. Hope this is insightful.
But nobody knows what baseline PPD is (47) and you can't actually specify a laptop screen in PPD, you can only specify it in PPI. So I think it's reasonable and maybe even preferable to use PPI here.
I can understanding finding it reasonable, it's just not getting at the heart of the problem.
It also introduces an element of uncertainty: as you say, you can't specify a laptop screen's PPD since that's dependent on viewing distance. But that's exactly the problem: it's dependent on viewing distance. Some people hunch over and look at their laptops up close and personal, others have it on a stand at a reasonable height and distance. To use PPI is to intentionally mask over this uncertainty, and start using ballpark measures people may or may not agree with without knowing.
To put it in context, for this display, "Retina resolution" (60 PPD), i.e. the 20/20 visual acuity threshold, is passed when viewed from 47.09 cm (18.54 inches, so basically a feet and a half). I don't know about you, but I think this is a very reasonable distance to view your laptop from, even if it's just 12.2" in diagonal. It corresponds to a horizontal field of view of 32°.
You could say it masks over the uncertainty in some ways, but it doesn't introduce that uncertainty. Asking for a laptop with 100PPD doesn't even make sense.
> Asking for a laptop with 100PPD doesn't even make sense.
Won't deny, since again, PPD depends on your field of view.
Yes, if you shop for "resolution and diagonal size", you may as well shop for PPI directly. This just doesn't generalize to displays overall (see my other comment with a TV example), as it's not actually the right variable. Wrong method, "right" result.
> The threshold for sharp edges is much finer, and the things we put on computer displays have a lot of sharp edges.
And the cell density is even finer. It was merely an example using a known reference value that lots of people would find excellent; I didn't mean to argue that it's the be-all end-all of vision. It's just 20/20.
PPI doesn't generalize across different types of display but it works pretty well within a category of monitor, laptop, tablet, phone. For TV you probably just assume it's 4K and figure out the size you like.
It's wrong but it's wrong in a way that causes minimal trouble and there's no better option. And if you add viewing distance explicitly, PPI+distance isn't meaningfully worse than PPD+distance, and people will understand PPI+distance better.
Eh, I suppose. Just the criteria of "is it hidpi? yes/no" readily mislead GP for example (i.e. it definitely is, just still "not hidpi enough"), so I felt it would be helpful if the mechanism at play was clarified. Maybe I came off too strong though. Felt it would be clearer to use the correct variable at least, than to try and relativize PPI.
I guess, but even without measuring pixel inches/degrees it feels clearly wrong to me to say that proper 1x on a 12 inch laptop screen is only 960x600. 1280x720 or 1280x800 makes more sense to me, and then there's no confusion because 1920 is a clear 1.5x resolution.
For what it's worth, it's a pretty small diagonal size. Netbooks used to be about this size, and those had exactly such low resolutions on them. Conversely, you'd see 1280x720, but especially 1366x786, more on regular variety laptops (~15"), and if you crunch the numbers for these (using standard ppi), it maps pretty much exactly right. So we've come a long way on Windows/Linux/BSD land, even if there's much more to go.
3840x2160@15.3" for example would be a nice even 3.0x display scale, at 287.96 PPI, and 128 PPD at 30° hfov to match the line pair resolving capability of the human eye [0] rather than the light dot resolving of 60 PPD, although of course still far from the 10x improvement over it via hyperacuity that you linked to earlier.
I accuse those 15 inch laptops of being below the bar. 15 inch should be 1600x900.
If 960xwhatever is okay at 12 inches, then 1366x768 wouldn't even be the baseline resolution for 15 inch laptops, it would be the baseline resolution for 17 inch laptops. That just sounds silly to me.
Assuming the laptop screen is just 20% closer goes a long way here to figuring out a good resolution. And it gives 720p to 12/13 inch laptops at 1x.
Windows' "real fractional scaling" gives me clipped window borders, maximized windows bleeding onto other screens, and fuzzy-looking applications. I'm curious if Apple's downsampling method works better, because I am not impressed with Microsoft's method.
Yes, it does. It always renders internally at 2x which means that's all applications have to support. Then it downsamples the final framebuffer to the resolution of the display.
186ppi is designed for 1.5×, an uncomfortable space that makes perfection difficult-to-impossible, yet seems to have become unreasonably popular, given how poorly everything but Windows tends to handle it. (Microsoft have always had real fractional scaling; Apple doesn’t support it at all, downsampling; X11 is a total mess; Wayland is finally getting decent fractional scaling.)