Font smoothing method should be set system-wide, and not be under the control of a remote web page. What does the remote web page know about the kind of monitor the text is being viewed on? It could have a non-standard pixel layout, be rotated 90 degrees, be hiDPI, or be scaled.
"Subpixel" rendering (a.k.a. cleartype) that abuses color fringing to provide higher luminance resolution only works when you have pixels made of three RGB columns. Mobile screens usually use a pentile pattern, so this trick is not just more difficult, it's actually impossible.
Why is that impossible on a Pentile matrix? Fundamentally subpixel antialiasing is just recognizing that the samples for different colour channels are offset from one another, and on that level of generality it doesn’t really matter how those samples are arranged geometrically. If the sample density depends on the channel, well, you’ll have to weight those samples, but still, how is that a problem?
I recognize that fitting that into a scanline rasterizer in a device that doesn’t really have samples arranged in scanlines might be a bit tricky and that there might be less urgency to squeeze the last dregs of resolution out of a ≥300 dpi display than out of a 96 dpi one, but I disagree that this can’t be done.
The reason it works on traditional flat panels is that there is a one to one correspondence between the image data and the panel layout. That is not the case with PenTile displays and there many different subpixel arrangements so it is not possible to predict what subpixels on the display will lit up for a given subpixel in the image data. For best results you also need fonts designed for subpixel rendering so even if subpixel rendering was practical on PenTile displays you would carefully need to tune the fonts for every possible PenTile subpixel arrangement.
Well, getting the hardware (manufacturers) to tell you what the hell it (their stuff) is doing is always a problem and even extremely complicated designs and massive marketing efforts like USB only kind of solve the problem for the most common of the common cases, but that’s kind of a truism.
Re you second point, I originally wanted to say something like “fonts aren’t, renderers are”, but in the meantime I read the Raster Tragedy[1] and it seems like the answer to your question is quite literally yes, hinting bytecode is in fact designed for a subpixel grid with a 1:3 aspect ratio (mostly by virtue of the font designer writing and checking it by feel using one). Shoehorning outline fonts and WYSIWYG layout onto low-resolution (≤ 150 dpi or so at 10 pt) displays is that much of a hack, and now I really, really want it to die, all the complaints about elitist designers notwithstanding. (That I find 200- and 300-dpi tablets and phones much more pleasant to work on is a mere coincidence, I assure you.)
As you mention they're not just offset differently, they're also at different resolutions. In theory you can achieve something similar by rendering at a higher resolution and downsampling with an appropriate filter, but then you're using a completely different algorithm that has nothing to do with the RGB "subpixel" cleartype rendering anymore.
Also modern mobile phones have super high resolution screens so you don't need to bother.
That's clearly the future for desktops too. At some point I would imagine subpixel hinting gets disabled entirely in desktop OSes but we just haven't got there yet.
This is unfortunately true. Fonts without cleartype are ugly af on my 4K 27” display and it’s very hard to find smaller (higher DPI) displays that support 4K, >60hz, and adaptive sync.
I suspect that 8K displays will be “good enough” and 16K will finally make this hack obsolete in a few years, but we’re not there yet.
Beauty is in the eye of the beholder. I personally can't stand any form of Cleartype-style rendering -- the color fringing stands out terribly. I vastly prefer a somewhat blurrier but stable grayscale than seeing a red-blue shimmer throughout the text.
Then again, I use a non-AA bitmap font in my editor, so yeah. Get off my lawn!
Eh that’s fair. I’m colorblind like 10% of males (and probably >20% of programmers) and I bet this defect in my eyes make cleartype more bearable. I need that subpixel resolution. Hopefully future displays will satisfy a larger proportion of the population.
I wonder how much of this is due to Windows font rendering currently being so focused on ClearType. At 4K and 27", you don't need subpixel rendering anymore, because your regular pixels are small enough, but you do still need grayscale antialiasing. As far as I can tell, Windows no longer supports grayscale font smoothing — it's subpixel or nothing, and "nothing" still looks bad except for bitmap fonts being displayed at exactly the size they were designed for.
This is one reason why I don't like 4k monitors. The pixel density is just awkward. If I use 1 point = 1 pixel, then the UI is way too small. If I run at 1:2, the UI is too big and the screen is effectively a "retina 1080p". And if I try 1:1.5 then the smoothing artifacts are too visible.
I ended up solving this with a 5k display at 1:1.6. The smoothing artifacts are below my threshold of perception and all looks nice and smooth, while still having a 3200x1800 usable resolution. Downside: cost.
Better yet: HP’s Z27q display. It’s 27 inches and 5K, which is “Retina” to 2560x1440. And I’ve got two of ‘em (…and a 6-head GPU to accommodate their MST cables).
I just wish there was a 16:10 aspect ratio option.
UPDATE: The "real" 5K HP Z27q model (with 30-bit color and 5120x2880 pixels) was released around 2012-2013 and was discontinued by 2016, which is a shame because it's such an amazing monitor: I imagine they discontinued it because 5K operation requires 2 separate and hefty DP1.2 cables in MST mode, which (at the time) the majority of computers just couldn't handle (I don't think Apple Macs can use it at all), and Windows' support for high-DPI was awful-to-mediocre until Windows 10 came out in 2015 - but they've been fantastic for me. Oh, and it was bloody expensive: each one new was about $1500-2000 IIRC.
It seems HP has re-released the Z27q... but in-name-only: the current model is called the "Z27q G3" but it's now just a mid-tier 27-inch 2560x1440 IPS display with no special features beyond DisplayPort 1.4 in-and-out ( https://www.hp.com/us-en/shop/pdp/hp-z27q-g3-qhd-display ). Originally the "q" in Z27q was for "quad" because the 5K (5120x2880) resolution really was 4x 1280x720 (the OG "high def" before 1920x1080 took-over), but now the "q" stands for "QHD" which doesn't mean Quad-HD dimensions, but just 4x the pixels (which isn't impressive at all: as pixel-count increases squared as length increases). Grumble.
Apple still has their own LG-made 5K monitors but they don't play-nice with Windows, and it's even worse with Apple's horrendously priced 6K monitor - but other than that, there's still very few options left for people wanting or needing a large desktop workspace beyond getting a 40-inch 4K display and running it in 96dpi mode.
> feel free to use it to fix custom font rendering on Windows
And then you get the exact opposite problem of macOS people trying to force macOS rendering onto other platforms because they prefer it. If you're going to advice people not to mess with these settings, don't make it apply to just your platform. Just don't touch these things unless you're going for some kind of retro feel or are foolish enough to try to do your own font rendering.
Apple's weird "font smoothing" probably has more to do with the way glyps are artificially altered for aesthetics, anyway. Stick to the platform defaults, it's what your customers are used to. Font rendering is done through lots of subjective choices and personal preferences because there's no technical "correct" font. If one platform does it different, let it be different, don't force down whatever your preference is because you use an iPhone/Windows tablet/Linux computer.
Funnily enough, Apple disagreed with this author and changed the way their subpixel algorithm worked years ago, so the author's preference clearly wasn't what the majority were expecting.
> Funnily enough, Apple disagreed withtthis author and changed the way their subpixel algorithm worked years ago, so the author's preference clearly wasn't what the majority were expecting.
Apple only changed it due to the prevalence of “retina” displays. When the pixel density of your screen is high enough, sub-pixel rendering stops making sense (your normal pixels are already small enough). You can just apply normal anti-aliasing and get the same result.
This also has the advantage to remove all the super special font rendering code from your code base, no need to percolate font data through your entire rendering pipeline, just so you can do sub-pixel rendering in your final compositing and scaling step. Just push normal pixels, and treat everything the same.
All of this doesn’t change the fact there might be situations where sub-pixel rendering still makes sense. It’s just the OS X no longer supports it.
Yup, I have 1080p monitor (SyncMaster BX2431) and it looks like shit when connected to my MBA. It kinda clean up the blurriness if I increased the UI scaling. So it look less shit but it still looks shit.
> When the pixel density of your screen is high enough, sub-pixel rendering stops making sense
I wonder if the prevalence of Retina displays across the Apple product line coupled with the prevalence of rotatable displays from iOS devices made them go towards something consistent across the board:
because 90deg rotations really doesn't play well with subpixel rendering, doubled down by having to handle four different orientations with a single font and have it display consistently on a single device, it makes sense to go with simplicity and walk the antialiasing route, and since you have Retina basically everywhere, well, just flick the switch everywhere as well.
I seem to recall (years ago, so memory may fail me) some blog post explaining how dropping subpixel rendering also allowed for some major cleanup in the Cocoa/CoreSomething rendering code.
> And then you get the exact opposite problem of macOS people trying to force macOS rendering onto other platforms because they prefer it.
This is not a matter of preference but of measurable quality. Fonts on macOS look super smooth, and most Windows laptops these days still ship with 1920x1080 displays which even with Cleartype enabled just looks like shit.
Linux is even worse, compared to a Mac a Linux desktop just looks like straight from the 90s (which is when most stuff regarding rendering seems to have frozen in time).
People using Macs generally value aesthetics, optical polish, UI consistency and elegance, a factor that is sadly lacking in the Windows world (there are ... five different ways of creating UI applications at least, six different applications to look for basic system configuration, ...) and completely absent in the Linux world (as a result of everything relating to Desktop Linux being underfunded and done by volunteer developers without UX design people).
>Linux is even worse, compared to a Mac a Linux desktop just looks like straight from the 90s
What are you smoking? most modern Linux distro's out of the box have the best font rendering of all OS's. Sure if you install some bare bones openbox DE it will look bad, but Gnome and to a less extent KDE look night and day better than windows and macOS.
Been a Mac user for decades, agree that Mac font rendering is preferable to Windows defaults. But also agree here that Linux can be very good too. Out of the box Ubuntu (since at least 16.04 or so) looks quite nice. Good enough that people that don’t like MacOS font rendering complain about it being too Mac-like.
I use both Windows and Linux on a regular basis, and I have to say that font rendering on Linux is almost always much better than on Windows. I frequently notice and am bothered by font rendering on Windows, even after running the ClearType tuning tool on both of my monitors.
I'm one of those people, and I'm not even a Windows user. What I like about them is that on crappy low resolution displays they are sharp. This seems less the case with Win10 though.
Yes, I know about the whole font shape debate in Windows vs Mac rendering. And I don't care one bit.
When I'm using the computer as a tool to read text all day, I want sharpness above all else. I don't care if the letters aren't exactly as their designer wanted them to be.
Have to agree. Font rendering is one of the top reasons I don't favor the Mac experience at all. At least Windows and Linux give me options, and honestly Windows is the only one so far I believe gets the fluctuation in DPIs 'right' by giving me some control over it. I love being able to use a 4K TV as an extra monitor, regardless of size.
Macs? Sorry; Apple says one size fits all these days, and you will likely have to shell out money to get it looking 'right' on an external display (i.e. higher dpi). Normally, this means just embracing the full Apple ecosystem of peripherals, which is only complicated further given Apple appears to have stopped producing external displays years ago.
> Macs? Sorry; Apple says one size fits all these days, and you will likely have to shell out money to get it looking 'right' on an external display (i.e. higher dpi). Normally, this means just embracing the full Apple ecosystem of peripherals, which is only complicated further given Apple appears to have stopped producing external displays years ago.
You can use any 4k screen with a Mac, and it will work well. Before I switched to Linux full-time, I used to use a 24", 4K Dell on my MBP, and it was glorious. The screen wasn't even all that expensive, around €300 3 years ago, if memory serves.
That puts you at roughly 180 dpi, which isn't that far from the 'retina' standard of roughly 220 dpi. But I imagine I can get you sharp text on a lowly 1080p 20 inch display with Windows or Linux at only 110 dpi (or worse) for roughly half the cost at today's prices.
I don't care if they're "exactly as their designer wanted them to be" — I don't want them to look weak and pixelated. I want my screen to appear as close as possible to print, with text equally easy to read. Windows does not provide such an experience.
Point is we both want different things, and doing something "in between" would look terrible for both. A choice has to be made.
The issue is that you cannot obtain rendering on a low resolution grid that is both sharp and close to the shape on print [0]. As such, given this constraint, I prefer sharpness rather than fidelity.
In my personal case, I actually use Linux on a high dpi display, so I can get both.
But when I have to use a low resolution screen, like at work, I would actually prefer a font that is designed with these constraints in mind, such that it looks both good and sharp. For this I like bitmap fonts, but they seem to be few and far between these days. Overly fancy fonts need to be beat into shape with hinting, but then the flow looks all broken. Or else, they're a blurry mess.
Terminus works great for my monospaced needs and Calibri (from Windows) seems to have bitmaps for low size fonts that look fairly pleasant.
--
[0] I'm talking mostly of small-sized text, like for interface widgets. For larger font sizes, it seems easier to get a decent result even with fancier fonts.
… not to be confused with the macOS concept that it has misnamed “font smoothing” which is absolutely nothing to do with font smoothing, but is instead glyph dilation—making fonts thicker than the designers intended them and than all other platforms (and anyone that turns it off in macOS’s settings) will get. I believe this to be a significant factor in many people using fonts at weight 300 at body sizes. See https://news.ycombinator.com/item?id=23553486 for some more details.
Stem darkening is basically necessary to some extent for most renderers, for multiple different reasons depending on the implementation, otherwise glyphs come out far too thin. There's no "right" answer here
There is a ClearType configuration wizard that allows you to make fonts a bit thicker. I recommend doing it as a counterweight to that, especially on high DPI displays.
Stop trying to fix everything to be honest, so tired of encountering text inputs in React web apps that don't have system wide spellchecking because some engineer decided they needed to roll their own input controls.
Thank you. This is the solution. Stop "fixing" my scroll bar. Stop "fixing" my copy/paste behavior. Stop "fixing" my default fonts. Stop "fixing" navigation from element to element using tab. Stop "fixing" what links look like, what happens when I right-click, what happens when I use my mouse wheel, what happens when I resize my browser window. Just stop trying to be clever. Deliver your damn hypertext to my browser, get out of the way, and allow it to render.
Not if a react component is reimplementing a textarea with JavaScript, which I've encountered in some enterprise apps. It's mildly infuriating, and breaks in lots of subtle ways.
Not really. It revolves around textarea not allowing rich inputs. That includes @-mentions, /-commands, block-based content builders etc. Textarea is laughably basic when it comes to adding interactivity to it, so the only solution is to use contentEditable[0][1], which is very difficult to work with or roll your own input[0], which makes OP angry.
> However, on Mac OS X, when this is reversed and you set light text on a dark background, you get a fairly ugly effect where the text becomes overly bold, spilling out of its lines.
Wait, that's not supposed to happen. Most likely they're setting subpixel values without taking display gamma into account. This should be fixed properly in the rendering pipeline.
Well, as long as you use Hi-DPI (or "Retina", as Apple likes to call them) displays over your entire product line, subpixel rendering doesn't have as big of an impact anymore. I also suspect that's what's going on with the designers who are wont to disable subpixel rendering for their sites: they also mostly have Macs with Hi-DPI screens, so they don't see the benefit of subpixel rendering.
This goes out the window the moment you attach a second monitor for most folks. 1080p is very common; if we start talking about achieving retina at that resolution (say 220 DPI, a rough average across modern Macbooks), we're talking 10-inch displays. Even at 4k we're talking a max size of 19 inches or so before things get outside of Apple's ideal DPI.
I can't speak for you, but for my work and current eyeglass prescription, 27 inches is my starting point for comfort.
Even if I dislike it, Apple seems to have removed the global option to turn font smoothing off in Big Sur.
Yep, this whole article misses the point that the gamma wasn't being corrected for, so the opacity wasn't linear relative to the requested percentage.
Sadly, the only thing that ever did this correctly was Firefox, and it no longer does it. The proper method needs alpha values maintained for each subpixel in an image, not just each pixel, and it needs a specialized blit routine to composite with the background. This was agreed by most to be too complex, and, as other comments point out, DPI is increasing, so the easiest thing is to just disable subpixel rendering.
Maybe… but it’s been like that for years and years now (a decade, even?) so you can’t blame web designers for disabling subpixel smoothing in that scenario.
I find it undesirable to let individual documents (or applications) to change/control fonts and colours in general, and usually restricting font changes by websites in FF, but sometimes noticing that some websites still manage to mess up the fonts by adjusting finer rendering options, as described here. Though it appears to be an even more general trend than that: reinventing and worsening built-in web browser functionality with JS is another common activity, as non-web software is prone to reinventing system functionality (or settings), and I usually imagine that the intent there is to "fix" things too (or at least to make them look prettier).
I wonder how much these "please stop" blog posts help: seems like they are supposed to raise awareness among developers, maybe to point it out to those who didn't think of it, but IME much of the awkwardness, especially in the UI, comes from non-tech-savvy client/user/management requirements: things must look good enough in a presentation, on their machines, and/or on common machines and systems at the time (though an argument can be made that it's the priority in many cases).
mobile devices don’t use subpixel rendering due to having to suport both vertical and horizontal screen orientations
Sure about that? I would've thought the wildly varying subpixel layouts for the various display technologies and even generations of a particular technology would have a bigger impact. IPS subpixel layout looks nothing like an early PenTile layout which is different from a modern AMOLED layout.
I came to the comments just to ask about this part. Pretty much every desktop OS out there allows you to change the orientation of your monitors, and I haven't noticed any font rendering issues when doing so.
A citation for that statement from the OP would be appreciated.
In the past ten years, Macs (which this writer is mostly concerned with) have all adopted ultra high density “retina” displays, which… well I’m not going to say they render the arguments moot, but the changes are so drastic that they force every statement to be reevaluated.
Also I could be mistaken but I believe as of about four years ago subpixel was dropped from Macs altogether.
What I (on Windows) found interesting, was that I thought the subpixel rendering screenshot looked clearly worse and more blurry… on my FHD Monitor. On a hunch, I dragged it to my WQHD Monitor and indeed, that reversed which of the 2 examples I found to be more blurry.
Your subpixel rendering settings need to align with the pixel layout of your monitor. Trying to smoothen text with a left-to-right BRG subpixel algorithm on a top-to-bottom GBR monitor, for example, will always look ugly.
On Windows you can tune this for ClearType (I don't know the exact command, I usually just type "ClearType" into the start menu) and doing a test similar to what you would get in the optometrist's: a set of texts, and you pick which one is the best and if things get better or worse.
If you have issues with blurry text, it could also be because of filters your monitors apply. Some using some kind of sharpering algorithm. Trying to tune the blurry monitor using some of the available websites might not be a bad idea in that case.
I always use the links terminal browser test. If the site is usable in my terminal with the `links https://usabilitypost.com/2012/11/05/stop-fixing-font-smooth...` command then it is well done. Chances are, if you are using any type of special font rendering to fix a UI issue you are also doing other things on that page that will cause it to not display properly in links. Also, shout out to the author for having their article actually render well in links!
Arguably much of the “infrastructure” and visual language for GUIs have evolved in the light background/dark text scenario, and “dark mode” has been pretty much improvised in the past couple of years.
I work mostly in the daylight and thus mostly in light color schemes for coding (doubly so when I’m forced to use glossy screens) but I get the appeal of not having a full white spotlight in your face when coding something in bed at 3AM. But: I’d rather most of my screen be dark (eg with terminals) but still have my editor viewport to be light-background, even if it means shrinking it.
They have images comparing subpixel and anti-aliased fonts, but if I understand correctly the subpixel interpolation takes place at a different level of the stack and the required metadata is lost when it's been rasterized... the subpixel example looks way blurrier than the anti-aliased example.
Exactly controlled font rendering and color accuracy over the web are impossible to get exactly right because what is correct for one OS/GPU/screen combination will not be correct if you change OS or screen (and sometimes GPU).
Still relevant, although in different applications. GTK removed subpixel font rendering in GTK+4 (or more precisely, changes were made in Pango), that resulted in blurry fonts. Official explanation is that subpixel rendering is incompatible with transformations. Some work is being done on partial fixes, but results are not as good as current font rendering on GTK+3 applications.
https://gitlab.gnome.org/GNOME/gtk/-/issues/3787
I though sub pixel rendering (or anti-aliasing) was not needed at all on modern HiDPI displays? Why do you need to anti-alias anything if you are already rendering glyphs with sub(logical)pixel precision anyway?
Well, the point is that traditional display technology had limited spatial resolution, which is exactly why algorithms like subpixel rendering were created — to improve the visual fidelity of text on an imperfect screen. The algorithms themselves are not without drawbacks — sensitivity to background colors, implementation complexity, caching cost, scalability etc. If the spatial resolution of the display is high enough that the font looks smooth with "normal" rendering, why bother with complex workarounds?
My understanding is that on pre-retina displays, subpixel anti-aliasing was a fairly difficult, complex, and resource intensive technique that resulted in huge improvements in readability, and in the era of retina displays, it is a lot of complication that results in barely perceivable gains.
> But upon closer inspection, antialiased text is always blurrier than subpixel rendered text. This is not a matter of opinion, it’s just how the rendering works.
That is not generally true. It depends on font hinting and grid-fitting. With good grid-fitting, font rendering with disabled subpixel support is more sharp, as vertical lines are exactly aligned with pixel boundaries. That is why i disable it on my desktop.