Hacker News new | past | comments | ask | show | jobs | submit login

> feel free to use it to fix custom font rendering on Windows

And then you get the exact opposite problem of macOS people trying to force macOS rendering onto other platforms because they prefer it. If you're going to advice people not to mess with these settings, don't make it apply to just your platform. Just don't touch these things unless you're going for some kind of retro feel or are foolish enough to try to do your own font rendering.

Apple's weird "font smoothing" probably has more to do with the way glyps are artificially altered for aesthetics, anyway. Stick to the platform defaults, it's what your customers are used to. Font rendering is done through lots of subjective choices and personal preferences because there's no technical "correct" font. If one platform does it different, let it be different, don't force down whatever your preference is because you use an iPhone/Windows tablet/Linux computer.

Funnily enough, Apple disagreed with this author and changed the way their subpixel algorithm worked years ago, so the author's preference clearly wasn't what the majority were expecting.




> Funnily enough, Apple disagreed withtthis author and changed the way their subpixel algorithm worked years ago, so the author's preference clearly wasn't what the majority were expecting.

Apple only changed it due to the prevalence of “retina” displays. When the pixel density of your screen is high enough, sub-pixel rendering stops making sense (your normal pixels are already small enough). You can just apply normal anti-aliasing and get the same result.

This also has the advantage to remove all the super special font rendering code from your code base, no need to percolate font data through your entire rendering pipeline, just so you can do sub-pixel rendering in your final compositing and scaling step. Just push normal pixels, and treat everything the same.

All of this doesn’t change the fact there might be situations where sub-pixel rendering still makes sense. It’s just the OS X no longer supports it.


> All of this doesn’t change the fact there might be situations where sub-pixel rendering still makes sense. It’s just the OS X no longer supports it.

Is that why fonts all look bad when I plug an external monitor into my Mac?


Yup, I have 1080p monitor (SyncMaster BX2431) and it looks like shit when connected to my MBA. It kinda clean up the blurriness if I increased the UI scaling. So it look less shit but it still looks shit.


If that external monitor is less than ~180ppi, like most monitors, then yes.


> When the pixel density of your screen is high enough, sub-pixel rendering stops making sense

I wonder if the prevalence of Retina displays across the Apple product line coupled with the prevalence of rotatable displays from iOS devices made them go towards something consistent across the board:

because 90deg rotations really doesn't play well with subpixel rendering, doubled down by having to handle four different orientations with a single font and have it display consistently on a single device, it makes sense to go with simplicity and walk the antialiasing route, and since you have Retina basically everywhere, well, just flick the switch everywhere as well.

I seem to recall (years ago, so memory may fail me) some blog post explaining how dropping subpixel rendering also allowed for some major cleanup in the Cocoa/CoreSomething rendering code.


> And then you get the exact opposite problem of macOS people trying to force macOS rendering onto other platforms because they prefer it.

This is not a matter of preference but of measurable quality. Fonts on macOS look super smooth, and most Windows laptops these days still ship with 1920x1080 displays which even with Cleartype enabled just looks like shit.

Linux is even worse, compared to a Mac a Linux desktop just looks like straight from the 90s (which is when most stuff regarding rendering seems to have frozen in time).

People using Macs generally value aesthetics, optical polish, UI consistency and elegance, a factor that is sadly lacking in the Windows world (there are ... five different ways of creating UI applications at least, six different applications to look for basic system configuration, ...) and completely absent in the Linux world (as a result of everything relating to Desktop Linux being underfunded and done by volunteer developers without UX design people).


>Linux is even worse, compared to a Mac a Linux desktop just looks like straight from the 90s

What are you smoking? most modern Linux distro's out of the box have the best font rendering of all OS's. Sure if you install some bare bones openbox DE it will look bad, but Gnome and to a less extent KDE look night and day better than windows and macOS.


Been a Mac user for decades, agree that Mac font rendering is preferable to Windows defaults. But also agree here that Linux can be very good too. Out of the box Ubuntu (since at least 16.04 or so) looks quite nice. Good enough that people that don’t like MacOS font rendering complain about it being too Mac-like.


I use both Windows and Linux on a regular basis, and I have to say that font rendering on Linux is almost always much better than on Windows. I frequently notice and am bothered by font rendering on Windows, even after running the ClearType tuning tool on both of my monitors.


It's weird when you run into people who prefer the spindly, rickety-looking fonts on Windows.


Tastes and colors, right?

I'm one of those people, and I'm not even a Windows user. What I like about them is that on crappy low resolution displays they are sharp. This seems less the case with Win10 though.

Yes, I know about the whole font shape debate in Windows vs Mac rendering. And I don't care one bit.

When I'm using the computer as a tool to read text all day, I want sharpness above all else. I don't care if the letters aren't exactly as their designer wanted them to be.


Have to agree. Font rendering is one of the top reasons I don't favor the Mac experience at all. At least Windows and Linux give me options, and honestly Windows is the only one so far I believe gets the fluctuation in DPIs 'right' by giving me some control over it. I love being able to use a 4K TV as an extra monitor, regardless of size.

Macs? Sorry; Apple says one size fits all these days, and you will likely have to shell out money to get it looking 'right' on an external display (i.e. higher dpi). Normally, this means just embracing the full Apple ecosystem of peripherals, which is only complicated further given Apple appears to have stopped producing external displays years ago.


> Macs? Sorry; Apple says one size fits all these days, and you will likely have to shell out money to get it looking 'right' on an external display (i.e. higher dpi). Normally, this means just embracing the full Apple ecosystem of peripherals, which is only complicated further given Apple appears to have stopped producing external displays years ago.

You can use any 4k screen with a Mac, and it will work well. Before I switched to Linux full-time, I used to use a 24", 4K Dell on my MBP, and it was glorious. The screen wasn't even all that expensive, around €300 3 years ago, if memory serves.


That puts you at roughly 180 dpi, which isn't that far from the 'retina' standard of roughly 220 dpi. But I imagine I can get you sharp text on a lowly 1080p 20 inch display with Windows or Linux at only 110 dpi (or worse) for roughly half the cost at today's prices.


I don't care if they're "exactly as their designer wanted them to be" — I don't want them to look weak and pixelated. I want my screen to appear as close as possible to print, with text equally easy to read. Windows does not provide such an experience.


Point is we both want different things, and doing something "in between" would look terrible for both. A choice has to be made.

The issue is that you cannot obtain rendering on a low resolution grid that is both sharp and close to the shape on print [0]. As such, given this constraint, I prefer sharpness rather than fidelity.

In my personal case, I actually use Linux on a high dpi display, so I can get both.

But when I have to use a low resolution screen, like at work, I would actually prefer a font that is designed with these constraints in mind, such that it looks both good and sharp. For this I like bitmap fonts, but they seem to be few and far between these days. Overly fancy fonts need to be beat into shape with hinting, but then the flow looks all broken. Or else, they're a blurry mess.

Terminus works great for my monospaced needs and Calibri (from Windows) seems to have bitmaps for low size fonts that look fairly pleasant.

--

[0] I'm talking mostly of small-sized text, like for interface widgets. For larger font sizes, it seems easier to get a decent result even with fancier fonts.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: