Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Everytime I've got 3200x1800 laptop, I wanna go back to 1920x1080 or 1920x1200.. Why well, personally, it's not worth going more than that. There is VRAM being wasted, gpu, cpu cycles, etc. and then software is not perfect - some of it would scale well, some others won't... Bought for my son Windows 10 laptop with 3200x1800 - lots of Qt applications would not scale properly, the minecraft launcher too. Maybe it'll be better on linux, chrome, and surely it was solved problem on OSX, but I still don't see the point.

The big thing was moving from 1024x768 to 1920x1080, and 4K displays are great - but when they are big, not for laptop.



> Maybe it'll be better on linux, chrome, and surely it was solved problem on OSX, but I still don't see the point.

Most of my use of a laptop is for programming, which means a lot of time starring at text. The far crisper text display on Retina/UltraHD/4k/whatevermarketingterm monitor feels much nicer to read for hours on end, for my money. It's a lot more like reading print.

I feel like I'm squinting when reading jagged lower rez text on my 1920x1080 external monitor, in contrast

Diclaimer: I use OS X, so I don't run into the scaling downsides you see in other OSes, and those are significant and would probably turn me off of 4k displays.


Other operating systems tend to snap fonts to the pixel grid where as OS X does not do that. So OS X suffers more from low-res displays in general.


It's funny, though, now that Retina is a thing, how much superior OS X font rendering algorithm is. This was not very clear when fuzzy low res displays were the norm. Back then, I vastly preferred Microsoft 's ClearType approach. Now, it's not even a contest -- Windows does not represent fonts truthfully.


Its pretty clear that OS X has existed for what 15 years most of that time without retina screens it didn't retroactively become better because screen resolutions improved to make what was presumably an inferior algorithm in the prior context work better in the modern context.


But it did. OS X (and Mac OS before it) were optimized for fidelity with higher resolution laser prints. This necessarily caused more anti-aliasing artifacts then with pixel snapped text. It was actually less of a problem with CRT monitors where individual pixels blended better with each other.

This is not a shallow concern, pixel snapping can cause major issues like line breaks differing between print and display.


Interesting I would never have considered print publishing. That said.

Was this configurable and what percentage of users were concerned with producing print copy? In what eras?

I would but that these individuals were always a minority and the arrangement appears to be inferior for the majority of users for the majority of the time it was a factor.


This was a Steve Jobs obsession from his first calligraphy class at Reed College. Proper proportional fonts were built in with the original Macintosh - this wasn't something he was going to do a market analysis on to decide for him.

However historically this was less problematic because screens were fuzzier anyways. It was only with LCDs that the anti aliasing became obvious.


The Macintosh may have had a very small market in the 90's but one market that it did dominate was print.


Most programs are pretty horrible in 4k with win10 too.


Really? All of my dev tools support scaling.


I agree, I have had no problems (except for Eclipse, but I vowed to use Eclipse as little as possible well before I upgraded to win10+4k).

Intellij, Cygwin, Ubuntu VMs (DPI scaling worked well in vmware at least), VS Code, VS proper, sublime, idle, cmd, cmder, IDA, Tableau. I can't think of a single application (except for Eclipse) which I noticed not scaling well.


And probably none of these is Qt-based. (I'm not blaming Qt specifically, it's a great framework, simply showing why you didn't have the problems). And of course there are solutions, but I wish things worked a bit more smoothly:

RStudio on Windows (fixed now, and some workarounds) - https://support.rstudio.com/hc/en-us/community/posts/2065421...


I don't use my windows install for development so I can't comment for that. However, I'm seeing issues with Photoshop and a lot of other programs that end up very small, blurry or with their images all stretched up.


So Chrome & OSX probably solve this somehow, but all my chromebooks have crouton, and yes Windows'10 support for 4K displays is not there yet. Most of the apps work, but then there are some that don't (Electronic Arts's Origin client, which is written in Qt was not scaling properly, same for Minecraft Launcher, there are other examples too). Also in the "4K" mode I would often see screen tearing in some games (if they can't stick to 30 or 60fps), but that's more due to the 4x pixels that needs to be drawn. For that I've hard switched my son's laptop back to 1920x1080 - so it still looks kind of crispy, and things look well.


I think you're missing parent's point. The benefits of staring at text all day with >30yo eyes also applies to Linux. I use Arch and Fedora on high-dpi displays and I'd sooner give up programming than ever return to ~72dpi.

Parent isn't talking about most apps, they are referring to terminals and IDEs, for which use case high-dpi is absolutely critical, and outweighs the occasional inconvenience of poor scaling outside those apps.


But that's where I disagree, especially when comes to terminals. I'm so used to seeing the pixels, that it doesn't bother me, but even then hinted anti-aliasing helped quite a lot to solve this. What I'm really missing (and hence my heavily opinionated answer) is that there are lot of people that might just started coding on higher DPI monitors/devices, and to them going back would be terrible.

I started on Apple ][ :)


I'd hate to go back to the eye-strain inducing fuzzy low-res CRTs we had to tolerate back in the day. I can look at my rMBP screen all day without getting a headache.


> What I'm really missing (and hence my heavily opinionated answer) is that there are lot of people that might just started coding on higher DPI monitors/devices, and to them going back would be terrible.

It has nothing to do with that. I learned to program on a 72DPI sony trinitron screen in the 90s. UltraHD is strictly a less eye-straining experience, "what I grew up on" be damned.


I haven't seen anything approaching 72dpi since 1024x768 stopped being common, most modern displays are north of 100dpi. It's still far too low though. We've been held back far too long by the OS and app makers inability to adjust to a reasonable resolution.


If you're seeing screen tearing, specifically when you see horizontal movement, then odds are it's the graphics card and drivers... If you're using a beefier graphics chipset, it works better.

I usually set it to 1440p scaling mode, or 2x... tbh, I have terrible vision anyhow, so I find the ultra-high resolution displays on phones don't help me much.


I had noticed the one app that displays terribly on my Surface Pro 4 is the Quassel IRC Client. Also, written in Qt.


But even then i would get a 4K external monitor and save the GPU/CPU cycles and battery life on the laptop. Being hunched over a laptop all day is probably a lot worse than having a lower res display in the end, so if we are talking ergonomics, better get a external 4K screen.


> Being hunched over a laptop all day is probably a lot worse than having a lower res display in the end, so if we are talking ergonomics, better get a external 4K screen.

Weird dichotomy. I use both simultaneously. No reason to confine yourself to one screen when 2 will do. And sometimes I code on the go, too.

The hand-wringing over battery life/GPU cycles with a 4K display seems misguided -- I get ~10 hours of battery life on my Retina MacBook Pro. That's no worse than my previous non-Retina version, and the damn thing got thinner, not bulkier.

It doesn't even switch over from the cheapo integrated GPU to the Radeon unless it has to drive more than one 4K display simultaneously. The battery impact of driving the integrated display seems negligible and totally dominated by the backlight.


I think the issue is Windows laptops with those displays get much worse battery life than standard HD displays. Apple does a great job of optimizing their software for their hardware. That way you do not see those battery life issues on Macbooks compared to Windows laptops usually. Have you even run Windows on you Mac? Talk about a battery drain.


I've done the Windows-on-Mac thing, but I assume half the problem there is Apple's Windows drivers aren't particular good (do the fans still just run full blast all the time?).

I'd hope that a dedicated Windows machine would have better drivers, but that's probably putting too much faith in the vast majority of manufacturers.


I ran Windows 7 on a 13" MBP, and then the 13" MBA when it originally came out. Never experienced full-speed fans unless I was playing videos games on it, doing something computationally expensive for an extended amount of time, or sitting with it in my lap, on top of a blanket, depriving it of ambient airflow.


People probably generally have both. When you are in the office, use the big screen but lots of us do a considerable amount of work from client sites, on airplanes, etc...


The crisp display is truly great.

What sucks is that a dual-monitor setup of a low-res monitor and my hiDPI laptop monitor is not that great. I have to upscale my monitor a bit, which makes the monitor look even worse than it already is (1920x1080). The monitor has a lot of room, but I naturally want to leave most windows on the small hiDPI laptop screen.

HiDPI monitors cost waaaay too much for me to get one. So for now, I just have to wait until the price starts going down.


You can get a QNIX QX2710 IPS 1440p for ~$200 on eBay. Do pay attention to connectors - presumably with a laptop you want one of the versions with HDMI or DisplayPort inputs, not the DVI-only variants. It has a PWM backlight, other versions have DC backlights (Check out the Crossover 2795QHD) but you are stepping up closer to $300 at which point you are competing with options like a refurbished Acer XB270HU or XG270HU 144hz/gsync monitor.

28" TN 4K monitors are regularly going for ~$300 nowadays. Again, Crossover makes a few here, the 288K and 289K. If you want IPS, keep an eye on the Dell Outlet for a P2715Q. They tend to run a lot of 35-40% off coupons for monitors, you can get one for under $350 if the timing works out. You will need DP 1.2 to drive 4K@60hz - HDMI 2.0 doesn't have any significant market penetration yet.

The Koreans basically make all the panels that go into monitors, and they take the ones that failed QC for whatever reasons and sell them cheaply though channels like eBay. You may have to put up with a few dead pixels but you can get a great IPS 27" monitor for the price of an average whatever. Don't bother getting the "pixel perfect" warranty unless you are OK paying to mail it back to Korea for replacement.


I wouldn't really recommend the 1440p versions. I had one shortly, but I did not really found it comfortable to look at. If scaling is disabled text is too tiny, and scaling (probably 125-150%) does not work as good as with higher resolutions. I now have a 28" 4k screen for which a higher scaling factor can be applied and that looks much better (although still worse than a MBP with retina screen). I guess 24" or 27" with 4k would be also good.


Wow, this is great information. Thanks.


I've purchased several of these 1440p Crossover monitors and they're pretty good. It is worth paying the ~$20 more for the variants listed as "pixel perfect" (If you're buying from one of the handful of reputable sellers).

I also have a Yamakasi M280PU which is a 27", 4K panel.


I agree, I can't use 1920x1080 or lower. Plugging my MacBook into a TV is horrible for anything but watching TV from a distance.

I think 2560x1440 is probably the ideal resolution for the time being, looks good enough, no need for a graphics card, cheap monitors, etc.


I wouldn't make a judgement from a TV though. I go from a 1080p monitor to the TV and i can't read the text at all on the TV while i can on the monitor just fine. Even with the TV in 'PC mode' it still does unnecessary post-processing.


I use/recommend a ASUS PB278Q 27" at 2560x1440. It works quite well for coding.


Since I don't have retina experience beyond cellphones, my 22" 1920x1080 is tolerable - on Windows. On Linux, oh boy..


27" 1440p is 108 ppi, which is the current sweet-spot IMO.

24" 1080p is 92 ppi, 22" 1080p is 100 ppi. So you're a bit behind the curve in terms of ppi, but it's probably not that noticeable. The big thing you're missing is just that 22" isn't that big a screen.

For reference 27" 4K is 163 ppi, and many people find that's too much and need to use some scaling.


To piggy back on this comment, I'm sure I'm not the only one that finds high density displays less fatiguing. That in itself is worth even more than the performance trade offs.


Well for one Eclipse!!! doesn't scale and is unusable on 4K displays. Tons of other applications don't... things get goofy and it makes the whole system seem kinda amateurish... And I don't really see the difference with text... maybe you have very small fonts?


For the longest time I refused to wear glasses and learned to read 13x7 bitmap blurs. Even with glasses, I can't read any other font nearly as well, so I'm stuck with my low res screens.

What I really want is a low res screen as bright as the retina.


I feel the same way - though, it's all about DPI. A 1080p display at 22" is not very good, but a 1080p at 13" is very very nice.

I have a 13" retina Macbook Pro with god knows how many pixels, and I'm left with a choice between full HiDPI where everything is giant, or "scaled" mode where the GPU renders even more pixels and then scales it down. This is because the UI components in OS X are bitmaps, and there is a set of 2x HiDPI bitmaps, and a set of 1x regular ones that are too small to read on a retina display. I use the scaled mode where it renders everything as if I had a giant monitor and scales down the whole screen to a normal size. Since I have the 1st generation 13" retina, it understandably struggles at times with its Intel GPU. It does look great, though.

I recently got a Dell Chromebook 13" with a 1080p IPS display, put Debian on it over Chrome OS, and now it's all but replaced my Macbook for everyday use. Everything is a decent size, and there is no need for the OS to do any scaling or anything weird, so everything works nicely. I mainly got it so that I wouldn't have to lug such an expensive laptop on the subway, but I've really grown attached to it. The DPI is high enough that I can't really discern the pixels, IPS displays tend to look great, and I forgot how much I missed using Linux while I was on OS X. :)


How do you manage with 16gb?


Upgraded it to 256 Gb - it just has an M.2 slot inside it.


Linux Mint with HiDPI on looks awesome at 3200x1800, IMO way better than on 1920x10. YMMV

Also, even Windows 7 looks way better than Windows 10 when you enable 200% scaling in Intel drivers.


I recently switched to the 3000x2000 Surface Book. Worrying about capacity being "wasted" is nonsense - how would it be better to not have that capacity? Some software looks bad, but I'm sure that'll be fixed as time goes by. Games look great, text looks even better, particularly if you read a lot.


I also have a Surface Book. The display is stunning when viewing photos and I have no issues browsing the web but I find using IntelliJ a bit difficult at the default 200% scaling. It's all a bit small and adjusting the font sizes is not as easy as it sounds (you get big fonts but everything else remains small). I also tried increasing the OS scaling to 225% or 250% but that introduced other issues (like when watching fullscreen videos you get a border).


I found I adjusted quickly to how Eclipse looks on it FWIW. The icons come out undersized but everything else works correctly, and I've grown to like the result - it feels like more space for the actual code.


I'll take back my previous comment. I just now attempted to customize the two font settings in IntelliJ and wow... it looks absolutely amazing. I changed the editor font to Consolas and increased the size a tad.

When I first got my SB I remember attempting this but I encountered odd glitches like fonts overlapping in the Project view.


Cost of display scanout in terms of power? When nothing's moving, more pixels = higher bandwidth requirements, and the thing has to be clocked higher. (I guess it runs constantly. The new non-scanned LCDs will help I suppose.) You also need more RAM, and that needs refreshing. And any time anything is moving, there's 4x as many bytes being shuffled about, and they have to be shuffled about 4x as quickly, because the display area is the same.


I think at least part of the implied tradeoff is resolution vs. battery life. More pixels == more GPU computation == more power required.


AIUI the backlight dominates the power consumption, so more pixels at the same physical size makes very little difference.

But sure, that's a lot more concrete. So are you unhappy with the battery life on this device? Does the laptop the grandparent was talking about have worse battery life than comparable models? Certainly I'm very happy with the battery life of my Surface Book (which is a lot better than that of the laptop it replace), so to my mind it's not a problem in practice.


more pixels requires more light, even on the same size display... So a 4K display the same size as a 1080p display will require 2-3x the light behind the display... I don't recall the exact amount extra, but it's not insignificant.. let alone the processing power to push those extra pixels.


>more pixels requires more light

[citation needed]?


More pixels means more circuitry in front of the backlight blocking light.


But to what degree? Blocking 2% vs. 1% of the light probably isn't noticeable; blocking 50% vs. 25% probably is. This might also be something that improves over time as process improvements reduce the overhead.


Potentially a lot. The iPad 3 requires 2.5x more backlight power (7 watts) for the same brightness than the iPad 2.


Thanks for quantifying that - the difference is 4.2 watts, which is huge. I wonder if the LED backlighting is getting any more efficient?


3200x1800 is 282 dpi. That's about on par with high-end tablets, worse than good phones (440 dpi or so), and worse than jaggie-free laser-printing (600 dpi, although that's not strictly comparable).

Text on high-DPI laptop screens looks enormously better than on low-DPI ones. And whatever issues Windows has with it, ChromeOS scales everything great.


Well, if you believe Apple (and in this case I do), then it's less about just DPI, and more about the combination of DPI and common viewing distance. Phones are generally viewed at closer ranges than tables, which are themselves viewed at a closer range than laptops, which are also probably viewed at a closer range than workstation monitors.

1: https://en.wikipedia.org/wiki/Retina_Display#Technical_defin...


Sure, at work I have two 1920x1080 24" monitors, and used such configuration for several years (had exact same config at previous job). I like this better than the other offering of one huge 4K display (30" or was it 36" - I don't remmember).


24" 4K is a really good option though


I agree, but on OSX that is the same screen real estate as 1080p. I personally use 5K monitors so I can get 2560x1440


Y but with much sharper text :) 5K 27" is better obviously but 3x the price.


You can change the scaling in the control panel though.


Comparing laptops to tablets/phones is a bit silly though.

You hold your phone/tablet much closer to your eyes than your laptop is.

A phone needs to be pretty high dpi. A laptop doesn't.


> Maybe it'll be better on linux

Nope. Scaling is supported by Gtk3 (but you can only scale by integer - 2x, 3x) and Qt5. A lot of apps are still using Gtk2 and Qt4. It is a mess. I wish I had an FHD display instead of WQHD.


Well, if you use the gtk3 hidpi settings, yes. However, if you disable the HiDPI, scale it to 1x, and then set the font "Scaling Factor" in gnome-tweak-tool to something else, say, 1.25, (a non-integer!) it works great!


For Gtk3 apps it works reasonably well (it scales text, but not buttons, widgets, etc). There's nothing you can do with Gtk2, Qt4 apps though (other than changing font size).


Can't you just set it to FHD?


Not without having fuzzy text. I explained it once on reddit:

> Is there any difference between FHD monitor and WQHD in FullHD mode?

Yes, there is. WQHD is 2560x1440 pixels. If you set your resolution to half that (in width and height) it is 1280x720 (2560/1280=2, 1440/720=2). In that case every software pixel is represented by 4 physical pixels (2x2) and we are fine. If you set resolution to 1920x1080 (FHD): 2560/1920 = 1.33, 1440/1080=1.33 . And we know there is no 1.33 physical pixel! So the image becomes distorted and looks bad.


It doesn't have to look bad.

rMBP13 has physical resolution 2540x1600 (i.e. 1280x800 x 2). The control panel offers resolutions 2880x1800 (1440x900 x 2) and 3360x2100 (1680x1050 x 2), that the GPU will scale down to 2540x1600. And these resolutions look good and sharp.

In fact so good, that the final resolution depends only on what physical size / working space you prefer, not which one looks good and which one doesn't.


Keep in mind that Apple claims they developed custom scaling algorithms to handle these downscale modes, so a PC user doing the same thing using GPU drivers might not get as good results.


I tried 1980x1080 on my 2560x1440 screen. I don't think it looks very well.


That MBP has physical 227 DPI. What's yours? (If it is 2560x1440 at 24", it has only 122 DPI).


> That MBP has physical 227 DPI. What's yours? (If it is 2560x1440 at 24", it has only 122 DPI).

It is a 14 inch WQHD (2560 x 1440) display (Lenovo Thinkpad X1 Yoga)


Good point. What about GPU scaling though? Doesn't Nvidia and AMD drivers support that for lower than physical resolutions?


Not that I know of. Screen can be scaled with xrandr (on X11), but that significantly decreases performance. There's a chance of making it work on Wayland (modern replacement for X11) in the future.


Can’t you switch to some form of “Retina scaling”, e.g. 2x which result in an effective Retina resolution of 1600x900?


I got confused finding that option. It's on laptop with intel/nvidia gfx cards. I've seen something while back, where the gfx card provider would provide such scaling (e.g. done a bit deeper, hence probably working more as intended that not).

Yes, my concerns are primarily now on Windows 10 (Home Edition if that matters), since I'm not really bothered (but can see) such cases with my linux installs (where I work).

It bothers my son though, as some of the windows are very small (like as I said before EA's Origin client) :)


I use Windows 10; I’ve found something under Settings > System > Display > Advanced display settings > Advanced sizing of text and other items > “set a custom scaling level”. It lets you do 200%, but I haven’t tried it out, sorry.


I would try this sometime this week. Thanks!


Retina screens on MacBooks look amazing. As for VRAM being wasted, that's utterly irrelevant, there's memory to spare on any machine capable of 3D.

Windows 10 has better High-DPI support, but it's far from flawless. The real problem is the Windows software ecosystem is full of junk that hasn't been updated properly or was never programmed correctly in the first place.


I'm a "moar pixels!" kind of guy. I'm very happy with the latest generation of the Dell XPS 13, which has... lots of pixels and works great with Ubuntu and xfce. I was worried that there might be some issues, but I'm very happy with it.

I'm glad to see we finally broke through that wall of no resolution improvements that lasted several years.


A) Windows doesn't handle modern displays properly and has a huge library of problems relating to desktop scaling etc...

B) What was the context of your use of the computer? I.e. Are you gaming or logo branding etc...


Windows 10 - gaming - and some apps/websites. Another example was the mouse driver (after 3 deep down dives into the settings menu) - now its window look ridiculous. Out of place :) - first it looked a bit like Win95, or maybe Win2000/XP - then it was smoothed (blurred for some reason), and the size was completely wrong.

That was not a big deal, though. Really the big deals are the windows that come very small, and I've tried backward compability for them (individually) and it didn't help.

The only thing that helped as I've said, was to set up to exactly 2x lower resolution (1920x1200 or was it 1920x1080 - not in front of that computer to say exactly).

Funnily it now looks a bit more blurry (not sure why, I was expecting integer scaling), but after some adjustments it looks almost good now (not as crispy as before, but good enough).

Also for some games, no more tearing. Mainly games that don't real fullscreen (possibly with custom resolution), but somehow expand the window, and remove toolbars, etc.


2560x1440 is the sweet spot for me currently. It's a notch crisper than my old (now secondary) display (27" 2560x1440 vs 24" 1920x1080), but is perfectly usable without any scaling and is relatively inexpensive.

My MacBook Pro (13" 2560x1600) is noticeably nicer looking, but not really worth the additional hardware costs and scaling hassles (Windows 10 is better than previous versions [my Surface Pro 3 is set 150% scaling IIRC], but its far from flawless) IMO.


I was looking for a specific laptop for my daughter that came in both 1920x1080 and 3200x1800 and the former was almost completely sold out everywhere. So I guess a lot of people prefer the lower-res. I felt the same way because if you have two laptops with identical specs but one has 4x as many pixels to push -- that's quite a trade off.


It's also the price difference - 4K have their use, and people appreciate them for different reasons, but yes I was in the same situation, and I felt ripped off for not buying the cheaper 1920x1200 or was it 1920x1080 option. This money could've go for a bit bigger SSD, or maybe more RAM. He really doesn't need more than that.


In my case, the price difference was actually negligible. I just felt the lower-res display was the better choice given all the variables: weight, battery life, performance, RAM, thickness, etc.


Except for a few webpage/manual/latex, I'd be happy with orange/black text mode. Yay for symbols.


It's all about DPI.

Eg. 12" notebook with 1080p is great.


You are buying wrong laptops for the job and then being upset about them instead of blaming yourself for the poor choices.


The job was one for my father to use some CAD tools, and he switched to 1920x1080 (for other reasons) - and kept switching between Window 7, 9, 10. I thought it would please him better, but he just complained :), and then for my son - I just couldn't find same laptop with lesser resolution (after having exp with my father).

First was from Fry's, and the deal there (few years ago) was to get a laptop with dedicated GPU (8GB) + 12GB RAM, current one is only with 4GB but enough for gaming, but comes with 16GB RAM.

I myself am using an old ACER Chromebook (the one with PageUp/PageDown buttons), and crouton on it. It was the reason I started this conversation. I love chromebooks, croutonized or not, but I fail to see the reason for bigger resolution. I might come to my senses :)

The only monitor that works correctly with resolutions, is the OSX laptop my company gave me, out of choices for chromebook, linux, or maybe even windows (rare).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: