Hacker News new | past | comments | ask | show | jobs | submit login
IPhone 4 to have dense IPS display (appleinsider.com)
42 points by mootymoots on May 29, 2010 | hide | past | favorite | 31 comments



I love my Nexus One's screen (AMOLED.) Android 2.2 corrected a lot of color calibration problems present when it was released, so the red/warm caste is mostly gone. The colors seem 'cartoonish' at first, since the contrast ratio is so ridiculously high compared to a normal screen.

You'll use it for a few days, then look at your iPhone (well, if you're a dork like me, you'll have both) and the iPhone will look washed out and crappy. The dark areas on an AMOLED are especially dark -- it looks like the screen when it's off. In other words, the screen on a Nexus One (and HTC HD2, etc.) looks pretty much the same with a full black screen as it does when it's powered off. Very cool.

Here's the downside: it's garbage in daylight. Even if you aren't in direct sun, as soon as you're outside, that vibrant screen turns into an indecipherable grey rectangle. Far worse than a TN or IPS display.

Apparently there are upcoming 'Super AMOLED'-type panels that should work better in daylight, but it's not what you'll get on a device right now, so oh well.

I spend most of my time in the dark, so the daylight thing doesn't bug me too much. The viewing angle is also very good on an AMOLED screen, better even than IPS.

It makes sense that Apple would choose IPS over AMOLED. IPS works in a broader range of conditions. Imagine the howls of anguish if people couldn't use it in daylight. There are lots of Android phones to choose from, so you could always get one without an AMOLED display. But there's only one iPhone.

The condescending tone in this article ("made this way to save money" repeated over and over) is retarded. They come off sounding like fanboys. At 250+ dots per inch, you can't see the difference between an irregular grid or a regular grid. Seriously.


HTC HD2 and the very similar Evo 4G have a standard screen, not AMOLED. If you happen to be allergic to one or the other then Android's fragmentation lets you choose.

Also, don't fall into their rhetorical trap. The irregular grid may well be better, and the difference may well be visible at current iPhone resolution and be positive in AMOLEDs direction. This may, or may not, become less true as the next iPhone doubles height and width, but I'm not going to assume anything based on Appleinsider's take on the matter.

I don't think anyone's linked yet to the very in-depth Ars Technica article that looked into this screen technology:

http://arstechnica.com/gadgets/news/2010/03/secrets-of-the-n...

spoiler: the 800x480 RGBG screen could be considered equivalent to a 653x392 RGB one, and regarding the iPhone:

The Nexus One screen remains better than the iPhone screen for text reproduction because the overall resolution is much higher, even taking into account the factors I describe. So if the iPhone is your measuring stick, the N1 screen really rocks. Overall, the N1 display is beautiful and vivid with dark blacks and incredible photo reproduction.

So this screen (and other standard screens with 800x480 pixels such as the Droid he mentions in the article) had leapfrogged the iPhone. A new iPhone will be out soon, that it appears will leapfrog these screens. Welcome to technology.


Oops, you're right, the other phones are not AMOLED. I had assumed they were, for some reason.

I don't doubt the resolution on the iPhone HD/4g will be better than the Nexus One. I just don't think that the non-rectangular pixel grid on the Nexus One is a huge drawback at that DPI.

This is going to seem like a really silly conversation in 10 years, when technology has moved on.


Coming from an iPhone 3G, I hate my Nexus One's screen (ATT/2.1): it's far, far too blue. DisplayMate tested the Nexus One's white as being 8870K. I find the display to be fatiguing and useless for viewing photos. I've noticed this problem isn't Nexus One-specific, either, as a friend with an HTC Incredible has the same terrible color calibration.

I'll be dumping the Nexus One as soon as the new iPhone comes out. The display is one of many issues I have with it.


> "The Nexus One's screen uses a "PenTile" grid, reportedly to reduce costs, which packs smaller green pixel components between red and blue elements. This irregular arrangement of subpixel elements results in the Nexus One providing a less accurate display of lines on the screen."

Is that really accurate? Our eyes are less sensitive to green than red and blue so it seems like the nexus one grid is logical. Also anecdotally my Nexus One display actually seems clearer and more vibrant than my iPhone 3g.

It'd be cool to have a more technical description of how the 2 approaches compare.

Edit: I just took pictures of my iPhone 3G, Nexus One, MBP and Samsung LCD TV. It's a close up of the Google logo on each display.

The Samsung one is especially interesting. It does make for a very clear bright display actually.

http://axod.net/IPhone3G_Google.jpg

http://axod.net/NexusOne_Google.jpg

http://axod.net/MacBookPro_Google.jpg

http://axod.net/SamsungTV_Google.jpg


Eyes are more sensitive to green, that's why Pentile RGBG as in the Nexus One screens have twice as many green pixels as red or blue ones.

Also, as anyone who follows the Apple blog scene knows, the author Mr. Dilger is either a callous manipulator who has hit on a lucrative pro-Apple-troll writing style, or is a very scary individual indeed.


Oops. Thanks. I knew it was one or the other.

The article here seemed to just blindly dismiss it as an "irregular pattern", suggesting it's like that purely for some cost reason - why would it be cheaper to create an irregular pattern rather than a regular one? :/


(OT Snark: So, regardless of which way round it is, it's still logical right?)


Now I'm confused. If eyes are more sensitive to green, wouldn't fewer green elements be needed to have the same effect? :-) (Or is it that they're more sensitive to the degrees of luminosity rather than the luminosity per-se?)


The green subpixels in a Pentile display are (1) smaller and (2) more numerous. #1 is OK because to get a given perceived brightness you need less actual energy in green than in red or blue. #2 is good because accuracy in green matters more than accuracy in red or (especially) blue.

What's not so reasonable is that AIUI these displays' resolution is usually quoted as if there were the same number of pixels as green subpixels, which would only be reasonable if the human eye simply couldn't see details in red and blue at all. Which is almost true for blue, but not for red.


You may be interested to know that nearly all digital cameras advertised resolution is the total number of what you call "subpixels."


Yup. Which, note, is not the same thing as the way Pentile resolutions are generally cited, it's worse.

On the other hand, what you actually get out of the camera (either directly or via whatever rawfile conversion tool you use) is an image with the advertised resolution's worth of RGB pixels. I suppose the equivalent of that would be a display that has as many RGB pixels as it claims, but that suffers from artefacts such that what a given pixel actually displays depends on its neighbours.

It's only just occurred to me that this practice of quoting the number of photosites as the resolution of a camera's sensor may help explain the (otherwise entirely indefensible) tendency of camera-makers to specify the resolution of the display on the back of the camera by giving its total number of subpixels. So, e.g., Nikon say that the D3000 has a "large 3-inch, 230k-dot, high-resolution LCD monitor". "230k-dot" means 320x240 pixels = 76,800 pixels = 230,400 subpixels. Bah!


Not coincidentally, the color camera's bayer filter and the pentile matrix use an awfully similar RGBG matrix arrangement.

Each "pixel" you get RAW off the camera is only R, G, or B. The "full" resolution is synthesized in "demosaicing." In other words, 2/3s of your digital camera's image is faked up. People just don't notice much because a) the error from faking it looks a lot like defocus, b) the cameras often try to cover up their suckiness with denoising algorithms, and c) most people get a JPEG out of the camera anyway... which typically encodes the chroma (color) at a lower resolution and throws out a bunch of the fine (high-frequency) details.

The real weird part is that the pentile matrix should be capable of a higher effective resolution when displaying color images from digital cameras (except from Foveon X3, 3CCD, or even more uncommon setups). It's "subpixels" are arranged the same way as the data is, which saves you from tranforming the image to and from the intermediate RGB format.


"the error from faking it looks a lot like defocus"

Mosaic cameras tend to have an optical antialiasing filter that blurs the image, to reduce the moire effect.


I've read this over and over again, and I can't tell if you're agreeing or disagreeing with me!

Defocus spreads a point of light over more than one point in the image... as does the "antialiasing filter."


There are actually multiple potential sources of error here. [EDIT for disambiguation: I mean error in reconstructing the image at full RGB resolution, not error in what anyone has been saying in comments here.]

1. The antialiasing filter. This behaves very much like a defocusing error.

2. Demosaicing. Exactly what this does depends on the algorithms in the camera. It doesn't necessarily look much like defocus. (For instance, on some cameras I've worked with -- not consumer ones, FWIW -- you get demosaicing artefacts that produce a sort of tartan effect.)

If the antialiasing filter is strong enough that it destroys all information in the image below the Nyquist limit for the R and B photosites, then demosaicing can in principle be perfect -- i.e., lose no information that wasn't already lost by the antialiasing filter. But (a) that ceases to be true in the presence of noise in the sensor, and of course there usually is some, and (b) an antialiasing filter that strong throws away information because then you're not really using all your green photosites. So in practice, sometimes you'll get ugly high-frequency demosaicing artefacts. Presumably designing a good demosaicing algorithm is largely about making this happen as seldom as possible for real-world images.


I don't think I was agreeing or disagreeing! Supplementing?

Mosaic cameras have an optical blurring device, the antialiasing filter, between the lens and the image sensor. It throws away information, so the image will always have residual blur. Even if the demosaicing algorithm does something a bit silly, it will still look like blur.


I can't comment on the author, but I can and will comment on PenTile OLED resolution claims:

http://www.nouvoyance.com/files/pdf/CV%20Application%20Note%...

Please download the above white paper on PenTile OLED resolution. Resolution is found by counting the number of monochromatic, desaturated (black&white or otherwise not fully satatured colors) lines and spaces:

NOT "pixels" or "subpixels"


See also night vision goggles?


I compared a HTC Desire (same screen with the nexus) with a 3GS side by side. While colors 'pop' more on the desire, they seem to me rather artificial and cartoonish in photos (faces).

Text chars margins are slightly irregular under a magnifier lens, but the higher resolution compensates. It becomes obvious when compared with my HD2 (same res) though, so I guess it will look crappy next to the 4G.


Some info on the pentile matrix:

http://en.wikipedia.org/wiki/PenTile


The AMOLED pixel layout on the Nexus one is actually the same as that employed in many professional video codecs, just in case anyone was wondering where it originated. There's a lot of research backing up its viability as a balanced-looking display arrangement.


It's nice to see Apple get behind IPS to the extent that you don't even have to dig in to the Tech Specs page to be told that the iPad uses it ( http://www.apple.com/ipad/ )

IPS monitors on the desktop are a treat to use, especially if they're calibrated. All TFTs are nowhere near created equal.


Absolutely. I'm still waiting for a good, affordable IPS and LED backlit desktop LCD to replace my aging Dell 20", and my hope is that Apple will help push this technology back into the mainstream consciousness as a "thing to have", and that companies will start mass producing them to have a competitive edge.


Hear hear. Hopefully this is the beginning of the end for those awful TN displays so common on laptops...

Mine is so bad I can't even look at photos together with somebody else...


I don't care about iPhone. But when do I get dense IPS displays for desktops? Feels like we have been stuck to 100 PPI forever.

24" 300 PPI IPS droool


Now that there are 3 different iphone resolutions, are we going to keep hearing about how bad fragmentation is?


What’s the third one? The iPad? That’s fragmentation you want – iPhone apps make no sense on that device.

Other than that, doubling the horizontal and vertical number of pixels (four times the pixels) is actually quite clever and will probably not lead to any fragmentation. Since the display’s size won’t change in any significant way, old iPhone apps will just look the same on the new device (every pixel will just be displayed four times, no need for ugly interpolation) – everything that’s a vector and not pixels will just look better without any effort on the developers’ part. The old iPhones presumably won’t have any problem with apps that come with higher resolution bitmaps.

Since the screen size doesn’t change it also doesn’t make any sense to add any additional elements which would then be too small for older devices – the size of our fingers and the resolution of our eyes is the limiting factor here, not the resolution of the screen.

All of this also means that a increase in resolution is no more a big leap forward (i.e. it allows you to display more stuff), it “merely” makes text, videos and photos look much nicer. It’s quite cool that we finally reached that point.


Palm did the same trick when they switched from 160x160 screens to 320x320, back in the day.


320ppi is getting to be pretty sweet. Hook me up with a 30" version, along with one more doubling of ppi and it's going to be amazing.


Looks like Gruber was right on the money again. From April: http://daringfireball.net/2010/04/why_960_by_640




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: