Welcome to Pete Brown's 10rem.net

First time here? If you are a developer or are interested in Microsoft tools and technology, please consider subscribing to the latest posts.

You may also be interested in my blog archives, the articles section, or some of my lab projects such as the C64 emulator written in Silverlight.

(hide this)

Rant: HDTV Has Ruined the LCD Display Market: Or, I want my pixels and DPI now!

Pete Brown - 22 April 2010

Ok, that's it. I've had it. I want my pixels, damn-it!

For a while, screen resolution has been going up on our desktop displays. The trend was good, as I've always wanted the largest monitor with the highest DPI that I could afford. I mean, I used to have one of the first hulking 17" CRTs on my desk. I later upgraded to a 21" inch job that was so huge, that if you didn't stick it in a corner, it took up the whole desk. It was flat-panel, though and full of pixels. It cost me around $1100 at the time.

LCD

Then we switched over to LCD displays. First, they were all 4:3, but later most became 16:9. Still, the resolutions were slowly but steadily going up. I figured it would be only a matter of time before massive displays with super high DPI became available. Now mind you, I don't want the super high DPI to fit more info, I want super high DPI so I can get extra smooth text and screen elements. Windows supports high DPI displays, we just don't really have any.

If the DPI on screens were high enough, we wouldn't need antialiasing or font smoothing or any of the other hacks that get us around low DPI issues. Think about printers: once the DPI got high enough, you could no longer tell there were individual dots making up the letters. That happens at about 600dpi (or 1200dpi if you're really paying attention). Heck, many folks look at 300dpi and think that's smooth. Our screens? Mostly still stuck at 96dpi.

My own primary screen (until I pick up the 30 later this year) is an older HP 2335 running at 1920x1200. It has a "TCO 03" sticker on it, to give you an idea of its age.

Then comes HDTV, and a backslide on resolution and DPI

Now, suddenly, the market is all about displays that display 1920x1080 "1080p HD" resolution. Wait!! What happened to all those wonderful pixels at the bottom? 120 pixels may not seem like much, but it is when you work on software that needs every last inch of your screen (development, design, video editing etc.). The brand new display to the left of it is actually an inch larger and the same resolution, so slightly lower real DPI.

At the rate we were going for a while, we should have had twice or three times the DPI on a 24 or 23 inch screen. But nooo.

Now all the panel manufacturers are hitting the consumer sweet spot of HDTV resolution. There's more money to be made in TV than in supporting alpha geeks with super high res displays. The highest resolution I've seen is on a 30" display, and they still cost a buttload because of the lack of competition. ($1200 to $1800 each). Even the laptop manufacturers are backsliding and going to these piddly netbook-class panel resolutions.

Average DPI is going down because TVs sell based largely on size. You end up with a 60" screen running at the same resolution as new 23" computer displays. They have big fat pixels the size of thumbtacks (ok, not quite that large).

We know we can handle higher DPI panels. Look at laptops running at the same 1920x1200 resolution on 15". Heck, look at phones doing 800x480 on a 4.1" OLED display. We have the technology. It can be done.

Will OLED change things?

I understand that OLED is both easier and more efficient to manufacture. Hey, and they're more power efficient especially due to reduced backlighting requirements. [Correction: eliminated backlighting requirements, not reduced]

With mobile devices moving to OLED, will we see them on our desktops? If they're easier to manufacture and don't suffer from the same rate of dead pixel problems, will that make the panel companies branch out into much larger displays? Or will they make them bigger, but at the same sucky 1920x1080 resolution?

Since OLED is done with plastics, can I just get one big honkin' panoramic display that wraps around me at my desk? Or am I going to have to wait for some new HD++ resolution to hit TV-land…or worse…have a 3d display on my PC?

I want my pixels. I want my DPI, and I want it in the form of a big honking display for my computer.

Update 2010-04-26: Here's a very timely XKCD on this very topic:

xkcd

   
posted by Pete Brown on Thursday, April 22, 2010
filed under:    

162 comments for “Rant: HDTV Has Ruined the LCD Display Market: Or, I want my pixels and DPI now!”

  1. Damien Guardsays:
    A comparison with printers isn't very conclusive - it's not just their DPI that makes text look smoother but also the non-rectangular nature of the dots combined the slight smudge of the toner or bleed of the ink.

    I'm willing to bet anti-aliased text on a 300dpi LCD screen will still look better than bi-level rendered text on the same device.

    [)amien
  2. Petesays:
    @Damien

    But don't you think there is a "magical" DPI level where anti-aliasing is no longer required? I'm pretty sure that if you get a screen to 600dpi or 1200dpi, there's no need for it. Of course, a screen hitting DPI like that would be amazing regardless :)

    Printing: with a laser, the roundish dots smooth out a bit, but there's pretty consistently opaque from center to edge, so I'm not sure it's quite the same effect. Similar enough that I get your point, though.
  3. WOmbatsays:
    Given that I'm hunting a new display, I have to agree with a lot of what you said. As far as I am concerned, a 24" display should be 1920x1200, not 1920x1080. I know that it's all about economies of scale, and 1920x1080 panels can be used for TVs as well as monitors, but that doesn't assuage my displeasure.

    I have two 30" 2560x1600 displays, and they are a delight to use (albeit horribly expensive!). I am tempted to put them both on the same machine (one is on my Photoshop box - it's a colour-calibrated Eizo CG301W; the other is a Dell 3008WFP on my primary development box). I think I might enjoy a pair of 30" panels, possibly in portrait orientation :-)

    I have used 2 x 24" displays before, and it's OK (I'd like smaller bezels), and I've used 4 x 19" displays in portrait orientation (way too many bezels!). I want a huge desktop.

    Not so convinced of the need to increase dpi - I want a lot more pixels before I'm willing to make them too much smaller. 30" panels run around 0.25mm dot pitch, which is about as small as I want to go until I can have panels up around 8000 pixels wide. At 16:10 aspect ratio, that would be 8000x5000 - might require a fairly serious graphics card to drive it, too.
  4. int19hsays:
    @Damien: you'd fail the bet, I've seen WinMo devices running at ~260dpi (that's _honest_ 480x640 on a 2.8" screen - not RGBG as in e.g. Nexus One, but proper RGB pixels each). They run with antialiasing disabled out of the box by default, because you can't really tell the difference when working with it.

    For example, take a look at the following screenshot:

    http://www.smartphone.ua/img/arts/1682/1682_36.jpg

    This is a screenshot directly from Glofish X800. As you can see, there's no AA. But when holding it in your hand, you won't see it even if you stare at it very hard.
  5. Scotesays:
    Yea for noticing this!

    I'm shopping for a new rig and every common monitor, from a 17" MBP panel to a 25 inch Samsung is the same horizontal pixel count :-0 That is just crazy. And then there is the huge price jump from the 24" class to the 30" class, where you finally get more pixels.

    I guess I'll hold out for a few months and see if Apple releaseses a new 2560 horziontal pixel monitor as the rumors claim.

    Not sure if it is true that the CFL gives a 100% gamut vs. 72% for LED as is claimed for the new 30" Dell, though.
  6. robinsays:
    i bet you wouldn't even shut up if you ended up in the situation where the display's size combined with its resolution resulted in you just not being able to make out individual pixels comfortably. at what point would you cry about the 11px verdana craze? i have sharper eyesight than what you americans refer to as "20/20 vision", and i find 17" "lugtops" with 1920*1080/1200 resolution a complete idiocy and utter annoyance.
  7. Corysays:
    I think you should better explain that high DPI makes things more legible, and how people who think things are too small are doing it wrong -- a great majority of people still get confused by this.

    I'm with ya, though! I've been waiting for high-DPI displays for a long time. And how about scRGB support with a wider gamut? :)
  8. Sebastian Gomez [almafuerte]says:
    Dude, you use windows!

    I mean, you actually get paid to promote the worst thing that has ever happened to technology (namely, microsoft). That is our real issue, not the lack of pixels in displays. The problem here is that windows is a piece of crap, and the only way microsoft has to try and fix it is by throwing hardware at the problem. I can do things on GNU/Linux in an ARM processor that you can't do on windows in a quad core intel. Microsoft software looks shitty on ANY display. No vector icons, outdated (and ugly) bitmap-based widgets, no real GL desktop, windows is still 2D (with a few shitty direct3d-accelerated effects), while the rest of the world (OSX, GNU/Linux) have FULL OpenGL graphic servers.

    The problem is not the lack of pixels in your display. It is the lack of quality in your software.
  9. MechRsays:
    I don't mind DPI so much, but absolutely despise the shift toward 16:9. It *might* be better for movie-watching, but it's worse for everything *else* people do on computers. Hell, it's not even better for movie-watching because 720p has to be upscaled to fit 1366x768.
  10. Petesays:
    @Sebastian

    You're letting your dislike of a company cloud your reasoning.

    I'm looking around, and don't see another operating system that diverges from this path. Windows, Linux, OS/X, they're all based on the same core ideas. Heck, even Linux versions had "start" menus for the longest time.

    FWIW, I get paid to help developers be productive, not to promote Windows. Any promotion of Windows that I do do is because I like the product.

    So, nice try, but you need to go back and do a little research if you plan to have a discourse on this topic in a way that rises above trolling.
  11. Danielsays:
    Point size is a phsyical measure of size (1/72 of an inch of a lower case 'x' or similar)..

    Windows (pre Vista) was pretty bad at handling non 96 dpi displays hence the "my fonts are too small" cries from people using high DPI monitors (and the lack of clue to change the fonts - although it's a pain that you should have to).

    "Proper" OSen ask the monitor for its physical size and then compute font sizes in pixels based on this and the desired font size (in points).

    Typed from a 15" display @ 1920x1200 - it looks great because I have more pixels to display the same physical sized text :)

    Alas my new one is only 1680x1050 as buying anything higher in 15" is apparently impossible :(
  12. Petesays:
    @Robin (thanks @Cory)

    You need to understand DPI. Rudeness of your comment aside, what you need to consider is that the higher the DPI, the crisper the rendering. So, you can have your letters appear at the same physical size, but they have a better form because they are made up of more pixels.

    Think of printing, to stretch the analogy a bit further. A 9 pin dot matrix printer can make text that is 1/8" of an inch high, and so can a 600dpi laser printer. Which one is going to have better character formation? The one with the much higher DPI. They're both 1/8" high, but one looks better and is easier on the eyes because it is made up of more elements.

    Pete

  13. Kisaisays:
    I always hated the fact that I could never find another 1920x1200 15" Laptop... I'm like you have got to be freaking kidding me, I don't want a 17" laptop with LESS pixels.

    Same with the desktop, I waited and waited and waited until someone would release a 1920x1200 display in 24" or LESS. Wound up with the Samsung 245BW at about 700$ at the time (The other option was a BenQ model that became really popular because it had HDMI, DVI, VGA in it, but nobody would carry them.)

    If I buy something larger, I want more pixels. Otherwise you have to be sitting several feet away, and that's just not fun unless you're playing a game.
  14. Greg 'groggy' Leheysays:
    Agreed entirely that the screen resolutions are ridiculously low. But you're blaming the wrong cause. Screen resolutions ground to a halt about 10 years ago. In 1996 I ran a monitor at 1600x1200. In 2000 I got two monitors that I ran at 2048x1536. Since then, nothing affordable had a higher resolution. When I finally changed to LCD screens I had to drop resolution to do so.

    But look at sales figures. The average man in the street didn't even buy the highest resolution he could get until HDTV came along. If anything, HDTV is raising the resolution for him. The real problem is HTML and the design of both web browsers and web pages. Try displaying a web page on a simulated 4096x3072 (300 dpi) display. In many cases the text overflows, overlaps and sometimes disappears. Buttons disappear outside non-resizeable windows without scroll bars. The page becomes illegible and unusable. The big problem is that so many things in HTML assume a specific relationship between point size and pixel count. Until that gets fixed, most people won't buy higher-resolution screens, so they won't appear.
  15. Wyattsays:
    Yes! Finally, someone who agrees with me!

    At the risk of someone making an awkward, inappropriate joke, I don't much care the size of the monitor. People at my old office looked at me like I was an idiot when I said I wanted the extra one inch of monitor for the extra resolution. It's not size I was after, but workspace!
  16. Petesays:
    @Greg

    Good , umm, point. I know so many sites (including this one, no doubt, I'll need to check my CSS) that use pixel sizes for fonts.

    So maybe HD has raised the low end, but it seems to me there are far fewer good choices on the high end, and the high-end is pretty fixed in resolution. Someone with access to market data for the last decade or so should plot the number of available displays per resolution per year and see what it looks like. My guess is a convergence toward the middle.

    For folks asking for vector apps: WPF and Silverlight both address this (as does Moonlight on linux). Get more apps written in those and we can actually use the higher DPI correctly without the pre-Vista DPI mess we had (which is due to GDI and apps written in it that don't bother to check DPI)

    That said, keep in mind that without the DPI increase, vector won't look any better than bitmap.

    Pete
  17. Robsays:
    I don't mean to pick on Windows, but I have had serious problems running Windows on high res displays. (Some other operating systems have the same types of problems to more or less of an extent).

    The main problem is this: When you have a display that is very high res for its size, (i.e. the DPI is high), everything shrinks. In particular, the fonts become microscopic. (Yeah it's great that OS X has giant icons, etc., but it's the fonts that matter more!)

    So what do you do? Well, you can of course increase the font size! The problem is, this doesn't work properly. Many, many windows applications (including built-in ones) make assumptions that X lines of text can fit in Y pixels. In fact, you can even get some bugs to show up just when using the same font size but a different font. (f.e. set your locale to Japanese, and then occasionally some English dialogs will show up clipped or cut-off). If you double the font size, you are in for all sorts of problems, and many things become unusable. If you don't believe me, try it. Double your font size globally in Windows and use it for a week. Yeah, it will be big if you are on a low resolution display, but there is the more basic problem that the applications don't adapt to it properly, and so it's not usable. This is a particular problem on some of the netbooks that have small screens but normal (or higher!) resolutions, like the Sony Type P.

    You can of course lower the resolution of the screen instead, but then you are wasting that expensive hardware you paid for.

    What higher resolution does work well for in Windows is just having more real-estate in the case of a larger display.
  18. Jeffsays:
    I have low vision and the pixel density is very important. I have to a lot of external magnification to see. When my Gateway 22" Sony based crt monitor bit the dust last year, I began searching for a replacement. So far nothing has come close. The best I have found is an Apple iMac 22. I keep hoping someone will produce a higher density.

    Jeff
  19. Steve Priorsays:
    Yes!!! I completely agree. I'm typing this comment on a Thinkpad T43P with a BEAUTIFUL 15" 1600x1200 IPS screen and you just can't buy those anymore. Not only has the resolution gone down, but the displays aren't IPS quality either. The requirements of a TV screen are quite different from a computer - vertical pixels are much more important when you're mostly reading vertical documents and want to see as much of the page as possible.
  20. Jermain Andersonsays:
    I can't imagine this guy know much about displays. His remarks on OLED are completly off, not only are they harder to manufacturer, but they don't have a backlight at all. The life span is significantly lower than current displays and are used only on mobile devices because they have such problems at large sizes. The business of displays is obviously lost on you, that is how I know this guy is obviously not in the industry because he doesn't know why his computer displays are matched to TV resolutions. To save money on smaller sized displays most companies use the same displays to produce small televisions as they do to produce larger computer monitors.
  21. tallisays:
    OLED is not yet cheaper to manufacturer, but there is potential that it may be. In particular, check this link out:

    http://www.technologyreview.com/computing/24520/?a=f
  22. VulcanTouristsays:
    How about having an auto-pivoting display, with a driver that senses the layout of the content and automatically rotates the display to make optimum use of its physical resolution and pixel pitch?
  23. Petesays:
    If you haven't seen it (or ended up here from it) there's a thread on /. with some pretty insightful comments (as well as a good number of folks who don't get the concept of DPI), but still, lots of good info.

    http://hardware.slashdot.org/story/10/04/23/0012218/HDTV-Has-Ruined-the-LCD-Display-Market

    Pete
  24. VulcanTouristsays:
    How about having an auto-pivoting display, with a driver that senses the layout of the content and automatically rotates the display to make optimum use of its physical resolution and pixel pitch?
  25. Petesays:
    @Jermain

    Thanks for all the assumptions.

    The OLED info I received from folks in the industry. I have no first-hand experience in manufacturing (nor, do I suspect, you). That said, I could certainly be wrong.

    As to not getting why panels are sized to TV resolutions: that was the whole premise of this article :)

    Pete
  26. Legonautsays:
    Just wanted to point out that single-link DVI bandwidth maxes out at 3.96 Gbps (1920x1200 @ 60 Hz) (http://en.wikipedia.org/wiki/Dual-link_DVI). I'm not an EE, but I'd bet that there's some premium for the components necessary for a monitor to exceed that.

    Exceeding 2560x1600 @ 60 Hz will require a non-DVI interface to your graphics card, as well as increased pixel densities from the glass manufacturers. Not sure if that'll be some HDMI variant or something else altogether.

    I've got four LCD panels on my system (including two of the BenQ's @Kisai mentioned), and I long for the day I can lose the bezels. :-)
  27. Richardsays:
    I bought a T60p thinkpad about 4 years ago, with a fantastic 15", 1600x1200 display. It's the best I've ever used. The machine is getting quite venerable, but I keep it because there is no decent screen around now. All laptops come with "shortscreens" now, which are just painful to use.

    Desktop monitors aren't quite so bad, because one can buy a 24" shortscreen as a substitute for a 20" fullscreen and still get 1200 pixels vertically. But still, why does the DVD tail have to wag the computer dog?

    For that matter, 16:10 is lousy for TV and appalling for home movies. There's wasted detail on the sides, and people get truncated at top and bottom. The obvious solution is to have 4:3 screens, and, in the rare case of watching video, black bands at the edges.
  28. Petesays:
    @Rob

    This actually got much better starting with Vista. If we could get more people to move to Windows 7, we wouldn't have those problems ;)

    Seriously, give it a try. Win7 is a good OS, better than XP IMHO.

    Pete
  29. Matthewsays:
    @pete
    Excellent article. I agree it is odd that we have slowed down in such a way. Seems to me that the only choice you have (especially in the laptop market) is if it is or is not glossy.

    @robin
    I have one of those 17 inch "lugtops" however I did not get it for the screen. I got it because I can have 2 internal hard drives RAID 1 + 12GB of RAM. The 17 inch screen is nicer to look at then a 9" netbook, but ultimately that was not a factor in the decision. Keep in mind that for different use cases there are different solutions. No one solution will meet all requirements.

    @sebastian
    Just a minor correction, the worst thing to ever happen to technology was not Microsoft... It was fanboyism.

    -matt
  30. massays:
    Your argument (which is very valid by the way) kinda reminds me of the reason American car makers went after creating and marketing SUVs. More markup and higher volume based on the perceived safety(!) of a bigger car (or bigger surface area of a HDTV (with the same old technology in my old 19" 1600x1200 monitor) compared to more safety features that car buyers cannot see (higher, more compact DPI), that costs a lot to do R&D for.
  31. Burnstreetsays:
    I wholeheatedly agee with you. I'd like to switch my 2 IBM T221 CRTs I have running az 1940x1536 for flat screens but there I haven't found a worty replacement yet :(

    Actually there was one display: the IBM T220/T221 http://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors
    But they stopped making it in 2005 and it's nearly impossible to get one (or two for that matter)
  32. Petesays:
    @Mark

    You must be thinking of something else.

    ClearType is a font smoothing/subpixel rendering technology. Most modern operating systems use a font smoothing algorithm to make up for the lack of pixels.

    http://en.wikipedia.org/wiki/ClearType

    Pete
  33. marksays:
    also the entire UI of windows isn't as Resolution independent as OS X is.
    so the entire windowing system of Windows Requires yet another re-write to come even close to OS X.

    also since the memory-footprint of aero is disastrous next to OS X, Aqua is even quite snappy on a 1ghz G4 vs AERO being slug on a Quad-Core2/i7 intel @3ghz with an Ati 5870 (really it's fugly and slow on a 30" display).
  34. Viakensays:
    Good luck. A years or so ago, I was doing home PC repair and support. The majority of the houses I went in had 20", 22" displays (laptop or LCD) running at 1024x768. When I tried to set it to native resolution, they'd complain "Why's everything so small? I don't like it. It's hard to see." Turning the font size up didn't help. "Just set it back the way it was." The mother of one of my friends runs her new Vista laptop at 800x600 so she can see. I spent considerable time adjusting font sizes and things, but she complained about the icons and web pages and on and on. It just wasn't worth the fight, in the end.

    Most people just don't give a rat's, so there's a lot of inertia for us to overcome in order to get this thing moving again, if it ever will. We may evermore find ourselves paying a premium for decent displays.
  35. Petesays:
    @mark

    ClearType in Vista and Win7 does a really good job. If you're talking about ClearType on XP, I agree, it wasn't very good.

    Also, I don't get why you say OSX is more resolution independent. You can't even change the system font size (as pointed out on /.)

    @Viaken

    The same problem as in so many other places in life, like quality of movies, for example. *sigh*
  36. LeoPsays:
    >i have sharper eyesight than what you americans refer
    >to as "20/20 vision", and i find 17" "lugtops" with 1920*1080
    >/1200 resolution a complete idiocy and utter annoyance.

    You have a point: 1920*1080 should be 15", not 17" !
  37. pddcsays:
    My jaw dropped when I saw a 4K TV for the first time at NAB this year. Of course, we're far from getting this. There would be no interest as transporting this data is still not feasible for networks or home delivery content (it would only fit about 1hr on BluRay).

    The 3840x2160 panel was flawless.

    Too bad all of the NAB was focused on 3d Stuff. Panasonic had the perfect definition of what 3D meant (DDD, Dollars, Dollars, Dollars). and that is the only point of the industry. After selling all the "HD" 1080p gear they could, they have to sell the next thing. and of course 3d is easier than higher res.

    Don't want to start a rant of my own. Fully agree we need more DPI
  38. Sebastiansays:
    Don't cover it up. It doesn't matter what machine you used to code, you coded in basic. It doesn't matter if it was Basic on a Talent or basic on a C64, still BASIC. And anyone that has ever touched basic is already too screwed up to learn any real language.

    Real man wrote 6502 assembly on their C64s.

    //My first approach to programming was assembly for a 68k.


    Anyway, I guess you have your reasons for liking windows. I'm sure you have checked out the source code, you found it to be elegant and well written, and you have made many contributions to it. Oh, right, you don't even have access to it. And the one time some if it leaked, it turned out to be a huge mess of spaghetti code. Enjoy.
  39. Troysays:
    I went from a laptop with a 15 inch16:10 1280x800 to a 16 inch16:9 1366x768. I agree I want more Pixels. at least 1920x1080 for my 16 inch screen but the more the better. I am using Windows 7 and I have found that unless your using a large screen the Snap feature is quite useless. But at the Same time I want Multi-Touch Screens. so What I really want is 300dpi 16 inch Multi-touch screens.
  40. TSsays:
    Thank You! I've been on a rampage the last couple days trying to find a wide variety of laptops without the extremely annoying 1366x768. I had higher resolution laptops in the 90's!

    768 pixels! on a 16" laptop screen! WTF! Who is designing laptops these days?
  41. chriscutsays:
    Obviously they are keeping the sizes at 1080P because its cheaper! I really hate the new smaller 16:9 screens, which that's all you can find at the stores today. Some higher priced screens are 1920x1200, but I need more. I guess LCD manufactures figure that if you want more pixels, you'll shell out dough for a second screen.
  42. Joesays:
    Well as much as I hate you for being a MS person! I AGREE. What the H. is gong on. There isn't anyway to deal with it either because you have to redesign the entire laptop chassis & boards and all so you can't hack together anything even and you can't as a small company even design something to sell because you can't get the peaces any more. Trust me. I've tried. I sell these things.

    My next laptop is going to be a used Lenovo T60. I'll be selling systems I'm not buying even if I use the software they come with and the essence of them (services/support). That T60 despite being years old will be a minor upgrade for me. Fortunately I don't need anything more! The systems then were fast and the MS Windows operating systems have only slowed things down. The problems comes with the ram for me though. Not so much a speed concern as I don't utilize even 1gb probably on GNU/Linux. I'm running out though of ram from the perspective that I come up with new uses for it. At one time it was a ram drive for things like web browsing until I switched to a non-ms browser anyway- that was designed properly and now it is other stuff where I could use a few GB of ram. I mean. Even though It'll be a long time before I really absolutely must have more ram than a T60 can handle and my OS/Applications demand it the time will come. And sooner then I care to think. How long can I hold out- 4-5 years maybe?
  43. Max E.says:
    Couldn't agree more.

    Widescreens... more like shortscreens, amirite? Unfortunately, 16:9 screens are cheaper to manufacture. That is the real reason they have become "popular," there are no alternatives.
  44. Jamessays:
    Very good point and I agree entirely, but I do agree that things would be nicer if I could hook up a higher end screen and it work like the lower sized screen but with more resolution. I myself am probably looking at a pair of Dell Ultrasharps because they're the closest thing to a high end display that I can afford a pair of.
  45. Bobsays:
    Utter crap. Its all about resolution vs distance vs "angular acuity". Your eye is a sensor and has a built in "resolution" (not quite true but roughly analogous).

    When shopping for a plasma panel I had to choose between 720p and 1080i. Based on a few calculators floating around the net, at the viewing distance in my lounge (10") the human eye could not detect the difference between the two on a 50" panel.

    Being a skeptical person I eventually went to the store and stood 10" away from the panels to compare and it turned out to be true. There was no way I could tell the difference between 720p and 1080i at that distance. (What was noticeable however was the poorer color reproduction of plasma over LCD, especially when looking at continuous shades, which exhibited "banding" artifacts).
  46. Happysays:
    Still lugging around my lovely Sony Trinitron G500 display (repaired twice); even though my spine hates me for it.
    Can run 2560x1920 in Windows (equal opportunity whine: KDE desktop on Ubuntu maxes out @ 1856x1392).
    In younger years I just loved to be able to "see everything" - I still aspire to that, but crisper, larger text is also getting to be a priority these days.

    Since you can't actually "increase" DPI, I have tried faking it (ie: pretend my 2560@96 is 1280@192) and found that all OS in my possession (Vista, OS X, Linux of various flavours) fail to cope adequately with a display of that pitch.

    In the end, I settled for just running all the system fonts at twice the size; many stupid programs (again, on all OS in question) fail to adequately enquire of the run time environment the particulars of their display situation; you can't blame this on the OS vendor, it's a dumb programmer mistake (like, how many of you out there even bother to test on 150% font scaling? I though so).

    A previous poster mentioned the actual "dot clock rate" (hey, I'm old) of display interconnect hardware probably has something to do with it - that's a certainty. OLED has the promise of perhaps integrating memory and display controller hardware into the substrate (pluggable display tiles anyone?).

    Laser based displays also have promise (no pixel defects, just angular accuracy to be concerned about).

    Increased LCD dot pitch is going to be a hard thing to fix given that (as we have read) the predominant consumer of display panels (for the next few years, anyway) will be the upgrade and replacement market for TV.

    Nirvana? How about say, 5 meters high, 10 wide, coupla hundred dots per cm? (not holding my breath)
  47. Petesays:
    @Bob

    "Utter crap"? Did you read anything above, or just comment based on what you assumed the article to be about?

    I'm talking about panels used for computer displays. I'm sitting just shy of 2" away (eyes to panel). I can certainly tell the difference.

    Pete
  48. Carlos Mundisays:
    Amen. I have wondered for a couple years when enough of us would wake up. Just try finding a 1600x1200 display. I scrounge whatever I can. We are not commercially relevant. But if some of us could come up with a technology which convinced Hollywood that they need the extra pixels, then their marketing droids would do the work for us. Subvert the dominant paradigm!
  49. bob_supersays:
    I'll second pddc. I saw 50-inch 4k screens at NAB and IBC and I sooo wish I could afford that not the highest DPI but I had to be under 2 feet away to see the pixels (vs about 1m for the 37" HD I type his on). Actually I wish I could afford the UHDTV demo screen (8k x 4k 100 inch, 4 of the 4k 50s with tiny bezels) that NHK has been showing off at the shows, but even I realize that I couldn't keep it well fed.
    Just hoping that NHK will get their way soon with UHDTV. It's standardized, it's ready on the eng size, but the industry can make more money for less investment pushing 3D (as was pointed earlier). Our best hope is that crappy 3D would fail fast enough to bring UHDTV to the front (4k is DCI rez, 8k will be a harder sell)
  50. bob_supersays:
    I'll second pddc. I saw 50-inch 4k screens at NAB and IBC and I sooo wish I could afford that not the highest DPI but I had to be under 2 feet away to see the pixels (vs about 1m for the 37" HD I type his on). Actually I wish I could afford the UHDTV demo screen (8k x 4k 100 inch, 4 of the 4k 50s with tiny bezels) that NHK has been showing off at the shows, but even I realize that I couldn't keep it well fed.
    Just hoping that NHK will get their way soon with UHDTV. It's standardized, it's ready on the eng size, but the industry can make more money for less investment pushing 3D (as was pointed earlier). Our best hope is that crappy 3D would fail fast enough to bring UHDTV to the front (4k is DCI rez, 8k will be a harder sell)
  51. rndtcsays:
    Hey Pete,
    Sound like a comment I read over at theinquirer.net:light-peak-unlikely-usb3-killer.
    I completely agree, although I didn't read all the comments, I might reiterate some of my observations:
    -Displayport(1.2 DualLink) has a max res. of 3840×2400@60Hz×30bpp.
    -I've also seen comments that LightPeak supports 7680x4320. Can't comment on the refresh and colour depth, and the new factor, 3D, requiring double the bandwidth. Seems like it should be more, like x4.
    -Those commenting on the T221, it's my yard stick, and it almost 10years old; I'd buy one if they put a Displayport on it. I think the last refresh required 2 duallink DVI-Ds, cool, but...
    -ATI now supports a max res of 8000x8000ish, over a eyefinity set-up but put a Lightpeak on card...
    -Those commenting on the MS Phone7 display, that's nice DPI.
    -There is also LazerTV to consider, or variants, increase the scan to(an area) to increase the DPI.
    -Similar to LazerTV, using multiple projectors overlaying the output, and DLP tweeks to make it work.
    -The dell 2711 is a step in the right direction; I hope well see a 16:10 30inch'ish at the same DPI in the current generation.
    -If quad-hd is going to be the new std., lets hope we see one below 56inch, 32 would be good.
    -Ideally, I'd have a 30inch@400dpi running at 200Mhz and 30bpp.
    -The biggest problem though is the panel makers pricing on the 30s and IPSs, it ain't Moors Law.
    -And finally, let just say there there is obviously room for more competition in the panel manufacturing sphere, IBM get the AMLCD happening again, and also lets hope that a few new panel manufacturers and technologies come online soon.
  52. Mark Butlersays:
    Why can't browsers allow the HTML page or HTTP header to specify the pixels per inch the page was designed for and scale pixel measurements by some rational factor so that it looks good on a high dpi display?
  53. Hughsays:
    As a Linux user, I'm embarrassed by the Anti-MS comments. They are off-topic and rude.

    Apparently we can all agree that WinXP made some unfortunate choices that still hold back high-res deployment. WinXP is dead -- move on. WinXP gave a bad set of expectations to a whole lot of people: high-res means unreadable.

    I've had a 30" 2560x1600 display on my (Linux) desktop for years. I'm ready to move up, but those options are not there. Bill Buxton (Microsoft Research visionary) promised me that it would get better. Years ago (perhaps 5 years ago). I'm still waiting. (This is not an attack on MS, just a demonstration that predicting the future is not reliable.)

    Those who whine that 30" high-res LCDs are too expensive and all cheap displays are 1920x1080 (or worse) don't get it. The 30" is not mass produced and won't be if we early-adopters don't buy enough of them. I've done my bit -- please do yours.

  54. Kevinsays:
    I see what the author was saying, and it's probably a bit true. On the other hand, I wonder if the current monitors which are priced decently for a 2048x1152 or even bigger, haven't just hit a spot where they are too expensive to produce anything larger. Like not too long ago a 1TB hard drive was nicely priced... then 1.5tb... now the 2tb are starting to become cheaper. It may just take a little time for the bigger resolution monitors to be able to be produced. True your point about 1920x1080 being a new sweet spot. In fact I just bought a new 1920x1080 24" computer monitor (with RBG/DVI/HDMI). I may rarely be using it for a 1080p PS3 / Blu-ray player, so I used that resolution as criteria for my purchase. I had a 1280x1024 (5:4) before, so this almost is like two of those side by side.

    Below this is off topic -- I just had to respond to Sebastian's rant about BASIC ruining programming.
    @Sebastian,

    Your comments about BASIC ruining coder's minds... You couldn't be more wrong. That's like saying you played a trumpet in the 6th grade, now you will never be able to learn the piano....In the 70s, and early 80s, that was all there was (except for assembler, of course). If a person has what it took to be a real coder, they easily went from BASIC to another language, as they were developed. I used BASIC back when I was 8 or 9 years old, using a computer my Dad built in an old plastic file cabinet. This was before a person could buy a computer (like the TRS-80 from Radio Shack). i loaded a 4K, or 8K version of BASIC from paper tape at 10 chars/sec. I eventually got a TRS-80 Color Computer, which eventually had a C compiler, but I didn't switch from BASIC to C until I was in high school. I took PASCAL in college, what a waste for me at that point, but it was required for my CS degree at the time. Of course I took "C" classes in college, and became quite good at it. After I got a nice job using "C" after graduating, I learned C++. Some people say learning C++ after learning C is a good thing, since you would understand what is really going on at a lower level. Perhaps learning BASIC first helps one appreciate the differences in other more well laid out languages? My point is, learning BASIC at 8 or 9 years old had absoultely no detrimental effect on me whatsoever. Sure it used GOTO and line numbers. But it was all we had. You move on, and learn new things. I used to tutor others in "C" at my college. You can tell when people aren't cut out to be programmers and you wonder why they are in these classes -- perhaps this is really where you were coming from, and you were just blaming their bad skills on the language they first learned?. These people couldn't understand pointers or dereferencing them... the list is countless about what they couldn't grasp. These are the types that took BASIC classes and thought they'd be programmers. It doesn't matter if they had taken BASIC, "C", C++ , these people just weren't coders. I'm hoping that's where you are confused. BASIC wasn't their problem.
  55. Happysays:
    Still lugging around my lovely Sony Trinitron G500 display (repaired twice); even though my spine hates me for it.
    Can run 2560x1920 in Windows (equal opportunity whine: KDE desktop on Ubuntu maxes out @ 1856x1392).
    In younger years I just loved to be able to "see everything" - I still aspire to that, but crisper, larger text is also getting to be a priority these days.

    Since you can't actually "increase" DPI, I have tried faking it (ie: pretend my 2560@96 is 1280@192) and found that all OS in my possession (Vista, OS X, Linux of various flavours) fail to cope adequately with a display of that pitch.

    In the end, I settled for just running all the system fonts at twice the size; many stupid programs (again, on all OS in question) fail to adequately enquire of the run time environment the particulars of their display situation; you can't blame this on the OS vendor, it's a dumb programmer mistake (like, how many of you out there even bother to test on 150% font scaling? I though so).

    A previous poster mentioned the actual "dot clock rate" (hey, I'm old) of display interconnect hardware probably has something to do with it - that's a certainty. OLED has the promise of perhaps integrating memory and display controller hardware into the substrate (pluggable display tiles anyone?).

    Laser based displays also have promise (no pixel defects, just angular accuracy to be concerned about).

    Increased LCD dot pitch is going to be a hard thing to fix given that (as we have read) the predominant consumer of display panels (for the next few years, anyway) will be the upgrade and replacement market for TV.

    Nirvana? How about say, 5 meters high, 10 wide, coupla hundred dots per cm? (not holding my breath)
  56. Steve Kasiansays:
    "[Correction: eliminated backlighting requirements, not reduced]"

    CORRECTION AGAIN, Mr. Micro$oft guy (I guess this would explain why Microsoft products blow so hard - their corporate leadership is made up of dolts!) - REDUCED, not eliminated. You were correct before you started thinking too hard. Stop thinking so hard, Mr. Micro$oft man - That was Bill Gate's problem!

    OLED displays ARE BACKLIT by LEDs instead of fluorescent bulbs. Each pixel has it's own LED backlight, which is synched with that corresponding pixel. The LED's light emission is adjusted based on the brightness of it's pixel. It shuts off completely whenever the pixel goes black in order to create a true pitch black pixel, eliminating the awful gamma issues that plague all LCD monitors due to the fluorescent bulb constantly backlighting the entire screen, black pixels included.

    And you're a "Tech Guru"? Oh God, please help us - please wield your sword from Heaven and put Micro$ux out of business once and for all... so that we may finally have a computer renaissance! I find it quite ironic that one of Micro$oft's "Tech Gurus" is complaining about an industry stifling innovation in HD monitor technology development while one of the main cornerstones of Micro$oft's brand is "stifling innovation". Haha - I'm glad you're pissed. Now you know how the rest of us feel about your company's crappy offerings in the monopolized computer operating system marketplace. DEAL WITH IT! We all have to.
  57. Andrew J. Winkssays:
    The other way to do the calculation is to ask "how large must the screen be to contain the same number of pixels as a printed page"?

    To match 600 dpi on an A4 printed page, based upon 96 pixels per inch on the screen:
    A screen height of 6 foot!

    It my contention (along with my co-author [*] Helmke Hennig) that until pixel densities go right up, or truly massive numbers of pixels are available, paperless review of documents will remain a myth.

    I am even prepared to offer a compromise - give me a screen twice the height of the sheet of paper. I'll put it just out of arm's reach for a perfect read, and the makers need only cram in 300 pixels per inch.

    And that is only to match 600 dpi printing, where one must frankly not examine the page from too close up. For top-class visual quality, like that offered by a fine art book, the equivalent of 1200 dpi is required.


    [*] "Practical Writing for Practically Everybody"
  58. Jeroenwsays:
    Ah yes, the HDTV craze, the followup of the widescnreen 16:9 mania that brought you things like widescreen tomtom devices and even phones in 16:9

    These days it is all about hdtv. There's only one good thing comming from that and that's something like a standard rez again. Back in the day you had 800x600 and 1024x768 and that was it and everyone knew where they stood. Now it is things like 2567x1858
  59. Mike Jsays:
    On operating systems with decent window management, I find the current 2500x1600 screens to be more than enough. Maybe your problem is poor window managment. Try a different OS...
  60. Anthony Harmonsays:
    After 8 years of proselytizing my Macs to the semi-ignorant Windows drones, using PCs for 13 years and escaping the shackles of..... whatever...

    Maybe, just MAYBE there aren't any 600PPI screens because there are not any GPUs that can can handle them yet....

    Just my 2 cents worth....
  61. Ben Resersays:
    @Pete, you can change the font size on the Mac, Apple just chooses not to expose it in their system preferences.

    TinkerTool will let you adjust it and all it does is alter the system configuration that Apple chooses not to expose themselves:
    http://www.bresink.com/osx/TinkerTool.html

    Apple tends to only provide configuration controls for the things they think users will find most useful and lets the power users get more options by installing 3rd party utilities.
  62. Anthony Harmonsays:
    Lest I forget, what about the memory bandwidth needed to run that many pixels through the system board and the frontside bus.... can PCIe handle that many more times the workload a video card would need for that increase in PPI (NOT DPI, sorry)

    Comparing TVs to computer screens is an apples-oranges prospect at best....
  63. Conor Flanagansays:
    Was the AA vs higher dpi not something Ed Catmull researched in the 70s, and came out in favour of AA?

    Now I've only read about this in the course of doing a Fine Art dissertation so there could be many things wrong with what I'm saying, but he is the father of the zbuffer which is almost universally applied in computing.

    There were colleagues of his who supported higher dpi but they couldn't remove the jaggies, was this maybe a limitation of hardware in the 70s? Or is 600dpi still a bit of a wish, even today?
  64. an onymoussays:
    Mobile phones have pretty good dpi. Just look at the HTC Touch Pro with 640x480 on just a few inches with >250dpi IIRC. Maybe you should get a few of those and glue them together ;D
  65. Albantarsays:
    Basically we're being hit by a wave of "good enough" products. The bleeding edge is >10MP DSLR cameras, but almost everybody buys <5MP "quickshot" photo cams because they're easier to use, a hell of a lot cheaper, and provide "good enough" quality. Similarly, around 100dpi computer screens between 17" and 21" are "good enough" for 99% of the users.

    Also, in LCD production, there are a lot of pixel defaults. The more pixels you've got, the higher the chance that there's a broken pixel somewhere. At 300dpi instead of 100dpi, you've actually got 10x the pixels for the same screen size, so 10x the failure rate. Customers aren't willing to pay the price for it.

    I'm using a Lenovo T61p laptop with a 15.4" screen running at 1920x1200. Next to it I've got an older HP 19" screen running at 1280x1024; it's bigger than the laptop screen, but lower resolution. I find this a pretty good setup. If I have an application that doesn't properly handle the relatively high dpi of my laptop screen, I drag it out to my secondary screen to "zoom in".

    But basically, everywhere you look, it's the "good enough" that hits home. Why spend hundreds or thousands of dollars for the bleeding edge if you can get "good enough" for 5 to 10 times less?
  66. Maxsays:
    totally agree - finally hitting 24" with 1920x1200 four years ago was a big delight. but not I want more, more screen real estate and more dpi. 27 or 30 inch are the way to go, but frankly the price differential for the increase in rez and dpi is not there!

    I want 28 inch with 120, or even 150 dpi now. 200dpi would be even better.

    and btw: I am a writer and spend my time on the computer with editing text, reading pdf etc. higer dpi = bliss. more pixel = easier reading & editing documents side by side!
  67. Glen Turnersays:
    When I first saw a mobile phone display at 300dpi I thought "great, in a few years time I'll be able to buy a desktop screen with lovely characters. Still waiting for desktop screens to approach those of my old Dell D600 (150dpi).

    But I don't really understand your fixation on the aspect ratio of the screen. From what I've seen of the multiple VDUs on programmers and graphic artists desk they really want 5m x 1m
  68. Daniel Psays:
    I would like hexagonal pixels. This would remove most of annoying Moiré patterns.

    But I agree that screens should be made consistent with book and letter reading:
    same resolution (600-1200 dpi) and a size corresponding to at least 2 vertical A4
    (or Legal) paper pages.

    Playing the evil advocate, higher pixel density increases power consumption.
    I guess this might explain the trend of lower pixel number on laptops.

  69. Alex Crowsays:
    @Steve Kasian

    You are wrong, very wrong, so your attempt to flame is a failure. OLED are not backlit - the pixel elements themselves are "Organic Light Emitting Diodes". There is no backlight as the panel itself if emissive. You are getting confused with dynamic backlit LCDs (which are largely crap) which have far less than one LED per pixel (more like 1 LED per 1,000 pixels!)
  70. Petersays:
    I had 1600x1200 in 15" over a decade ago. Highest resolution was R50, with 2048x1536 in 15", which was 2004. Since then, resolutions have steadily dropped. Today, highest resolution is 1920x1200. Apple, once the king of graphic design and typography, only ships systems with very low DPI displays. I will never buy a Mac because the displays all suck. I'm on Thinkpads and Dells (R6500 and W500 both have 1920x1200, which while no better than may 6-year-old XPS, is the best on the market). MacBook Pro 15" is stuck at 1440x900 (which is fewer pixels than 1280x1024, for those used to non-wide-screen).

    Most applications can scale smoothly to higher resolution. Hit ctrl-+ and ctril-- in Firefox. It'll scale everything -- text, pictures, etc. Pictures have to be upscaled, but text stays smooth.

    Lack of high resolution computers is completely ridiculous to me, but I don't see it getting fixed.
  71. Mark Fsays:
    massays:
    Thursday, April 22, 2010 at 10:53:55 PM

    Your argument (which is very valid by the way) kinda reminds me of the reason American car makers went after creating and marketing SUVs. More markup and higher volume based on the perceived safety(!) of a bigger car ...

    Perceived safety?
    Nothing perceived about it. Basic physics. All the auto crashworthiness data is compiled like vs like. That is, sub-compact vs sub-compact, mid-size vs mid-size, etc. The results are then ranked giving the best in each class.

    You think 'bigger' SUV is inly 'perceived' safer?

    If you mean heavier (more mass) - you couldn't be more wrong.
    Think of a pool table with pool balls, like vs like, roll two from opposite sides into each other. Now try bowling ball vs pool ball and try a head on collision.

    Tell you what, you get into your sub-compact "A" class auto - say a "Smart Fortwo" ( curb weight 1,808 lbs.) as they are known in the US, and I'll get into my 2001 HUMMER H1 4dr Wagon SUV (curb weight 7,154 lbs.).

    In fact, I'll put a few thousand pounds of cargo in the back just to get the weight up to a nice even 10,000 lbs.

    Now, you and me, 25 mph each head on. You are nothing more than a speed bump.

    Try googling around and see the results when they crashed "A" class cars into the smallest regular sedan made by the same manufacture (Smart vs Mercedes "C" class, etc.) - not a pretty sight.
  72. Mikesays:
    AMEN + infinity ;)

    I just bought a system with an extra monitor - the good one? 1920x1200; the cheap one? 1080p. Price difference? ENORMOUS.

    LCD Makers, we want the pixels without the premium. Heck, 16:10 was here first! Some of us like to do work as well as watch movies.

    Now, how to get a good 30" on the cheap....
  73. Andysays:
    If HDTV has caused a backslide in resolution, then why has the tend in laptop displays to be to backslide AWAY from HD (1080p) resolutions?

    It used to be extremely easy to find high resolution laptop displays. 1600x1200 in a 15" screen prior to the advent of widescreen displays, 1920x1200 in a 15" widescreen display. Now it's next to impossible - very few models offer anything higher than 1366x768 until you reach the 17" size.

    You know why Pete? It's not because of HDTV. It's because of Microsoft. Windows 2000 and XP had AWFUL support for displays that offered anything higher than 96 DPI. If you increased the font size settings in Control Panel to compensate and give you readable text, you'd often get completely wacky layout problems in half your applications (text getting hidden because it went past the edge of a window, half the buttons in a dialog box unclickable because they were pushed past the edge of a window, and other weird things like that.

    The end result was a backlash against high-DPI displays from consumers that PC manufacturers are going to remember for a very long time, even if Vista and W7 may have fixed some of the problems.
  74. amanoussays:
    The trend towards lower DPI in screens is not related to the HDTV market IMHO, it is much earlier.

    High resolution displays where never popular because the GUI's of operating systems did not cope very well with them. The hard coded value of 96dpi made everything appear "normal" on a 96dpi screen, small on a screen of 120DPI and unreadable for higher res. It is not accidental that "geeks" where the only audience of high res displays, because they where the only target group which could configure the screen properly such that a 12pt font is 12pt, and not 12px (which is tiny in a display with small pixels).

    Unless all modern operating systems actually give a damn about the values returned by the DDC in modern displays and fix the DPI automatically, this market will never expand beyond the circle of computer enthusiasts and people working with graphics. It is inevitable that this trend is almost dead now. Even the laptop market which brought true hish res displays (like 1920x1200 in 15"), is now drawing back to 1680x1050 at max in 15".
  75. Iansays:
    I though the situation was bad when they advertised the diagonal size of monitors only, and not the resolution. But now things are getting worse! with resolution actually dropping. And the shape of the display becoming less appropriate for just about every task except watching a film! Which I never do on a computer. This is the market for the gullible who are used to crap quality and don't realise that editing a word document on a 1080 wide screen monitor means loads more scrolling. Especially if they use MS office with its fat ribbon at the top and empty space at the sides.
    .
  76. Scottsays:
    I agree 100%

    I love 16x10 resolutions. 16x9 STINKS!!!!!!

    What I wonder is that computer 16x10 resolutions were around long before HDTV was. So why did HDTV go 16x9? Why didn't they go the already set standard of 16x10? The whole "wide screen shows more" pitch is a bunch of B.S. Sure makeing the screen wide is nice, but not at the expense of height!

    Yet another "corporate decision" made without the input of consumers. They have again force fed the public with options that are less than optimal.
  77. Scottsays:
    BTW, I'd also like to see reflective screens go away. It sucks to have to put up with seeing myself or other things in the room while working or playing on my laptop.
  78. Gaston DASSIEU-BLANCHETsays:
    Hi Pete,

    Nice post, but I don't agree with you. I'm still one of those guys who chose a lower DPI screen over a higher DPI one. Yes, LOWER is better for me. Why? You rightly pointed out that windows supports setting the fonts to use higher DPI. However, what you failed to mention is that (at least until XP, have not tried newer versions yet), everything else will look plain ugly when you set the DPI it to something other than the default.

    The reason is simple. Pretty much everything is based on bitmaps and not on vectors. If you change your DPI, your start menu will look disproportionate. Your taskbar icons will have strange pixelating, etc... etc...

    So, I stick with the default Windows DPI setting, which prevents me from choosing high DPI screens (otherwise I'd become blind in 6 months by trying to read the minuscule fonts).

    The day I get a "vector based" (as opposed to "pixel based") operating system that can smoothly scale all my applications to the zoom level I want without making things look disproportionate / pixelated / ugly, then, like you, I'll prefer higher DPI screens.

    Cheers
  79. georgesays:
    Pete,

    I think its all about cost. I'm sure you know that monitors are a commodity item. A company makes up for the lower margin by sell in volume. Most people just want good enough.

    So the question for us developers/geeks is what would be a perfect size of the monitor and DPI?

    I looked at a Dell UltraSharp U2711 27-inch ( 2560x1440 @ 27 inches) priced at $1050. That is a 3.6 mega pixel display with a 109 dpi. That is around $291 per million pixels.

    So if we want a 300dpi, the monitor becomes a 27.9 megapixel display. That is 7.5 times the number of pixels.

    Now I know that the more pixels is not a linear price increase. So I looked at the other Dell Ultrasharps. Using the other monitor's price per million pixels I came up with linear equation of
    Cost Per MegaPixel = 75.52 * MegaPixels + 51

    this is equation is terrible at modeling the price increase of pixels since I really only used 3 monitors. It would be better with more monitors, but this is just a forum post.

    So, using that equation for the 300dpi 27 monitor would become
    Cost per MegaPixel = 75.52 * 27.9 + 51 = 2158.00

    That is per megapixel so the dream monitor would cost $60,000. I believe that is a worst case cost. If I use the cheapest cost per megapixel from the UltraSharp sample ( 234 per mega pixel ) the price would be $6528.

    So even if manufactures keep the same price per megapixel (which they won't) $6500 is still a lot of money. Now I'm not saying manufactures are going to charge $60,000 either. That is there just as a trend.

    I believe that before we get large displays with very high DPI that you want, a breakthrough in creating display will need to be found. And the breakthrough will have to be a manufacturing break to keep cost down.

    George


    PS: Now I'm sure that people can find flaws in my math and critical thinking in this post. Please school me in the way. Its the only way I learn.
  80. Coyotesays:
    The reason monitors have moved to HDTV resolutions is all about cost and demand.

    Even in smaller sizes, OEMs have a higher demand for TVs, so it makes sense to go with 1366x768 vs. 1280x720, or 1920x1080 vs 1920x1200.

    As for why we don't see higher DPI? A few factors:

    - Higher pixel density = more TFT drivers = much higher cost
    - Affordable video cards and display interfaces can't drive them yet
    - Digital video (HDTV) standards don't support them
    - For large displays, smaller pixels = lower pixel-to-interpixel gap ratio = less light output

    Granted, these things only hold for the consumer market. You want higher DPI? You can get it, but it will cost ya $16K... Look up displays by Barco.
  81. Seansays:
    So Pete,

    I'm here from Slashdot, where they (surprisingly respectfully!) referred to you as a Microsoft Evangelist. So, maybe you can explain something to me.

    We've got a technology that can do this. It's big and heavy, but it's also mature, cheap, and very robust. It's called CRT. Where has Windows support for it gone? Sure, sure, it's power-hungry, but not much moreso than similarly sized LCDs (considering the backlighting requirements). And it's not great for the environment, but even more reason to encourage the people who have them to keep them, and not toss them out because newer versions of windows don't support EXACTLY what makes them better than LCD.

    It's almost like the removal of the feature was designed to get every last holdout in CRT-land to join the new LCD reso- er, revolution.

    They've axed resolution changing. Windows 7 picks one for me, and I get to live with it, including the default 75Hz refresh rate. So I can't take advantage of their superior ability to change resolutions.

    I can't use the "advanced" options in the resolution settings, so I can't take advantage of color profiles or of increased refresh rate, two more marks against CRT. It seems like, to an uninformed person, you could just say: "see, you're not missing anything. The pixels are too small to see, the screen flicker's and the color is bad." When that's just not true of the technology. This is like vinyl -> CD. It's supposedly the "market's decision" but I just don't see that from where I sit.

    And where I sit is in front of my 24" widescreen CRT, still running XP.
  82. Gaston DASSIEU-BLANCHETsays:
    Happy said:
    > like, how many of you out there even bother to test on 150% font scaling?

    The correct answer is: you shouldn't have to. Do you imagine how complex it would be to write applications if you had to add a bunch of IFs in your code to check the DPI and layout accordingly? That's completely insane! That's why we have operating systems for. To abstract us from such low level details as the screen DPI. Unfortunately, many OSs don't do this. Windows is one of them.
  83. Andysays:
    "Unless all modern operating systems actually give a damn about the values returned by the DDC in modern displays and fix the DPI automatically"

    DDC isn't the be-all and end-all, since it doesn't take into account viewing distance.

    Fonts that are small on a high-DPI display do make sense if the display is intended to be viewed from very close.

    Similarly, if you let DDC calculate the DPI of a 46" 1080p HDTV, you'll have awful results, because of the fact that while the DPI of these is VERY low, they are intended to be viewed from far away, so a 96-120 DPI setting is still appropriate for text displayed on them.

    Fonts should probably be calculated to achieve a particular targeted percentage of screen size, since viewing distance is usually proportional to screen size.
  84. jilocasinsays:
    @Rob:
    You wrote:
    "The main problem is this: When you have a display that is very high res for its size, (i.e. the DPI is high), everything shrinks. "

    I actually _liked_ that. I wish Linux and the newer Windows OSes did that as well. The last think I want on a 27" screen are 2" icons. I _want_ everything to shrink on a higher dpi display. the Start menu, the "My Computer" icon, they should all get _smaller_. What's the point of an additional 10" of screen size when a simple confirmation dialog takes up a third of your screen?

    Is there a way to make 'modern OSes' act like older Windows OSes? Bigger screen/higher dpi == smaller everything?

  85. Petesays:
    @Sean

    Yeah, most of the people on /. were pretty cool about it. Some good conversation going on there. A few trolls and a bit of bile, but less than I had expected. FWIW, I'm not an evangelist (I don't get paid to actually push a technology), but that's probably too fine a distinction unless you're actually looking at my commitments vs an evangelist's. I get paid to help developers be productive with our developer tools. My concentration is in Silverlight, WPF (both vector-based technologies), Windows Phone, Surface and a bit on gaming.

    I'm not aware of the removal of support for CRTs. You sure that's not really something that was changed in the drivers?

    I can easily change screen resolution on my LCD.

    When I click on "Advanced Settings" in the display settings in Win7, I get the ability to change the properties of the monitor, set the refresh rate, set the color depth etc. I also get a tab that links me to the NVidia Control Panel where I can really tweak stuff.

    If I had to guess, I'd say that you need to see if you can find a generic CRT profile that will work with your monitor. Windows can't auto determine capabilities, so I assume (again, without your setup in front of me) it's restricting you to some low common denominator of capabilities. At the least, on the "Monitor" tab in advanced settings, you should be able to uncheck the "Hide modes that this monitor cannot display" option.

    Hope that helps.

    Pete
  86. Dansays:
    I had this exact argument with a friend of mine. He purchased a 23" 1920x1080 display for really cheap and suggested I do the same. (I think it was a Samsung from Sam's club or something) I told him it was no good, and a week later spent 500 bucks on a 24" Dell (with many more features on it including PIP/PBP/every input under the sun. But it was 1920x1200 which is the best you'll find without buying an Apple Cinema 30" or Dell 30" for 1200-1800$

    I am with you on the DPI as I am a developer and have my terminal font size set a6 6. Could go smaller except it becomes unreadable for the reasons you list.

    GIVE ME MY PIXELS!
  87. Andrew Humesays:
    i agree with the DPI. for a few months in the mid 90s, i used an exotic
    grey scale monitor that was a true 300 dpi. it is hard to grok how gorgeous
    and readable it was. (although it raised a lot of compatibility issues with
    existing software.
  88. Chaibaccasays:
    Hugh wisely stated:

    > As a Linux user, I'm embarrassed by the Anti-MS comments. They are off-topic and rude.

    As another *NIX user, I couldn't agree more. Fan-boy-ism is a problem just as much as any form of extremism. It creates a bad rap for any community.

    In this particular forum, though, I will beg any *NIX user who has come here to go off on Windows or Microsoft to PLEASE SHUT UP!! This resolution regression problem is affecting everyone. It has had me absolutely fuming for the past year as I've searched for a replacement for my ageing XPS-Gen2, and it kills me to see the issue get clouded by any idiot trying to blame the problem on software just because it suits their agenda.
  89. Petesays:
    For folks who say Windows is the reason, I'll only concede a small amount of agreement with you. Windows XP and previous versions didn't handle high DPI that way. Of course, LCD panels were not quite as ubiquitous back then. XP is almost 10 years old now. 10 years old! Plus, it's not just the OS that's responsible for that; folks had to build apps that scaled well. At the time, we simply didn't have great tools for doing that. (WPF and Silverlight make that much easier)

    If it were really just Windows, however, why wouldn't Apple, who controls the hardware/software pipeline from front to back, and even now manufactures their own chips, pushed to get special high DPI panels for their own displays? DPI on the Mac (desktop or laptop) is typically only a little higher than typical PCs.

    Maybe we should have a funeral for Windows XP like folks did with IE6? I'd be happy to see folks on Windows 7; fewer viruses, better security, better dpi support and all-around an improved OS.

    Heck, imagine if we all judged linux by distros that were 10 years old? I've used those. They're no more modern :)

    Pete
  90. Happysays:
    I wrote:
    > like, how many of you out there even bother to test on 150% font scaling?

    Gaston wrote:
    >The correct answer is: you shouldn't have to [etc]

    Yup, we'd all like to live in that perfect place :-)

    (I have a copy of DisplayPostcript, circa 1992, around here somewhere... it sure is pretty)


    @Andy: you've got a good point there with viewing distance vs pixel spacing, reminds me of other UI disasters such as people who write interfaces for set top boxes, media players etc that use tiny (~40 lines of text per 1080 screen) for displays meant to be read from several meters away...

    @Andrew - those displays you mention were highly groovy - I saw one for a high end page layout system... monochrome I think because there's no shadow mask on a monochrome CRT?
  91. Stuartsays:
    Fonts were originally designed to be crisp and legible at normal monitor displays. Tahoma is an excellent example.

    When ClearType was invented someone at Microsoft thought it would be a good idea to switch to fonts that COULD NOT be displayed well with low dpi, like Segoe, so that they could show off their ClearType technology.

    Unfortunately, ClearType is not that good...the mult-colors and blurriness look absolutely terrible. The fact is, things are MUCH crisper and more legible if you disable ClearType and use a font that is designed to be displayed on limited resolution like Tahoma.

    I think that this is the primary reason why people feel like they need higher resolution on their LCD's.

    I think that monitor resolution is close to the perceptible threshold, but not quite there. It is really only noticeable for text. When looking at images or video, I don't think higher resolution would be noticeable.

    Also, if you increased screen resolution, then you'd need to increase the size of media content, and you'd have to compress it less, or else it would be pointless. When using 3D graphics apps, you would have to render more pixels, which would slow down games etc...and it's just not necessary.
  92. Seeraksays:
    @Legonaut:

    Why assume that every pixel needs to be refreshed per display cycle (60Hz or whatever)? That assumption is a leftover from the CRT days, when pixels were non-persistent and *had* to be refreshed.

    LCD's are persistent, however; they are set-and-forget. If, instead of sending every pixel to the screen over and over again, the video card simply sent instructions on which pixels to change for the next display cycle -- a real screen Postscript, of sorts -- the bandwidth requirements get way, way smaller.

    There is no reason why the display shouldn't contain the framebuffer and related display hardware, with the connection to the CPU being a simple data connection. Don't those little USB screens work like this?

    The only applications needing to refresh every pixel would be movies and games, and that's not a problem since the human ability to resolve drops precipitously when the information is moving (cf. motion blur). Games and video could easily remain at HD resolution and would be just fine.
  93. Shadderssays:
    What a lively debate!

    And the problem raise by Pete is spot on, why have monitors permanently stuck at 96 DPI? And as Pete and many others have pointed out, this affects all operating systems with a GUI.

    Desktop publishing would most certainly benefit from increased DPI on displays. It’s a shame that the concept can be a bit hard to grasp because setting your OS to say 120DPI on a 96 DPI monitor just makes everything bigger. That’s not the purpose; it’s to keep everything the same size on a high DPI monitor as it is on a low one.

    Though I agree that reliance on bitmaps for things like icons means that toolbars and the like don’t scale well with a higher DPI unless the designer had the foresight to include high DPI versions of the bitmaps, clearly using vector graphics is the way forward.

    On Windows some inroads are being made, Windows Presentation Foundation makes it possible for vectors to be used as icons, though before V4, they did not look too good because WPF renders with sub-pixels. I believe that V4 can be forced to render to pixel boundaries, making for a cleaner on-screen image. Interestingly, this would probably be less of an issue on a high DPI monitor. Please note that I’m not saying other operating systems can’t do this, it’s just that it’s been over ten years since I last used a Mac or one of the flavours of Linux. It would not be right for me to comment on what they can or can’t do.

    Good article.
  94. marbiensays:
    The reason there are hardly any high-DPI displays is because of OS X. OS X has very poor support for high-DPI displays, even in Snow Leopard. You can upscale user interface elements, but the rendering is ugly.

    Graphics designers - primarily Mac users are some of the only people who would spend extra for high DPI. But, their preferred OS has lousy support for it. So you end up with hardly anyone who wants to pay more for DPI. Once Mac adds high-DPI support, high-DPI panels will be manufactured to sell as companions to Mac Pros, and possibly even as an option in more expensive iMacs. That will provide enough demand for the same suppliers to sell high DPI LCDs to the PC market as well.
  95. Chaibaccasays:
    I don't think the only market for higher resolutions is graphic design. At least, I certainly can't see how it would explain a regression in commonly-available native resolutions. Most of the software that everyone uses is consuming more and more horizontal lines just for the interface, leaving less for the working space. Take the ribbon menu system for example. It was created just as the industry standard display went from 800 horizontal lines (back) to 768.

    And all the while, manufacturers are getting away with describing these as "hi-def" displays. It's only high-def in TV/movie land. And, I can't think that too many people are buying laptops just to watch hi-def movies. Otherwise, more of them would come with blu-ray drives. But, I would swear there were far more laptops available a year ago that had blu-ray drives than there are now. We seem to be regressing in that area too! In fact, it's not just blu-ray drives, but more and more machines that aren't even being sold with any optical drive at all! Even high-end machines, like Dell's latest Adamo are completely leaving them out.
  96. Donsays:
    I don't know how to pressure panel mfgs and graphics card makers into making higher res parts. Too many in the general public don't understand and don't care about high quality imaging, so there is little incentive to make such things. It can be done though - evidenced by the super-high-res IBM display, accompanied by its special high-end graphics card about 8 years ago. It was 3840x2400 pixels @ 22" - roughly 204dpi. Unfotunately it (and others like it) appear to no longer be made: http://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors#IBM_T221


    @ marbien: There is no problem with MacOS X and high DPI. It's the panel mfgs and graphics cards that are the problem. OS X can scale to any res you want. Apple was one of the 1st, if not the first, to ship a 30" 2560x1600 pixel screen. It wasn't until dual-link DVI became available that they could do it. No standard graphics card could push that many pixels. Apple still sells that 30" screen, and I'd bet they'd love to push it even higher if they could find someone willing to make the panel. Apple also ships the 27" iMac with a 2560x1440 screen - higher DPI than the 30" (cause it's 3" smaller), but 16:9 instead of 16:10. I dare you to find many, if any, competitors shipping 27" screens at that res. Apple is also pushing for independent resolution design among developers. No, the real problem is waiting for screens & GPUs.
  97. Bumpysays:

    Having just spent a lot of time finding a replacement for my loved but aged 21" CRT monitor and settling on a 24" 1920 x 1200 flat panel I can very wholeheartedly agree with Pete on this - and I rarely agree with anyone from M$. Even so I'm disappointed by the flaming - really detracts from good discussion of a problem we all face.

    This post runs a bit long - didn't intend to write as much. The key points are made in bullets a-d below, all else is elaboration.

    I'm not very knowledgeable here but I think the following forces are at play

    The big force is Convergence of Entertainment and Computing which is driving:
    a) Encryption of the full signal path from media to screen
    b) Use of historical film aspect ratio (16x9) [backward compatibility for content]
    c) Two markets compete for same manufacturing capacity and the "least common" market is prevailing (1080p)
    d) MUCH larger "upstream" chain that needs to support higher resolutions to deliver consumer benefit

    In vain hopes of eliminating piracy (or at least eliminating it at the level of individual consumers sharing content), the content industry has mandated encryption of the full signal path. This has greatly reduced connection bandwidth (DVI/HDMI compared to VGA). As pointed out by a prior poster, it will take time to get encrypted connectors up to the bandwidth needed for much higher resolutions. This is just the smallest most localized bottleneck.

    At a much larger scale, content creation and distribution is now geared to 1080p. Upgrading to the next standard resolution will require upgrading everything in the content chain (camera, editing, storage, network, optical media if it still exists or an alternate like SD cards - physical distribution won't end anytime soon, etc.). This is a HUGE series of change costs which will have a long cycle time relative to the PC industry's progress over past 20 years. Everything in the chain has to meet price-points that retain profitability or consumer acceptability. In the past all that needed to upgrade "upstream" was graphics cards - and even then the software support (per many posters) wasn't there and undermined high resolution experience for most consumers.

    Going forward, the task of getting the upstream capabilities in place to meet consumer expectations is immensely more complex. Resolutions are likely to move in lockstep from one standard (1080p) to the next (2p or 4p or some other catchy oversimplified name). I think 16x10 may stay an option as the extra vertical pixels allow for some editing controls on-screen while viewing 16x9 content at native resolution during the production chain (which even impacts consumers like me who edit home videos to send to friends and family - so consumer demand *may* exist). Other than a minor variant like that I think we'll see far less variety of resolutions in future.

    That said, before we take the next step up in resolution we're going to pause to double frame rates to accommodate alternate-eye images (3D) - which will eat one doubling of connector bandwidth, storage needed per minute of video (likely less than 2x as related images will compress heavily, but that adds CPU cost), memory, CPU and GPU load all rise, etc. Once 1080p 3D is established we may see a big step-up in resolution *if* 3D effect can be significantly improved such that consumers that just went thru HD then 3D will cough up the cash again. Alternately the industries may prefer to creep up slowly to fragement market and keep consumers spending for many decades.

    Convergence is now a reality for good and ill - sadly resolution is on the 'for ill' side of the equation and as much as tekkies are used to being impatient and getting what they want in an industry that can move light and fast, it seems likely we're going to have to learn to be patient.

    In short I think this rant is railing against the wind - I hope I'm wrong, maybe one of you can convince me?


  98. Not the hobbyistsays:
    A relevant issue.
    All the laptops on the market are made with the average buyer who has no idea what to do with the laptop.
    How I wish there could be higher DPI and resolutions than what is available on the laptops today !

  99. Pixy Misasays:
    I recently bought a Sony notebook with a 15" 1920x1080 screen. I was worried about how well applications would deal with the higher DPI, but so far everything has worked really well, and the screen is just amazing.

    Based on that experience I've ordered the Dell 27" 2560x1440 monitor. Now I just need to free up the desk space for it...

    That old 3840x2400 monitor would have been perfect, but I never had a spare $10,000 to spend on it.
  100. rndtcsays:
    Pete,
    The guy with the 21crt and the win7 rez/refresh issue, bet he was using BNC connectors(Sony g500: its a limitation of the BNC standard). I had the same issue, after remembering, and swapping out for a VGA cable, didn't have the problem.

    Feature request: I see great value in being able to have different UI:DPIs for each monitor, ie, using a 24 and a 2, side-by-side(big PPI difference), I'd like to have the interface scale automatic, so moving an app. from one display to the other won't shrink/grow. So, DPI and display size awareness, and being able to scale it all in the current way.

    Also, there seems to be a bit of a "niche market" theme to some of these comments, all I can say to those users, is that if you cant see any difference or advantage in stdDef TV, compared to hiDef, as a comparison(or the dot matrix vs laser printer argument), perhaps its time for glasses. We sit closer to display than tvs, it justifies higher PPI.

    IMHO, high rez. is not niche, just, there are very few people that are willing to pay top dollar for 5 year old tech. like a 30ISP. And if there is no significant price reduction... Display tech. is not that different to CPU tech., so why haven't we see Moors Law on displays? Bit off topic, but I wonder, if Dell wins that case against the panel manufacturers, are they going to reimburse everyone...

    High Resolution/PPI displays are the future, the problem is that we should have them now...

  101. Addisays:
    These Dell's 30" are quite nice. You need a video card with two dvi outputs to use it, but it does 4 mega pixels. It sure seems that you similarly need a pretty rockin video card to drive that so perhaps there's a bit of chicken and egg.

    http://accessories.us.dell.com/sna/products/Displays/productdetail.aspx?c=us&l=en&s=bsd&cs=04&sku=223-4890

    Btw, completely agree with you on that this is frustrating. It's just one of those things where the market forces pushes people towards places where easy money can be made rather than real progress.

    -- Addi
  102. Gaston DASSIEU-BLANCHETsays:
    Pete said:

    > XP is almost 10 years old now. 10 years old!
    Well, I hate to have to point this out, and it's slightly off topic, but the main reason a lot of people are still using a 10-year old OS is because the next version (Vista) was clearly not good enough. Hopefully this will change with 7 (which I have yet to find the time to try).

    > At the time, we simply didn't have great tools for doing that...
    You are right. It is unfortunate that only now we have tools allowing us to do this, but IMHO this problem should have been addressed much much earlier and at the MFC level so that when you change the DPI all applications get proportionally resized without any "bluring". No specific code on the application should be required. I remember many years ago I was a little bored one weekend and I extended some the Java Swing classes (JButton, JTextbox, etc...) to draw at different sizes based on a global DPI setting, without any pixelating (I used vector based stuff). It looked very nice, and application code did not need to be changed at all. As far as I could judge from the screenshots, Windows 7 does a poor job at "scaling" legacy apps (they look blurred). I insist that DPI should be transparent to developpers, and we should only care about it in specific cases like when coding image editing software and the like.

    > If it were really just Windows, however, why wouldn't Apple...
    Yes, I agree with you that Windows is not the only OS with this shortcoming. A lot of effort must be done on this point by most OS makers.

    Regards,
  103. Petesays:
    @Gaston

    I've worked at Microsoft only since October, so my experience with Vista actually pre-dated joining (I ran Windows 7 from one of the betas onward on one of my machines)

    I must have had a good driver set or something because I never had problems with Vista. Yes, UAC was annoying at first (and I'm glad we tuned that down in Win7), and while not crippling, did get some foul words from me during installs. My system was stable, despite being overclocked and using (at the time) a combination of some pretty recent hardware and some dodgy old stuff that had been baggage over several system builds.

    I also ran it on an IBM Thinkpad laptop and had no problems.

    Then, with SP1 of Vista, performance got better, which is always a good thing.

    Now, maybe because I'm not much of a gamer anymore, I didn't suffer on that end, but for constant programming / development, writing, graphics design, it was fine.

    Vista was pretty hit and miss. If you had crappy drivers (Microsoft's fault? not really) then you probably had a crappy experience. I don't think it quite deserves the reputation it got, though. Much of that reputation was based on word of mouth, passed along from person to person (and let's be honest, hater to hater) without any actual experience with the OS. I'm not discounting the legit complaints here, just pointing out that I think it snowballed in a strange way.

    I'm nowhere near enough of a fanboy to put up with a crap OS when I have stuff to do. I'm pretty picky. But Vista was fine for me. That said, perf on Win7 is much faster, and I prefer some of the new UI changes, as well as the toned-down UAC.

    Pete
  104. Bobsays:
    I had the opportunity to see a DCI-resolution 153" monitor at CES 2010. that is what we should have in our living rooms! (At least until the Super Hi-Vision 8k stuff is available :-)

    [What will we use for a source? 5x speed 120GB BluRay? 250Mbps "PurpleRay" disks? :-) And what single media can drive an 8k panel without looking like compressed-crap cable-TV/DBS channels? Fun times ahead.]

    It would be nice to have an affordable option to get computer monitors beyond 1920x1200. Until then, there are always the big multi-monitor stands. Erect a set of six monitors and pretend your a rich wall street trader.
  105. Jeffsays:
    I fully agree with this ran. I have been thinking the same thing ever since LCD monitors became the standard for computers. I remember about 10-12 years ago I had a 17" CRT that would do 1600x1200 pretty well and I could even get it to go up to 2048x1536. I have yet to see a monitor that has higher DPI and that was 10-12 years ago. Anyways I just ordered a new monitor and went with the highest DPI I could find, it is a 23" that has a native resolution of 2048x1156 but unfortunately it does not appear to be made any more and the manufacturer (Samsung) only has 1920x1080 monitors to replace it.

    From what I have been reading there are people trying to get 4k and 8k resolutions standardized for next generation of HD displays and disks so hopefully that will push computer display up higher.
  106. No Tellingsays:
    Your rant is correct and spot on.

    I'm running a Dell C810 with the (at the time) top end 15" screen at 1600x1200. That's 126DPI. It's about the best screen I have for clarity of image.

    It was manufactured in ... wait for it ...

    Fall of 2001.

    The difference in image clarity on this screen is so much better than 96DPI that I can't imagine 300DPI on a screen. Really.

    The only lacks on this screen compared to more modern screens is lack of comparable contrast and brightness.
  107. retepvosnulsays:
    Really, How useful would it be ? Even when doubling the resolution and quadrupling the graphic throughput needed to feed the display, you'd eventually still need AA in 2d and 3d rendering situation, making the datapath and the data processing 4 times more intensive without practically no gain at at.

    There simply always will be situation where motion, characters or other graphics start to display pattern effects if not treated with simple and proven "tricks" like anti-aliasing.

    We would be a whole lot happier if things were kept simple with resolution ratio that kept devices,apps and media compliant with each other. Then we could all enjoy the same, perfect, resolution with quite able AA and still connect that extra monitor. This will remain true for decades to come ( until the monitor itself has been replaces with.. holograms ? )
  108. Samsays:
    The direct source of this issue is the failure of Microsoft (and Apple) to design truly resolution-independent operating systems.

    Despite piss-poor attempts at DPI scaling and font-size settings (particularly in Windows 2000/XP) The reality is that even with Windows 7 it is very difficult to run a high-density display! I have a 1920x1200 17" laptop and have friends with 1920x1200 15" laptops, and it is a nightmare trying to get readable text. If you crank the DPI scaling up, many legacy applications either ignore the setting altogether, or actually become impossible to use as user interface elements are hidden from view because of fixed size elements in their code.
    (I think what happens is that they have regions of their UI defined with fixed pixel sizes, and when you scale up the DPI or increase OS font size, some of the content within these regions grow larger than the region itself and get clipped off.)

    In an alternate universe where Microsoft designed Windows properly, you'd have average people buying high-density displays and when they get home, they'd easily be able to adjust one simple setting which would seamlessly increase the size of ALL text and icon elements throughout the OS and applications. Maybe in Windows 8??????????
  109. Petesays:
    @Sam

    Not sure how that's the operating systems' fault, except that none of them used vector elements for all display. Considering processing power when those operating systems were first written, I think that is an understandable omission.

    In Windows 7 (and Vista), the operating system asks the applications if they are DPI aware. If they say yes, the OS does no scaling. If they say no, the operating system handles the scaling. Unfortunately, many apps lie - again, not an OS problem, but an app problem.

    Also, in beta far before, and released by 2006, Microsoft has a technology that enables folks to create applications from vector elements, that scale correctly: WPF. In 2007, Microsoft introduced Silverlight, the companion technology that also supports truly scalable vector UI.

    Microsoft had assumed, some time back, that 200dpi screens were on the horizon, so they did a lot of work in Vista and Win7 to make sure that would work. Unfortunately, the screens never arrived.

    Also, from comments above, it sounds like IE8 is the only browser that handles DPI well; most of the others choke. Many web apps/pages also choke due to settings in pixels, not points (not an OS problem, but a design problem)

    Pete
  110. Samsays:
    @Pete

    Thank you for responding to my post. This had always been an issue that had pissed me off to no end, so it is great that people are talking about it again.

    Anyways, after writing my first comment, I decided to mess around with DPI scaling again in Windows 7 on my high-density laptop. After playing with it for a few hours and opening many applications, I do have to say that is does indeed work well for native *windows* elements and programs, but the biggest problem continues to be (older and newer) applications that are not DPI aware that end up becoming scaled by the OS. This ends up treating text elements as if they are bitmaps, and so the scaled text ends up becoming really blurry --- which is hardly a solution to small text.

    I do agree with you that much of the problem is/was created by 3rd party application developers; but at the same time Microsoft should have dealt with the high-DPI display issue far better in Windows XP.

    And while I agree that a fully-resolution-independent vector-based (native) interface may not have been possible in 1998, It should have been created for Windows 7.

    And yes, WPF is an enormous step forward from GDI+, and I hope Windows 8 natively uses .NET4+ for all development.


  111. hazydavesays:
    It's not so much about DPI. At every stop along the way in the PC industry, there's been a commodity resolution. When the IBM PS2 first hit the market, it was 640x480. Every commodity priced display was 640x480, independent of size. Sure, you could buy off the commodity curve for a price, and get higher resolution.

    No different today. In 2007, I bought two MVA monitors at 1920x1200, $400 each. A great price, I thought, but hardly PC-industry commodity pricing. Today, you can still get 1920x1200, only you have to pay $300 or more, and you probably have a hard time finding MVA (best for video, good for graphics, not so good for gaming). So I can blame the rise of the gaming industry for harming what I value in a monitor just as much as HDTV.

    In fact, it was the rise of HDTV that gave us 1920x1080 panels now as the sweet spot in the PC commodity market... $150 or so for one of these, if you shop around. Half the price of a 1920x1200 monitor. You can't really blame HDTV for that... without HDTV, we might find the commodity sweet spot is still at 1640x1080.

    This is also an ironic twist. If you recall the development of HDTV, the PC industry wanted 16:10 displays, the film industry wanted 2:1 (16:8), and so they compromised on 16:9. And the PC industry went right on to making 16:10 their standard anyway. Now it's that same agreement pushing them back to what they agreed to, anyway.

    Until there's big demand for larger screens, they're going to be expensive... just as it always has been. You might not have done the math, but that 19" 1680x1080 screen you could get cheap a few years ago has a higher DPI than your 24" 1920x1200... you actually lowered your DPI voluntarily with the 24" screen, and were just happy with that until 1920x1080 came along. So buy the 22" model, and you'll have a higher DPI. Thing is, it's not about DPI, it's really about the pixels.

    300dpi isn't something to laugh about.. that's the standard for professional photo printing. Don't make this too confusing, though, because in photos and on monitors, the accurate measurement is ppi -- pixels per inch. Every 24-bit pixel in a photo or on a screen is what we're talking about here. Printers don't compare, because they print in 6 to 10 colors only, and use their increasing dpi ratings to make a better 300dpi print, not to actually increase the ppi rating of that print.
  112. polossatiksays:
    I'm wanting to replace an age old Dell c610 lattitiude who has a very nice - no glare - 15" 1400 x 1050 screen.
    A screen like that (4:3 - if it needs to be 16:9 ok ...) combined with a up to date laptop with decent battery life... I'll shove out the $$$ right away...
    It's already so hard to find any non mirror screens, let alone to find something with a decent *vertical* resolution....
    Is there no panel maker anymore who actually makes stuff like that? Is that the problem?
  113. Hogar.MEsays:
    Pete,

    As far as I am concerned, 1920 pixels _horizontally_ is plenty for 90% of users, especially on laptops. Vertical rersolution is another kettle of tasty aquatic animals altogether.

    What I would like to see are different aspects made available:

    1. 1920 x 1080 (16:9) for movie freaks
    2. 1920 x 1200 (16:10) movies, games, general use
    3. 1920 x 1440 (4:3) and 1920 x 1536 (5:4) for office work, graphic design, DTP (I am old too ;) etc

    If these options were available, I think most of the discussion here would be moot.

    Greetings from Montenegro

  114. Hogar.MEsays:
    Pete,

    As far as I am concerned, 1920 pixels _horizontally_ is plenty for 90% of users, especially on laptops. Vertical rersolution is another kettle of tasty aquatic animals altogether.

    What I would like to see are different aspects made available:

    1. 1920 x 1080 (16:9) for movie freaks
    2. 1920 x 1200 (16:10) movies, games, general use
    3. 1920 x 1440 (4:3) and 1920 x 1536 (5:4) for office work, graphic design, DTP (I am old too ;) etc

    If these options were available, I think most of the discussion here would be moot.

    Greetings from Montenegro

  115. rndtcsays:
    Is the RBG colour model also holding us back? I've been thinking about why large pixels look so bad(on a display), for me its because my eye starts to pick up the the individual colours of each sub-pixels, especially on the edge of text, using clear type, for example, so that got me thinking...

    Is it time to move past the whole concept of a RGB pixel, composed of 3 sub-pixels, ie a Red, a Green and a Blue sub-pixels as well.

    I would like to see a new generation of LCD with full spectrum pixels(and Hi-PPI), rather than using the tradition RGB colour mixing technique. So rather that having the individual RGB sub-pixels, each pixel can generate any colour required. This would also free us for the rather limited colour gamut that can be produced mixing RGB.

    For anyone who has used a true monochrome LCD(they don't have sub-pixels), like that, but with Full-Spectrum-Pixels.

    In condition where you are far away from the display, like a TV, sub-pixels are sufficient, and to that end, I'm interested in Sharps quadpixel RGBY(adding Yellow to the mix), but for a display that you read from, Full-Spectrum-Pixels are on my wish list(in conjunction with Hi-PPI/Low-DP(dot pitch)!

  116. george2says:
    DPI was the single most important criterion I used when deciding which tablet to buy. I do a lot of e-reading (including fiction) and it's simply more comfortable with a small pixel pitch. In the case of eReaders, I do think that e-ink helps - physically the way that adjacent blobs of ink/toner interact, as for printed documents, makes text less jagged without blurring it. Regarding the responsibilty of the OS, I see no reason why the device driver for a display component shouldn't include its diagonal size, and the OS can then decide on appropriate scaling.
  117. rndtcsays:
    Hmmmm. resuna, if you're implying that science often follows fiction, or that the techno babel(at lease in TNG) was usually penned by a research scientist who worked for NASA(IMMSM), or perhaps that trek inspired many children to peruse science as a career, and they they are diligently working on creating a future for all of us, then, no.
    For the completely unimaginative(and uninformed), I suppose that the concepts in star trek might seem magical, for me just a bit dated(it is 20 years old).

    For the more mature, 'my idea' for a full spectrum light source/pixel centres around an LED/Optical waveguide like light source that is capable of producing a range of colours(390 to 750 nm), the exact implementation is unimportant to the discussion, but the concept of a single pixel that can emit a range of wavelengths, is relatively simple, and there are numerous examples in the real world of such emitters(your probable using it now, ie fibre optics), the challenge is applying the tech in displays... And improving the range and accuracy of of current dielectric waveguides, and size-cost, etc..

    Eventually, I'd see the technology able to product a range from infra-red through to ultraviolet(being both safe and practical), and multiple wavelengths per pixel; imagine looking at a picture of a landscape with the sun in frame, and being able to feel it. Akin to a sub woofer for audio; you cant hear the bottom range but you can feel it.

    Not to bring us back to Pete's original comments too much; but I also thought it worth mentioning that OLED as probably also contributed to the slow progress a bit. ie, why work on improving current techniques if OLED is just around the corner. LCD-LED became a bit of a stop gap when progress on OLED slowed, but now we're in the situation where OLED is still not as mature as it needs to be, and we now need a stop gap for the original stop gap, and progress get held back again.

    For those with cash to burn(mountain of it) there is always Barco; they have a 10mp display in mono, and 6mp in colour at 30inch, and quadHDs at 56inch.
  118. Irkfisays:
    Why there are not QuadHD computer monitors @3840x2160 or 3840@2400 is beyond me, there are definitely no technical problems in making them. Such monitors could have size between 34-40" and would be ultimate for desktops, Productivity and satisfaction would definitely increase. One obvious application for such monitors is digital photography

    Can Microsoft persuade manufacturers to go ahead with the QuadHD monitors?

    A really long shot would be hig-res monitors which are NOT FLAT. They would be surrounding the user. With nonflat monitor the viewing distance would be constant across the screen, that would enhance the comfort and feeling of immersion. But nonflat monitors are not possible with the current LCD technology, one has to wait for OLED made on flexible substrate.
  119. c$34says:
    filed under: Error parsing XSLT file: \xslt\BlogPostListComments.xslt

    I agree, HDTV sucks. I would rather spend the money an a truley high resolutiion monitor, and watch TV on that (which I already do, thank you, SageTV!)
  120. Jim Duggansays:
    You forget that TV's aren't generally viewed from 18" away. If people were putting their 50" LCD on their desk and browsing the 'net then we'd definitely see better resolutions, but you have to understand that these panels are usually used for watching movies and TV shows. Not many people are hooking their computers up to HD panels. Manufacturers are in it to make money.
  121. Petesays:
    @Jim

    I get that, but I think we crossed in the mail somewhere.

    I was talking about how manufacturers have settled on 1080p "HD" resolution as the sweet spot for all computer displays, includ 20/21/22" jobs that I wouldn't want to watch from across the room :)

    Basically, they've decided that laptops displays and standalone monitors are best used to watch HD video. Any other uses are secondary.

    Pete
  122. comosays:
    You're forgetting- it's the baby boomers who pay the bills.

    Who has all the money? the old people. What do they want? Lower resolution for their failing eyesight. As a professional IT guy, one of the biggest complaints I hear from my overage, overburdened (with money) clients is that their new fancy "high def" displays have too many pixels, so text is too hard to read.

    Yes, you can go through and change all of the settings to make things bigger; but oftentimes I'll just run someones fancy lcd at a few steps lower res. Looks like hell, but to them its sharper.

    In another 30 years when they're all dead (and we're fucking it up for the youngins) we'll be getting our high res lcd's, our gay marriage laws, and all the other things these old coots don't want.
  123. Frankiesays:
    "I always hated the fact that I could never find another 1920x1200 15" Laptop... I'm like you have got to be freaking kidding me, I don't want a 17" laptop with LESS pixels. "

    This is my least favorite thing with the latest crop of laptops. My Dell XPS 15" does 1920x1200 and I love it. I much prefer it over any of the new 17" ones. Even most of the "Powerful Gaming" ones don't have that high of screen resolutions. I just don't get it.

    What people seem to miss here is that with a higher screen resolution you also get to take up less space from things like toolbars and ribbonbars and other mostly-of-the-time useless user interface components. Things like Photoshop and Microsoft Office take up way less of the usable real estate so you are left with more space to do work. You can always zoom in to the contents of whatever you are working with if you can't read with the high resolution.

    I don't like that the manufacturers are deciding that we should all stay at 1920x1080.
  124. yodasays:
    Great article. Couldn't agree more.

    I have been using a 1920 x 1200 monitor for years.
    I use it as a computer monitor, not a TV.

    To replace it, I want a 2560 x 1440 LED 30 inch monitor now!

    Are you listening HP, ASUS, Samsung, LG, BenQ etc.
  125. DanaGsays:
    I have a 1920x1200 15.4" laptop (147 DPI), and it's the only computer I've ever had that I can read all day with very little eyestrain.

    To get the same DPI at about 20 inches, an LCD would have to be 2560x1660 (150 DPI). Now, that may be hard to achieve, but I don't want a 20-inch panel! I just want a second 15.4" 1920x1200 panel!

    I mean, when HP has a 17" IPS panel that's 1920x1200, with 30-bit color depth, why the hell can't they stick that thing in a desktop enclosure and sell it? I can virtually guarantee they'd make a boatload of money from it!
    http://forum.notebookreview.com/hp-compaq-voodoo-pc/502871-see-why-you-should-lust-after-dc2-equipped-hp-laptop.html

    Also, why do desktop LCDs have such wide bezels, when a laptop panel's metal frame is only like 1/2 inch wide?

    And how the hell is LOSING 120 rows of pixels an "improvement" (1200->1080)? I like space under my movies for a task bar or subtitles, thank you very much. I'd love to take whoever's forcing the market to 16:9, and chop their ceilings down to 5'10", or however tall they are -- look, no wasted space!
  126. Ricksays:
    Well most people don't care and just buy whatever the latest and greatest technology is. There are still people that argue records have better audio than tapes or CD. But records have been outmoded, so who knows for sure?
  127. Prospersays:
    the display technology might be there, but the graphic processing power to run it is nowhere close. A 25" display at 600dpi would need graphics hardware (and memory) to drive 51M pixels, all getting refreshed 60 times a second. When was the last time you saw a 600dpi printer that could fire out 60 ledger-sized pages per second? At 32 bits per pixel, that's 200 megs of video ram required just to display a static image.

    There would be 20 times as many pixels on this display as on a 1920*1200 unit - so video ram requirements, bandwidth requirements and processing power need to be able to handle 20 times as much complexity. Currently out of the realm of our technological capabilities.
  128. Petesays:
    @Prosper

    Comparing to a printer isn't valid; it's a totally different process.

    IMAX Digital projectors do between 27 million and 104 million pixels according to Wikipedia. If you add in 3d, double that as they two 2x as many frames.

    My inexpensive single video card can do two 30 and possibly a 24" display, pushing around 8-10 million pixels.

    ATI Eyefinity can do something like 25m pixels on a single card (7680x3200)

    The tech would definitely be high-end, but I believe we can do it already. High-end tech comes way down in price when it goes mainstream.

    Pete
  129. orangesays:
    yeah, we all would like higher DPI.
    manufacturers too.
    still, CRT monitors didn't have higher dpi than LCDs. and a beam of light can be made smaller than TFT pixel.

    pixels have borders. the smaller the pixel, the worse is ratio pixel/border. so you get lousy picture.
    so, smartas, build 1200 dpi LCD yourself and become millionaire.
  130. Petesays:
    @orange

    Read up on your tech. CRTs were also limited by the screening technologies they used as well as how the phospers were placed.

    And don't be mean just because you're hiding behind a keyboard and anonymous info.

    Pete
  131. Tadsays:
    It's even worse this year. The best laptop display I can find is 1600x900 in a 17" laptop. My five year old Dell has a 15.4" 1680x1050 display. I expected to have something resembling 1920x1200 in a 17" laptop by now.
  132. Davidsays:
    Totally agree. The situation is appalling. My 2006 XPS is going nowhere until I get the option of buying a laptop with the same or greater resolution.

    Even more staggering is that desktop monitors of 24" and greater typically offer less resolution than my 17" laptop!! (And I can see everything on the screen perfectly well, incidentally...)

    There is SIMPLY NO WAY that I'm giving up 1920x1200. NOT GOING TO HAPPEN.
  133. Greedy Boysays:
    Help me plz.. these 1080p HDTV sux, they are not impressive, I need super HDTV 1200dpi resolution 42 inch, ultra bright, super fast response.. plz help me, I need Alien Technology.
  134. Petesays:
    @Greedy Boy

    You completely missed the point. I was talking about how HD TV resolution has constrained panel makers who would normally have built higher res displays for computers. Even Apple had to go way out of their way and get their high res "retina" displays manufactured specifically for them.

    Pete
  135. Chassays:
    Well, it's been almost 3 years (now Feb 2013) since this was first posted and it doesn't look like it's going to get any better any time soon. Still can't find much that isn't 1080p. So I'm still using an Apple Cinema Display that's almost 10 years old but is running at 1920x1200.

    What's really disappointing is that I haven't found much discourse out there other than this post.
  136. Adamsays:
    I totally agree with this rant. The whole 1080p movement for TVs makes sense in a way, given that when people watch TV, they won't be watching anything above that resolution anyway. However, the moment you want to use any screen as a computer monitor, it's completely absurd to be limited to this resolution. Hell, I found an old 19" CRT from about 2002 that had "Full HD" (in fact, because it has an aspect ratio of 4:3, it's actually higher than this; 1920x1440). 11 years on and no production improvement.... zzz. Personally, I blame BluRay. Our screen res shouldn't be held back by the standard issue res of disk movies.
  137. Petesays:
    On the plus side, 4k is coming.

    I now have an inexpensive 39" 4k display. It has its issues (such as being limited to 30hz refresh rate due to HDMI, and also having some lag due to HDMI) but when the panels come up in popularity and down in price, we'll see many more 4k options. There are a couple 4k computer monitors out there, but they are in the $5,000+ range. Not too long ago, they were in the $20,000+ range.

    Consumer 4k is equivalent to four 1920x1080 displays. You get a real choice between a large display that fits more at a normal DPI, or a smaller display with higher DPI and crisper graphics.

    Pete

Comment on this Post

Remember me