Search
Search titles only
By:
Search titles only
By:
Forums
New posts
Search forums
What's new
New posts
Latest activity
Feedback
View Statistics
Members
Current visitors
Buy Sell Trade
WTB
Log in
Register
Search
Search titles only
By:
Search titles only
By:
New posts
Search forums
Menu
Install the app
Install
Reply to thread
Forums
Need Buying Help?
Gadgets & Consumer Electronics Buying Advice
Mid-range LCDs demystified
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Message
<blockquote data-quote="elvesbane" data-source="post: 673661" data-attributes="member: 20606"><p>And my response to all of those...</p><p></p><p>1) While the author of that particular article might have used prints to come to his conclusion, the whole point of it was to find the resolution of the human eye - not "what resolution of printed material is best". Besides, you can't blame him for using printed material, seeing as how the most displays available for purchase come nowhere near the sort of detail (DPI, PPI, whatever) that a modern printer can achieve.</p><p></p><p>2) I admit that your point is valid, but does anyone have any clue as to what exactly is the resolution of the eye 'at rest' insofar as reading textual / viewing movies / viewing pictures / gaming is?? Besides, when moving from a 22" at 1680x1050 to a 21.5" at 1920x1080, the DPI goes from 90 to 102 - an increase of 12 DPI or 13.3%... but still well away from the very commonly heard 300 DPI limit for human visual acuity.</p><p></p><p>3) Ignoring some information in favour of others doesn't really help your case... it's just childish. Clarkvision mentions a MASSIVE 530 DPI as being the upper limit at 20 inches, or 1.67 ft. I'll go out on a limb here and assume that most people sit 2.5-3.5 ft away from their monitors (I'm at the 2.9 ft mark). Common sense tells me that the figure isn't so different from clarkvision's test distance so as to render the result (and the conclusion I made in my previous post) irrelevant.</p><p></p><p>4) Isn't that statement in MY favour?? Yes, it is true that the capability of graphics cards (and the demands on them) are growing rapidly - and one of the things I've noted is that everyone seems to be focusing on the HD and beyond HD resolutions for gaming. While HD gaming has been hyped for a very long time, you have to admit that even mainstream cards are able to manage themselves at HD resolutions well enough (Please don't cite Crysis as an example to the 'otherwise'. Crysis is weird.). In any case, future graphics card will most certainly focus more and more on gaming at higher resolutions - not at staying in the same place. Why introduce a limit on your resolution if its likely that you'll get one or more powerful graphics cards during the lifetime of the monitor?</p><p></p><p>5) According to tvcalculator.com, when watching movies at 16:9 (or wider), the 21.5" 16:9 has greater area (albeit by a very small margin - 197.33 sq. in. vs 195.56 sq. in. for 16:9 source) than the 22". This means that...</p><p>a) Area is greater when watching movies.</p><p>b) Pixel depth is greater on the 21.5", plus 1080p is shown at native, and not downscaled like for 1680x1050.</p><p></p><p>And yeah, its true that I sound like I'm trying to justify my purchase. But the point is that gaming (+ movie watching) is quite possible (at high or highest settings) as long as the person in question is willing to spend on a graphics card that costs about as much as the monitor. For eg. BenQ E2200HD = 11.8k, ATI Radeon 4870 = 12.8k (saw the listing in dealer's paradise! sounds nice, eh? <img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite1" alt=":)" title="Smile :)" loading="lazy" data-shortname=":)" /> ) As for me, I just finished playing Mirror's edge at max everything (<sarcasm>except AA - only 4x - sigh...</sarcasm>) at full HD on a paltry Core 192 GTX 260. That's an AAA title - on a card who's value is less than the monitor's. :hap2:</p></blockquote><p></p>
[QUOTE="elvesbane, post: 673661, member: 20606"] And my response to all of those... 1) While the author of that particular article might have used prints to come to his conclusion, the whole point of it was to find the resolution of the human eye - not "what resolution of printed material is best". Besides, you can't blame him for using printed material, seeing as how the most displays available for purchase come nowhere near the sort of detail (DPI, PPI, whatever) that a modern printer can achieve. 2) I admit that your point is valid, but does anyone have any clue as to what exactly is the resolution of the eye 'at rest' insofar as reading textual / viewing movies / viewing pictures / gaming is?? Besides, when moving from a 22" at 1680x1050 to a 21.5" at 1920x1080, the DPI goes from 90 to 102 - an increase of 12 DPI or 13.3%... but still well away from the very commonly heard 300 DPI limit for human visual acuity. 3) Ignoring some information in favour of others doesn't really help your case... it's just childish. Clarkvision mentions a MASSIVE 530 DPI as being the upper limit at 20 inches, or 1.67 ft. I'll go out on a limb here and assume that most people sit 2.5-3.5 ft away from their monitors (I'm at the 2.9 ft mark). Common sense tells me that the figure isn't so different from clarkvision's test distance so as to render the result (and the conclusion I made in my previous post) irrelevant. 4) Isn't that statement in MY favour?? Yes, it is true that the capability of graphics cards (and the demands on them) are growing rapidly - and one of the things I've noted is that everyone seems to be focusing on the HD and beyond HD resolutions for gaming. While HD gaming has been hyped for a very long time, you have to admit that even mainstream cards are able to manage themselves at HD resolutions well enough (Please don't cite Crysis as an example to the 'otherwise'. Crysis is weird.). In any case, future graphics card will most certainly focus more and more on gaming at higher resolutions - not at staying in the same place. Why introduce a limit on your resolution if its likely that you'll get one or more powerful graphics cards during the lifetime of the monitor? 5) According to tvcalculator.com, when watching movies at 16:9 (or wider), the 21.5" 16:9 has greater area (albeit by a very small margin - 197.33 sq. in. vs 195.56 sq. in. for 16:9 source) than the 22". This means that... a) Area is greater when watching movies. b) Pixel depth is greater on the 21.5", plus 1080p is shown at native, and not downscaled like for 1680x1050. And yeah, its true that I sound like I'm trying to justify my purchase. But the point is that gaming (+ movie watching) is quite possible (at high or highest settings) as long as the person in question is willing to spend on a graphics card that costs about as much as the monitor. For eg. BenQ E2200HD = 11.8k, ATI Radeon 4870 = 12.8k (saw the listing in dealer's paradise! sounds nice, eh? :) ) As for me, I just finished playing Mirror's edge at max everything (<sarcasm>except AA - only 4x - sigh...</sarcasm>) at full HD on a paltry Core 192 GTX 260. That's an AAA title - on a card who's value is less than the monitor's. :hap2: [/QUOTE]
Insert quotes…
Verification
Post reply
Forums
Need Buying Help?
Gadgets & Consumer Electronics Buying Advice
Mid-range LCDs demystified
Top
Bottom