Nice to see there's a good videophile blog in town -- the world has really need one since audioholics got whiney.
I have one of those Shields connected to a trash TV and have a low level of satisfaction with it compared to an XBOX ONE on the other TV now that I'm running Plex. It just has so many little annoyances like the controller seems to discharge itself between uses every time, an Android interface that almost makes sense, etc. It sits next to a used Blu-Ray player so I can play discs.
I am thinking about replacing it with a used PS4 or maybe a PS5 so I can get a great Plex client that also plays discs, the used PS4 costs less than a new Shield -- even though I'm disinvesting from Xbox in favor of steam I'm sure there are still some PS4 exclusive games I'd like to play, though it is late enough in the lifecycle that the PS5 doesn't seem crazy to me and it would be future proof in terms of supporting a future high-end TV.
500 NITS is extremely bright. The standard for professional imaging has been 100 candelas per square meter (NITS) for decades. In fact, when you go to a theater, you are typically looking at 30 to 50 NITS.
Modern high dynamic range imaging can provide enhancements to specular highlights when/where necessary. That said, due to observer adaptation in most environments this tends to be largely pointless. As someone mentioned, the viewing environment can be far more critical than the screen parameters. Even a 100 NITS screen can feel blinding in a dark environment before adaptation.
Something most people don't realize/understand is the quality of the blacks or lowlights in an image is a perceptual effect, not an absolute characteristic of the human vision system. This means that a super bright screen in a reflective environment will "pollute" your black level perception, therefore having the net effect of collapsing the range of the image (everything darker than a certain perceptual point will seem black).
As a matter of course, I generally cut the brightness of all of my computer monitors by at least 50%. I am convinced that a huge element of the visual fatigue people complain about when working long hours is because they are looking at a light bulb (the screen) pounding them at 500 NITS all day. There is no doubt that will have negative consequences.
Source: Among other things, I studied Color Science at the Rochester Institute of Technology.
One note missing from the YCbCr color space explanation is that, while HDMI does support YCbCr, the spec requires all video sources to also offer an RGB stream. Thus it is not possible to use HDMI for a device that can only output YCbCr.
Nice to see there's a good videophile blog in town -- the world has really need one since audioholics got whiney.
I have one of those Shields connected to a trash TV and have a low level of satisfaction with it compared to an XBOX ONE on the other TV now that I'm running Plex. It just has so many little annoyances like the controller seems to discharge itself between uses every time, an Android interface that almost makes sense, etc. It sits next to a used Blu-Ray player so I can play discs.
I am thinking about replacing it with a used PS4 or maybe a PS5 so I can get a great Plex client that also plays discs, the used PS4 costs less than a new Shield -- even though I'm disinvesting from Xbox in favor of steam I'm sure there are still some PS4 exclusive games I'd like to play, though it is late enough in the lifecycle that the PS5 doesn't seem crazy to me and it would be future proof in terms of supporting a future high-end TV.
> So how bright do we want/need our displays to be? How many bits? For me, even a 500 nits TV in a dark room is enough viewing from about 10' away.
The advantage of high peak brightness (eg > 1000 nits) is brighter small regions such as specular highlights. 500 nits full field is eye-searing.
500 NITS is extremely bright. The standard for professional imaging has been 100 candelas per square meter (NITS) for decades. In fact, when you go to a theater, you are typically looking at 30 to 50 NITS.
Modern high dynamic range imaging can provide enhancements to specular highlights when/where necessary. That said, due to observer adaptation in most environments this tends to be largely pointless. As someone mentioned, the viewing environment can be far more critical than the screen parameters. Even a 100 NITS screen can feel blinding in a dark environment before adaptation.
Something most people don't realize/understand is the quality of the blacks or lowlights in an image is a perceptual effect, not an absolute characteristic of the human vision system. This means that a super bright screen in a reflective environment will "pollute" your black level perception, therefore having the net effect of collapsing the range of the image (everything darker than a certain perceptual point will seem black).
As a matter of course, I generally cut the brightness of all of my computer monitors by at least 50%. I am convinced that a huge element of the visual fatigue people complain about when working long hours is because they are looking at a light bulb (the screen) pounding them at 500 NITS all day. There is no doubt that will have negative consequences.
Source: Among other things, I studied Color Science at the Rochester Institute of Technology.
A proper light controlled, dark ambient room is a lot more impactful for highlights than a higher peak brightness is
When is someone going to properly reverse engineer Dolby Vision to make a DV to HDR10+ translator
mpv could do this for years
One note missing from the YCbCr color space explanation is that, while HDMI does support YCbCr, the spec requires all video sources to also offer an RGB stream. Thus it is not possible to use HDMI for a device that can only output YCbCr.
You can, you just won’t get the hdmi compliance sticker