Jump to content
Sign in to follow this  
David Mills

Mid-Summer Aerial Scenery is SUPPOSED to be "washed out."

Recommended Posts

7 minutes ago, Bobsk8 said:

HDR is a different coding, and if the monitor is not programmed to read it, nothing will happen. It's like speaking chinese to someone that only understand english. 

确实是的


Jim Barrett

Licensed Airframe & Powerplant Mechanic, Avionics, Electrical & Air Data Systems Specialist. Qualified on: Falcon 900, CRJ-200, Dornier 328-100, Hawker 850XP and 1000, Lear 35, 45, 55 and 60, Gulfstream IV and 550, Embraer 135, Beech Premiere and 400A, MD-80.

Share this post


Link to post
Share on other sites

Oh my! Why would someone make a lame excuse like a thesis for such an obvious nasty intrusion on our eyes? Holy krap! Save it for your watering eyes!

Edited by AboveAtlantic

Share this post


Link to post
Share on other sites

it's bad and would liked it fixed, but I don't feel like it's bad enough to keep me from flying.

Share this post


Link to post
Share on other sites
1 hour ago, JRBarrett said:

Interestingly, until recently, I did not understand that HDR is a monitor function. and will not work on a non-HDR monitor.

In the wider sense of HDR, this is not true. Lots of people seem to be confused over what HDR actually is and the term is being muddied somewhat by the marketing of HDR monitors, although the clue as to what it really is, is in the name - high dynamic range.

The gist of what HDR actually is, is as follows: Imagine you spend and afternoon in the pub, and then you go outside into the bright sunlight at 4pm on a Summer's day. Initially, your eyes will have adjusted to the light levels inside the pub courtesy of your pupils getting larger to allow more light in whilst inside, so when you walk outside into the bright sunlight, the concrete of the pavement in the sunlight looks glaringly bright and you cannot see the details of the texture in that concrete until your pupils shrink down to limit the amount of light coming into them. When they do this, you can then see the details on the concrete pavement, but if you now look at areas in strong shadow, you won't be able to make out details in those areas, since your pupils have shrunk to accommodate the strong light levels coming in from that pavement in the sunlight. You can see this effect in MSFS (on a normal monitor), when you move your view up from the panel to the outside view ahead, for a second or two the view ahead has really bright clouds which don't have much detail, but then the sim replicates your eyes adjusting to it and sorts the exposure out. Or to put it simply, your eyes cannot handle a high dynamic range of light levels.

This is the same as what you do with a camera when you stop down the aperture to photograph bright things, or open it up to photograph dark things. The most famous example of this it the Apollo 11 photographs on the Moon, which have to have the lens stopped down a lot because of how bright the Lunar surface is and how much light it bounces around. This is why you cannot see stars on those pictures, the lens is stopped down too much for the light coming from the stars to expose on the film which was set to not blow out the images of astronauts and Lunar surface. You'll find the same thing happens with your camera if you try to photograph a starry sky unless you have very sensitive film with an emulsion speed designed for the purpose, or you take a very long exposure.

So, this was, and is, a problem for photographers, since it makes it difficult to get a detailed photograph of some buildings against a bright sky on a sunny day (or the panel in a plane and the sky through the window at the same time), because if you set your camera up to get a great shot of the bright sky with a fast shutter speed, the darker buildings will be underexposed and show no detail, and if you try it the other way around and set up with a longer exposure for a great shot of the buildings, the sky will be overexposed. To solve this problem, pioneer photographers in the mid-19th century came up with the technique of taking two photographs of the same subject at the same time, one long exposure, one short exposure, which they would combine in order to get the best detail from each picture in a montage of the best, most detailed areas of each shots to make one really detailed shot. This is why HDR shots look 'super real', because they show what your eye could never see in reality, because your pupils adjust for one or the other in real life, but they can't do both at the same time of course. Now of course since there were HDR photographs around in 1850, this tells you that HDR is nothing to do with having a monitor capable of displaying it because there weren't any monitors of any kind at all around back in 1850.

So what is HDR then in modern terms? Modern digital cameras use either a CMOS sensor or a CCD plate (sometimes both). The first ones of these only used one plate for the red, green and blue (RGB) colours, so whilst RGB can display about 17 million different colour tones (not all people can see these incidentally, the average male can see about 12 million different colours, the average female can see nearer that 17 million), there was no way for these early cameras to take an HDR shot unless you did what those 19th century photographers did and took two shots at different aperture settings and then montaged the pics together. However, more modern cameras with an HDR setting, can split their R,G and B sensors so that each one can function like the single sensor of the older digital cameras, and take a full RGB image on each of the three, i.e. all three colour sensors take a photograph with a gamut range of about 17 million colours, but these three shots can be at different aperture settings. The camera can then combine these three exposures into one picture and bob's yer uncle, you get an HDR image in real time. This is what an HDR computer game is replicating.

So it simply is not true that HDR is a monitor function; that's what the marketing people who are trying to get you to buy an HDR monitor are saying. Any monitor - even a black and white one from years ago - could display an HDR image, albeit in black and white, but it would still be showing the detail of the multiple exposures, just not in colour, an old CRT colour monitor from years ago can cheerfully display an HDR image in colour though, because HDR is not about the gamut range, it's about combining exposure levels. Below is an HDR image from 1856. You would never see this in reality with your own eye, there is simply too much difference in the light levels between the sky, sea and land, for your pupils to adjust for all that detail in one go, but replicated in a photograph of course, you can see all that detail just fine, and on any kind of monitor:

1280px-Gustave_Le_Gray_-_Brig_upon_the_W

Now it is true that there are some HDR modes that some software can use which are specifically tailored to the way some monitors which are sold as HDR monitors function and decode those signals. But as noted, conceptually HDR itself is simply a big range of exposure beyond that which the human eye could normally see 'all in one go'.

Edited by Chock
  • Like 2

Alan Bradbury

Check out my youtube flight sim videos: Here

Share this post


Link to post
Share on other sites

Just to add what Chock said as related to more of a direct video device thing...

When it's all purely digital, there are bit depth complications which can cause issues. The original HDR standards started in photography, and technically any HDR image can be tone-mapped to SDR regardless, but with a loss in bit depth. The color standards are often separate from HDR itself which is more of the EOTF encoding (the light output of each tonal range), but some HDR standards kind of assume a certain minimum color spec, usually at least DCI-P3 which is not the absolute widest color gamut, but it between the old Rec 709 color and the even wider Rec 2020. 

So basically the HDR standard is separate from the color standard and applies more to the GAMMA, but for the commercial sakes of the way they are packaged together, they were for the most part meant to go together and assumes will probably use a slightly wider color gamut with the wider output range. You can also do the reverse and only use the wider color gamut and not use the more sophisticated gamma.

The problem is the HDR standards can get complicated and messy, so it's not always easy even for the production remaster company to strictly adhere to such standards.

Edited by Alpine Scenery

AMD 5800x | Nvidia 3080 (12gb) | 64gb ram

Share this post


Link to post
Share on other sites

The real issue with standard dynamic range is our eyes are capable of distinguishing more than 256 shades of light and dark.  Unless the lighting is extreme (like standing on a bright beach peering into a dark change room) our eyes are more than capable of coping with dark and light and getting details in both, though our ability to do this deteriorates as we get older (the reason young men wear sun glasses for fashion but old men wear them because they need them)

With only 256 "Shades of grey" ( to mimic the book title) you are forced to lose some information.  You can compress the bottom end and have dark featureless shadows or compress the top end and have washed out bright parts or (more commonly) limit how dark and light you can go so that 256 shades more or less covers the reduced range of "sort of white" through to "kind of darkish". 

Now whilst HDR screens do often also give you a wider color gamut (often 10 bit or 12 bit)  - that is not the actual HDR bit.  What makes them HDR compared to conventional screens is individual quite small parts of the screen can go much darker and much lighter than you get with a conventional screen (which usually has a single maybe 300 nit back light for the entire screen) so you end up with one part of the screen maybe almost total black and somewhere else being lit at 1000 nits. An old school SDR screen cannot physically do this, the back light is not bright enough, it is never completely off either, and it changes the back lighting for the entire screen at once.  The myth that genuine HDR10 screens are just SDR screens with a bit of marketing spin is just that, an internet conspiracy.

To take advantage of this wider "dynamic range" the video signal has to have a lot more than 256 shades of grey, a HDR signal is not just more colors it has more shades of grey.   The end result if done well is blacker blacks and whiter whites and more detail in both, all on the screen at the same time.

NOW - one of the issues with an application creating a HDR signal with a much wider color gamut and much wider dynamic range is converting that signal back down to a SDR format for use on old school hardware.  With only "256 Shades of Grey"  you absolutely MUST throw away some of your information as you only have 256 levels to work with in SDR.  If you do it badly you lose detail in the whites and they wash out or depth in the blacks that become murky.   Sound familiar ???

Edited by Glenn Fitzpatrick

Share this post


Link to post
Share on other sites

Intrascene contrast ratio is separate from HDR, because contrast ratio is a varying non-determinate obstacle that is different on every TV. So they can only use the peak nits (white level) as the output factor in the EOTF or HDR gamma.

So as the Peak White raises, the black floor also raises, so depending on the display's intrascene contrast ability, you will still have 2 different contrast ratios from the same image displayed on two separate TV's, even though it's an HDR image and it's a standard.

I could go on forever, but I'll stop there.

 

Edited by Alpine Scenery

AMD 5800x | Nvidia 3080 (12gb) | 64gb ram

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Tom Allensworth,
    Founder of AVSIM Online


  • Flight Simulation's Premier Resource!

    AVSIM is a free service to the flight simulation community. AVSIM is staffed completely by volunteers and all funds donated to AVSIM go directly back to supporting the community. Your donation here helps to pay our bandwidth costs, emergency funding, and other general costs that crop up from time to time. Thank you for your support!

    Click here for more information and to see all donations year to date.
×
×
  • Create New...