1 upvotes, 2 direct replies (showing 2)
View submission: Ask Anything Wednesday - Engineering, Mathematics, Computer Science
Thanks! I didn’t realise human perception was so key to the analysis.
Comment by Indemnity4 at 25/07/2024 at 02:00 UTC*
2 upvotes, 0 direct replies
Great example of *biology versus displays* is high visibility and fluorescent clothing. Think of the white lines on a road at night time or clothing worn by contractors.
To your eyes it appears hyper intense. But take a photo with a camera and the image looks dull and muted.
As well as *hue* (the wavelength of light), we also detect *chroma/saturation* (like adjusting the brightness on your screen).
We also have white balance. The best of example of this is trying to take a photograph or film image of a person with black skin. Quite frequently a dark skinned actor's skin tone will change between different movies. Sometimes they appear so dark skinned you cannot make out any facial features; other times they look like appear much fairer toned and get accused of makeup or skin lightening. It's up the director of photography to adjust the white balance of a scene and historically, they suck at that job.
Comment by chilidoggo at 24/07/2024 at 21:42 UTC
2 upvotes, 0 direct replies
It's more than just perception, there is a hard science explanation as well.
When you look at how light is usually detected (biologically *and* electronically) you'll find that the sensor usually is only good at receiving light at an optimum wavelength with a bell curve around it. The mechanics of this are that the "sensor" converts the photon's energy into an electrical impulse, but less efficiently the further away from target the wavelength is. Think of like a radio signal getting more static when you tune away from it.
If you have 1 type of sensor, you end up seeing the world in only black and white, because that electrical impulse can only go from 0 to 100%, there's no extra information it can transmit. If you plot this black and white vision on your chart, you would only get a single point at those wavelengths that gets brighter and darker.
If you have two sensors that are tuned to different wavelengths and have some overlap on their curve, you'll be able to recognize when a certain wavelength is weak in one sensor and strong in another. On the parabola chart, you end up getting a line between two points on the parabola, where you have a sequence of colors between the two wavelengths (plus they can get brighter and darker overall). This is how colorblind people see the world, and why there's different types of colorblindness. You can be missing any of the three or even have just a smaller triangle.
In reality, we usually have three types of cells that catch light, so that's why it's a triangle. If we were to "redesign" the human race, we maybe would recalibrate what wavelength they're tuned to in order to let us distinguish between every single wavelength, and hell, even see into the UV and IR spectrums.