What we learnt in basic science can help us improve our photographs

Basic knowledge of the science of light will help you to get the best results from your camera. Here are some simplified explanations about how science in the classroom can impact what we do on our cameras.

There’s a lot more to light than meets the eye. Okay, so that’s an awful pun, but it is both metaphorically and literally correct. This is not only a much more complicated subject than what we should be learning in everyday life, but there’s also more light that hits you than can be converted into nerve impulses sent to the mind by your retina. There is less light than that landing on your camera sensor that gets turned into an electronic signal and transmitted to the camera’s processor.

In other words, our eyes can’t see all the light there is, much of it is invisible to us, but they see more than our cameras.

Take Control of Your Dynamic Range

When it’s bright, the sharp highlights in the sky will be visible simultaneously with the dark shadows beneath objects. Your camera, with a single exposure, can’t see as wide a range of tones. Modern sensor technology, however, has dramatically improved and as you can tell from the above image, it is now able to capture both details and highlights in dark areas. It wasn’t that many years ago when I would have had to bracket the exposures of that scene and then combine them to create a high dynamic range (HDR) image to be able to see the top of the lighthouse when pointing the camera directly at the sunrise.

We gradually lose the ability of our eyes to perceive as much range in tones. Nevertheless, the tonal range should be between 18 and 20, depending on whether we are using black or white. Cameras at the top of their range have 12-15 stops. A sensor released this year, however claims 24.6%. Don’t worry too much about that, though; your camera will still take fabulous pictures.

It is important to understand the role of visible light in our lives

We can only see a fraction of the electromagnetic spectrum. I find it quite incredible that we can differentiate the different colors of the spectrum that occur in a range that’s only 320 nanometers wide. We are able to distinguish between all seven colours that make up white light and their combination.

It is fortunate for us that the vast majority of photons which hit our planet fall in the band from 380 nanometers to 700. Our planet is in the Goldilocks Zone, which means that it’s at the perfect distance to the correct type of star. Our atmosphere also has a layer of ozone that blocks out ultraviolet light, which would otherwise cause sunburns and other harmful effects. We would be cooked if we were exposed to more energetic particles such as UV and gamma rays. On the other end of the spectrum, if we had longer wavelengths, it would require us to have larger eyes, and threading needles would become difficult.

What is the impact of light bending on our images?

As the earth spins and dawn arrives, we see the sun appear before it’s physically above the horizon. That’s because, like passing through a prism or raindrops, the light bends when it hits the atmosphere. This bending, or refraction of light is what allows us to see beyond the horizon. It’s the same effect as when you put a spoon into a glass of water and see it bend.

Seven different colors make up white light: violet, red, blue indigo and green. These colors are also found in rainbows and the Pink Floyd album The Dark Side of the Moon.

Each color travels with a unique wavelength. White light is broken up into different colors because the wavelengths bend in various directions. The red light is the slowest and therefore has less diffracted. Violet light, on the other hand, is the slowest and most refracted. The process by which the light splits is known as dispersion. This process of splitting the light is called dispersion. Diamonds, as expected, have the highest index of refractive indices, and this is what makes them sparkle.

This splitting of light in photography is generally undesirable. Rainbows are the last thing that we want from our lens. It is often called color fringing around the edges of high-contrast images. This fault is common in cheap lenses. Perfect lenses would not have any aberrations. All wavelengths would also converge at a single spot on the sensor. In order to achieve this goal, manufacturers of lenses use several glass elements that work together in the lens. Modern technology in lens manufacturing is improving constantly and modern professional lenses have virtually no aberrations.

When light strikes an edge, it can also cause diffraction. When water ripples hit an obstacle, they bend around the obstruction. Light behaves the same way.

Why we usually avoid very small apertures

Look at your own shadow on a sunny day. It will be easy to see the difference between the darkest part and the lighter edges. The light is bent around you, which causes the lighter edges. The darker part of the shade is known as the umbra and the lighter side is called the penumbra. You will notice that the shadows become more sharp the farther you get from the light source. It’s important to keep this in mind when you use studio lighting and flash.

This light-bending occurs when photons bend and bounce around the blades of an aperture. Smaller the aperture is, the greater the bouncing of light and the bending. That’s because the proportion of diffracted light to un-diffracted light is high at small apertures.

Photographers avoid small apertures as the images are less sharp due to the higher proportion of diffracted lights.

Why the sky is blue

Not only does this bouncing – properly called scattering – occur when light hits boundaries, but also when light encounters other particles. Blue has shorter wavelengths and is therefore scattered more readily than red. This is why the sky is blue during the day.

When we are looking toward the horizon the sky appears whiter. It is because the light is scattered more often by the atmosphere, removing the blueness. Furthermore, looking obliquely through the atmosphere the extra particles scatter other colors too, plus there’s also the light reflected from the planet’s surface affecting it. 

The different wavelengths are mixed together again by all these factors, resulting in white light. Over the ocean, the reflection of blue water into the air can turn the sky low blue.

CPL Filters

This scattered light is polarised. The blue waves are polarized, which means they travel only in a certain direction. They do not move randomly. This movement is parallel to light source. If you attach a CPL filter to your lens you will be able to cut down on the amount of light that is reflected at 90° to the Sun. This makes the sky look darker.

Polarizing filters are great for taking away reflections off the water’s surface, allowing you to see more clearly what lies beneath because the reflection is polarized. The glare can be removed from damp leaves in autumn, for example, to bring out their rich colors.

Physics Behind Those Glorious Sunsets

The sun is getting lower in the sky and must travel through more air before reaching you. It is dusty and filled with water vapor at low altitudes. The blue light is scattered even further, and we only see the warm reds and oranges.

Warm colors are not warm at all

Warm colors is a psychological term. When we think about warm colors like reds, oranges and yellows, blues come to mind. It’s actually the reverse in physics. Imagine a blacksmith warming a metal piece. It starts out red then turns yellow. The color changes to a white-blueish hue as the heat increases. Gas torch used by welders is very hot. It can even melt steel. Blue flames are produced.

It’s for a reason that your camera used red, green, and blue

It is something that I get asked a lot. Why don’t computer screens and photographic sensors use colors such as red, yellow, orange, green or blue? It is this difference that separates science from engineering. From an engineering standpoint, it is easy to get white when you combine just these three colors. Combining all seven colours of the spectrum onto a computer screen or camera sensor would be expensive and complex.

As with everything else in photography, there are compromises. Cameras and computer screens that use the primary colors red, blue, and green cannot produce the same range of colours as our real eyes.

Even that isn’t as simple as it first seems. The color range varies depending on which device you use. First, the virtual version, the most accurate, is available. It’s what your camera records when shooting raw and what your editing software understands your image to be. There is also a more restricted version of color that appears on your monitor or camera, if you’re shooting jpegs. The version you can print is also different.

To achieve this, we use color management. We use color management to achieve this. We define maximum and minimal values of red, green and blue with color management. Whole books have been written about this, and there’s far too much information to include in this article. You should set your screen, camera and printer (if you’re not shooting in raw format) to the exact same color space.

sRGB is the most common color space. Adobe RGB was a more common profile with more available colors. It used to be the standard in high-end printing. ProPhoto RGB offers an even wider range of colors and is compatible with most printers. But things have changed, and now the printers that I use create color profiles specific to each print type and paper. These profiles provide the best color accuracy.

It is sufficient for most photographers to simply remember that they should use a gamut no greater than their device’s color space.

How to tell the difference between subtractive and additive light

The primary colours are red, blue and green when projected light is mixed. They produce secondary colours, such as magenta and cyan. It is an additive light. Mixing red and green will produce yellow. The printer’s color range can be determined by mixing different amounts of the colored lights.

When we print, the inks subtract or remove color from white light. Inks reflect some light while absorbing others. Mixing the magenta, yellow, cyan and cyan inks in various proportions to black can give us a different color range. The gamut is what color managers call this range.

By using a single space for colors, we can ensure that we only use the colors in which both gamuts are overlapped. We would see strange results if we attempted to display or print colors outside of the capability of our screens and printers.

You Can’t Control the Color of Everything

In a similar vein, you’ll find control buttons on the screen to adjust, at a minimum, brightness and contrast. So will everyone else’s. Calibrating your screen is an important step to ensuring your prints’ colors and tones match what you print. Of course, when you share an image online, most other people won’t have calibrated screens. The images you share online may be brighter or dark, have more saturation or less, and different in contrast. It is impossible to change this. It is important to calibrate your monitor if printing photographs and accurate results are desired, or you plan on sharing photos with others who have calibrated their displays.

Click here to read more

This article is, of course, just scratching the surface of these topics and there is plenty to be learned under each topic I’ve covered. There’s an abundance of information in Fstoppers articles. Some of them go into greater detail on the topics that I briefly touched upon here.

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous post Bye bye stubborn stains! The BEST stain-removing product for carpets and clothes is safe to use on sensitive skin. It can remove mud, food, paint and pens.
Next post The RMG Sector Leeds is the Way