It’s bad enough to know that the sun’s ultraviolet rays are aging us and that the protection science has so far enabled in the form of sunscreens is barely adequate, without turning up the heat by identifying a whole other part of the light spectrum — infrared — as a deadlier culprit. That’s what happened a few years ago, when we were told to be more concerned about IR than UV. Just as I started to receive emails asking which sunscreens protected against IR, new research suggested that IR wasn’t as harmful as we were led to believe. At the start of spring and longer, warmer days, it seems time to try to shed some light on infrared safety.
The initial alarm was fuelled by just how much infrared there is. IR accounts for 50 percent of a person’s light exposure, compared to just 7 percent from UVA and UVB. But before you retreat to a basement, let’s understand what IR is. Part of the electromagnetic spectrum, infrared light is emitted along with other frequencies including gamma rays, x-rays, UV rays, visible light, microwaves and radio waves.
Infrared has two components. IRB, or short wavelength infrared, has been said to penetrate just at the epidermal layer. IRA, or near infrared, sounds a lot scarier as it is thought to go deep down to where new skin cells are formed and nutrients are delivered to the skin. What we were told was that this deep penetration messes with collagen equilibrium and attacks the mitochondria in our cells, leading to premature skin aging.
But does it? If there was really a threat from 50 percent of our light exposure, wouldn’t the consequences be dramatic? Isn’t it more likely that we have evolved some kind of resistance? As I hunted around for answers, I found that IR might even be beneficial. Today, you can even book an appointment in an infrared sauna promoting skin health.
In a 2016 study evaluating the need for IR protection in sunscreens, researchers looked at the IRA exposure of steel and glass workers who work in extreme heat and compared this to typical sun exposure. They determined that the IRA levels were similar and found no notable skin damage in the workers. They concluded that IR protection was not needed in sunscreens.
One reason for the discrepancies in these kinds of findings is that studies which found IR to be damaging used artificial NIR light sources that were not representative of the solar irradiance, such as one by Pienza et al in 2014. A very thorough article in the Journal of Photochemistry and Photobiology looks at studies that have been conducted and show that the artificial heat-inducing devices used did indeed cause damage to collagen, but were of much higher intensities than would be received from normal exposure to the sun. Normal solar irradiance is measured at approximately 100 mW/cm, and one Korean study that concluded IR caused wrinkles used levels of 1200 mW/cm for up to 15 weeks.
On the other hand, there is good research showing that IR is beneficial. For example, therapies using IR can result in the proliferation of specific cells, gene expression of anti-inflammatory cytokines and suppression of the synthesis of pro-inflammatory mediators. And to top it all off, there seems to be evidence that exposure to IR actually helps our bodies protect against the deleterious effects of UV light. For me, all this suggests that IR scaremongering has just been keeping us in the dark. I’ll be sticking to my regular sunscreens and as many topical antioxidants as my skin can bear to protect against environmental stressors known to age us.