Look to Your Aunts, Uncles and Parents for Clues to Your Longevity

Your chances of inheriting genes linked to longevity are highest if you come from a family with many long-lived members, researchers say.

And that includes aunts and uncles, not just parents.

Using databases at the University of Utah and in the Dutch province of Zeeland, investigators analyzed the genealogies of nearly 315,000 people from over 20,000 families dating back to 1740.

“We observed . . . the more long-lived relatives you have, the lower your hazard of dying at any point in life,” said study lead author Niels van den Berg. He is a doctoral student in molecular epidemiology at Leiden University in the Netherlands.

“For example, someone whose parents are both ‘top survivors’ has a 31 percent lower hazard of dying than someone of the same age without such parents,” van den Berg said in a University of Utah news release.

“Top survivors” refers to people in the top 10 percent age-wise of a group of people born within a given time period.

“Moreover, that person’s hazard of dying is reduced, even if the parents themselves did not live to be extremely old but aunts and uncles were among the top survivors,” van den Berg said.

“In long-lived families, parents can therefore pass on longevity genes to their children, even if external factors prohibited them from reaching the top survivors,” he explained.

The findings reinforce the idea that “there really are longevity genes to be discovered in humans,” van den Berg said.

The study was published online in the journal Nature Communications.

Researchers have long searched for genes associated with longevity, but those genes have been much more difficult to pinpoint than genes for disease, said study co-author Eline Slagboom, a professor of molecular epidemiology at Leiden University.

“This research has led us to be far stricter in selecting the people in whom you have to look for those genes,” Slagboom said.

“If you investigate a random group of people aged over 100, however exceptional they may be, it’s highly likely that many of them do not in fact belong to a family in which longevity is heritable,” Slagboom said. “Their age is probably a matter of chance, the result of a healthy lifestyle or healthy circumstances, for example during childhood, and isn’t therefore reflected in their DNA.”

Source: HealthDay

Advertisements

Your Heart Needs a Minimum 6-hour Sleep Per Night to Stay Healthy

Six hours: That’s the minimum amount of sleep per night you need to help your heart stay healthy, new research suggests.

The study found that chronic lack of sleep and poor sleep quality raise the odds of fatty plaque accumulation in arteries — a condition known as atherosclerosis, which increases the odds of heart attack and stroke.

There are many ways to fight heart disease, including “pharmaceuticals, physical activity and diet,” said lead researcher Jose Ordovas. “But this study emphasizes we have to include sleep as one of the weapons we use to fight heart disease — a factor we are compromising every day.”

Ordovas is an investigator at the National Center for Cardiovascular Research in Madrid, Spain.

In the new research, his team used coronary ultrasound and CT scans to track the artery health of nearly 4,000 Spanish adults. The study participants, average age 46, did not have heart disease at the beginning of the study.

The study couldn’t prove cause and effect, but people who slept less than six hours a night were 27 percent more likely to have body-wide atherosclerosis than those who slept seven to eight hours a night, Ordovas and his colleagues reported.

Too much sleep wasn’t great for the heart, either. The study also found that women who slept more than eight hours a night had an increased risk of atherosclerosis.

Participants with “poor-quality” sleep — frequent awakenings or difficulty getting to sleep — were also 34 percent more likely to have atherosclerosis, compared to those with good-quality sleep.

The study was published in the Journal of the American College of Cardiology.

“This is the first study to show that objectively measured sleep is independently associated with atherosclerosis throughout the body, not just in the heart,” Ordovas said in a journal news release. He also directs nutrition and genomics at the Jean Mayer USDA Human Nutrition Research Center on Aging, at Tufts University in Boston.

People who had short and poor-quality sleep also tended to consume higher levels of caffeine and alcohol, Ordovas noted.

“Many people think alcohol is a good inducer of sleep, but there’s a rebound effect,” he said. “If you drink alcohol, you may wake up after a short period of sleep and have a hard time getting back to sleep. And if you do get back to sleep, it’s often a poor-quality sleep.”

Two U.S. experts agreed that sleep is a key component of cardiovascular health.

While a direct cause-and-effect relationship between sleep and heart health remains unclear, “targeting one’s sleep habits is finally getting recognized in the medical world as an important factor to improve heart disease,” said Dr. Eugenia Gianos. She directs women’s heart health at Lenox Hill Hospital in New York City.

Gianos reasoned that behaviors in a person’s waking hours may explain the sleep-heart connection. That’s “because patients with good sleep hygiene have the energy to be physically active, make healthy food choices and handle stress better,” she said.

Dr. Thomas Kilkenny directs sleep medicine at Staten Island University Hospital, also in New York City. The new study “opens a door to further investigations to hopefully demonstrate the cause and effect between poor sleep quality and the generation of atherosclerosis disease,” he said.

“In the meantime, physicians should constantly evaluate their patients to identify sleeping disorders and stress to their patients the need to maintain at least six to eight hours of sleep per night,” Kilkenny said.

Source: HealthDay


Today’s Comic

New Portion Size Guide Tells You How Much You Should Actually be Eating

James Rogers wrote . . . . . . . . .

Nutritionists have launched a brand-new portion size guide to tackle overeating.

The British Nutrition Foundation’s (BNF) guide spells out how much of each sort of food.

The guide includes starchy carbohydrates, protein, dairy, fruit and vegetables and oils and spreads.

The aim of the guide is to revolutionise our eating and tackle the obesity crisis.

It takes into account the foods we should be eating – and in which portions – to have a healthy diet.

Women should be eating 2,000 calories a day – and men 2,500.

According to the guide, the correct portion size for pasta is two hands cupped together.

A finger and thumb, meanwhile, is the right thickness of spaghetti.

The right amount of cheese, more worryingly for cheese lovers, is a mere two thumbs.

The suggested single portion of a grilled chicken breast, a cooked salmon fillet or a cooked steak is “about half the size of your hand”.

A baked potato should be the “about the size of your fist”.

The BNF survey suggested that when it comes to eating pasta, on average we eat around 230g worth when cooked.

And that’s without any sauces or sides.

Researchers found that 10% of the people questioned eat 350g.

That’s around 500 calories alone, but their recommendation is 180g.

A portion of fruit or vegetables – of which we should eat at least five a day – could be two plums, two satsumas, seven strawberries, three heaped serving spoons of peas or carrots, one medium tomato or three sticks of celery.

But it’s not all bad news.

If you do fancy a snack, you’re still allowed them – but you are told to keep them small.

They should be around 100 to 150 calories, and not too frequent.

Examples included a small chocolate biscuit bar, a small multipack bag of crisps, four small squares of chocolate (20g) or a mini muffin.

Bridget Benelam, nutrition communications manager at the BNF, said: “More often than not, portion size is not something people give much thought to.

“The amount we put on our plate typically depends on the portion sizes we are used to consuming, how hungry we feel and how much is offered as a helping at a restaurant table or in a packet/ready meal.

“Nonetheless, in order to maintain a healthy weight we should ensure that our diets contain the right balance of foods, in sensible amounts.

“This isn’t just about eating less; it’s also about eating differently.”

Louis Levy, head of nutrition sciences at Public Health England, said: “The Eatwell Guide, the nation’s healthy eating model, shows the proportion of foods that should be consumed from each food group for a healthy balanced diet.

“With the exception of fruit and vegetables, fish and red and processed meat, the government does not provide guidance on specific food portion sizes as there is no evidence to make recommendations at a population level.”

Source: Birmingham Live


Read also at British Nutrition Foundation:

Find your balance, get portion wise! . . . . .

More Evidence Marijuana May Damage the Teen Brain

Dennis Thompson wrote . . . . . . . . .

Smoking just a couple of joints may cause significant changes in a teenager’s brain structure, a new study has found.

Brain scans show that some adolescents who’ve tried marijuana just a couple of times exhibit significant increases in the volume of their gray matter.

These changes were associated with increased risk of anxiety, and decreased ability on thinking and memory tests.

“It is important to understand why some people may be more vulnerable to brain effects of cannabis at even the earliest stages of use, as it might give us some insight into why some people transition to substance misuse while others do not,” said lead researcher Catherine Orr. She is a lecturer at Swinburne University of Technology in Melbourne, Australia.

“Also, if we can identify some of the factors that place people at greater risk of these brain effects, we need to let people know what they are so that they can make informed decisions about their substance use,” Orr continued.

However, these findings are inconsistent with earlier studies that have found no significant long-term changes in brain structure or deficits in memory, attention or other brain function that can be attributed to pot use, said Paul Armentano, deputy director of NORML, an advocacy group for reform of marijuana laws.

“The notion that even low-level exposure to cannabis results in significant brain changes is a finding that is largely out of step with decades worth of available science,” Armentano said. “Therefore, these findings ought to be regarded with caution.”

Most studies involving the effects of pot on the brain focus on heavy marijuana users. These researchers wanted to focus instead on what might happen as teens experiment with marijuana.

To that end, they gathered brain scan data obtained as part of a large research program investigating brain development and mental health in teens.

The researchers examined brain imaging of 46 kids, aged 14 years, from Ireland, England, France and Germany, who reported trying pot once or twice. They also looked at the teens’ scores on cognitive and mental health tests.

The teens’ brains showed greater gray matter volume in brain areas more affected by pot, when compared with kids who’d never toked, the study authors said.

“The regions of the brain that showed the volume effects map onto the parts of the brain that are rich in cannabinoid receptors, suggesting that the effects we observe may be a result of these receptors being stimulated by cannabis exposure,” Orr said.

Regions most affected by weed were the amygdala, which is involved in processing fear and other emotions, and the hippocampus, which is involved with memory and reasoning, the researchers said.

The findings were published in the Journal of Neuroscience.

Senior study author Hugh Garavan said, “You’re changing your brain with just one or two joints.” Garavan is a professor of psychiatry with the University of Vermont.

“Most people would likely assume that one or two joints would have no impact on the brain,” he added in a university news release.

Researchers can’t say whether these changes in the structure of the brain are permanent, Orr said. There are a lot of things that influence brain development in teens that can’t be ruled out by the data at hand.

“The imaging technology we have does not let us disentangle what differences in the adult brain may be a result of smoking pot once or twice as a 14-year-old from what differences are due to studying a second language or playing video games as a teen,” Orr said.

Yasmin Hurd, director of the Addiction Institute at Mount Sinai, in New York City, said one would expect some things to return to normal if a teen tries marijuana a couple of times and then stops.

“I would be very surprised if just a few exposures to marijuana would cause irreparable damage,” Hurd said.

On the other hand, even temporary changes in brain structure might make a person more predisposed to emotional or cognitive problems later in life, Hurd added.

Orr suggested that “if they may then use drugs later in life or are exposed to excessive stresses later in life, they’re much more vulnerable. This indicates that any drug use leaves a trace in the brain. Whether that trace has long-term consequences for subsequent disorders, that’s something that really needs to be researched.”

Source: HealthDay


Today’s Comic

Opinion: Is Sunscreen the New Margarine?

Rowan Jacobsen wrote . . . . . . . . .

These are dark days for supplements. Although they are a $30-plus billion market in the United States alone, vitamin A, vitamin C, vitamin E, selenium, beta-carotene, glucosamine, chondroitin, and fish oil have now flopped in study after study.

If there was one supplement that seemed sure to survive the rigorous tests, it was vitamin D. People with low levels of vitamin D in their blood have significantly higher rates of virtually every disease and disorder you can think of: cancer, diabetes, obesity, osteoporosis, heart attack, stroke, depression, cognitive impairment, autoimmune conditions, and more. The vitamin is required for calcium absorption and is thus essential for bone health, but as evidence mounted that lower levels of vitamin D were associated with so many diseases, health experts began suspecting that it was involved in many other biological processes as well.

And they believed that most of us weren’t getting enough of it. This made sense. Vitamin D is a hormone manufactured by the skin with the help of sunlight. It’s difficult to obtain in sufficient quantities through diet. When our ancestors lived outdoors in tropical regions and ran around half naked, this wasn’t a problem. We produced all the vitamin D we needed from the sun.

But today most of us have indoor jobs, and when we do go outside, we’ve been taught to protect ourselves from dangerous UV rays, which can cause skin cancer. Sunscreen also blocks our skin from making vitamin D, but that’s OK, says the American Academy of Dermatology, which takes a zero-tolerance stance on sun exposure: “You need to protect your skin from the sun every day, even when it’s cloudy,” it advises on its website. Better to slather on sunblock, we’ve all been told, and compensate with vitamin D pills.

Yet vitamin D supplementation has failed spectacularly in clinical trials. Five years ago, researchers were already warning that it showed zero benefit, and the evidence has only grown stronger. In November, one of the largest and most rigorous trials of the vitamin ever conducted—in which 25,871 participants received high doses for five years—found no impact on cancer, heart disease, or stroke.

How did we get it so wrong? How could people with low vitamin D levels clearly suffer higher rates of so many diseases and yet not be helped by supplementation?

As it turns out, a rogue band of researchers has had an explanation all along. And if they’re right, it means that once again we have been epically misled.

These rebels argue that what made the people with high vitamin D levels so healthy was not the vitamin itself. That was just a marker. Their vitamin D levels were high because they were getting plenty of exposure to the thing that was really responsible for their good health—that big orange ball shining down from above.

One of the leaders of this rebellion is a mild-mannered dermatologist at the University of Edinburgh named Richard Weller. For years, Weller swallowed the party line about the destructive nature of the sun’s rays. “I’m not by nature a rebel,” he insisted when I called him up this fall. “I was always the good boy that toed the line at school. This pathway is one which came from following the data rather than a desire to overturn apple carts.”

Weller’s doubts began around 2010, when he was researching nitric oxide, a molecule produced in the body that dilates blood vessels and lowers blood pressure. He discovered a previously unknown biological pathway by which the skin uses sunlight to make nitric oxide.

It was already well established that rates of high blood pressure, heart disease, stroke, and overall mortality all rise the farther you get from the sunny equator, and they all rise in the darker months. Weller put two and two together and had what he calls his “eureka moment”: Could exposing skin to sunlight lower blood pressure?

Sure enough, when he exposed volunteers to the equivalent of 30 minutes of summer sunlight without sunscreen, their nitric oxide levels went up and their blood pressure went down. Because of its connection to heart disease and strokes, blood pressure is the leading cause of premature death and disease in the world, and the reduction was of a magnitude large enough to prevent millions of deaths on a global level.

Wouldn’t all those rays also raise rates of skin cancer? Yes, but skin cancer kills surprisingly few people: less than 3 per 100,000 in the U.S. each year. For every person who dies of skin cancer, more than 100 die from cardiovascular diseases.

People don’t realize this because several different diseases are lumped together under the term “skin cancer.” The most common by far are basal-cell carcinomas and squamous-cell carcinomas, which are almost never fatal. In fact, says Weller, “When I diagnose a basal-cell skin cancer in a patient, the first thing I say is congratulations, because you’re walking out of my office with a longer life expectancy than when you walked in.” That’s probably because people who get carcinomas, which are strongly linked to sun exposure, tend to be healthy types that are outside getting plenty of exercise and sunlight.

Melanoma, the deadly type of skin cancer, is much rarer, accounting for only 1 to 3 percent of new skin cancers. And perplexingly, outdoor workers have half the melanoma rate of indoor workers. Tanned people have lower rates in general. “The risk factor for melanoma appears to be intermittent sunshine and sunburn, especially when you’re young,” says Weller. “But there’s evidence that long-term sun exposure associates with less melanoma.”

These are pretty radical words in the established dermatological community. “We do know that melanoma is deadly,” says Yale’s David Leffell, one of the leading dermatologists in the country, “and we know that the vast majority of cases are due to sun exposure. So certainly people need to be cautious.”

Still, Weller kept finding evidence that didn’t fit the official story. Some of the best came from Pelle Lindqvist, a senior research fellow in obstetrics and gynecology at Sweden’s Karolinska Institute, home of the Nobel Prize in Physiology or Medicine. Lindqvist tracked the sunbathing habits of nearly 30,000 women in Sweden over 20 years. Originally, he was studying blood clots, which he found occurred less frequently in women who spent more time in the sun—and less frequently during the summer. Lindqvist looked at diabetes next. Sure enough, the sun worshippers had much lower rates. Melanoma? True, the sun worshippers had a higher incidence of it—but they were eight times less likely to die from it.

So Lindqvist decided to look at overall mortality rates, and the results were shocking. Over the 20 years of the study, sun avoiders were twice as likely to die as sun worshippers.

There are not many daily lifestyle choices that double your risk of dying. In a 2016 study published in the Journal of Internal Medicine, Lindqvist’s team put it in perspective: “Avoidance of sun exposure is a risk factor of a similar magnitude as smoking, in terms of life expectancy.”

The idea that slavish application of SPF 50 might be as bad for you as Marlboro 100s generated a flurry of short news items, but the idea was so weird that it didn’t break through the deadly-sun paradigm. Some doctors, in fact, found it quite dangerous.

“I don’t argue with their data,” says David Fisher, chair of the dermatology department at Massachusetts General Hospital. “But I do disagree with the implications.” The risks of skin cancer, he believes, far outweigh the benefits of sun exposure. “Somebody might take these conclusions to mean that the skin-cancer risk is worth it to lower all-cause mortality or to get a benefit in blood pressure,” he says. “I strongly disagree with that.” It is not worth it, he says, unless all other options for lowering blood pressure are exhausted. Instead he recommends vitamin D pills and hypertension drugs as safer approaches.

Weller’s largest study yet is due to be published later in 2019. For three years, his team tracked the blood pressure of 340,000 people in 2,000 spots around the U.S., adjusting for variables such as age and skin type. The results clearly showed that the reason people in sunnier climes have lower blood pressure is as simple as light hitting skin.

When I spoke with Weller, I made the mistake of characterizing this notion as counterintuitive. “It’s entirely intuitive,” he responded. “Homo sapiens have been around for 200,000 years. Until the industrial revolution, we lived outside. How did we get through the Neolithic Era without sunscreen? Actually, perfectly well. What’s counterintuitive is that dermatologists run around saying, ‘Don’t go outside, you might die.’”

When you spend much of your day treating patients with terrible melanomas, it’s natural to focus on preventing them, but you need to keep the big picture in mind. Orthopedic surgeons, after all, don’t advise their patients to avoid exercise in order to reduce the risk of knee injuries.

Meanwhile, that big picture just keeps getting more interesting. Vitamin D now looks like the tip of the solar iceberg. Sunlight triggers the release of a number of other important compounds in the body, not only nitric oxide but also serotonin and endorphins. It reduces the risk of prostate, breast, colorectal, and pancreatic cancers. It improves circadian rhythms. It reduces inflammation and dampens autoimmune responses. It improves virtually every mental condition you can think of. And it’s free.

These seem like benefits everyone should be able to take advantage of. But not all people process sunlight the same way. And the current U.S. sun-exposure guidelines were written for the whitest people on earth.

Every year, Richard Weller spends time working in a skin hospital in Addis Ababa, Ethiopia. Not only is Addis Ababa near the equator, it also sits above 7,500 feet, so it receives massive UV radiation. Despite that, says Weller, “I have not seen a skin cancer. And yet Africans in Britain and America are told to avoid the sun.”

All early humans evolved outdoors beneath a tropical sun. Like air, water, and food, sunlight was one of our key inputs. Humans also evolved a way to protect our skin from receiving too much radiation—melanin, a natural sunscreen. Our dark-skinned African ancestors produced so much melanin that they never had to worry about the sun.

As humans migrated farther from the tropics and faced months of light shortages each winter, they evolved to produce less melanin when the sun was weak, absorbing all the sun they could possibly get. They also began producing much more of a protein that stores vitamin D for later use. In spring, as the sun strengthened, they’d gradually build up a sun-blocking tan. Sunburn was probably a rarity until modern times, when we began spending most of our time indoors. Suddenly, pasty office workers were hitting the beach in summer and getting zapped. That’s a recipe for melanoma.

People of color rarely get melanoma. The rate is 26 per 100,000 in Caucasians, 5 per 100,000 in Hispanics, and 1 per 100,000 in African Americans. On the rare occasion when African Americans do get melanoma, it’s particularly lethal—but it’s mostly a kind that occurs on the palms, soles, or under the nails and is not caused by sun exposure.

At the same time, African Americans suffer high rates of diabetes, heart disease, stroke, internal cancers, and other diseases that seem to improve in the presence of sunlight, of which they may well not be getting enough. Because of their genetically higher levels of melanin, they require more sun exposure to produce compounds like vitamin D, and they are less able to store that vitamin for darker days. They have much to gain from the sun and little to fear.

And yet they are being told a very different story, misled into believing that sunscreen can prevent their melanomas, which Weller finds exasperating. “The cosmetic industry is now trying to push sunscreen at dark-skinned people,” he says. “At dermatology meetings, you get people standing up and saying, ‘We have to adapt products for this market.’ Well, no we don’t. This is a marketing ploy.”

When I asked the American Academy of Dermatology for clarification on its position on dark-skinned people and the sun, it pointed me back to the official line on its website: “The American Academy of Dermatology recommends that all people, regardless of skin color, protect themselves from the sun’s harmful ultraviolet rays by seeking shade, wearing protective clothing, and using a broad-spectrum, water-resistant sunscreen with an SPF of 30 or higher.”

This seemed to me a little boilerplate, and I wondered whether the official guidelines hadn’t yet caught up to current thinking. So I asked David Leffell, at Yale. “I think that sun-protection advice,” he told me, “has always been directed at those most at risk”—people with fair skin or a family history of skin cancer. “While it is true that people with olive skin are at less risk, we do see an increasing number of people with that type of skin getting skin cancer. But skin cancer… is very rare in African Americans… and although they represent a spectrum of pigmentation, [they] are not at as much risk.”

Still, David Fisher at Mass General didn’t think that changed the equation. “There’s a pharmacopoeia of drugs that are extremely effective at lowering blood pressure,” he said. “So to draw the conclusion that people should expose themselves to an elevated skin-cancer risk, including potentially fatal cancer, when there are so many alternative treatments for hypertension, is problematic.”

Am I willing to entertain the notion that current guidelines are inadvertently advocating a lifestyle that is killing us?

I am, because it’s happened before.

In the 1970s, as nutritionists began to see signs that people whose diets were high in saturated fat and cholesterol also had high rates of cardiovascular disease, they told us to avoid butter and choose margarine, which is made by bubbling hydrogen gas through vegetable oils to turn them into solid trans fats.

From its inception in the mid-1800s, margarine had always been considered creepers, a freakish substitute for people who couldn’t afford real butter. By the late 1800s, several midwestern dairy states had banned it outright, while others, including Vermont and New Hampshire, passed laws requiring that it be dyed pink so it could never pass itself off as butter. Yet somehow margarine became the thing we spread on toast for decades, a reminder that even the weirdest product can become mainstream with enough industry muscle.

Eventually, better science revealed that the trans fats created by the hydrogenation process were far worse for our arteries than the natural fats in butter. In 1994, Harvard researchers estimated that 30,000 people per year were dying unnecessarily thanks to trans fats. Yet they weren’t banned in the U.S. until 2015.

Might the same dynamic be playing out with sunscreen, which was also remarkably sketchy in its early days? One of the first sunscreens, Red Vet Pet (for Red Veterinary Petrolatum) was a thick red petroleum jelly invented in 1944 to protect soldiers in the South Pacific; it must have been eerily reminiscent of pink margarine. Only after Coppertone bought the rights and reformulated Red Vet Pet to suit the needs of the new midcentury tanning culture did sunscreen take off.

However, like margarine, early sunscreen formulations were disastrous, shielding users from the UVB rays that cause sunburn but not the UVA rays that cause skin cancer. Even today, SPF ratings refer only to UVB rays, so many users may be absorbing far more UVA radiation than they realize. Meanwhile, many common sunscreen ingredients have been found to be hormone disruptors that can be detected in users’ blood and breast milk. The worst offender, oxybenzone, also mutates the DNA of corals and is believed to be killing coral reefs. Hawaii and the western Pacific nation of Palau have already banned it, to take effect in 2021 and 2020 respectively, and other governments are expected to follow.

The industry is now scrambling to move away from oxybenzone, embracing opaque, even neon, mineral-based formulations, a fashion statement reminiscent of the old Red Vet Pet. But with its long track record of pushing products that later turn out to be unhealthy, I remain skeptical of industry assurances that it finally has everything figured out. We are always being told to replace something natural with some artificial pill or product that is going to improve our health, and it almost always turns out to be a mistake because we didn’t know enough. Multivitamins can’t replace fruits and vegetables, and vitamin D supplements are clearly no substitute for natural sunlight.

Old beliefs don’t die easily, and I can understand if you remain skeptical of old Sol. Why trust one journalist and a handful of rogue researchers against the august opinions of so many professionals?

Here’s why: many experts in the rest of the world have already come around to the benefits of sunlight. Sunny Australia changed its tune back in 2005. Cancer Council Australia’s official-position paper (endorsed by the Australasian College of Dermatologists) states, “Ultraviolet radiation from the sun has both beneficial and harmful effects on human health…. A balance is required between excessive sun exposure which increases the risk of skin cancer and enough sun exposure to maintain adequate vitamin D levels…. It should be noted that the benefits of sun exposure may extend beyond the production of vitamin D. Other possible beneficial effects of sun exposure… include reduction in blood pressure, suppression of autoimmune disease, and improvements in mood.”

Australia’s official advice? When the UV index is below 3 (which is true for most of the continental U.S. in the winter), “Sun protection is not recommended unless near snow or other reflective surfaces. To support vitamin D production, spend some time outdoors in the middle of the day with some skin uncovered.” Even in high summer, Australia recommends a few minutes of sun a day.

New Zealand signed on to similar recommendations, and the British Association of Dermatologists went even further in a statement, directly contradicting the position of its American counterpart: “Enjoying the sun safely, while taking care not to burn, can help to provide the benefits of vitamin D without unduly raising the risk of skin cancer.”

Leffell, the Yale dermatologist, recommends what he calls a “sensible” approach. “I have always advised my patients that they don’t need to crawl under a rock but should use common sense and be conscious of cumulative sun exposure and sunburns in particular,” he told me.

This does not mean breaking out the baby oil or cultivating a burnished tan. All the experts agree that sunburns—especially those suffered during childhood and adolescence—are particularly bad.

Ultimately, it’s your call. Each person’s needs vary so much with season, latitude, skin color, personal history, philosophy, and so much else that it’s impossible to provide a one-size-fits-all recommendation. The Dminder app, which uses factors such as age, weight, and amount of exposed skin to track the amount of sunlight you need for vitamin D production, might be one place to start. Trading your sunscreen for a shirt and a broad-brimmed hat is another. Both have superior safety records.

As for me, I’ve made my choice. A world of healthy outdoor adventure beckons—if not half naked, then reasonably close. Starting today, I’m stepping into the light.

Source : Outside


Read also at The Truth About Cancer:

Sunscreen and the Lies We’ve Been Told About the Real Causes of Skin Cancer . . . . .