Vitamin A Linked to Lower Odds of Common Skin Cancer

Serena Gordon wrote . . . . . . . . .

Wondering if you can do more than slap on some sunscreen to prevent skin cancer? A new study suggests that getting more vitamin A may help.

The study of around 125,000 Americans found that people with the highest intake of vitamin A lowered their risk of squamous cell skin cancer by around 15%. Most of the vitamin A they consumed came from foods.

“These findings just add another reason to have a healthy diet with fruits and vegetables. Vitamin A from plant sources is safe,” said Eunyoung Cho, the study’s senior author. She is an associate professor of dermatology and epidemiology at Brown University.

Healthy food sources of vitamin A include sweet potato, cantaloupe, carrots, black-eyed peas, sweet red peppers, broccoli, spinach, dairy foods, fish and meat, especially liver, according to the U.S. National Institutes of Health (NIH).

Vitamin A is a fat soluble vitamin. That means it can collect in the fat cells. But when taken in large amounts — like those in supplements — you can potentially reach an unsafe level, according to the NIH. Adults shouldn’t consume more than 10,000 international units of preformed vitamin A daily, the NIH said.

Cho said too much preformed vitamin A (typically from supplements and some animal foods) increases the risk of osteoporosis and hip fractures.

Squamous cell carcinoma is a common type of skin cancer. Over a lifetime, as many as 11% of Americans will have squamous cell skin cancer, the researchers said. It tends to occur in areas exposed to a lot of sunlight, such as the face and head.

The study included data from more than 75,000 women who took part in the Nurses’ Health Study and almost 50,000 men in the Health Professionals Follow-up Study. Participants’ average age was in the early 50s.

Study volunteers provided information on their average diet and supplement use.

People with higher levels of vitamin A tended to be older. They also exercised more and were less likely to consume alcohol or caffeine. Women with higher levels of vitamin A were more likely to use postmenopausal hormones.

Nearly 4,000 people ended up with squamous cell skin cancer during more than 25 years of follow-up, the findings showed.

Average daily vitamin A intake was around 7,000 I.U. daily for the lowest group in both studies. The highest group in both studies had more than 21,000 I.U. daily. Most of this came from dietary sources, the study authors said.

The researchers noted that increasing use of vitamin A supplements didn’t appear to lower the risk of squamous cell skin cancer.

Vitamin A seemed to be even more protective for people with numerous moles and those who had a blistering sunburn in childhood or adolescence.

The study wasn’t designed to prove a cause-and-effect link, but Cho said that vitamin A works to keep skin cells healthy, and that may be why it’s linked to a lower risk of squamous cell cancers.

But, she added, even if you have a healthy diet full of vitamin A, you still need sunscreen when you’re outside.

Dr. Desiree Ratner, a spokesperson for the Skin Cancer Foundation, agreed that sun protection measures are still crucial for keeping your skin in good shape.

“While this study seems promising, it should not change any current sun protection behaviors or recommendations. The group involved in this study did show a slight decrease in squamous cell carcinoma incidence and it was only after each study participant ingested vitamin A in excess,” said Ratner, a dermatologist in New York City.

In addition, vitamin A didn’t prevent squamous cell skin cancers entirely, she noted. The study also didn’t look at vitamin A’s effect on other forms of skin cancer, such as basal cell carcinoma and melanoma.

“The best way to protect yourself against skin cancer is to develop a complete sun protection strategy that includes seeking shade, wearing protective clothing and daily sunscreen application,” Ratner advised.

The study was published online in JAMA Dermatology.

Source: HealthDay

Advertisements

Opinion: Is Sunscreen the New Margarine?

Rowan Jacobsen wrote . . . . . . . . .

These are dark days for supplements. Although they are a $30-plus billion market in the United States alone, vitamin A, vitamin C, vitamin E, selenium, beta-carotene, glucosamine, chondroitin, and fish oil have now flopped in study after study.

If there was one supplement that seemed sure to survive the rigorous tests, it was vitamin D. People with low levels of vitamin D in their blood have significantly higher rates of virtually every disease and disorder you can think of: cancer, diabetes, obesity, osteoporosis, heart attack, stroke, depression, cognitive impairment, autoimmune conditions, and more. The vitamin is required for calcium absorption and is thus essential for bone health, but as evidence mounted that lower levels of vitamin D were associated with so many diseases, health experts began suspecting that it was involved in many other biological processes as well.

And they believed that most of us weren’t getting enough of it. This made sense. Vitamin D is a hormone manufactured by the skin with the help of sunlight. It’s difficult to obtain in sufficient quantities through diet. When our ancestors lived outdoors in tropical regions and ran around half naked, this wasn’t a problem. We produced all the vitamin D we needed from the sun.

But today most of us have indoor jobs, and when we do go outside, we’ve been taught to protect ourselves from dangerous UV rays, which can cause skin cancer. Sunscreen also blocks our skin from making vitamin D, but that’s OK, says the American Academy of Dermatology, which takes a zero-tolerance stance on sun exposure: “You need to protect your skin from the sun every day, even when it’s cloudy,” it advises on its website. Better to slather on sunblock, we’ve all been told, and compensate with vitamin D pills.

Yet vitamin D supplementation has failed spectacularly in clinical trials. Five years ago, researchers were already warning that it showed zero benefit, and the evidence has only grown stronger. In November, one of the largest and most rigorous trials of the vitamin ever conducted—in which 25,871 participants received high doses for five years—found no impact on cancer, heart disease, or stroke.

How did we get it so wrong? How could people with low vitamin D levels clearly suffer higher rates of so many diseases and yet not be helped by supplementation?

As it turns out, a rogue band of researchers has had an explanation all along. And if they’re right, it means that once again we have been epically misled.

These rebels argue that what made the people with high vitamin D levels so healthy was not the vitamin itself. That was just a marker. Their vitamin D levels were high because they were getting plenty of exposure to the thing that was really responsible for their good health—that big orange ball shining down from above.

One of the leaders of this rebellion is a mild-mannered dermatologist at the University of Edinburgh named Richard Weller. For years, Weller swallowed the party line about the destructive nature of the sun’s rays. “I’m not by nature a rebel,” he insisted when I called him up this fall. “I was always the good boy that toed the line at school. This pathway is one which came from following the data rather than a desire to overturn apple carts.”

Weller’s doubts began around 2010, when he was researching nitric oxide, a molecule produced in the body that dilates blood vessels and lowers blood pressure. He discovered a previously unknown biological pathway by which the skin uses sunlight to make nitric oxide.

It was already well established that rates of high blood pressure, heart disease, stroke, and overall mortality all rise the farther you get from the sunny equator, and they all rise in the darker months. Weller put two and two together and had what he calls his “eureka moment”: Could exposing skin to sunlight lower blood pressure?

Sure enough, when he exposed volunteers to the equivalent of 30 minutes of summer sunlight without sunscreen, their nitric oxide levels went up and their blood pressure went down. Because of its connection to heart disease and strokes, blood pressure is the leading cause of premature death and disease in the world, and the reduction was of a magnitude large enough to prevent millions of deaths on a global level.

Wouldn’t all those rays also raise rates of skin cancer? Yes, but skin cancer kills surprisingly few people: less than 3 per 100,000 in the U.S. each year. For every person who dies of skin cancer, more than 100 die from cardiovascular diseases.

People don’t realize this because several different diseases are lumped together under the term “skin cancer.” The most common by far are basal-cell carcinomas and squamous-cell carcinomas, which are almost never fatal. In fact, says Weller, “When I diagnose a basal-cell skin cancer in a patient, the first thing I say is congratulations, because you’re walking out of my office with a longer life expectancy than when you walked in.” That’s probably because people who get carcinomas, which are strongly linked to sun exposure, tend to be healthy types that are outside getting plenty of exercise and sunlight.

Melanoma, the deadly type of skin cancer, is much rarer, accounting for only 1 to 3 percent of new skin cancers. And perplexingly, outdoor workers have half the melanoma rate of indoor workers. Tanned people have lower rates in general. “The risk factor for melanoma appears to be intermittent sunshine and sunburn, especially when you’re young,” says Weller. “But there’s evidence that long-term sun exposure associates with less melanoma.”

These are pretty radical words in the established dermatological community. “We do know that melanoma is deadly,” says Yale’s David Leffell, one of the leading dermatologists in the country, “and we know that the vast majority of cases are due to sun exposure. So certainly people need to be cautious.”

Still, Weller kept finding evidence that didn’t fit the official story. Some of the best came from Pelle Lindqvist, a senior research fellow in obstetrics and gynecology at Sweden’s Karolinska Institute, home of the Nobel Prize in Physiology or Medicine. Lindqvist tracked the sunbathing habits of nearly 30,000 women in Sweden over 20 years. Originally, he was studying blood clots, which he found occurred less frequently in women who spent more time in the sun—and less frequently during the summer. Lindqvist looked at diabetes next. Sure enough, the sun worshippers had much lower rates. Melanoma? True, the sun worshippers had a higher incidence of it—but they were eight times less likely to die from it.

So Lindqvist decided to look at overall mortality rates, and the results were shocking. Over the 20 years of the study, sun avoiders were twice as likely to die as sun worshippers.

There are not many daily lifestyle choices that double your risk of dying. In a 2016 study published in the Journal of Internal Medicine, Lindqvist’s team put it in perspective: “Avoidance of sun exposure is a risk factor of a similar magnitude as smoking, in terms of life expectancy.”

The idea that slavish application of SPF 50 might be as bad for you as Marlboro 100s generated a flurry of short news items, but the idea was so weird that it didn’t break through the deadly-sun paradigm. Some doctors, in fact, found it quite dangerous.

“I don’t argue with their data,” says David Fisher, chair of the dermatology department at Massachusetts General Hospital. “But I do disagree with the implications.” The risks of skin cancer, he believes, far outweigh the benefits of sun exposure. “Somebody might take these conclusions to mean that the skin-cancer risk is worth it to lower all-cause mortality or to get a benefit in blood pressure,” he says. “I strongly disagree with that.” It is not worth it, he says, unless all other options for lowering blood pressure are exhausted. Instead he recommends vitamin D pills and hypertension drugs as safer approaches.

Weller’s largest study yet is due to be published later in 2019. For three years, his team tracked the blood pressure of 340,000 people in 2,000 spots around the U.S., adjusting for variables such as age and skin type. The results clearly showed that the reason people in sunnier climes have lower blood pressure is as simple as light hitting skin.

When I spoke with Weller, I made the mistake of characterizing this notion as counterintuitive. “It’s entirely intuitive,” he responded. “Homo sapiens have been around for 200,000 years. Until the industrial revolution, we lived outside. How did we get through the Neolithic Era without sunscreen? Actually, perfectly well. What’s counterintuitive is that dermatologists run around saying, ‘Don’t go outside, you might die.’”

When you spend much of your day treating patients with terrible melanomas, it’s natural to focus on preventing them, but you need to keep the big picture in mind. Orthopedic surgeons, after all, don’t advise their patients to avoid exercise in order to reduce the risk of knee injuries.

Meanwhile, that big picture just keeps getting more interesting. Vitamin D now looks like the tip of the solar iceberg. Sunlight triggers the release of a number of other important compounds in the body, not only nitric oxide but also serotonin and endorphins. It reduces the risk of prostate, breast, colorectal, and pancreatic cancers. It improves circadian rhythms. It reduces inflammation and dampens autoimmune responses. It improves virtually every mental condition you can think of. And it’s free.

These seem like benefits everyone should be able to take advantage of. But not all people process sunlight the same way. And the current U.S. sun-exposure guidelines were written for the whitest people on earth.

Every year, Richard Weller spends time working in a skin hospital in Addis Ababa, Ethiopia. Not only is Addis Ababa near the equator, it also sits above 7,500 feet, so it receives massive UV radiation. Despite that, says Weller, “I have not seen a skin cancer. And yet Africans in Britain and America are told to avoid the sun.”

All early humans evolved outdoors beneath a tropical sun. Like air, water, and food, sunlight was one of our key inputs. Humans also evolved a way to protect our skin from receiving too much radiation—melanin, a natural sunscreen. Our dark-skinned African ancestors produced so much melanin that they never had to worry about the sun.

As humans migrated farther from the tropics and faced months of light shortages each winter, they evolved to produce less melanin when the sun was weak, absorbing all the sun they could possibly get. They also began producing much more of a protein that stores vitamin D for later use. In spring, as the sun strengthened, they’d gradually build up a sun-blocking tan. Sunburn was probably a rarity until modern times, when we began spending most of our time indoors. Suddenly, pasty office workers were hitting the beach in summer and getting zapped. That’s a recipe for melanoma.

People of color rarely get melanoma. The rate is 26 per 100,000 in Caucasians, 5 per 100,000 in Hispanics, and 1 per 100,000 in African Americans. On the rare occasion when African Americans do get melanoma, it’s particularly lethal—but it’s mostly a kind that occurs on the palms, soles, or under the nails and is not caused by sun exposure.

At the same time, African Americans suffer high rates of diabetes, heart disease, stroke, internal cancers, and other diseases that seem to improve in the presence of sunlight, of which they may well not be getting enough. Because of their genetically higher levels of melanin, they require more sun exposure to produce compounds like vitamin D, and they are less able to store that vitamin for darker days. They have much to gain from the sun and little to fear.

And yet they are being told a very different story, misled into believing that sunscreen can prevent their melanomas, which Weller finds exasperating. “The cosmetic industry is now trying to push sunscreen at dark-skinned people,” he says. “At dermatology meetings, you get people standing up and saying, ‘We have to adapt products for this market.’ Well, no we don’t. This is a marketing ploy.”

When I asked the American Academy of Dermatology for clarification on its position on dark-skinned people and the sun, it pointed me back to the official line on its website: “The American Academy of Dermatology recommends that all people, regardless of skin color, protect themselves from the sun’s harmful ultraviolet rays by seeking shade, wearing protective clothing, and using a broad-spectrum, water-resistant sunscreen with an SPF of 30 or higher.”

This seemed to me a little boilerplate, and I wondered whether the official guidelines hadn’t yet caught up to current thinking. So I asked David Leffell, at Yale. “I think that sun-protection advice,” he told me, “has always been directed at those most at risk”—people with fair skin or a family history of skin cancer. “While it is true that people with olive skin are at less risk, we do see an increasing number of people with that type of skin getting skin cancer. But skin cancer… is very rare in African Americans… and although they represent a spectrum of pigmentation, [they] are not at as much risk.”

Still, David Fisher at Mass General didn’t think that changed the equation. “There’s a pharmacopoeia of drugs that are extremely effective at lowering blood pressure,” he said. “So to draw the conclusion that people should expose themselves to an elevated skin-cancer risk, including potentially fatal cancer, when there are so many alternative treatments for hypertension, is problematic.”

Am I willing to entertain the notion that current guidelines are inadvertently advocating a lifestyle that is killing us?

I am, because it’s happened before.

In the 1970s, as nutritionists began to see signs that people whose diets were high in saturated fat and cholesterol also had high rates of cardiovascular disease, they told us to avoid butter and choose margarine, which is made by bubbling hydrogen gas through vegetable oils to turn them into solid trans fats.

From its inception in the mid-1800s, margarine had always been considered creepers, a freakish substitute for people who couldn’t afford real butter. By the late 1800s, several midwestern dairy states had banned it outright, while others, including Vermont and New Hampshire, passed laws requiring that it be dyed pink so it could never pass itself off as butter. Yet somehow margarine became the thing we spread on toast for decades, a reminder that even the weirdest product can become mainstream with enough industry muscle.

Eventually, better science revealed that the trans fats created by the hydrogenation process were far worse for our arteries than the natural fats in butter. In 1994, Harvard researchers estimated that 30,000 people per year were dying unnecessarily thanks to trans fats. Yet they weren’t banned in the U.S. until 2015.

Might the same dynamic be playing out with sunscreen, which was also remarkably sketchy in its early days? One of the first sunscreens, Red Vet Pet (for Red Veterinary Petrolatum) was a thick red petroleum jelly invented in 1944 to protect soldiers in the South Pacific; it must have been eerily reminiscent of pink margarine. Only after Coppertone bought the rights and reformulated Red Vet Pet to suit the needs of the new midcentury tanning culture did sunscreen take off.

However, like margarine, early sunscreen formulations were disastrous, shielding users from the UVB rays that cause sunburn but not the UVA rays that cause skin cancer. Even today, SPF ratings refer only to UVB rays, so many users may be absorbing far more UVA radiation than they realize. Meanwhile, many common sunscreen ingredients have been found to be hormone disruptors that can be detected in users’ blood and breast milk. The worst offender, oxybenzone, also mutates the DNA of corals and is believed to be killing coral reefs. Hawaii and the western Pacific nation of Palau have already banned it, to take effect in 2021 and 2020 respectively, and other governments are expected to follow.

The industry is now scrambling to move away from oxybenzone, embracing opaque, even neon, mineral-based formulations, a fashion statement reminiscent of the old Red Vet Pet. But with its long track record of pushing products that later turn out to be unhealthy, I remain skeptical of industry assurances that it finally has everything figured out. We are always being told to replace something natural with some artificial pill or product that is going to improve our health, and it almost always turns out to be a mistake because we didn’t know enough. Multivitamins can’t replace fruits and vegetables, and vitamin D supplements are clearly no substitute for natural sunlight.

Old beliefs don’t die easily, and I can understand if you remain skeptical of old Sol. Why trust one journalist and a handful of rogue researchers against the august opinions of so many professionals?

Here’s why: many experts in the rest of the world have already come around to the benefits of sunlight. Sunny Australia changed its tune back in 2005. Cancer Council Australia’s official-position paper (endorsed by the Australasian College of Dermatologists) states, “Ultraviolet radiation from the sun has both beneficial and harmful effects on human health…. A balance is required between excessive sun exposure which increases the risk of skin cancer and enough sun exposure to maintain adequate vitamin D levels…. It should be noted that the benefits of sun exposure may extend beyond the production of vitamin D. Other possible beneficial effects of sun exposure… include reduction in blood pressure, suppression of autoimmune disease, and improvements in mood.”

Australia’s official advice? When the UV index is below 3 (which is true for most of the continental U.S. in the winter), “Sun protection is not recommended unless near snow or other reflective surfaces. To support vitamin D production, spend some time outdoors in the middle of the day with some skin uncovered.” Even in high summer, Australia recommends a few minutes of sun a day.

New Zealand signed on to similar recommendations, and the British Association of Dermatologists went even further in a statement, directly contradicting the position of its American counterpart: “Enjoying the sun safely, while taking care not to burn, can help to provide the benefits of vitamin D without unduly raising the risk of skin cancer.”

Leffell, the Yale dermatologist, recommends what he calls a “sensible” approach. “I have always advised my patients that they don’t need to crawl under a rock but should use common sense and be conscious of cumulative sun exposure and sunburns in particular,” he told me.

This does not mean breaking out the baby oil or cultivating a burnished tan. All the experts agree that sunburns—especially those suffered during childhood and adolescence—are particularly bad.

Ultimately, it’s your call. Each person’s needs vary so much with season, latitude, skin color, personal history, philosophy, and so much else that it’s impossible to provide a one-size-fits-all recommendation. The Dminder app, which uses factors such as age, weight, and amount of exposed skin to track the amount of sunlight you need for vitamin D production, might be one place to start. Trading your sunscreen for a shirt and a broad-brimmed hat is another. Both have superior safety records.

As for me, I’ve made my choice. A world of healthy outdoor adventure beckons—if not half naked, then reasonably close. Starting today, I’m stepping into the light.

Source : Outside


Read also at The Truth About Cancer:

Sunscreen and the Lies We’ve Been Told About the Real Causes of Skin Cancer . . . . .

Study Links Widely-used Drug Azathioprine to Skin Cancers

Roddy Isles wrote . . . . . . . . .

A drug used to treat inflammatory bowel disease, arthritis and vasculitis as well as to prevent organ rejection in transplant patients has been identified as an important contributor to skin cancer development, in a research study carried out at the University of Dundee, Queen Mary University of London and the Wellcome Sanger Institute.

The research, published in Nature Communications, identified a `strong case for an association’ between the drug azathioprine and the mutational signature found in cases of cutaneous squamous cell carcinoma (cSCC), a common form of skin cancer.

It was already known that use of azathioprine leads to increased photosensitivity to UVA light, probably contributing to development of skin cancers. This new study finds that use of azathioprine leaves a molecular fingerprint in skin cancers, further implicating it in cSCC development.

Charlotte Proby, Professor of Dermatology in the School of Medicine at Dundee, said, “We recommend all physicians give appropriate advice on UVA avoidance including year-round sun protection for their patients on azathioprine.”

Professor Proby and colleagues said they were not necessarily advocating withdrawal of azathioprine.

“As with all medications the risks must be balanced against the benefits, particularly with the need to treat potentially life-threatening diseases with an effective drug,” she said.

“It is important that sun protection, skin surveillance and early diagnosis/lesion removal are part of the routine management of patients on azathioprine.”

cSCC is a common skin cancer with more than 40,000 new cases diagnosed annually in the UK, with significant health economic implications.

Sophia Lowes, from Cancer Research UK, said, “It’s important to protect your skin from the sun when it’s strong, especially if you burn easily or are taking medications which make you more sun-sensitive. The most effective protection is to spend time in the shade and cover up with a hat, long-sleeved top and sunglasses. For the bits you can’t cover, use sunscreen with at least 4 stars and SPF 15 or higher for protection against both UVA and UVB rays.”

Importantly, this new study also reveals the molecular landscape of cSCC and highlights potential targets that may be developed for future therapeutic approaches to manage cSCC.

Different carcinogens leave a different `mutational signature’ in a cancer. By studying these signatures, researchers can start to determine what the causes of a cancer are.

The researchers in the School of Medicine at Dundee, in collaboration with the Wellcome Sanger Institute and Queen Mary University of London, were able to carry out mutational signature analysis of cSCC tumours from 37 patients, many of whom had been on azathioprine. They found a new mutational signature, Signature 32, which correlated with time on azathioprine therapy.

Source: University of Dundee


Today’s Comic

Artificial Intelligence Better Than Dermatologists at Catching Skin Cancers

Amy Norton wrote . . . . . . . . .

A computer can beat even highly experienced dermatologists in spotting deadly melanomas, researchers report.

The study is the latest to test the idea that “artificial intelligence” can improve medical diagnoses.

Typically, it works like this: Researchers develop an algorithm using “deep learning” — where the computer system essentially mimics the brain’s neural networks. It’s exposed to a large number of images — of breast tumors, for example — and it teaches itself to recognize key features.

The new study pitted a well-honed computer algorithm against 58 dermatologists, to see whether machine or humans were better at differentiating melanomas from moles.

It turned out the algorithm was usually more accurate. It missed fewer melanomas, and was less likely to misdiagnose a benign mole as cancer.

That does not mean computers will someday be diagnosing skin cancer, said lead researcher Dr. Holger Haenssle, of the University of Heidelberg in Germany.

“I don’t think physicians will be replaced,” Haenssle said.

Instead, he explained, doctors could use artificial intelligence (AI) as a tool.

“In the future, AI may help physicians focus on the most suspicious skin lesions,” Haenssle said.

A patient might, for instance, undergo whole-body photography (a technology that’s already available), then have those images “interpreted” by a computer algorithm.

“In the next step,” Haenssle explained, “the physician may examine only those lesions labeled as ‘suspicious’ by the computer.”

Doctors already do skin exams with the help of a technology called dermoscopy — where a hand-held device is used to light and magnify the skin. Haenssle said AI could again be used to help analyze those images.

Dr. Mary Stevenson is an assistant professor of dermatology at NYU Langone Medical Center in New York City.

She agreed that the technology is not going to replace doctors, but could serve as an “aid.”

There are still questions to be answered, according to Stevenson, who was not involved in the research. For one, she said, this study focused only on differentiating melanoma from benign moles — and there is more to skin cancer diagnosis than that.

For the study, Haenssle’s team recruited 58 dermatologists from 17 countries. Over half had more than five years of experience and were considered “expert” level.

First, the doctors examined 100 dermoscopic images of either melanomas or harmless moles.

Four weeks later, they viewed those images and were given more information about the patients — such as their age and position of the lesion on the body. That more closely reflected what doctors work with in the “real world.”

In the first phase, the doctors accurately caught melanomas nearly 87 percent of the time, on average; they correctly identified moles about 71 percent of the time.

The computer, however, did better: When it was tuned to have the same level of accuracy as doctors in detecting benign moles, the computer caught 95 percent of melanomas.

The doctors boosted their accuracy when they also had information about the patients. They caught 89 percent of melanomas, and accurately identified benign moles about 76 percent of the time.

The computer still outperformed them, though: At that same level of accuracy for catching melanoma, the computer correctly diagnosed about 83 percent of moles.

Haenssle said that in some parts of Germany, doctors are already using the algorithm tested in this study — in software sold by the company FotoFinder Systems GmbH. He has received fees from the company and others that market devices for skin cancer screening.

For now, traditional skin exams remain the standard of care.

Stevenson said she suggests people get one head-to-toe exam to inspect the skin for suspicious growths — and then talk to their doctor about how to follow up.

“I also recommend getting in front of a mirror once a month to do a self-exam,” Stevenson said.

The point is to spot any changes in the size, shape or color of a mole or other dark spot on the skin. According to Stevenson, some warning signs of melanoma include asymmetry in a growth, as well as irregular borders, uneven coloring and a large diameter (larger than a pencil eraser).

“When melanoma is caught early,” Stevenson said, “it is highly curable.”

Source: HealthDay

Can You Recognize the Signs of Skin Cancer?

With skin cancer the most common type of cancer in the United States, you should learn to spot its early signs, a cancer doctor says.

“Early detection is key. When detected early, most skin cancers may be effectively treated and are often curable,” said Dr. Jeffrey Farma, a surgical oncologist at Fox Chase Cancer Center in Philadelphia.

“Individuals play an important role in early detection,” Farma said in a center news release. “By being familiar with your own skin markings, like moles, freckles and blemishes, you’re likely to notice any changes.”

His recommendation: Have your skin checked yearly by a physician or dermatologist, and check your own skin for signs of skin cancer by using a mirror every month.

Using the ABCDE rule of skin cancer can help identify potential problems, including the most deadly form of skin cancer, melanoma, he said.

A for Asymmetry. Melanoma lesions are often not symmetrical in shape, while benign moles are usually symmetrical.

B for Border. Benign moles usually have smooth, even borders, while melanoma lesions usually have irregular borders that are difficult to define.

C for Color. A mole with more than one color (blue, black, brown, tan, etc.) or the uneven distribution of color can sometimes be a warning sign of melanoma. Benign moles are usually a single shade of brown or tan.

D for Diameter. Melanoma lesions are often more than 6 millimeters in diameter, about the size of a pencil eraser.

E for Evolution. The evolution of your moles is important. Knowing what is normal for you could save your life.

“If a mole or marking has gone through recent changes in color and/or size, bring it to the attention of your doctor immediately so he or she can determine the cause,” Farma said. “Remember that skin cancer affects people of all skin tones, no matter what their complexion.”

Source: HealthDay


Today’s Comic