Weight Gain and Loss May Worsen Dementia Risk in Older People

Older people who experience significant weight gain or weight loss could be raising their risk of developing dementia, suggests a study from Korea published today in the online journal BMJ Open.

Dementia is an important health problem especially with increasing life expectancy and an ageing population. In 2015, there were an estimated 46.8 million people diagnosed with dementia.

Meanwhile, the global prevalence of obesity, which is closely related to cardiometabolic diseases, has increased by more than 100% over the past four decades.

There is existing evidence of a possible association between cardiometabolic risk factors (such as high blood pressure, cholesterol and blood sugar levels) and dementia. However, the association between body mass index (BMI) in late-life and dementia risk remains unclear.

Therefore, a team of researchers from the Republic of Korea set out to investigate the association between BMI changes over a two-year period and dementia in an elderly Korean population.

They examined 67,219 participants aged 60-79 years who underwent BMI measurement in 2002-2003 and 2004-2005 as part of the National Health Insurance Service-Health Screening Cohort in the country.

At the start of the study period, characteristics were measured including BMI, socioeconomic status and cardiometabolic risk factors.

The difference between BMI at the start of the study period and at the next health screening (2004-2005) was used to calculate the change in BMI.

After two years, the incidence of dementia was monitored for an average 5.3 years from 2008 to 2013.

During the 5.3 years of follow-up time, the numbers of men and women with dementia totaled 4,887 and 6,685, respectively.

Results showed that there appeared to be a significant association between late-life BMI changes and dementia in both sexes.

Rapid weight change – a 10% or higher increase or decrease in BMI – over a two-year period was associated with a higher risk of dementia compared with a person with a stable BMI.

However, the BMI at the start of the period was not associated with dementia incidence in either sex, with the exception of low body weight in men.

After breaking down the figures based on BMI at the start of the study period, the researchers found a similar association between BMI change and dementia in the normal weight subgroup, but the pattern of this association varied in other BMI ranges.

Cardiometabolic risk factors including pre-existing hypertension, congestive heart failure, diabetes and high fasting blood sugar were significant risk factors for dementia.

In particular, patients with high fasting blood sugar had a 1.6-fold higher risk of developing dementia compared to individuals with normal or pre-high fasting blood sugar.

In addition, unhealthy lifestyle habits such as smoking, frequent drinking and less physical activity in late life were also associated with dementia.

This is an observational study, so can’t establish cause, and the researchers point to some limitations, including uncertainty around the accuracy of the definition of dementia and reliance on people’s self-reported lifestyle habits, which may not be accurate.

However, the study included a large amount of data and reported various modifiable risk factors of dementia in late life.

As such, the researchers conclude: “Both weight gain and weight loss may be significant risk factors associated with dementia. This study revealed that severe weight gain, uncontrolled diabetes, smoking and less physical activity in late-life had a detrimental effect on dementia development.

“Our results suggest that continuous weight control, disease management and the maintenance of a healthy lifestyle are beneficial in the prevention of dementia, even in later life.”

Source: EurekAlert!

Advertisements

Want to Stay Trim? Don’t Eat in the Evening, Study Finds

Alan Mozes wrote . . . . . . . . .

Maybe you rush around with work and activities during the day, then settle in for a large, relaxing meal in the evening. But new research says the later in the day you eat, the more weight you’re likely to pack on.

That’s the takeaway from a week-long study involving 31 overweight and obese patients, mostly women.

“We evaluated meal and sleep timing in patients with overweight/obesity at the beginning of a weight loss trial, before participants started the intervention,” said lead author Dr. Adnin Zaman, an endocrinology fellow at the University of Colorado School of Medicine.

Her team found that “eating later into the day was associated with a higher body mass index (BMI) and greater body fat.” BMI is a measure of body fat based on height and weight.

For the study, participants were enrolled in a weight-loss trial comparing daily calorie limits to time-restricted feeding. In other words, once the trial launched, they could only eat during certain hours of the day.

Ninety percent of the participants were women. Their average age was 36.

A week before the study, they were outfitted with electronic devices to monitor their activity and sleep. They also were asked to snap cellphone photos of everything they ate. The photos were time-stamped using an app called MealLogger.

Zaman and colleagues did not define which hours would amount to “late-day eating” and did not track calories or nutritional values.

The team did note that participants who ate later in the day also went to bed later, though all averaged seven hours of sleep a night.

The participants’ food consumption spanned 11 hours a day, with the last nosh typically clocked around 8 p.m. Those who ate later tended to have a higher BMI and body fat, the study found.

Though most participants were women, Zaman said the findings may “also apply to men.”

But, she added, the study was purely observational and more research is needed to understand why late-day eating might lead to obesity.

Her team is already exploring whether eating earlier in the day, when people tend to be more active, might help prevent obesity.

“Future studies are also needed where these methods are applied to people with normal BMIs, and compared to a population with overweight/obesity,” Zaman said.

Lona Sandon is program director with the Department of Clinical Nutrition at the University of Texas Southwestern Medical Center. She got a sneak peek of the findings and was not surprised.

Sandon has her own theories about why late-day eating might lead to weight gain.

“When you eat more of your food calories earlier in the day, they may be more likely used for energy and less likely stored as fat due to different levels of hormones,” she said. You may also feel more satisfied with fewer calories.

“Eating later in the day, more so at night, seems to be linked to storing more body fat due to hormone differences at this time of day,” Sandon added.

Her advice: Eat breakfast and enjoy a hearty lunch.

“If you are skipping breakfast, having a light lunch and finding yourself eating late into the night because you have barely eaten all day, simply cutting back at night is not going to work,” Sandon said. “Making the lunch meal the largest meal of the day, with at least a little something for breakfast, has worked for some of my clients to be able to cut back at night or be satisfied with a light dinner.”

The findings are slated for presentation on Saturday at a meeting of the Endocrine Society, in New Orleans. Research presented at meetings is considered preliminary until published in a peer-reviewed journal.

Source: HealthDay

Study: Energy Cost of Sitting Versus Standing for Sedentary Workers

Office employees who opt to stand when working are likely to be burning only fractionally more calories than their seated colleagues, according to new research from the University’s Department for Health.

The study, published in the journal Medicine & Science in Sports & Exercise, reveals that the ‘benefits’ of standing over sitting equate to little more than 9 calories an hour – the equivalent of just one stalk of celery. In fact, purely for weight gain perspective, it would take individuals who opted to stand nearly the entire day to burn just one cup of coffee.

Testing sitting vs standing

Prolonged sitting has become a major health concern, targeted via government policy and the increase in of height-adjustable workstations and wearable technologies that encourage standing. Yet despite these interventions, which have the potential to influence energy balance, remarkably little had been known of the true energy cost of sitting versus standing naturally.

For this study, which involved researchers at Bath and Westmont College (US), the team tested the resting metabolic rates of 46 healthy men and women. Participants were then asked to either lie down, sit down or stand up before measurements were taken of their expired gases in order to assess how many calories they burnt through the activity.

With only marginal gains in calories expended observed, the study questions the effectiveness of standing as an effective strategy for weight loss and in the treatment of obesity.

Professor James Betts of the University of Bath’s Department for Health explains: “The biomechanics of standing means that more muscles are used to support a greater proportion of the body weight in an upright position, so should cost more energy than when sitting.

“Past research has shown this by comparing sitting and standing when completely motionless. Other research has also explored the energy costs of various daily activities that can be completed whether or not seated but also allow people to walk around, so may not tell us about the simple difference between siting versus standing per se.

“In the real-world people also do not usually have their bodily movements restricted but instead do spontaneously fidget to remain comfortable, so we saw an opportunity to understand the fundamental difference between sitting and standing naturally.”

Collaborator, Professor Gregg Afman, Professor of Kinesiology at Westmont College (US) said: “We found that energy cost increase of 0.65 kJ per minute from sitting to standing naturally which equates to a 12% difference. However current interventions to reduce prolonged sitting like standing desks or wearable technologies only increase standing by a maximum of two hours per day. This limited time-frame would cause a person to expend less than 20 kcals more each day.”

Unlikely to influence waist lines

Dr Javier Gonzalez, who was also involved in the study from the University of Bath, added: “The very small increase in energy cost of standing compared to sitting that we observed suggests that replacing time spent sitting with time spent standing is unlikely to influence our waist lines in any meaningful way.

“To put this difference in context, it would require an additional 20 hours of standing time, on average, to burn of a medium latte. Many people are becoming aware of the negative health effects of prolonged sitting, and so may opt for standing desks. These people should be aware that whilst there are still some health benefits to standing more, they should not expect to see drastic changes in their body weight. In order to lose body weight, people should focus on increasing physical activity and focus on their diet too.”

Source : University of Bath

The Case for Bread

Carrie Dennett wrote . . . . . . . . .

Is it the dietary devil it’s made out to be?

If bread is the staff of life, why does it elicit so much fear and loathing? Bread’s reputation has taken a double hit from the low-carbohydrate and gluten-free trends, blamed for everything from weight gain to celiac disease. Despite its long and revered history, today many people feel virtuous when they avoid the bread basket—or guilty when they eat a sandwich. Have we failed bread … or has modern bread failed us?

Bread’s history dates back to the Fertile Crescent, where the world’s first urban cultures developed. Archeologists have found starch from barley and possibly ancient wheat embedded in a grindstone at a Paleolithic site in Israel dating back approximately 23,000 years, along with evidence of a simple ovenlike hearth, which suggests that the flour made from the grain was baked. Both modern and traditional forms of wheat contain essential amino acids, minerals, and vitamins, as well as phytochemicals and fiber. Where modern wheat differs is that it contains fewer mineral micronutrients, largely because it’s been bred for white flour—and white flour only—says Stephen Jones, PhD, a wheat breeder and director of The Bread Lab at Washington State University’s Mount Vernon Northwestern Washington Research & Extension Center.

Jones says wheat is one of the most nutrient-dense foods on the planet, but what we do to wheat turns it into one of the least nutrient-dense foods on the planet. The bran and germ are stripped away, and what’s left behind is primarily starch and gluten. “There has been zero effort to increase easy micronutrients like iron, zinc, and selenium,” Jones says of modern wheat breeding, adding that how wheat is processed creates more problems. “Industrial plastic-wrapped bread can have over 25 ingredients. Bread needs four.”

The Gluten Myth

It’s a myth repeated so often that many people take it as truth: Modern bread wheat contains more gluten and is responsible for the increased prevalence of celiac disease and nonceliac wheat sensitivity (NCWS), and heirloom and ancient wheats are the answer. Those ancient wheats are einkorn (Triticum monococcum), a diploid wheat (two complete sets of chromosomes) with an AA genome, and emmer (Triticum dicoccoides), a tetraploid wheat with an AABB genome. Emmer evolved from the spontaneous hybridization of einkorn and wild grass. Common wheat, which refers to the hexaploid species with AABBDD genomes, is about 9,000 years old, the result of hybridization between emmer and wild “goat grass” (Triticum tauschii). It’s the D genomes that contain most of the components that play a role in celiac disease.

Heirloom wheats, also referred to as heritage wheats or landraces, are generally older, open-pollinated, genetically diverse varieties of common wheat, the results of natural evolution and adaptation that were saved by farmers and passed on. Wheat that has adapted to one part of the country likely won’t perform well in another—say, the South vs the Great Plains or Arizona vs the Pacific Northwest. In addition to differences in their adaptation to different environments, common wheat species also vary in their composition of bioactive components, including gluten.

Modern wheat debuted in the 1950s with a semidwarf wheat that wouldn’t tip over from the weight of the large seed heads fostered by nitrogen-rich synthetic fertilizers. While wheat grows in 42 states, commodity wheat is grown in wheat belts in the western and plains states on 2,000- to 5,000-acre farms. While this wheat scores high marks for uniformity and high yield in ideal environments, it doesn’t prioritize taste or nutrition—nor does it lend itself well to regional farming.

But what about gluten content? Wheat contains many proteins—the main types being gluten, globulin, and albumin—any of which have the potential to cause an immune reaction. Within the gluten group of proteins are glutenins and gliadins. Gliadins are more likely to be a causal factor in celiac disease and some types of wheat allergy, but modern wheat hasn’t been bred for higher gliadin content. It’s been bred to encourage high–molecular weight glutenins, proteins that are essential for bread baking quality but carry low risk of causing celiac disease, wheat allergy, or NCWS, says Lisa Kissing Kucek, PhD, a plant breeder at Cornell University in Ithaca, New York.

“The gliadins are the types of proteins that are more likely to be reactive. The glutenins are more important for baking quality,” Kucek says, adding that pastry baking calls for flour with more gliadins that have extensibility, or stretch, as opposed to glutenins, which offer the elasticity needed for bread baking. “Modern wheat breeders have been very good at increasing the types of glutenins that make good bread.”

As lead author of the 2015 article, “A Grounded Guide to Gluten: How Modern Genotypes and Processing Impact Wheat Sensitivity,” Kucek examined the relative immunoreactivity of ancient, heritage, and modern wheats. “We looked at hundreds of research papers to see what we could find about what the difference truly is, and we found there is a very tiny difference between modern and heritage wheat for most sensitivities, especially celiac and wheat allergies,” Kucek says. “Depending on what type of sensitivity people have, heritage wheats are not going to be the answer most of the time.”

Kucek points out that even if a wheat variety is known to be particularly low in immunoreactive proteins, it would be difficult to find that wheat—or bread baked from it—in the store. “Different varieties are grown in different regions, and many flours are blended in the mill. Getting a variety-specific flour is difficult.” Plus, a variety that’s better for someone with wheat allergy might not be better for someone with celiac disease. “I wish we had more data to say that these are the varieties that are better for this disease or that disease, but it would be a labeling nightmare for the industry.”

Rather than looking back, Kucek says the best hope is looking forward—by identifying wheat genotypes that aren’t immunoreactive and using them to guide future breeding. “There are efforts, at least with the low-hanging fruit, such as anaphylaxis,” Kucek says. “The same thing is being researched for celiac disease, so when breeders develop varieties for different regions, they can screen for these genotypes.”

Beyond Wheat Genetics

What Kucek and her coauthors did find was that a larger contributor to immunoreactive compounds is how the wheat is processed—from farm to mill. The nutrient composition of wheat, as with other crops, depends on the environment it’s planted in and how it’s grown. Higher application of nitrogen fertilizers leads to higher protein content overall, but it also specifically boosts gluten—and gliadins.

Traditional methods such as sprouting and fermenting are largely missing from industrial bread. Sprouting grains, which actually soaks them just short of the sprouting point, activates enzymes in the grain that can break down difficult-to-digest proteins. “So does the sourdough fermentation process, which has been used for thousands of years and really changed in the 1920s and ’30s,” Kucek says. “That said, for people with celiac disease, even fermented and sprouted grains are not going to help them.” However, she says these techniques reduce exposure to immunoreactive compounds, which may reduce new cases of celiac disease in genetically predisposed populations.

Wheats developed before 1870 were grown for stone milling, which crush the germ (the fat- and vitamin B-rich embryo) into the endosperm, distributing both oils and flavor. Although much of the flour is sifted to create white flour from the starchy, protein-rich endosperm, some bran (the outer, fiber-rich skin) and germ remain. “Much less than 1% is stone milled today,” Jones says. “It is increasing but slowly. A lot of people are stone milling commodity wheat.”

With the Industrial Revolution, stone mills gave way to steel roller mills, which more precisely separate the bran, germ, and endosperm. The removal of the germ, along with the heat generated by the rollers, removes much of wheat’s nutrients, one reason why most white roller-milled flour is enriched with iron, folic acid, thiamin, riboflavin, and niacin. The remaining micronutrients lost during milling—accounting for 60% to 90% of total nutrients—aren’t replaced by fortification. Roller milling meant that the white flour desired by both home and commercial bakers could be produced efficiently. It also paved the way for the industrialization of bread baking. In 1890, 90% of households baked their own bread. By 1930, 90% of households bought industrial white bread.

Modern Baking

How did we get from hand-formed rustic bread made of whole wheat flour, water, salt, and maybe yeast to, as Jones puts it, “Whipped-into-a-frenzy dough that will become a fast-food hamburger bun”? One reason is that industrial baking needs standardized flour that works predictably in large volumes in mechanized assembly lines, which translates to white flour with high protein content and low mineral content. Why the low mineral content? Minerals are found in the bran, and less bran means more endosperm—and more white flour per seed.

Unfortunately, white flour has fewer enzymes available to help break down the gluten, because most of those enzymes are in the bran. But wheat that’s bred for white flour and industrial baking isn’t optimal for whole wheat bread and natural fermentation because whole wheat dough has to be strong enough to carry the bran and the germ. As a result, many artisan bakers still use mostly white flour. Industrial bakers, on the other hand, strengthen whole wheat dough in a way that may have some unintended health effects.

“Whole grain bread started becoming more popular in the 1970s, but people didn’t want dense bread; they wanted their fluffy bread,” Kucek says. “The thing is with whole grain bread, you have the bran, and that bran can act like little razor blades.” This disrupts gluten development and loaf volume. “It’s tough to get the fluffy bread that people are used to for their sandwiches. Instead of being careful as with French processes or long fermentation, we started adding gluten after the fact to get the gluten structure that people have come to expect from good bread.”

Visit the bread aisle in any grocery store and start picking up loaves of whole grain bread. On the ingredient list you’ll see “wheat gluten” or “vital wheat gluten”—gluten separated from wheat flour by washing away the starch granules. This is true of both multigrain breads—which include flours from grains that don’t contain gluten of their own—and whole wheat breads, even brands perceived to be more healthful. Unless a store stocks artisan breads, odds are good that every loaf will have added gluten.

One problem with adding gluten after the fact is that we don’t know the concentration we’re getting, Kucek says. Vital gluten intake may have tripled since the late 1970s, and consuming isolated gluten could create problems for some individuals. “There are enzymes within the wheat kernel that are important for helping us break down a number of compounds in wheat, including fructans and various gluten compounds,” Kucek says.

“When we artificially separate the gluten and add it after the fact, we don’t have these enzymes to help us process that. We also don’t have that long fermentation to break down the glutens and fructans.” Fructans are polysaccharides formed from fructose molecules.

“We can have whole wheat bread without adding wheat gluten if we will accept denser bread,” Kucek says. However, there is also a wave of innovation in “postmodern” wheat breeding, selecting for and improving traits that will make it easier and more affordable for bakers to create slow-fermented whole wheat bread that’s acceptable to consumers—no added gluten needed. Breeders such as Jones aren’t using heirlooms wholesale. Instead, they’re looking at what favorable traits—flavor, nutrition, high yield, disease, and pest resistance—heirlooms carry that may be adapted to a modern context.

Jones has been breeding wheat since he was an agronomy undergraduate in the late 1970s, growing five acres on a student farm. In the 1990s, after earning his PhD in genetics, he became a chief wheat breeder at Washington State University, unhappily improving commodity wheat that was designed for industrial milling. Today at The Bread Lab, he works with his graduate students to breed wheat and other grains to be used regionally on small farms in the coastal West and other areas of the country. What it means to “improve” wheat has shifted significantly over the years. “Then, it was ‘How much white flour can we get per acre?’ Yield of white flour is all that mattered,” he says. “Now it is high-yielding wheats that are full of iron and zinc and taste great and perform well in whole wheat uses.”

Even though Jones often talks about terroir and flavor, two “foodie” words, one of his overarching goals is to help make good bread accessible, affordable, and regional. “Affordable is part of a mature food system,” he says. “Twelve-dollar loaves of bread don’t help anybody out.” That mature system, he says, has no place for ancient and heirloom wheats, because they yield too little. “We have wheat that is nutritious, non-GMO, nonpatented, tastes great, and works well in whole wheat products that will yield 10 times the old wheats. This brings the price point down. We are not interested in boutique wheats,” Jones says.

Kucek agrees. “There can be a lot of nostalgic excitement about those old varieties, and they do serve a huge purpose in terms of biodiversity and flavor, but we’ve moved on from that,” she says. “There’s a reason we’re not growing einkorn as much, and heritage varieties often don’t have as much disease resistance.”

Bread and Body Weight

If modern wheat itself can’t account for the rise in cases of celiac disease and NCWS, what about weight gain? Is bread culpable, despite its role in many traditional diets, including the Mediterranean diet? In the Spanish arm of the PREDIMED study, researchers tracked bread consumption at baseline and each year for four years in 2,213 participants at high risk of CVD. Their findings suggest that bread consumption isn’t associated with clinically relevant increases in weight and abdominal fat, although whole grain bread appeared to have an advantage over white bread for preventing weight gain.13 Researchers from the Spanish SUN study found similar results.

In Norway, data from the HUNT study suggest that lower intake of bread, especially whole grain bread, was associated with central adiposity. A 2012 review published in Nutrition Reviews looked at evidence from 38 epidemiologic studies and found that dietary patterns that include whole grain bread don’t contribute to weight gain, and that even white bread at worst shows a “possible relationship” with excess abdominal fat.

Bottom Line

When asked why bread isn’t the devil, Jones says simply, “Bread is who we are.” So how can dietitians help patients, clients, and consumers make the best possible choice?

“The best way to go about it if you don’t have celiac disease but have it in the family or are worried about it is to go for long fermentation, avoid vital wheat gluten, and go for sprouted grains as much as possible,” Kucek says. “It is very hard to avoid vital wheat gluten if you are shopping for bread in the grocery store.”

Source: Today’s Dietitian

A 3-Pronged Plan to Cut Type 2 Diabetes Risk

Len Canter wrote . . . . . . . . .

The type 2 diabetes tide remains unchecked in the United States, as does pre-diabetes — having a blood sugar level higher than normal, but not high enough for a diabetes diagnosis.

A U.S. Centers for Disease Control and Prevention report found that about 30 million Americans — roughly 10 percent of the population — have type 2 diabetes. What’s more, over 80 million have pre-diabetes, which, if not treated, often leads to diabetes within five years.

That puts these people at risk of heart and blood vessel disease, nerve damage, and kidney and eye damage, among other health threats.

While type 2 diabetes is more common among certain ethnic and racial groups — including American Indians, Alaska Natives, African Americans and people of Hispanic descent — no one is immune. And though you can’t change your heritage, you can change diabetes risk factors like carrying too much weight and being sedentary.

A multi-year U.S. National Institutes of Health study of more than 3,000 overweight or obese adults with blood sugar at pre-diabetes levels found that lifestyle changes can have a profound effect. Achieving a 7 percent weight loss and doing 150 minutes of moderate intensity activity every week lowered the rate of type 2 diabetes by 58 percent, compared to people who didn’t make these changes.

What you eat counts, too. Focusing on fruits and vegetables, whole grains and low-fat dairy cuts type 2 diabetes risk. By contrast, eating a lot of refined grains (like baked goods made with white flour and white rice), processed meats and added sugars increases it. So aim to rebalance your diet.

Finally, rethink your drink. One very simple step is to cut out sugar-sweetened beverages. Sodas and similar drinks may lead not only to type 2 diabetes, but also to overweight and more belly fat in particular.

Source: HealthDay


Read also at CDC:

National Diabetes Prevention Program . . . . .


Today’s Comic