Is It Safe to Eat the Colourful Glitter on Top of the Cupcake

Caroline Weinberg wrote . . . . . . . .

Before Tide Pods inexplicably captured America’s imagination, edible glitter enjoyed a few moments of Instagram fame in 2017 — peaking with a latte topped with a liberal sprinkle of glitter that caught diners’ eyes in November. Since then, other restaurants have added the ingredient to their own menus, resulting in colorful dishes like countless glitter lattes, glittery strawberry jelly, “sparkly” iced tarts, glitter smoothies, and even glittery gravy, which one London pub served alongside its Christmas pie.

This week, glittery food hits the big time. Mardi Gras 2018 inspired glitter-topped hot chocolate. Los Angeles-based Astro Doughnuts just announced a glittery gold doughnut to celebrate the Oscars. And for Valentine’s Day, burger chain Shake Shack will unleash a glittery pink milkshake in select cities; dubbed the “Love Shack,” it’s a Valentine’s Day-themed strawberry milkshake topped with whipped cream and glitter.

But as the glitter trend gains steam, the FDA cautions that all that glitters is not edible, and some environmental scientists are trying to get us to give up glitter altogether. So what’s the deal with glitter in food?

Why are people eating glitter?

Like raccoons, people like shiny things. Researchers have found evidence that this preference starts in infancy, with some suggesting that it’s tied to our “innate need for water.” Non-flavored edible glitter, which is often sold at craft stores, adds no additional flavor to dishes — it’s a purely aesthetic add for those times when drinking plain coffee or eating a cupcake with dull icing just doesn’t seem exciting enough. But not everyone is happy with the trend, and some people have complained that certain glitters add an unappealing gritty texture to the food.

Is this the same glitter I used in arts and crafts?

No. Or at least it shouldn’t be. There are two forms of glitter you’ll find topping cakes and drinks: edible and non-toxic, and that classification depends on the Food and Drug Administration (FDA), the U.S. agency that regulates, among other things, what products are considered safe for human consumption.

Edible products are cleared for human consumption in the U.S. and are mandated to include an ingredient list. Non-toxic products won’t kill you, but they’re not considered food, and not subject to the same rigorous testing as products designed for consumption. Play-Doh, for example, is non-toxic, but no one would recommend that you eat it as a snack.

“Consumers should carefully check the label of decorative products they consider for use on foods,” says FDA spokesperson Dr. Marianna Naum. “Most edible glitters and dusts also state ‘edible’ on the label. If the label simply says ‘non-toxic’ or ‘for decorative purposes only’ and does not include an ingredients list, the product should not be used directly on foods.”

Concerns over edible glitter consumption first emerged in 2012, thanks to an episode of the cultishly adored reality program The Great British Bake Off: In the episode, one contestant sprinkled glitter atop her cupcakes but admitted she wasn’t sure if the product was edible. The episode quickly made glitter one the top 10 food safety concerns in Britain.

What’s the glitter on my food made of?

Ingredients in edible glitter commonly include “sugar, acacia (gum arabic), maltodextrin, cornstarch, and color additives specifically approved for food use, including mica-based pearlescent pigments and FD&C colors such as FD&C Blue No. 1.” Barring any food allergies, it can be sprinkled liberally on or in your food, should you be so inclined.

Non-toxic or “food contact” glitter, which is often used on cakes, is technically safe to consume in small quantities, but that doesn’t mean you should be using it as an everyday garnish. The FDA issued an advisory statement about glitter in 2016, noting it had recently become “aware that some non-edible decorative glitters and dusts are promoted for use on foods.”

According to the FDA, there is no difference between this non-toxic decorative food glitter and the glitter that you poured over construction paper as a child; non-toxic glitter can be made of plastic. This glitter is sometimes labeled as for “display” only, with fine print explaining that it is not intended to be eaten and should be removed from food stuffs prior to consumption — and challenging task when it’s being applied directly to icing or whipped cream.

Should I be wary of glitter on food?

Eating small amounts of non-toxic glitter on food will not kill you, so there’s no need to panic if you accidentally consume something meant to be decorative. People with some gastrointestinal disorders that have trouble with digesting small, hard food stuffs like seeds may want to be particularly careful in these cases. “Non-toxic glitter may not kill you, but don’t eat it,” says Dr. Zhaoping Li, professor of medicine and chief of the Division of Clinical Nutrition at UCLA. “At least not regularly or large quantities.”

So you can feel free to cover your coffee, cakes, steak, fish, and other food products with edible glitter — if you can find it. It’s far more difficult to find a bottle of edible glitter in a store than the non-toxic version. If you’re eating at a professional bakery, you can ask what type of glitter is used, but employees may not know offhand: When asked, staff at one New York City bakery took 9 minutes to confirm (the answer was a gelatin based, edible glitter).

But Li still cautions against going overboard with the edible sparkly food. “Our body can only take care of it if we only consume things like glitter foods once a while,” she says, “in small amounts.”

Source: Eater

Advertisements

Trans Fat Is (Almost) Out of Your Food. Here’s What’s Going In

Deena Shanker wrote . . . . . . .

Partially hydrogenated oils (PHOs), the main source of artificial trans fat and an invisible mainstay of the American food industry for decades, are finally being pushed out in favor of healthier alternatives.

The change was a long time coming. Research showing the dangers of trans fat, which raises the risk of heart disease and stroke, solidified with a major study published in 1990 and got stronger with each successive round of research, forcing food makers to start looking for alternatives. From 2006 to 2008, according to one estimate, the amount of PHOs in food in North America was cut in half. By 2015, the Grocery Manufacturers Association said trans fat had been reduced by 85 percent. That year, the U.S. Food and Drug Administration announced that food makers had three years to completely remove the oils from their products.

PHOs come in many forms and serve a variety of hidden functions. They could be in the deep fryer at a national fast-food chain or in your favorite packaged baked goods—you know, the ones that tastes as fresh today as when they were purchased three years ago. They’ve shown up in creamers, cereal bars and microwave popcorn. Replacing them requires a mix of liquid oils and solid fats, along with collaboration among oil-producers, fast-food industry and packaged food producers.

Consumers likely won’t be able to tell the difference when they sample the new generation of PHO-free products. In fact, many have already been eating them for years. Nutritionally, all are an improvement over standard partially hydrogenated soybean oil. Producers tout their high content of “good fats,” (aka monounsaturated and polyunsaturated fats) and low levels of “bad fats” (trans and saturated fats). Omega-9 fatty acids, or oleic acid, makes the oil both more stable and healthful. (The high oleic acid content of olive oil is one reason it is considered the gold standard of healthy oils.) Yet these oils fall short in at least one respect: Because all the new oils are liquid, baking with them requires the addition of other solid fats. In these cases, palm oil, an ingredient associated with negative environmental impacts, is often the solid fat of choice.

Non-GMO Omega-9 canola oil, from Dow Chemical Co.’s Dow AgroSciences LLC, has even lower levels of saturated fats than olive oil and about the same level of omega-9s. “This oil has the whole package,” said Dave Dzisiak, the global business leader of oils and grains at the company. The oil also has a cleaner, light taste, he said. It’s currently being used by major national and regional foodservice chains, as well as by snack makers, according to the company.

DuPont Co.’s DuPont Pioneer subsidiary makes similar claims about its trans-fat free oil, Plenish. Through genetic modification, the company lowered the amount of saturated fat in standard soybean oil by 20 percent and raised the omega-9 fatty acids to rival that of olive oil. Plenish is also currently being used in packaged goods, according to DuPont. It believes its soy oil holds a competitive advantage over canola because soy is so entrenched in the American diet. “There’s a pretty strong belief that the U.S. consumer in particular has developed a preference for soy,” said Russ Sanders, director of food at DuPont. “The flavor of the food comes through more than the flavor of the oil.”

Monsanto Co. is banking on the same preference as it prepares to launch its own soybean oil, Vistive Gold. Like its competitors, Vistive Gold has a high omega-9 count, low saturated fat content and better stability than standard soybean oil. It is also a product of genetic engineering. The company declined to name any specific customers, but a spokesperson said it has been working closely with the food industry to develop an oil producers will want to use.

At the moment, high-oleic canola oils are more popular, largely because they have been around in commercial quantities much longer than the new soybean oils, said Robert Collete, president of the Institute of Shortening and Edible Oils. (High-oleic sunflower oils have also been available for some time in lower quantities.) But the soybean industry is fighting back, blending different oils to create Qualisoy, another high-oleic soybean oil, as a potential competitor.

The nutritional profile of the new soybean oils is also more favorable than that of the new canola oils, said J. Thomas Brenna, professor of Human Nutrition at the Dell Medical School at the University of Texas-Austin. While omega-9 fatty acids are important, omega-3s and omega-6s—found in the aforementioned polyunsaturated fats—are as well, and they need to be in the right balance with each other. But, Brenna said, the omega-3s in the canola oil were reduced below an ideal level to make room for the omega-9s. That makes it less healthy than the soybean oils, which keep their polyunsaturated fat ratios more aligned with olive oil.

Of course, all of these options are still far better than the partially hydrogenated oils of yore. “That is bad stuff,” said Brenna.

For people hoping that the new, trans fat-free world means they can eat as many French fries and potato chips as they want, Keri Gans, a New York-based registered dietitian, has some bad news. “Just because a company has switched their oil to a healthier variety,” she said, “doesn’t mean the product becomes good for you.”

The old rules of diet still apply. “Watch the amount of French fries you eat,” Gans warned.

Source : Bloomberg

Study Reveals Mechanism of Walnuts’ Ability to Decrease Hunger

Packed with nutrients linked to better health, walnuts are also thought to discourage overeating by promoting feelings of fullness. Now, in a new brain imaging study, researchers at Beth Israel Deaconess Medical Center (BIDMC) have demonstrated that consuming walnuts activates an area in the brain associated with regulating hunger and cravings. The findings, published online in the journal Diabetes, Obesity and Metabolism, reveal for the first time the neurocognitive impact these nuts have on the brain.

“We don’t often think about how what we eat impacts the activity in our brain,” said the study’s first author Olivia M Farr, PhD, an instructor in medicine in the Division of Endocrinology, Diabetes and Metabolism at BIDMC. “We know people report feeling fuller after eating walnuts, but it was pretty surprising to see evidence of activity changing in the brain related to food cues, and by extension what people were eating and how hungry they feel.”

To determine exactly how walnuts quell cravings, Farr and colleagues, in a study led by Christos Mantzoros, MD, DSc, PhD hc mult, director of the Human Nutrition Unit at Beth Israel Deaconess Medical Center and professor of medicine at Harvard Medical School, used functional magnetic resonance imaging (fMRI) to observe how consuming walnuts changes activity in the brain.

The scientists recruited 10 volunteers with obesity to live in BIDMC’s Clinical Research Center (CRC) for two five-day sessions. The controlled environment of the CRC allowed the researchers to keep tabs on the volunteers’ exact nutritional intake, rather than depend on volunteers’ often unreliable food records – a drawback to many observational nutrition studies.

During one five-day session, volunteers consumed daily smoothies containing 48 grams of walnuts – the serving recommended by the American Diabetes Association (ADA) dietary guidelines. During their other stay in the CRC, they received a walnut-free but nutritionally comparable placebo smoothie, flavored to taste exactly the same as the walnut-containing smoothie. The order of the two sessions was random, meaning some participants would consume the walnuts first and others would consume the placebo first. Neither the volunteers nor the researchers knew during which session they consumed the nutty smoothie.

As in previous observational studies, participants reported feeling less hungry during the week they consumed walnut-containing smoothies than during the week they were given the placebo smoothies. fMRI tests administered on the fifth day of the experiment gave Farr, Mantzoros and the team a clear picture as to why.

While in the machine, study participants were shown images of desirable foods like hamburgers and desserts, neutral objects like flowers and rocks, and less desirable foods like vegetables.

When participants were shown pictures of highly desirable foods, fMRI imaging revealed increased activity in a part of the brain called the right insula after participants had consumed the five-day walnut-rich diet compared to when they had not.

“This is a powerful measure,” said Mantzoros. “We know there’s no ambiguity in terms of study results. When participants eat walnuts, this part of their brain lights up, and we know that’s connected with what they are telling us about feeling less hungry or more full.”

This area of the insula is likely involved in cognitive control and salience, meaning that participants were paying more attention to food choices and selecting the less desirable or healthier options over the highly desirable or less healthy options. Farr and Mantzoros next plan to test different amounts, or dosages, of walnuts to see whether more nuts will lead to more brain activation or if the effect plateaus after a certain amount. This experiment will also allow researchers to test other compounds for their effect on this system.

Similar studies could reveal how other foods and compounds, such as naturally-occurring hormones, impact the appetite-control centers in the brain. Future research could eventually lead to new treatments for obesity.

“From a strategic point of view, we now have a good tool to look into people’s brains – and we have a biological read out.” said Mantzoros. “We plan to use it to understand why people respond differently to food in the environment and, ultimately, to develop new medications to make it easier for people to keep their weight down.”

Source: Beth Israel Deaconess Medical Center

The Nitty-gritty behind How Onions Make You Cry

Adding onions to a recipe can make a meal taste rich and savory, but cutting up the onion can be brutal. Onions release a compound called lachrymatory factor (LF), which makes the eyes sting and water. Scientists know that a certain enzyme causes this irritating compound to form but precisely how it helps LF form in the onion remained an open question. Now, one group reports in ACS Chemical Biology that they have the answer.

According to the National Onion Association, the average American consumes 20 pounds of onions each year. When an onion is cut, it has a natural defense mechanism that springs into action, producing LF. This kind of compound is rare — only four known natural types exist. An enzyme in the onion known as lachrymatory factor synthase (LFS) spurs production of LF in the onion. “Tearless” onions, sold exclusively in Japan for a hefty price, don’t make LFS so they also don’t produce the irritant LF. But scientists have been at a loss to explain exactly how LFS helps LF form. That’s because it is extremely reactive, and LF evaporates or breaks down easily. Marcin Golczak and colleagues wanted to take a different approach to solve this mystery once and for all.

The team determined the crystal structure of LFS and analyzed it. With the crystal structure, they could finally see the architecture of the enzyme as a whole and its active site as it bound to another compound. By combining this information with known information about similar proteins, the group developed a detailed chemical mechanism that could explain the precise steps involved in LF synthesis—and hence, why people wind up crying when they chop an onion.

Source: American Chemical Society

Does Sugar Make You Sad? New Science Suggests So

Anika Knüppel wrote . . . . . . . .

The thought of a cupcake, skillfully frosted with fluffy vanilla icing, may put a smile on your face, but research suggests that, in the long term, a sweet tooth may turn that smile into a frown – but not for the reasons you think. In a new study, published in Scientific Reports, my colleagues and I found a link between a diet high in sugar and common mental disorders.

The World Health Organisation recommends that people reduce their daily intake of added sugars (that is, all sugar, excluding the sugar that is naturally found in fruit, vegetables and milk) to less than 5% of their total energy intake. However, people in the UK consume double – in the US, triple – that amount of sugar. Three-quarters of these added sugars come from sweet food and beverages, such as cakes and soft drinks. The rest come from other processed foods, such as ketchup.

At the same time, one in six people worldwide suffers from a common mental disorder, such as a mood or anxiety disorder. Could there be a link between high sugar consumption and common mental disorders?

Earlier research, published in 2002, examined the link between depression and sugar consumption in six countries. The researchers, from Baylor College in the US, found that higher rates of refined sugar consumption were associated with higher rates of depression.

Since then, a handful of studies have investigated the link between added sugar consumption and subsequent depression. In 2011, researchers in Spain found that when they grouped participants based on their commercial baked food consumption, those who ate the most baked food had a 38% increased chance of developing depression compared with those in the group with the lowest intake. The association remained even after accounting for health consciousness and employment status.

In 2014, researchers studied the association between sweetened beverages in a large US group. They found that sugar-sweetened and artificially sweetened drinks (diet drinks) could increase a person’s risk of developing depression. And, more recently, a 2015 study, including nearly 70,000 women, found higher chances of depression in those with a high added sugar intake, but not in those with a high intake of naturally occurring sugars, such as those found in fruit.

Trying to explain the link

We are still not sure what causes depression, but some researchers believe that biological changes are at the root of it. Some of these changes could be influenced by sugar and sweet taste. For example, a study in rats found that diets high in sugar and fat can reduce a protein called BDNF that influences the growth and development of nerve cells in the brain. This protein is thought to be involved in the development of depression and anxiety.

Another possible biological cause is inflammation. High sugar diets can increase inflammation – a protective reaction of the body, normally directed against microorganisms or foreign substances. While common signs of inflammation, such as redness, are far from a mood disorder, the symptoms that keep us in bed with a cold are much closer, such as low energy and being unable to concentrate. Ongoing research suggests that mood disorders could be linked with inflammation, at least in some cases.

Dopamine is another possible culprit. A study using rats earned headlines for suggesting sweet foods could be as addictive as cocaine. This might be due to affects on dopamine, a brain chemical involved in the reward system. Dopamine is also thought to influence mood. And addiction is itself associated with a higher risk of developing a mood disorder.

Finally, sugar intake could be associated with other factors, such as obesity, which itself is related to mood.

But these associations could also reflect a reverse phenomenon: low mood could make people change their diet. Sweet foods could be used to soothe bad feelings by providing a short-term mood boost. And low mood and anxiety could make simple tasks, such as grocery shopping or cooking, so difficult and exhausting for the sufferer that they might start to avoid them. Instead, they might opt for junk food, takeaways and ready meals – all of which have a high sugar content.

What our study adds to the debate

For our latest study, my colleagues and I put the reverse association idea to the test. We used sugar intake from sweet food and drinks to predict new and recurrent mood disorders in a group of British civil servants. We also investigated whether having a mood disorder would make people more inclined to choose sweet foods and drinks.

We found that men without a mood disorder who consumed over 67g of sugar had a 23% increased risk of suffering from a mood disorder five years later, compared with those who ate less than 40g. This effect was independent of the men’s socioeconomic status, physical activity, drinking, smoking, other eating habits, body fatness and physical health.

We also found that men and women with a mood disorder and a high intake of sugar from sweet food and drinks were at higher risk of becoming depressed again five years later, compared with those who consumed less sugar. But this association was partly explained by their overall diet.

We found no evidence for a potential reverse effect: participants did not change their sugar intake after suffering from mood disorders.

Despite our findings, a number of questions remain about whether sugar makes us sad, whether it affects men more than women, and whether it is sweetness, rather than sugar itself, that explains the observed associations. What is certain, though, is that sugar is associated with a number of health problems, including tooth decay, type 2 diabetes and obesity. So cutting down on sugar is probably a good idea, regardless of whether it causes mood disorders or not.

Source: The Conversation


Today’s Comic