History of Nutella, the Chocolate-Hazelnut Spread

Emily Mangini wrote . . . . . . . . .

Nutella’s squat, oddly shaped jar has become a culinary icon across the globe, thanks to the addictively rich and creamy chocolate-and-hazelnut spread housed within. But, while the marriage of chocolate and hazelnut may seem as natural as that of salt and pepper or bread and butter, its origin story isn’t nearly so simple. It begins with the spread’s progenitor, the chocolate-hazelnut treat called gianduia (also spelled “gianduja”).

The tale of gianduia’s birth is often splashed across product labels and woven into pop history accounts of related products, including Nutella. In large part, that’s because it’s a compelling story—one of wartime desperation, economic strife, and the triumph of one industry’s ingenuity. It starts in Turin, Italy, at the turn of the 19th century, and it’s also almost certainly rife with untruths.

What most historians can agree on is that by the early 1800s, Turin had long held the distinction of being Europe’s chocolate capital, its cacao-based products renowned as delicacies across the continent. But by 1806, its prominence was poised to collapse. Napoleon Bonaparte and his French Grande Armée were on the move, conquering Europe in the name of social enlightenment. Tensions between France and Britain had come to a head, culminating in a series of naval blockades and trade embargoes. In late fall, Napoleon enacted the Continental System, a sweeping blockade that halted all trade between the island kingdom and any country under the emperor’s thumb, including the patchwork of kingdoms and city-states that would soon be unified under the name “Italy.”

In the case of Turin, one particular change transformed its coveted chocolate industry. Britain, a dominant force in maritime trade, was a major vein in the flow of cacao between Mesoamerica and Europe; under the blockade, Turin found its main cacao source cut off.

From here, gianduia’s origin myth gets a bit more complicated. Many claim that, unable to exploit Britain’s access to cacao beans, Turin’s chocolatiers needed a quick solution to supplement their supply and stay in business. The surrounding area of Piedmont, with its abundant hazelnut trees, proved to be just the ticket. When ground up, the hazelnuts took on the texture of cocoa powder, meaning that the nuts could be used to stretch what cocoa was available into a thick, ganache-like confection. In this version of the story, Turin’s chocolatiers buoyed the local industry, harnessing their resourcefulness to create a brilliant new product—one that has persisted in popularity through the centuries.

As attractive as this narrative may be, there are reasons to call it into question. Some point out that at the time, chocolate was consumed in liquid form rather than in thick pastes or solid bars. Others argue that Turin chocolatiers would have lacked the powerful technology required to grind enough hazelnuts to make gianduia a cost-effective product on a large scale, let alone single-handedly save an entire industry.

While it’s true that chocolate was first introduced to North America and Europe as a Mesoamerican medicinal beverage, and that the cacao press—the machine that made solid chocolate readily available—wasn’t invented until 1828, there’s ample evidence that so-called “eating chocolate” was established in Europe by the mid-17th century. In The True History of Chocolate, Michael and Sophie Coe point to instances of culinary experimentation with chocolate in Italy that date back to the 1680s, and records of “eating chocolate” from 18th-century France. The Marquis de Sade, known for his love of sweets, wrote to his wife from prison in the early 1800s, imploring her to send care packages filled with chocolate treats: “…half pound boxes of chocolate pastilles, large chocolate biscuits, vanilla pastilles au chocolat, and chocolat en tablettes d’ordinaire [chocolate bars].”

But just because chocolate was available in more than just liquid form doesn’t mean that the Continental System resulted in the creation of gianduia, particularly given that virtually no primary sources link the two. More significant is the other oft-cited rebuttal to the legend**—the unlikelihood that technology available at the time could churn out enough of the new confection to save the Turin chocolatiers from the effects of their dwindling cacao supply.

If gianduia wasn’t born out of necessity, then what was the catalyst for its creation? “My take on Turin and the whole kingdom of Savoy is that it was entirely under the sway of France in the 18th and 19th centuries,” says Ken Albala, historian and director of the food studies program at the University of the Pacific in California. “I wouldn’t be surprised if you find the combination [of chocolate and hazelnuts] in France before Italy.” This influence makes sense, given Napoleon’s conquest of the region, and suggests that gianduia was produced at a slow and gradually increasing rate, at least in its early years. It’s likely that chocolatiers quietly released the chocolate-hazelnut blend, and that its growth in popularity was more of a slow boil than the explosion of success suggested by the prevailing narrative.

But, of course, a story that credits an invading force for a chocolate-confection-turned-regional-gem is not nearly as stirring as one that frames the chocolatiers as ingenious victors, who persevered in their trade in spite of the odds against them. And the motivation to reshape gianduia’s narrative only grew with time.

From Puppet to Candy: Gianduia Gets a Name

By the mid-19th century, Italy was in the throes of the Risorgimento, the contentious, decades-long fight to unify the peninsula’s states into a single kingdom. Italian nationalism was reaching a fever pitch, and revolutionary movement erupted across the soon-to-be nation. In Piedmont, which had seen an 1821 insurrection against its Austrian rulers, the atmosphere was uniquely ripe for patriotic myth-building. And it took the form of a character named Gianduia, a wine-guzzling, tricorn-hat-wearing, womanizing peasant.

Over the course of the 19th century, Gianduia had evolved from a traditional masked character in the Italian commedia dell’arte to a puppet, and then a pervasive political cartoon. His form was paraded across newspapers as a symbol of Turin, a jovial peasant mascot of sorts who represented the Piedmontese capital.

It was at the 1865 Turin Carnival, just four years after Italy’s official unification, that Gianduia’s name first became associated with the chocolate-hazelnut confection. There, candies said to resemble Gianduia’s tricorn hat were distributed at the Carnival festivities, possibly by someone dressed as the character. Though a number of chocolate companies, most notably Caffarel, claim to have invented these confections, no evidence exists to verify their claims. What is more broadly agreed upon is that the chocolate-and-hazelnut sweets took on the name gianduiotti at roughly this point in time. Naming the candy for the city’s most ubiquitous representative cemented it as a Turinese—and now, following unification, an Italian—creation. Gianduia has since become synonymous with the combination of chocolate and hazelnut, and variations on the name are used to refer to chocolates, spreads, and other confections.

War Strikes Again, and Nutella Is Born

After 90 years of producing their treats in relative peace, the chocolatiers of Turin faced a new period of uncertainty with the onset of World War II. As with Napoleon’s blockade, the onset of the war brought with it food rations, and the supply of cocoa was once again drastically limited. In 1946, Piedmontese pastry chef Pietro Ferrero, inspired by gianduiotti and his chocolatier forefathers, created a thick paste using hazelnuts, sugar, and what little cocoa was available. He shaped the paste into a loaf and named it “Giandujot.” But though its low proportion of expensive cocoa arose out of the cost-consciousness of the war years, Giandujot, so dense and thick that it had to be cut with a knife, was still too pricey for a mass audience.

In 1951, Ferrero revolutionized the industry with the first spreadable version of his sweet loaf: “La Supercrema.” According to a BBC interview with Ferrero’s grandson, Giovanni Ferrero, the spreadability of La Supercrema meant that “a small amount went a very long way, helping to break down the perception that chocolate was, as Giovanni puts it, ‘only for very special occasions and celebrations like Christmas and Easter.'”

The availability and affordability of La Supercrema turned the chocolate-hazelnut spread into a household staple throughout Italy. In 1961, Ferrero’s son, Michele, once again adjusted the recipe, adding palm oil and scaling it up for mass production. The new spread was rebranded as Nutella, and went on to become a common breakfast and snack item throughout Europe, touching down first in Asia and then the United States in the early 1980s. Nutella’s world domination would surely have turned Napoleon green with envy.

It’s rare that a jar of anything can embody two centuries of social, political, and historical change. But mixed with a touch of food lore under that white lid are Napoleon’s bravado (possibly, at least); the ingenuity of the old Turinese chocolatiers; and the creativity of their descendant Ferrero. Creamy, nutty, and sweet, Nutella and its chocolate-hazelnut brethren are war, progress, and industrialization. Each spoonful snuck from the jar, every dollop that drips from the folds of a warm crepe, pays homage to the events that shaped its journey. And that’s how it should be, because without those moments of strife and stress, our cupboards wouldn’t be the same.

Source: Serious Eat

Why Haven’t We Been Able to Cure Cancer?

Kent Sepkowitz wrote . . . . . . . . .

Depending on who is speaking, the war against cancer that President Richard Nixon declared nearly half a century ago has either been a soaring triumph of innovation and doggedness or a colossal failure, featuring lunkhead decisions, bottomless greed, and annoyed experts hurrying from here to there.

On the positive side are stories, seemingly every day, of breakthroughs and miracle drugs, of triumphant against-all-odds cures featuring the latest treatments, be they based on molecular targets or tricks to stoke the immune system. And national trends seem promising: Cancer mortality has decreased from about 200 deaths per 100,000 people in the 1990s to roughly 163 per 100,000 in the 2010s. Pretty good, right?

Not so fast, say the doubters. After all the time, money, and scientific talent poured into the problem, this progress doesn’t amount to all that much. And plenty of the criticism comes from high up in the medical hierarchy. Twenty-six years into the war, a harsh assessment titled “Cancer Undefeated” was published in the New England Journal of Medicine, declaring it open season on any claims of victory, and the criticism has been steady ever since. Recently, Clifton Leaf echoed this dour perspective in his 2013 book, The Truth in Small Doses: Why We’re Losing the War on Cancer — and How to Win It, while the poet Anne Boyer recounted her own cancer experience (and profound disappointment in modern care) this year in The Undying.

Enter Azra Raza, a prominent cancer specialist at Columbia University. Although she doesn’t consider herself a pessimist, her new book, The First Cell: And the Human Costs of Pursuing Cancer to the Last, argues that we have wasted precious time and zillions of dollars barking up the wrong scientific tree. We are using wrongheaded experimental models (animals, cells, and the entire 20th-century repertoire of discovery), and we are giving federal grants to all the wrong ideas.

Most importantly, she argues that current cancer research is looking at the wrong end of the problem—late-stage disease, when the cancer is large and perhaps has already spread, when patients are sick and failing, when even the most wonderful new wonder drug is unlikely to work.

Better to find the cancers sooner, when the tumor burden—the actual number of cancer cells—is still small. Then therapies have a better chance of being effective: The lift is not so heavy, with a lower risk of genetic mutations that confer drug resistance or spotty penetration of medications into bulky growths. This approach—or better yet, attacking the disease when the cells show only an early itch to cause trouble—would be cheaper, less toxic, and decidedly more effective, she writes.

It’s a pretty compelling argument, one with a long history and public support. In 2016, Vice President Joe Biden endorsed the approach when he issued the Cancer Moonshot report, a national assessment of current cancer research and goals for the future. “We’re talking about prevention and early detection,” he said. “I’m convinced we can get answers and come up with game-changing treatments and get them to people who need them.”

Raza sets out to demonstrate her point and sharpen her criticism by presenting a series of patients she has treated through the years. We meet several people with difficult cancers but a lot of spunk. Each chapter leads us, not so gently, to their death. Of particular poignance is the story, woven throughout the book, of her husband, oncologist Harvey Preisler, who died of an aggressive lymphoma in 2002.

These clinical stories are recounted in vivid, precise detail, and carry a grim moral: Implicitly and often explicitly, Raza makes it clear that, in her view, a more intelligent and better organized research program and a more honest self-appraisal by the community of cancer scientists might have saved lives. “How many more Omars and Andrews will it take?” she laments, referring to two of her patients who died, diagnosed late with no good options for cure.

An experienced researcher herself, Raza knows well that real research is anything but organized. Rather it is a muddy scrum where no one really knows who is driving the pile, where motion might be from pushing or from pushing back, where real steps forward are rare and missteps plenty. Ideas are simple; humans and their biology are not.

And nowhere is the gap between our hopes and the stubbornness of reality wider than in the field of early cancer detection, the “first cell” of the book’s title. Science has been working on early detection since the Pap smear was introduced almost a century ago.

Somewhat late in the book, Raza describes the work of some of today’s leaders in the field of early diagnostics. She praises Sanjiv Sam Gambhir, chairman of the Canary Center at Stanford for Cancer Early Detection, for his work in using radiologic scans to see the first signals of cancer. She also highlights the great success of colonoscopy screening in reducing mortality from colon cancer. And she describes the enormous promise of DNA detection in the bloodstream.

Yet she avoids deep discussion of the vast amount of snake oil oozing through the field of early detection, such as the notoriously inaccurate scientific work of Elizabeth Holmes and Theranos, with their claims of a simple finger-prick diagnosis of all your worldly woes. Nor does she take on the many problems created by early detection, including the uncertainty in how best to manage unclear results. Instead, after 15 pages or so, she is back to her old tune, describing a young woman named Zaineb with a lethal cancer caught late, in a section titled, “And How Many Zainebs?”

In the end, there is a strong current of mea culpa defensiveness running through Raza’s persuasive if repetitive case for early detection; she essentially issues a blanket apology to the American public for how badly our cancer programs have failed us.

But while there is surely much to dislike about the American health care system and the medical profession as well, the fact that cancer remains an often-fatal disease isn’t merely a result of bad-faith governance or corporate avarice or individual narcissism, though there is plenty of each. Rather, we’re probably stuck where we are for a simple if overwhelming reason: As Raza herself views it, cancer is simply an impossible problem for current science to fix.

Source : Slate

Study: New Findings on Postmenopausal Hormone Replacement Therapy

Amy Norton wrote . . . . . . . . .

The ongoing debate about postmenopausal hormone therapy and breast cancer risk may have turned even more muddy: A large, new study suggests that two different types of hormone therapy have opposite effects on women’s long-term risk of the disease.

The researchers found that combined hormone replacement therapy (HRT) — with estrogen and progestin — increases the risk of breast cancer, with effects that last for years after women discontinue the therapy.

On the other hand, women who take estrogen alone appear to have an equally long-lasting decrease in their breast cancer risk.

The findings come from a long-term follow-up of the Women’s Health Initiative (WHI) — a major U.S. government-funded project begun in the 1990s that tested the health effects of hormone replacement therapy. One trial randomly assigned over 16,000 women aged 50 to 79 to take either combined HRT or placebo pills. The other involved close to 11,000 women the same age who were given either estrogen therapy alone or placebos.

Before the WHI, doctors had thought that menopausal hormone therapy — which helps control hot flashes — had other health benefits, including a lower risk of heart disease.

But the initial findings from the WHI made waves when they instead uncovered higher disease risks: Combined HRT raised women’s odds of developing heart disease, stroke, blood clots and breast cancer.

The picture was different with estrogen-only therapy: It raised the risk of blood clots and stroke, but did not increase heart risks. In addition, it seemed to lower the odds of developing breast cancer.

But only certain women can take estrogen-only therapy, namely, those who’ve had a hysterectomy, since using estrogen by itself raises the risk of uterine cancer.

As if that weren’t complicated enough, things have gotten murkier over the years. A number of observational studies — which followed women in the “real world” who opted for hormone therapy or not — have found that estrogen-only therapy is associated with a higher breast cancer risk.

Enter these latest findings from the WHI. They show that for years after stopping combined HRT, women continue to face an increased risk of breast cancer. Meanwhile, the reduced risk seen with estrogen-only therapy also continued.

“So, who’s right? This big clinical trial or those large observational studies?” asked Dr. Rowan Chlebowski, the lead researcher on the new analysis.

Unfortunately, there is no clear answer, according to Chlebowski, chief of medical oncology at Harbor-UCLA Medical Center, in Los Angeles.

He is to present the findings Friday at the annual San Antonio Breast Cancer Symposium. Such research is considered preliminary until published in a peer-reviewed journal.

“Overall,” Chlebowski said, “this information suggests that combined HRT is a little worse than we’d thought, and estrogen alone is probably a little safer than we’d thought.”

Trial participants on combined HRT typically used it for about five years. Over 18 years of follow-up, those women were 29% more likely to develop breast cancer.

Women on estrogen-only typically used it for seven years. Over 16 years, they were 23% less likely to be diagnosed with breast cancer, the findings showed.

So what does it all mean? Given the overall body of evidence, experts have long advised women against using hormone therapy to prevent any disease.

And that advice still stands, Chlebowski said.

“You should not use hormone therapy to lower chronic disease risks,” he said. “If your menopausal hot flashes are bad enough that you want to try hormone therapy, talk with your doctor about the benefits and risks to you.”

Susan Brown is senior director of education and patient support at the nonprofit Susan G. Komen. The new findings give women another piece of information “to make informed decisions about their health,” she agreed.

“Large population studies are needed to understand the complex impact menopausal hormone therapy has on breast cancer risk and incidence,” Brown said. “We’re encouraged to see the results of research like this.”

Source: HealthDay


Today’s Comic

Study: Brushing Your Teeth May be Good for Your Heart

It included more than 161,000 South Korean adults, ages 40 to 79, with no history of heart failure or the heart rhythm disorder atrial fibrillation.

Between 2003 and 2004, participants had a routine medical exam and were asked about a wide range of lifestyle habits, including how often they brushed their teeth.

During a median follow-up of 10.5 years, 3% developed a-fib and 4.9%, developed heart failure. (Median means half were followed for less time, half for more.)

Those who brushed their teeth three or more times a day had a 10% lower risk of afib and a 12% lower risk of heart failure during the follow-up.

The reduced risk was independent of age, sex, wealth, exercise, alcohol use, body fat and conditions such as high blood pressure, according to the study published in the European Journal of Preventive Cardiology.

Researchers didn’t investigate how regular brushing might reduce heart disease risk. But previous studies have suggested that poor oral hygiene results in bacteria in the blood, causing inflammation that increases odds of heart disease.

The study was conducted in one country and was observational, so it does not prove a direct link between regular brushing and reduced heart risk, said senior author Dr. Tae-Jin Song, of the Department of Neurology at Ewha Womans University in Seoul.

But he added: “We studied a large group over a long period, which adds strength to our findings.”

An editorial accompanying the study said it is “certainly too early” to recommend tooth brushing to prevent afib and heart failure.

“While the role of inflammation in the occurrence of cardiovascular disease is becoming more and more evident, intervention studies are needed to define strategies of public health importance,” the editorial said.

Source: HealthDay

Only Eat Oysters in Months with an ‘r’? Rule of Thumb Is at least 4,000 years old

Halle Marchese, Mary-Lou Watkinson and Natalie van Hoose wrote . . . . . . . . .

Foodie tradition dictates only eating wild oysters in months with the letter “r” – from September to April – to avoid watery shellfish, or worse, a nasty bout of food poisoning. Now, a new study suggests people have been following this practice for at least 4,000 years.

An analysis of a large shell ring off Georgia’s coast revealed that the ancient inhabitants of St. Catherines Island limited their oyster harvest to the non-summer months.

How can scientists know when islanders were collecting oysters? By measuring parasitic snails.

Snails known as impressed odostomes, Boonea impressa, are common parasites of oysters, latching onto a shell and inserting a stylus to slurp the soft insides. Because the snail has a predictable 12-month life cycle, its length at death offers a reliable estimate of when the oyster host died, allowing Florida Museum of Natural History researchers Nicole Cannarozzi and Michal Kowalewski to use it as a tiny seasonal clock for when people collected and ate oysters in the past.

Stowaways on discarded oyster shells, the snails offer new insights into an old question about the shell rings that dot the coasts of Florida, Georgia, South Carolina and Mississippi.

“People have been debating the purpose of these shell rings for a very long time,” said Cannarozzi, the study’s lead author and Florida Museum environmental archaeology collection manager. “Were they everyday food waste heaps? Temporary communal feasting sites? Or perhaps a combination? Understanding the seasonality of the rings sheds new light on their function.”

Cannarozzi and Kowalewski, Thompson Chair of Invertebrate Paleontology, analyzed oysters and snails from a 230-foot-wide, 4,300-year-old shell ring on St. Catherines Island and compared them with live oysters and snails. They found that island inhabitants were primarily harvesting oysters during late fall, winter and spring, which also suggested the presence of people on the island tapered off during the summer.

The seasonality of the shell ring may be one of the earliest records of sustainable harvesting, Cannarozzi said. Oysters in the Southeast spawn from May to October, and avoiding oyster collection in the summer may help replenish their numbers.

“It’s important to look at how oysters have lived in their environment over time, especially because they are on the decline worldwide,” she said. “This type of data can give us good information about their ecology, how other organisms interact with them, the health of oyster populations and, on a grander scale, the health of coastal ecosystems.”

Cannarozzi said using impressed odostomes to gauge what time of year oysters were harvested offers an independent way to assess ancient patterns of oyster gathering. This approach can complement other archaeological methods, including stable isotope analysis and examining shell growth rings.

Kowalewski said the method could be applied to other marine invertebrate studies if the “timepiece” organism’s life cycle meets several key requirements.

“If you have species with a lifespan of one year or less, consistent growth patterns and predictable spawning behavior, you could potentially use them as clocks as well,” he said. “We might be able to use this type of strategy to reconstruct population dynamics or the natural history of various species, especially those that are extinct.”

Cannarozzi and Kowalewski emphasized the importance of interdisciplinary collaboration in addressing longstanding research questions in new ways. Their project combined paleontology, the study of fossils and other biological remains, with archaeology, which emphasizes human history. Cannarozzi’s specialization – environmental archaeology – also explores the close connections between humans and their natural resources.

“People have affected the distributions, life cycles and numbers of organisms over time,” Cannarozzi said. “Understanding how people in the past interacted with and influenced their environment can inform our conservation efforts today.”

The researchers published their findings in PLOS ONE.

Source: Florida Museum