Low Blood Pressure Linked to High Mortality in Older Adults

International blood pressure guidelines may require review, according to new research that found a link between low blood pressure and higher mortality rates.

A largescale study led by the University of Exeter, published in Age and Ageing and funded by NIHR, analysed 415,980 electronic medical records of older adults in England.

The research was conducted after some countries have changed blood pressure guidelines to encourage clinicians to take measures to reduce blood pressure in a bid to improve health outcomes. UK blood pressure guidelines are within safe parameters for all. However, previous research has not considered the impact on frail older adults, who are often omitted from trials.

The team found that people aged 75 or over with low blood pressure (below 130 / 80) had increased mortality rates in the follow-up, compared to those with normal blood pressure. This was especially pronounced in ‘frail’ individuals, who had 62 per cent increased risk of death during the ten year follow-up.

Although high blood pressure increased risk of cardiovascular incidents, such as heart attacks, it was not linked to higher mortality in frail adults over 75. Older people aged 85 and over who had raised blood pressure actually had reduced mortality rates, compared to those with lower blood pressure, regardless of whether they were frail or not.

Jane Masoli, a geriatrician and NIHR Doctoral Research Fellow, who led the study as part of her PhD at the University of Exeter, said: “Internationally, guidelines are moving towards tight blood pressure targets, but our findings indicate that this may not be appropriate in frail older adults. We need more research to ascertain whether aggressive blood pressure control is safe in older adults, and then for which patient groups there may be benefit, so we can move towards more personalised blood pressure management in older adults.”

She added: “We know that treating blood pressure helps to prevent strokes and heart attacks and we would not advise anyone to stop taking their medications unless guided by their doctor.”

Source: University of Exeter


Today’s Comic

Recipes from the Garden of Contentment – a Chinese Gastronomic Guide from 1792

Susan Jung wrote . . . . . . . . .

Recipes from the Garden of Contentment (2018) is the first bilingual (Chinese and English) edition of Suiyuan Shidan (1792), a work on gastronomy by Qing dynasty poet and scholar Yuan Mei. However, its translator, Sean J.S. Chen, is neither a classic Chinese scholar nor a chef in a high-end Chinese restaurant looking for inspiration; rather, his field is science and engineering.

A “research scientist and algorithms dev­eloper for computer-assisted minimally invasive surgery”, Chinese-Canadian Chen started translating Yuan’s work after failing to find a transla­tion of it, and published his efforts on his blog, Way of the Eating.

Translating the book wasn’t easy, Chen writes. “Classical Chinese is a written language of its own, quite different from modern written Chinese that is used today in daily life. For the untrained reader, Classical Chinese appears as a disconti­nuous mask of characters glommed together on a grid typically without any punctuation to guide the reader. Reading through the Suiyuan Shidan in Classical Chinese brought back those feelings of inadequacy I felt while grinding through the Middle English version of The Canterbury Tales in university.”

It wasn’t just the translations that troubled Chen – he also found that there were different versions of Suiyuan Shidan. For an accurate translation, he needed access to the original text, and found two copies of the 1792 volume – one at the Harvard-Yenching Library, the other at Princeton University Library.

Yuan wasn’t a cook – his household staff included a chef (and several concu­bines). But as a food lover, he had strong opinions about recipes, as well as the preparation of ingredients. In the chapter “Essential Knowledge”, he states, “It is better to use more of an expensive ingredient in a dish and less of the inexpensive ones. If too much of an ingredient is pan-fried or stir-fried at the same time, there would be insufficient heat to cook them through; meats done this way are especially tough […] If one asks, ‘What if there isn’t enough to eat?’ I say, ‘If you’re not full after you’ve finished what’s there, just cook some more.’”

The chapter titled “Objectionables” is especially entertaining, and Chen’s annotations are just as opinionated. “What are ‘meals for the ears’?” reads the original text. “Meals for the ears exist only for bolstering name and reputation. By boasting the names of expensive and coveted ingredients, flaunting one’s wealth to esteemed guests, such meals tease one’s ears but confer no satisfaction to one’s tongue.”

Chen adds, “Sadly, dishes for the ears, or ‘ear meals’, are a mainstay of gastronomy, be it Eastern or Western cuisine. Foie gras is fantastic, but if a restaurant serves it too thin (less than five millimetres thick) just to be able to mention it in a dish, that’s an ear meal. White truffle oil (usually containing no truffle shavings whatsoever) in your pasta? Ear meal. ‘Kobe beef’ hamburgers? Yet another ear meal.”

The recipes are brief, leaving out a lot of detail. The recipe for mutton soup, for instance, reads, “Take some cooked mutton and cut it into small pieces, about the size of dice. Braise the meat in chicken broth. Add diced bamboo shoots, diced shiitake mushrooms, diced mountain yam, and then braise until done.” The recipe for radish cooked in lard reads, “Stir-fry the radishes in rendered lard, then add dried shrimp and braise them until completely done. When one is about to plate the dish, add chopped green onions. The radishes should be translucent and red like amber.”

Other recipes include roasted suckling pig, red cooked pork, white cut chicken, smoked eggs, eight treasure tofu and homestyle pan-fried fish.

Source: SCMP

History: SARS Fast Facts

Here’s some background information about SARS, severe acute respiratory syndrome. Since 2004, there have been no known cases of SARS reported anywhere in the world.

General Information:

SARS is an acute viral respiratory illness brought on by a coronavirus.

Symptoms include fever, cough, severe headache, dizziness and other flu-like complaints.

The illness presents as an atypical pneumonia that does not respond to standard treatments.

There were 8,098 confirmed cases of SARS in 29 countries from November 2002 to July 2003, with 774 deaths.

Timeline:

November 16, 2002 – What will become known as SARS is first reported in Foshan, China.

November 2002-February 2003 – Five people die and more than 300 are reported ill of SARS in Guangdong province, China.

February 15-22, 2003 – Liu Jianlun develops SARS symptoms on a trip from Huang Xingchu in the Guangdong province to visit family in Hong Kong. He is considered patient zero, or the first person to die of the disease. He infects people at his hotel and his family. He is hospitalized and dies, as does one member of his family.

March 15, 2003 – The World Health Organization (WHO) issues an emergency travel advisory about the illness, calling it a “global threat.”

March 27, 2003 – Hong Kong officials have quarantined more than 1,000 people and schools close in Singapore.

March 29, 2003 – Dr. Carlo Urbani, the WHO physician who identified SARS in patient zero, dies from the virus in Bangkok.

April 1, 2003 – An American Airlines flight from Tokyo is quarantined at Mineta San Jose Airport. Three passengers are transported to an area hospital for evaluation of SARS and later released.

April 4, 2003 – By executive order, President George W. Bush has SARS added to the list of communicable diseases for which a person can be quarantined.

April 14, 2003 – Working independently, American and Canadian scientists announce they have sequenced the genome thought to be the cause of SARS.

April 20, 2003 – China cancels a weeklong national holiday celebration as Beijing’s SARS cases rise from 37 to 339 in less than a week. A wholesale vegetable market in Singapore closes and all 2,400 people are quarantined.

April 22, 2003 – The CDC issues a health alert for travelers in Toronto.

April 23, 2003 – Travel warnings and advisories for Shangxi province, Beijing and Toronto have been increased and those for Hong Kong and Guangdong province have been extended.

April 28, 2003 – WHO lists Vietnam as the first nation to contain the SARS outbreak.

April 29, 2003 – WHO announces it will lift its SARS advisory against travel to Toronto.

May 14, 2003 – WHO removes Canada from its list of countries where local transmission of the disease is occurring.

May 23, 2003 – WHO removes its travel warnings against Hong Kong and the province of Guangdong in southern China.

May 28, 2003 – Russia confirms first case of SARS, in a town bordering China.

May 29, 2003 – Canada has 29 active cases of SARS and more than 7,000 under home quarantine.

June 17, 2003 – The first major conference on SARS opens in Malaysia, with more than 1,000 scientists and clinicians in attendance. WHO lifts the travel advisory to Taiwan.

June 24, 2003 – WHO lifts its SARS travel advisory on Beijing.

July 2, 2003 – WHO removes Toronto from its list of areas with recent local transmission of SARS.

July 5, 2003 – WHO announces containment of SARS.

December 17, 2003 – Taiwanese Department of Health reports a case of SARS.

January 5, 2004 – Civet cats are linked through genetic testing to the outbreak of SARS, and the Chinese Health Ministry orders the killing of thousands of the mammals. A man in Guangdong province in China has a confirmed case of SARS

April 23, 2004 – The Chinese Health Ministry reports two confirmed cases of SARS, one in the eastern province of Anhui and the other in the capital, Beijing. Two other possible cases are being investigated.

April 25, 2004 – The Chinese Health Ministry identifies two new cases of SARS in Beijing.

April 29, 2004 – China’s Ministry of Health reports two new confirmed SARS cases in Beijing, bringing the total number of possible or confirmed cases there to nine.

April 30, 2004 – China’s Ministry of Health confirms that a woman who died last week in Anhui province had SARS, the first death related to the illness this year.

May 18, 2004 – The last reported outbreak of SARS is contained in China.

October 5, 2012 – The CDC’s Select Agent Program declares SARS to be a select agent, “a bacterium, virus or toxin that has the potential to pose a severe threat to public health and safety.”

December 2017 – Chinese researchers locate a population of bats in a Yunnan province cave infected with SARS-related coronaviruses. The newly-discovered strains contain the genetic building blocks of the strain that triggered the SARS outbreak.

Source: CNN


Source: World Health Organization

History of Nutella, the Chocolate-Hazelnut Spread

Emily Mangini wrote . . . . . . . . .

Nutella’s squat, oddly shaped jar has become a culinary icon across the globe, thanks to the addictively rich and creamy chocolate-and-hazelnut spread housed within. But, while the marriage of chocolate and hazelnut may seem as natural as that of salt and pepper or bread and butter, its origin story isn’t nearly so simple. It begins with the spread’s progenitor, the chocolate-hazelnut treat called gianduia (also spelled “gianduja”).

The tale of gianduia’s birth is often splashed across product labels and woven into pop history accounts of related products, including Nutella. In large part, that’s because it’s a compelling story—one of wartime desperation, economic strife, and the triumph of one industry’s ingenuity. It starts in Turin, Italy, at the turn of the 19th century, and it’s also almost certainly rife with untruths.

What most historians can agree on is that by the early 1800s, Turin had long held the distinction of being Europe’s chocolate capital, its cacao-based products renowned as delicacies across the continent. But by 1806, its prominence was poised to collapse. Napoleon Bonaparte and his French Grande Armée were on the move, conquering Europe in the name of social enlightenment. Tensions between France and Britain had come to a head, culminating in a series of naval blockades and trade embargoes. In late fall, Napoleon enacted the Continental System, a sweeping blockade that halted all trade between the island kingdom and any country under the emperor’s thumb, including the patchwork of kingdoms and city-states that would soon be unified under the name “Italy.”

In the case of Turin, one particular change transformed its coveted chocolate industry. Britain, a dominant force in maritime trade, was a major vein in the flow of cacao between Mesoamerica and Europe; under the blockade, Turin found its main cacao source cut off.

From here, gianduia’s origin myth gets a bit more complicated. Many claim that, unable to exploit Britain’s access to cacao beans, Turin’s chocolatiers needed a quick solution to supplement their supply and stay in business. The surrounding area of Piedmont, with its abundant hazelnut trees, proved to be just the ticket. When ground up, the hazelnuts took on the texture of cocoa powder, meaning that the nuts could be used to stretch what cocoa was available into a thick, ganache-like confection. In this version of the story, Turin’s chocolatiers buoyed the local industry, harnessing their resourcefulness to create a brilliant new product—one that has persisted in popularity through the centuries.

As attractive as this narrative may be, there are reasons to call it into question. Some point out that at the time, chocolate was consumed in liquid form rather than in thick pastes or solid bars. Others argue that Turin chocolatiers would have lacked the powerful technology required to grind enough hazelnuts to make gianduia a cost-effective product on a large scale, let alone single-handedly save an entire industry.

While it’s true that chocolate was first introduced to North America and Europe as a Mesoamerican medicinal beverage, and that the cacao press—the machine that made solid chocolate readily available—wasn’t invented until 1828, there’s ample evidence that so-called “eating chocolate” was established in Europe by the mid-17th century. In The True History of Chocolate, Michael and Sophie Coe point to instances of culinary experimentation with chocolate in Italy that date back to the 1680s, and records of “eating chocolate” from 18th-century France. The Marquis de Sade, known for his love of sweets, wrote to his wife from prison in the early 1800s, imploring her to send care packages filled with chocolate treats: “…half pound boxes of chocolate pastilles, large chocolate biscuits, vanilla pastilles au chocolat, and chocolat en tablettes d’ordinaire [chocolate bars].”

But just because chocolate was available in more than just liquid form doesn’t mean that the Continental System resulted in the creation of gianduia, particularly given that virtually no primary sources link the two. More significant is the other oft-cited rebuttal to the legend**—the unlikelihood that technology available at the time could churn out enough of the new confection to save the Turin chocolatiers from the effects of their dwindling cacao supply.

If gianduia wasn’t born out of necessity, then what was the catalyst for its creation? “My take on Turin and the whole kingdom of Savoy is that it was entirely under the sway of France in the 18th and 19th centuries,” says Ken Albala, historian and director of the food studies program at the University of the Pacific in California. “I wouldn’t be surprised if you find the combination [of chocolate and hazelnuts] in France before Italy.” This influence makes sense, given Napoleon’s conquest of the region, and suggests that gianduia was produced at a slow and gradually increasing rate, at least in its early years. It’s likely that chocolatiers quietly released the chocolate-hazelnut blend, and that its growth in popularity was more of a slow boil than the explosion of success suggested by the prevailing narrative.

But, of course, a story that credits an invading force for a chocolate-confection-turned-regional-gem is not nearly as stirring as one that frames the chocolatiers as ingenious victors, who persevered in their trade in spite of the odds against them. And the motivation to reshape gianduia’s narrative only grew with time.

From Puppet to Candy: Gianduia Gets a Name

By the mid-19th century, Italy was in the throes of the Risorgimento, the contentious, decades-long fight to unify the peninsula’s states into a single kingdom. Italian nationalism was reaching a fever pitch, and revolutionary movement erupted across the soon-to-be nation. In Piedmont, which had seen an 1821 insurrection against its Austrian rulers, the atmosphere was uniquely ripe for patriotic myth-building. And it took the form of a character named Gianduia, a wine-guzzling, tricorn-hat-wearing, womanizing peasant.

Over the course of the 19th century, Gianduia had evolved from a traditional masked character in the Italian commedia dell’arte to a puppet, and then a pervasive political cartoon. His form was paraded across newspapers as a symbol of Turin, a jovial peasant mascot of sorts who represented the Piedmontese capital.

It was at the 1865 Turin Carnival, just four years after Italy’s official unification, that Gianduia’s name first became associated with the chocolate-hazelnut confection. There, candies said to resemble Gianduia’s tricorn hat were distributed at the Carnival festivities, possibly by someone dressed as the character. Though a number of chocolate companies, most notably Caffarel, claim to have invented these confections, no evidence exists to verify their claims. What is more broadly agreed upon is that the chocolate-and-hazelnut sweets took on the name gianduiotti at roughly this point in time. Naming the candy for the city’s most ubiquitous representative cemented it as a Turinese—and now, following unification, an Italian—creation. Gianduia has since become synonymous with the combination of chocolate and hazelnut, and variations on the name are used to refer to chocolates, spreads, and other confections.

War Strikes Again, and Nutella Is Born

After 90 years of producing their treats in relative peace, the chocolatiers of Turin faced a new period of uncertainty with the onset of World War II. As with Napoleon’s blockade, the onset of the war brought with it food rations, and the supply of cocoa was once again drastically limited. In 1946, Piedmontese pastry chef Pietro Ferrero, inspired by gianduiotti and his chocolatier forefathers, created a thick paste using hazelnuts, sugar, and what little cocoa was available. He shaped the paste into a loaf and named it “Giandujot.” But though its low proportion of expensive cocoa arose out of the cost-consciousness of the war years, Giandujot, so dense and thick that it had to be cut with a knife, was still too pricey for a mass audience.

In 1951, Ferrero revolutionized the industry with the first spreadable version of his sweet loaf: “La Supercrema.” According to a BBC interview with Ferrero’s grandson, Giovanni Ferrero, the spreadability of La Supercrema meant that “a small amount went a very long way, helping to break down the perception that chocolate was, as Giovanni puts it, ‘only for very special occasions and celebrations like Christmas and Easter.'”

The availability and affordability of La Supercrema turned the chocolate-hazelnut spread into a household staple throughout Italy. In 1961, Ferrero’s son, Michele, once again adjusted the recipe, adding palm oil and scaling it up for mass production. The new spread was rebranded as Nutella, and went on to become a common breakfast and snack item throughout Europe, touching down first in Asia and then the United States in the early 1980s. Nutella’s world domination would surely have turned Napoleon green with envy.

It’s rare that a jar of anything can embody two centuries of social, political, and historical change. But mixed with a touch of food lore under that white lid are Napoleon’s bravado (possibly, at least); the ingenuity of the old Turinese chocolatiers; and the creativity of their descendant Ferrero. Creamy, nutty, and sweet, Nutella and its chocolate-hazelnut brethren are war, progress, and industrialization. Each spoonful snuck from the jar, every dollop that drips from the folds of a warm crepe, pays homage to the events that shaped its journey. And that’s how it should be, because without those moments of strife and stress, our cupboards wouldn’t be the same.

Source: Serious Eat

Why Haven’t We Been Able to Cure Cancer?

Kent Sepkowitz wrote . . . . . . . . .

Depending on who is speaking, the war against cancer that President Richard Nixon declared nearly half a century ago has either been a soaring triumph of innovation and doggedness or a colossal failure, featuring lunkhead decisions, bottomless greed, and annoyed experts hurrying from here to there.

On the positive side are stories, seemingly every day, of breakthroughs and miracle drugs, of triumphant against-all-odds cures featuring the latest treatments, be they based on molecular targets or tricks to stoke the immune system. And national trends seem promising: Cancer mortality has decreased from about 200 deaths per 100,000 people in the 1990s to roughly 163 per 100,000 in the 2010s. Pretty good, right?

Not so fast, say the doubters. After all the time, money, and scientific talent poured into the problem, this progress doesn’t amount to all that much. And plenty of the criticism comes from high up in the medical hierarchy. Twenty-six years into the war, a harsh assessment titled “Cancer Undefeated” was published in the New England Journal of Medicine, declaring it open season on any claims of victory, and the criticism has been steady ever since. Recently, Clifton Leaf echoed this dour perspective in his 2013 book, The Truth in Small Doses: Why We’re Losing the War on Cancer — and How to Win It, while the poet Anne Boyer recounted her own cancer experience (and profound disappointment in modern care) this year in The Undying.

Enter Azra Raza, a prominent cancer specialist at Columbia University. Although she doesn’t consider herself a pessimist, her new book, The First Cell: And the Human Costs of Pursuing Cancer to the Last, argues that we have wasted precious time and zillions of dollars barking up the wrong scientific tree. We are using wrongheaded experimental models (animals, cells, and the entire 20th-century repertoire of discovery), and we are giving federal grants to all the wrong ideas.

Most importantly, she argues that current cancer research is looking at the wrong end of the problem—late-stage disease, when the cancer is large and perhaps has already spread, when patients are sick and failing, when even the most wonderful new wonder drug is unlikely to work.

Better to find the cancers sooner, when the tumor burden—the actual number of cancer cells—is still small. Then therapies have a better chance of being effective: The lift is not so heavy, with a lower risk of genetic mutations that confer drug resistance or spotty penetration of medications into bulky growths. This approach—or better yet, attacking the disease when the cells show only an early itch to cause trouble—would be cheaper, less toxic, and decidedly more effective, she writes.

It’s a pretty compelling argument, one with a long history and public support. In 2016, Vice President Joe Biden endorsed the approach when he issued the Cancer Moonshot report, a national assessment of current cancer research and goals for the future. “We’re talking about prevention and early detection,” he said. “I’m convinced we can get answers and come up with game-changing treatments and get them to people who need them.”

Raza sets out to demonstrate her point and sharpen her criticism by presenting a series of patients she has treated through the years. We meet several people with difficult cancers but a lot of spunk. Each chapter leads us, not so gently, to their death. Of particular poignance is the story, woven throughout the book, of her husband, oncologist Harvey Preisler, who died of an aggressive lymphoma in 2002.

These clinical stories are recounted in vivid, precise detail, and carry a grim moral: Implicitly and often explicitly, Raza makes it clear that, in her view, a more intelligent and better organized research program and a more honest self-appraisal by the community of cancer scientists might have saved lives. “How many more Omars and Andrews will it take?” she laments, referring to two of her patients who died, diagnosed late with no good options for cure.

An experienced researcher herself, Raza knows well that real research is anything but organized. Rather it is a muddy scrum where no one really knows who is driving the pile, where motion might be from pushing or from pushing back, where real steps forward are rare and missteps plenty. Ideas are simple; humans and their biology are not.

And nowhere is the gap between our hopes and the stubbornness of reality wider than in the field of early cancer detection, the “first cell” of the book’s title. Science has been working on early detection since the Pap smear was introduced almost a century ago.

Somewhat late in the book, Raza describes the work of some of today’s leaders in the field of early diagnostics. She praises Sanjiv Sam Gambhir, chairman of the Canary Center at Stanford for Cancer Early Detection, for his work in using radiologic scans to see the first signals of cancer. She also highlights the great success of colonoscopy screening in reducing mortality from colon cancer. And she describes the enormous promise of DNA detection in the bloodstream.

Yet she avoids deep discussion of the vast amount of snake oil oozing through the field of early detection, such as the notoriously inaccurate scientific work of Elizabeth Holmes and Theranos, with their claims of a simple finger-prick diagnosis of all your worldly woes. Nor does she take on the many problems created by early detection, including the uncertainty in how best to manage unclear results. Instead, after 15 pages or so, she is back to her old tune, describing a young woman named Zaineb with a lethal cancer caught late, in a section titled, “And How Many Zainebs?”

In the end, there is a strong current of mea culpa defensiveness running through Raza’s persuasive if repetitive case for early detection; she essentially issues a blanket apology to the American public for how badly our cancer programs have failed us.

But while there is surely much to dislike about the American health care system and the medical profession as well, the fact that cancer remains an often-fatal disease isn’t merely a result of bad-faith governance or corporate avarice or individual narcissism, though there is plenty of each. Rather, we’re probably stuck where we are for a simple if overwhelming reason: As Raza herself views it, cancer is simply an impossible problem for current science to fix.

Source : Slate