Long Read: The Lie of “Expired” Food and the Disastrous Truth of America’s Food Waste Problem

Alissa Wilkinson wrote . . . . . . . . .

Maybe you know the routine. Every so often, I go through my refrigerator, check labels on the items, and throw out anything that’s a month, or a week, or maybe a few days past the date on the label. I might stop to sniff, but for my whole adult life, I’ve figured that the problem was obvious — my jam or almond milk or package of shredded Italian cheese blend had “expired” — and the fix was simple: Into the garbage it goes.

This habit is so ingrained that when I think about eating food that’s gone past its date, I get a little queasy. I’ve only had food poisoning once or twice in my life, always from restaurants, but the idea is still there in my head: past the date, food will make me sick. You’ll probably never catch me dumpster-diving.

I know, on some intellectual level, that throwing away food is probably wrong. The statistics are damning. Forty percent of food produced in America heads to the landfill or is otherwise wasted. That adds up. Every year, the average American family throws out somewhere between $1,365 and $2,275, according to a landmark 2013 study co-authored by the Harvard Food Law and Policy Clinic and the Natural Resources Defense Council. It’s a huge economic loss for food growers and retailers, who often have to ditch weirdly shaped produce or overstocked food that didn’t sell.

Environmentally it’s bad, too. The study found that 25 percent of fresh water in the US goes toward producing food that goes uneaten, and 21 percent of input to our landfills is food, which represents a per-capita increase of 50 percent since 1974. Right now, landfills are piled high with wasted food, most of which was perfectly fine to eat — and some of which still is.

On top of this, I know that in the same country that throws away so much food, about 42 million people could be living with food insecurity and hunger. Yet state-level regulations often make it difficult to donate past-date food to food banks and other services.

America has a food waste problem. But I’ve rarely been clear on how that translates to how I actually treat the food in my refrigerator. Because what can you do, right? When the date says it’s done, it’s done, right?

Apparently, very wrong. Researchers have found that “expiration” dates — which rarely correspond to food actually expiring or spoiling — are mostly well-intentioned, but haphazard and confusing. Put another way, they’re not expiration dates at all. And the broader public’s misunderstanding about them is a major contributor in every single one of the factors I named above: wasted food, wasted revenue, wasted household income, and food insecurity.

If you’ve been throwing out food based on the freshness label, though, you’re not alone. It’s a widespread practice. Chef, journalist, and cookbook writer Tamar Adler, author of An Everlasting Meal: Cooking with Economy and Grace, explains: “In the absence of culinary information, people assume that any information they’ve been given must be the most important information.” A big part of the problem is that most of us don’t really believe we’re capable of determining if a food is good for us.

“It’s really hard to imagine you’re supposed to trust your own nose and mouth,” Adler said. “Add that to convenience culture and rapacious late-stage capitalism and, well, we’re fucked.”

The good news is that the problem wouldn’t be all that hard to fix, in the abstract. The bad part is that solving the broader system around it takes time, education, and a shift in our consumption habits. But the incentives for virtually everyone involved are high — and a good place to start is by figuring out what those labels actually mean and how to interact with them.

Everything you assume about date labels is probably wrong

There are two vital facts to know about date labels on foods in the US: They’re not standardized, and they have almost nothing to do with food safety.

Date labels first started appearing in the decades following World War II, as American consumers increasingly moved away from shopping at small grocery stores and farms and toward supermarkets, with their rows of packaged and curated options. At first, manufacturers printed a date code on cans and packages for the benefit of the grocer, so they’d have a guideline for when to rotate their stock. The label was not designed for consumers. But since shoppers wanted to buy the freshest food on the shelf, savvy folks started publishing booklets that gave a guide for deciphering the codes.

Eventually, producers — seeing that shoppers actually wanted to know what those secret dates were — started including more clearly readable dates on the packages, with month, day, and year. They saw it as a marketing boon; it was a way to attract consumers and signify that your food was fresh and flavorful. Consumers loved it, and the so-called “open date” labels became common. But there was little consistency about them.

And while the federal government made some attempts beginning in the 1970s to enact legislation that would standardize what those labels mean across the country, they failed. (The exception is infant formula, for which there are strict federal guidelines.) Instead, the burden fell on state (and sometimes local) legislatures, which passed laws that varied wildly, often relying on voluntary industry standards. One state might never require labels; another may mandate that the freshness label on milk have a date of 21 days after bottling; a third may set the same date at 14 days. (In my home state of New York, there are laws about labels, but the standards don’t mention dates at all — though certainly many manufacturers still put date labels on their products, and various municipalities at times set their own guidelines.) State-to-state discrepancies can be costly for manufacturers, who had to come up with ways to produce multiple labels for multiple regions. But it’s also baffling to consumers.

The labels are inconsistent, too. What the label actually indicates varies from producer to producer. So you might have a “best by” label on one product, a “sell by” label on another, and a “best if used before” label on a third. Those have different meanings, but the average consumer may not immediately realize that, or even notice there’s a difference.

Furthermore, those dates might not even be consistent across brands of the same food product — peanut butter, say, or strawberry jam. That’s partly because they’re not really meant to indicate when a food is safest. Most packaged foods are perfectly fine for weeks or months past the date. Canned and frozen goods last for years. That package of chips you forgot about that’s a month out of date isn’t going to kill you — they just might be a tiny bit less crunchy than you’d like. (The huge exception is foods like deli meats and deli salads, which won’t be reheated before they’re consumed and can pick up listeria in the production process — but that’s the exception, not the rule.) You can check for the freshness of eggs by trying to float them in a glass of water (if it sinks, it’s good). Properly pasteurized milk, which is free of pathogens, should be fine if it tastes and smells fine. But many of us, with the best of intentions, just look at what the label says and throw out what’s old.

Is this a scam?

When I first realized that date labeling wasn’t linked directly to scientifically backed safety standards but to a more subjective, voluntary, and nebulous standard of “freshness,” I wondered if it was … well, kind of a scam. After all, customers don’t benefit from throwing out foods; grocers lose money; farmers miss out on possible sources of revenue. The only people who could benefit are the producers, and I could imagine an unscrupulous manufacturer shortening the date on their food so that people will sigh, throw out a half-eaten package that has “expired,” and go buy some more.

I asked Emily Broad Leib, the director of the Harvard Law School Food and Policy Clinic and lead author of the 2013 study, about this. She laughed and said I’m not the only one to wonder if we’re just getting played.

But, she said, manufacturers would say “there is a legitimate reason on their part, which is that they want you to eat things when they taste the absolute best.” The methods by which they determine that date can vary; a big manufacturer might run a focus group with consumers to determine the date, while a small producer may just hazard a guesstimate. But importantly, the freshness date almost never corresponds to the food’s safety — to whether or not it could make you sick.

Suppose you buy a particular brand of yogurt, Broad Leib says, and it waits around till it’s slightly past its peak. You might decide you don’t like this brand of yogurt, and buy a different one next time. The dates are, in part, a way of “protecting the brand,” she said. Their biggest incentive is to make sure you eat the food when it tastes the way they think it should.

But that doesn’t mean that the way we buy and eat food has no part in the blame, and producers don’t have to be insidious to be part of the issue. The fact that so many of us read a “best by” label as actually saying “bad after” is partly a public education problem, and it’s one that manufacturers haven’t worked too hard to combat. “It’s in the general interest of anybody trying to sell anything to continue to perpetuate the illusion that our foods are going bad all the time,” Adler said. “We could buy half as much food.”

Adler noted that our penchant for buying more than we need and then throwing out food that’s gone slightly past its peak is rooted, at its core, in a consumer mindset. “The only way that makes sense is if your cultural value is unfettered growth and profit at all costs,” she said. “There’s no other way that it makes sense to just throw stuff out.”

In fact, she said, it’s in direct contrast to what most food cultures practice around the world. “The whole idea that mold and bacteria are to be avoided at all costs is not only antithetical to good cooking, but it’s literally not practiced” in most cultures. Salami and cheese and pickles and sauerkraut and all kinds of food come from the natural process of aging — “in most cuisines of the world, there’s not as great a distinction between new food and old food; they’re just ingredients that you’d use differently,” she said. Those traditions certainly have been retained in regions where Americans still make kimchi and half-sours and farm cheese. But we’ve absorbed over time the idea that those natural processes are bad and will make us sick. Instead, we rely on companies to tell us what food is good for us and when to get rid of it.

Adler says part of the problem may also lie with our burgeoning “food as status performance” culture, in which particular foods trend on social media, or food media coaxes us to keep buying new ingredients to make something we saw in a picture or on TikTok. “That doesn’t do a great service for anybody trying to cook what they have,” she said. “If they don’t have the ingredients for the viral thing, then whatever they do have is just going to sit there, while they go get the other ingredients.”

Our shopping culture is also at fault

The problem is bigger than individual consumers. Some states bar grocery stores from donating or selling out-of-date foods to food banks and other services designed to help those living with food insecurity. The thinking is reasonable, even altruistic: Why would we give sub-par food to the “poor”? If I wouldn’t eat “expired” food, why would I give it to others? Distributors fear legal threats if someone eats past-dated food and becomes ill (something that has rarely happened, but it’s still a looming threat).

That’s exacerbated by the way Americans shop. Think about it: How often do you see a shelf or bin or freezer in a grocery store that isn’t fully stocked to the brim? Supermarkets stock more food than they can sell, and that’s on purpose. Broad Leib told me that it’s common practice for supermarkets to plan for “shrink” — food they expect to be wasted. Shoppers in the US look askance at a shelf that isn’t fully stocked, or at a few potatoes left in the bin. “On the consumer side, you can understand,” she said. “You want to go to a store and have them have everything you want. And if you went in and they didn’t have what you want, then you’d go somewhere else.” We may not even realize it, but we’ve trained ourselves to see full crates of beets and shelves of salad dressing as a sign that the store is good, and therefore the food in it is good. Abundance indicates quality.

But that mindset naturally, even inevitably, leads to waste. In many places, if you can’t sell all your milk by the sell-by date, you have to dump it. Consumers don’t want to buy a box of Cheez-Its that only has a week left on it. Beef that “expires” in two days is not going to fly off the shelves. And if you can’t sell all your carrots, some of those carrots are going to start getting a little bendy. And many grocery stores will only sell produce that’s up to a certain aesthetic standard — no weird-looking apples or sweet potatoes from outer space, everything the same shape and size. Furthermore, if a manufacturer changes the label on their cookie packages, all the old packages will probably just be discarded to maintain uniformity.

“Most of the decisions that are made about most of the foods that we eat are made for reasons that have nothing to do with the food’s deliciousness or its healthiness or anything intrinsic to the food,” Adler said. “The leaves on vegetables wilt before the stalk on the vegetable, so it’s much easier for grocery stores to cut off the leaves at some point in processing. Otherwise you have to be sprinkling and trimming them all the time.” So the perfectly edible leaves of some vegetables may get lost in the process as well, while they could have been used to feed people.

Some businesses have cropped up to try to fix this larger-scale problem, like Misfits Market and Imperfect Foods. They form relationships with producers to rescue aesthetically “ugly” food — or at least, food we’ve been trained to think is ugly or too small or too large — and sell it to customers. They also buy food that’s approaching its label date and resell it to customers, hoping to cut down on food waste and change the way people eat. “It’s all about breaking down misconceptions,” Imperfect Foods’ associate creative director, Reilly Brock, told me by phone. “Food is not Cinderella. It’s not going to turn back into a pumpkin by midnight if it reaches the date on the label.”

But across the country, the standard practice for your average American consumer still stands. Make a big trip to the grocery store to buy your food from the glossy displays. When food expires, throw it out. Meanwhile, farmers are plowing ugly produce back into the ground or letting it rot in the field, and stores are chucking food that’s near or past its date into the garbage because there’s nowhere else they can send it.

Can we change this?

Why doesn’t the government just fix the problem?

The follow-up data to the 2013 Harvard study found that standardizing the date labeling system across the country — rather than leaving it to more local governments to address in a scattershot fashion — could be incredibly beneficial to the economy and to consumers. Enacting standardized legislation, it estimates, could prove to be an economic value of about $1.8 billion to the US. What’s more, an estimated 398,000 tons of food waste would be diverted to actually feed people, instead of sitting in landfills.

But fixing it has proven harder. Since the 1970s, Congress has periodically introduced legislation to modernize and standardize the system, in various forms. But, as Broad Leib told me, it can be an uphill battle. “The last administration and Congress were fairly deregulatory,” she pointed out. In the years since the 2013 study, many states have passed laws to try to standardize their own dates, even if they don’t align with other states. While Broad Leib and her colleagues argue that businesses (particularly national ones) would benefit from trying to meet one federal standard rather than different standards in different states, the philosophical differences can still be tough to surmount. “When you’re in a government that’s deregulatory, even for a good regulation, they say, ‘Let industry handle it. They have a voluntary standard, and we don’t need to step in.’”

Furthermore, Congress just moves slowly. “They don’t have a lot of stand-alone small bills,” she said. “So the best hope that this has of getting enacted is hitching itself to a moving train. A lot of our work has been in saying, ‘Here are other bills that are moving along’” — like the US Farm Bill, or the Child Nutrition Act — “and here’s why date labeling fits in with them.”

Quite a bit has happened in the years since Broad Leib and her colleagues first published their study. Seeing the problem, two major associations (the Consumer Brands Association and the Food Marketing Institute) put together a working group to design a standard date label that would work for both businesses and consumers. “They came up with a ‘best if used by’ label for a quality date and ‘use by’ for a safety date,” Broad Leib told me. “And they got a bunch of their members to sign on to voluntarily shift to using those dates.” In other words, if a food won’t decrease in safety but might decrease in quality, the manufacturer would use the “best if used by” label; if it might become unsafe to eat, they’d use the “use by” label. That system corresponds roughly to a standard used in many other countries.

This could make the work easier for the federal government to act, she says. “If Congress wanted to act, or the FDA or USDA wanted to act, it would be very easy to say, ‘Here’s what the standard label should be. We have some data on what works for consumers. And we know that these work for industry.” But otherwise, she calls the new label standard more of a “halfway solution,” since the label still will only appear on some products.

It’s more than laws. The culture needs to change.

And until there’s a better solution, the best thing we can do is try to educate ourselves and change the way we shop for food.

Broad Leib says there would be three big components to improving the system as it stands. First, the adoption of standard labels that indicate either a freshness date or a risk date would help.

But the second part is just as important: We need a public health program to educate people about what’s safe to eat. The UK has done a series of campaigns toward that end, with the slogan “Look, Smell, Taste, Don’t Waste,” in which it partnered with industry to help people understand when to keep their food and when to toss it.

The third component would be changing the way we allow food to be donated and distributed through food banks and other means. That requires a shift in how we think. If everyone is eating food past its “freshness” date — understanding that the food is perfectly safe but may not be at its absolute peak condition — then there will be less hesitancy about giving that food away, and less fear about the possibility of facing legal repercussions. That could have a huge impact on hunger and food insecurity in the US. “If everyone acknowledges that those foods are fine to eat, and everyone’s eating them, it’s not like, ‘Past-dated food is only for people who can’t afford food,’” Broad Leib said. “No, we should all be eating that.”

But that means we each need to rethink how we interact with food. We need to start trusting our senses to tell us if food is edible. “Use your sense organs,” Adler said. “We have them so that we can figure out whether things in the world are going to kill us, so we can make sure we’re not going to poison ourselves and die — and it’s even worth doing when you suspect something is bad, because feeling your body’s response is so reassuring.”

We need to ask for more clear labels, advocate for better legislation, and talk to one another about what labels really mean. And we need to move closer to food again, thinking of it less as a packaged consumer product and more as something natural that nourishes us as humans.

And in my case, that means I’m going to start sniffing what’s in my refrigerator before I chuck it — and maybe even turning it into lunch.

Source: Vox

Chart of the Day: Should COVID-19 Vaccination be Mandatory?

Source : Statista

SARS-CoV-2 Viral Variants — Tackling a Moving Target

John R. Mascola, Barney S. Graham, Anthony S. Fauci wrote . . . . . . . . .

In this issue of JAMA, Zhang and colleagues report the emergence of a novel severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) variant in Southern California that accounted for 44% (37 of 85) of samples collected and studied in January 2021. The terminology of viral variation can be confusing because the media and even scientific communications often use the terms variant, strain, and lineage interchangeably. The terminology reflects the basic replication biology of RNA viruses that results in the introduction of mutations throughout the viral genome. When specific mutations, or sets of mutations, are selected through numerous rounds of viral replication, a new variant can emerge. If the sequence variation produces a virus with distinctly different phenotypic characteristics, the variant is co-termed a strain. When through genetic sequencing and phylogenetic analysis a new variant is detected as a distinct branch on a phylogenetic tree, a new lineage is born.

New variants become predominant through a process of evolutionary selection that is not well understood. Once identified, several questions arise regarding the potential clinical consequences of a new variant: Is it more readily transmitted; is it more virulent or pathogenic; and can it evade immunity induced by vaccination or prior infection? For these reasons, new viral variants are studied, leading to the terms variant under investigation or variant of concern.

To communicate effectively about new SARS-CoV-2 variants, a common nomenclature is needed, which like the virus, is evolving. Fortunately, the World Health Organization (WHO) is working on a systematic nomenclature that does not require a geographic reference, since viral variants can spread rapidly and globally. Currently, the terminology is overlapping, as reflected in the report by Zhang et al.1 This new variant (CAL.20C) is termed lineage 20C/S:452R in Nextstrain nomenclature,2 referring to the parent clade 20C and spike alteration 452R. Similarly, using a distinct PANGO nomenclature,3 this variant derives from lineage B (B.1.429 and B.1.427). While alterations in any viral genes can have implications for pathogenesis, those arising in the spike protein that mediates viral entry into host cells and is a key target of vaccines and monoclonal antibodies are of particular interest. The new variant, identified in California and termed 20C/S:452R, has 3 amino acid changes in the spike protein, represented using the single-letter amino acid nomenclature: S13I, W152C, and L452R. To interpret this new set of alterations, it is useful to review what is known about recent variants that have become predominant in other regions of the world.

During the early phase of the SARS-CoV-2 pandemic, there were only modest levels of genetic evolution; however, more recent information indicates that even a single amino acid substitution can have biological implications. Starting in April 2020, the original SARS-CoV-2 strain was replaced in many regions of the world by a variant called D614G, which was subsequently shown to increase the efficiency of viral replication in humans and was more transmissible in animal models.4-6 The D614G strain appears to position its receptor binding domain to interact more efficiently with the ACE2 receptor, and it is associated with higher nasopharyngeal viral RNA loads, which may explain its rise to dominance.

In October 2020, sequencing analysis in the UK detected an emerging variant, later termed B.1.1.7 or 20I/501Y.V1, which is now present and rapidly spreading in many countries.7 B.1.1.7 contains 8 mutations in the spike protein and maintains the D614G mutation. One of these, N501Y, appears to further increase the spike protein interaction with the ACE2 receptor. Epidemiological studies indicate that the B.1.1.7/20I/501Y.V1 strain is 30% to 80% more effectively transmitted and results in higher nasopharyngeal viral loads than the wild-type strain of SARS-CoV. Also of concern are retrospective observational studies suggesting an approximately 30% increased risk of death associated with this variant.8

Another notable variant, 20H/501Y.V2 or B.1.351, was first identified is South Africa, where it has rapidly become the predominant strain.9 Cases attributed to this strain have been detected in multiple countries outside of South Africa, including recent cases in the US. B.1.351 shares the D614G and N501Y mutations with B.1.1.1.7; thus, it is thought to also have a high potential for transmission. There are no data yet to suggest an increased risk of death due to this variant. Importantly, this constellation of mutations—9 total in the spike protein—add yet another dimension of concern. B.1.351 strains are less effectively neutralized by convalescent plasma from patients with coronavirus disease 2019 (COVID-19) and by sera from those vaccinated with several vaccines in development.10-12 The decrement in neutralization can be more than 10-fold with convalescent plasma and averages 5- to 6-fold less with sera from vaccinated individuals. Fortunately, neutralization titers induced by vaccination are high, and even with a 6-fold decrease, serum can still effectively neutralize the virus.

Nonetheless, these data are concerning because they indicate that viral variation can result in antigenic changes that alter antibody-mediated immunity. This is highlighted by in vitro studies showing the B.1.351 strain to be partially or fully resistant to neutralization by certain monoclonal antibodies, including some authorized for therapeutic use in the US.12 The prevalent strains in the US appear to remain sensitive to therapeutic monoclonal antibodies; however, recent evolutionary history raises the concern that the virus could be only a few mutations away from more substantive resistance.

COVID-19 vaccine development has been an extraordinary success; however, it is unclear how effective these vaccines will be against the new variants. The interim data from 2 randomized placebo-controlled vaccine studies, the rAd26 from Janssen and a recombinant protein from Novavax, offer some insight. The Janssen study included sites in the US, Brazil, and South Africa with efficacy against COVID-19 at 72%, 66%, and 57%, respectively.13 Novavax reported efficacy from studies in the UK and South Africa with overall efficacy of 89% and 60%, respectively.14 Viral sequence data from infected patients showed that the B.1.351 strain was responsible for the majority of infections in South Africa. Lower vaccine efficacy in the South Africa cohort could be related to antigenic variation or to geographic or population differences. Despite the reduced efficacy, the rAd26 vaccine was 85% effective overall in preventing severe COVID-19, and protection was similar in all regions.

These data suggest that current vaccines could retain the ability to prevent hospitalizations and deaths, even in the face of decreased overall efficacy due to antigenic variation. It is unclear whether changes in vaccine composition will be needed to effectively control the COVID-19 pandemic; however, it is prudent to be prepared. Some companies have indicated plans to manufacture and test vaccines based on emerging variants, and such studies will provide important information on the potential to broaden the immune response.

The recognition of a novel emergent variant, 20C/S:452R, in the most populous US state necessitates further investigation for implications of enhanced transmission. In particular, the L452R mutation in the spike protein could affect the binding of certain therapeutic monoclonal antibodies. The emergence of this and other new variants is likely to be a common occurrence until the spread of this virus is reduced. This emphasizes the importance of a global approach to surveillance, tracking, and vaccine deployment. The approach should be systematic and include in vitro assessment of sensitivity to neutralization by monoclonal antibodies and vaccinee sera, vaccine protection of animals against challenge with new strains, and field data defining viral sequences from breakthrough infections in vaccinees. The infrastructure and process used for tracking and updating influenza vaccines could be used to inform that process. Finally, SARS-CoV-2 will be with the global population for some time and has clearly shown its tendency toward rapid antigenic variation, providing a “wake-up call” that a sustained effort to develop a pan-SARS-CoV-2 vaccine is warranted.

Source: JAMA Network

The Restaurant Before and After COVID-19

Aaron Timms wrote . . . . . . . . .

Have you eaten here before? Well, have you? Since I can tell from your hesitation that you haven’t, I can’t help but wonder: what are you doing here?

Generous as I am, I will guide you. Tonight’s meal will take the form of a series of liquid and solid comestibles served in vessels made sometimes of ceramic, sometimes of wood, sometimes of glass. Ingredients are all seasonal and humanely produced, in the modern tradition, and our chef is an explosive asshole of moderate talent whose best ideas have been stolen from less well-connected cooks with inferior financial backing, as is also traditional. Dishes will appear as they’re prepared, according to an ancient and mysterious protocol known as “making food to order in a restaurant.” Service will be obsequious to the point of viscosity; the waiters will laugh at even your most painful attempts at conversation, mostly as a means of securing a large tip, their only balm for employment in a service economy that regards them as less than human. Despite this it will be impossible to understand the descriptions of any of the food they place on your table, which will emerge less as words than as a series of tuneful sighs.

Dishes will include an amuse-bouche of insipid “soup” in an espresso cup; a signature dish that’s been on the menu for too long; an innovative “take” on an “ethnic” foodstuff spuriously linked to the chef’s bio; a palate cleanser that’s unintentionally the most satisfying part of the meal; and a dessert that’s designed to be witty but just tastes like dessert. Some dishes will arrive on moving vehicles. Others will include strobe lighting. Allergies will be tolerated, but only with contempt. You will be charged for drinking water. Nothing will meet the expectations you have created by feverishly consuming online reviews in the days leading up to this meal. Any questions before we start?

WHAT WAS THE RESTAURANT? To put the question in the past tense implies that it’s no longer possible to ask what the restaurant is. In time that may come to seem a ridiculous position; in many parts of the world outside the United States, where restaurants are holding firm in the face of the coronavirus, it already seems moot. But for now, anyone walking the center of any major city in this country would find it difficult to dispute that the American restaurant as we once knew it is an artifact of history. The US hospitality industry lost almost 5.5 million jobs in a single month at the start of the pandemic; 2 million more people are currently unemployed than were pre-COVID. In New York, over a thousand restaurants have perished since March. This massacre has disproportionately affected workers of color, who made up more than three-quarters of the city’s pre-COVID restaurant labor force, and the working poor, which is the only way to describe the vast mass of people employed in an industry with an average salary of $33,700. Restaurants have been brought to their knees at precisely the moment when the nourishment of the country is most in peril: since the start of the pandemic the number of Americans facing food insecurity has climbed from thirty-seven million to fifty-four million.

This year of death and hunger closes a period in which the restaurant, aided by social media and its mimetic logic of aspirational consumption, enjoyed an imperial phase of growth and influence. Neither inhospitable margins nor a famously high rate of failure for new businesses had, writ large, held the restaurant back. In the decade before COVID hit, the net number of new restaurants and bars in New York alone increased by more than 7,000, to 23,650, twice the rate at which businesses citywide expanded. Since the turn of the millennium the public’s appetite for news about the restaurant industry—for openings, reviews, recipes, and anything that took us closer to the chefs and their journeys into the culinary unknown—grew even more quickly, and a powerful new institution, the “food media,” was salivated into existence. Competition quickly emerged as the food media’s dominant narrative mode, touching off an explosion in competitive cooking shows, awards, and new systems of rating and ranking that turned food into a sorting mechanism for the consumer as well. Food functioned as both a prestige destination for the idle lifestyle dollar and a kind of prize. Restaurants during this era were more than simply a growing industry; for many they became a totem, a lodestar of in-group identification, a shorthand for cultural savvy and openness to experience. The person who frequented the right restaurants was living their best life; the restaurant agnostic was a cultural heathen, left behind by the great hungry tide of progress.

Among those restaurants still standing, or huddling by the feeble warmth of their outdoor heat lamps, forbearance rules the day, since survival through the pandemic is both an exercise of endurance and a cry for financial mercy—mercy that, whether in the form of further state assistance or rent forgiveness, increasingly looks as though it will arrive too late, or never. The shells that pass for restaurants today, grimly subsisting on trickles of revenue from sidewalk seating in the hope that salvation-by-vaccine might arrive by the coming spring, look nothing like the teeming, raucous places that used to line many communities’ streets. Their interiors empty or rearranged at quarter capacity, these half-restaurants present 2020 as a conclusive volemic shock to an urban patient that was already, in many cases, losing blood fast. In the laughter of today’s outdoor brunchers, gamely throwing back mimosas in pustular sidewalk bubbles and military-style dining tents hastily assembled over street gutters—some of these structures so secure they’re more indoor than the indoors could ever be—there’s a bleak determination to prove that despite the cold, despite everything, the show must go on. But it’s difficult to avoid the uncomfortable sense that these people are simply holdouts, and that the restaurant as we have come to know it will not survive the many months—six? Twelve? Who knows?—that the pandemic still has to run. In a country such as this one, so reluctant to redistribute in any direction other than upwards, the pandemic represents not a state of exception but a state of extension, an acceleration of the economy’s guiding inhumanities. What may be the restaurant’s final chapter was already prefigured by the excesses of its recent past. The story is not uniformly pretty.

The first restaurants date to mid-18th-century Paris, but it wasn’t until the early 19th century that the modern restaurant, born of an alliance between capital, power, and the press, was set on its course. In the aftermath of the Revolution, French culinarians—responding to directives from Napoleonic state censors to suppress coverage of politics and instead promote articles about “pleasure”—remapped France and Paris as geographies of edible abundance. Charles Louis Cadet de Gassicourt’s 1809 Course in Gastronomy presented a “gastronomic map of France” that redrew the country as a place of indulgence, not political commitment: cattle, ducks and hare now populated the Vendée and Versailles, rather than royalists and counter-revolutionaries. The restaurant—at that point a still-new institution whose primary points of distinction from the inns, taverns, coffee houses, and tables d’hôte that also offered food to the public were individual seating and individual choice in each diner’s selection of dishes off a menu—figured prominently in this depoliticizing initiative. In a Paris awash with “new fortunes,” Alexandre Balthazar Laurent Grimod de la Reynière promised that his Gourmands’ Almanac, the first volume of which appeared in 1803, would guide the epicurean middle classes through the “labyrinth” of new food shops, restaurants, and caterers that had emerged in the previous three decades, while specifically avoiding all discussion of governance: “For the last 15 years we have spoken about politics too much, and things have only started going well for France since we left the trouble of government to true statesmen,” he wrote.

[ . . . . . . . ]

Read more at n+1 . . . . .

Japanese Mayor: Men Should Do Shopping Since Women are ‘Indecisive and Take Forever’

The mayor of Japan’s Osaka has come under fire for suggesting men should do grocery shopping during the coronavirus outbreak because women are indecisive and “take a long time”.

Japan is under a state of emergency over the pandemic, and residents in some areas have been asked to shop less frequently and only send one family member out to get supplies to limit contact.

Osaka Mayor Ichiro Matsui told reporters on Thursday (Apr 23) that men should be entrusted with grocery runs because women “take a long time as they browse around and hesitate about this and that”, Kyodo news agency reported.

“Men can snap up things they are told (to buy) and go, so I think it’s good that they go shopping, avoiding human contact,” the 56-year-old added.

When challenged by a reporter, he acknowledged his remarks might be viewed as out-of-touch, but said they were true in his family.

But online he was roundly condemned, with one Twitter user accusing him of being “disrespectful to women and men”.

Another dubbed his comment “full of prejudice against women”, adding “there are indecisive men and nimble and sharp women”.

“Does he think (shoppers) like to take time?” added a third. “They are thinking about menus and prices.”

But there was some support for the mayor.

“That’s right. Elderly women, in particular, are always chatting away, unconcerned about shopping,” wrote one user.

Despite its highly educated female population, Japan ranked 121 out of 153 countries in the World Economic Forum’s 2020 gender gap index, primarily because of its poor showing in political representation.

Traditional gender roles are still deeply rooted in Japanese society and women are often still expected to take primary responsibility for childcare and domestic chores, even while holding down professional jobs.

Source : CNA


Read also at BBC:

Coronavirus: Malaysian men in shopping muddle amid lockdown . . . . .