Explainer: What is a Second Wave of a Pandemic, and Has It Arrived in the U.S.?

Julie Steenhuysen wrote . . . . . . . . .

Infectious disease experts, economists and politicians have raised concerns about a second wave of coronavirus infections in the United States that could worsen in the coming months.

But some, including Dr. Anthony Fauci, the U.S. government’s top infectious disease expert, said it is too soon to discuss a second wave when the United States has never emerged from a first wave in which more than 120,000 people have died and more than 2.3 million Americans have had confirmed infections with the novel coronavirus.

Here is an explanation of what is meant by a second wave.


In infectious disease parlance, waves of infection describe the curve of an outbreak, reflecting a rise and fall in the number of cases. With viral infections such as influenza or the common cold, cases typically crest in the cold winter months and recede as warmer weather reappears.

Fears about a second wave of COVID-19, the respiratory disease caused by the coronavirus, stem in part from the trajectory of the 1918-1919 Spanish flu pandemic that infected 500 million people worldwide and killed an estimated 20 to 50 million people. The virus first appeared in the spring of 1918 but appears to have mutated when it surged again in the fall, making for a deadlier second wave.

“It came back roaring and was much worse,” epidemiologist Dr. William Hanage of Harvard University’s T.H. Chan School of Public Health said.

Epidemiologists said there is no formal definition of a second wave, but they know it when they see it.

“It’s often quite clear. You’ll see a rise involving a second group of people after infections in a first group have diminished,” epidemiologist Dr. Jessica Justman of Columbia University’s Mailman School of Public Health said.

U.S. COVID-19 cases spiked in March and April and then edged downward in response to social-distancing policies aimed at slowing the transmission of the virus from person to person. But unlike several countries in Europe and Asia, the United States never experienced a dramatic drop in cases marking the clear end of a first wave. There is now a plateau of about 20,000 U.S. cases daily.

“You can’t talk about a second wave in the summer because we’re still in the first wave. We want to get that first wave down. Then we’ll see if we can keep it there,” Fauci told the Washington Post last week.

The easing in recent weeks of social-distancing mandates in numerous U.S. states as businesses have reopened has caused an acceleration in infections.


To many epidemiologists, it is a matter of semantics.

“Do you want to call it an extension of the first wave or a second wave superimposed on the first? You could argue it either way,” Justman said.

Dr. Eric Toner, a senior scientist at the Johns Hopkins Center for Health Security, said he does not find “waves” to be an especially useful term in describing a pandemic.

“When you’re underwater, it’s hard to tell how many waves are passing over your head,” Toner said.

Toner said current increases in U.S. cases have less to do with the virus and more to do with people’s behavior.

“The virus isn’t going away and coming back. The virus is still here. It’s up in some places and down in others,” Toner said.


Vice President Mike Pence last week wrote an opinion piece in the Wall Street Journal trying to ease concerns over a second wave of U.S. cases. White House economic adviser Larry Kudlow said on Monday that a “second wave” is not coming.

Dr. Theo Vos of the University of Washington’s Institute for Health Metrics and Evaluation called those assurances “wishful thinking.”

Based on global models, his group has predicted that the coronavirus will surge in the fall as colder temperatures arrive in the United States.

“It’s likely to start picking up in October,” Vos said, with increased cases hitting in November, December and January.

Source : Reuters

Blackened Catfish


2 teaspoons paprika
1 teaspoon cayenne pepper
I teaspoon garlic powder
I teaspoon dried thyme
teaspoon salt
1 teaspoon finely grated lemon zest
1 tablespoon olive oil
4 catfish fillets, about 4 oz each
1 tablespoons lemon juice


  1. In a large shallow dish, combine the paprika, cayenne, garlic powder, thyme, salt and lemon zest. Add the catfish fillets and turn to coat both sides, pressing the herb-and-spice rub into the fish.
  2. In a large nonstick frying pan over medium-high heat, heat the oil.
  3. Add the fish and cook for 4 minutes. Pour the lemon juice over the fillets, turn and cook until the fish just separates when pressed with a fork, about 4 minutes more.
  4. To serve, divide among 4 individual plates.

Makes 4 servings.

Source: Cooking for Healthy Living

The History of Popcorn: How One Grain Became a Staple Snack

Michelle Delgado wrote . . . . . . . . .

Though their big screens are dark, the smell of popcorn, hot and freshly popped, still wafts out of some movie theaters. Closed because of COVID-19, theaters large and small are trying to stay afloat on the sale of popcorn and other snacks. “We had our doors closed and no income coming in,” Dave Loomos, co-owner of the 92-year-old Pickwick Theater in Park Ridge, Illinois, told a local radio station earlier this month. “We decided to do curbside popcorn pick-up to see how it would go, and we’ve been doing that for the past couple of weeks and it seems like it’s well-received…”

When popcorn was first sold inside movie theaters, almost 100 years ago, it actually helped buoy the business, which was flailing at the time as the country entered the Great Depression. Always an affordable treat, today, popcorn is tinged with nostalgia. For many Americans, the aroma alone triggers happy memories of going to the movies, of waiting in line to see a new release with friends and family.

With movie nights happening at home now, this April, popcorn flew off grocery store shelves, resulting in sales that were more than 30 percent higher than the previous year’s, according to data from Nielsen. But this isn’t the first time Americans fell in love with popcorn—and it won’t be the last.

The First Popped Corn

Long before boxes of Pop Secret lined grocery store shelves, corn began as a wild grass called teosinte in southwestern Mexico, according to research compiled by Mexico’s National Institute of Anthropology and History. Corn was probably cultivated as a domesticated crop around 9,000 years ago, but it wasn’t until 2012 that archaeologists unearthed the first evidence of popcorn in Peru: 6,700-year-old corn cobs studded with puffed kernels.

Thanks to its versatility, nutrition, and possibly the fact that dried kernels were popped and “easily consumable with the simplest of technologies: fire,” according to Michael Blake in Maize for the Gods, there’s evidence that the nimble grain was grown and consumed all over Mesoamerica, South America, and North America.

“If tribes didn’t grow the corn, they perhaps traded that corn,” says Lois Frank, a New Mexico-based chef, author, historian, and expert in Native American foodways, who explains that a vast network of trade routes once criss-crossed the continents. Though corn wasn’t the only foodstuff that was traded, it—including the popped variety—was an essential part of the cuisine of many of these early cultures.

Early popcorn probably resembled parched corn, which is made by cooking dried kernels, often in a frying pan. (Because parched corn typically uses kernels with lower water content, curbing its ability to pop, it’s considered a predecessor of CornNuts.) “Parched corn is much crunchier,” Frank says. “We know that in the early Southwest, there was popcorn—it just wasn’t a Jiffy Pop that you’d put in your microwave.”

The fluffy popcorn we know and love today is, in part, the result of thousands of years of careful cultivation of a few different strains of corn by those early tribes. Modern processing techniques ensure its dramatic cooking process: Corn for popping is grown, cured on the stalk, picked, and then dried until each kernel contains around 14 percent moisture, according to the USDA. When exposed to heat, that moisture expands, causing the kernel to burst into the final product. (For more on the science of popped corn, see this guide to making the best popcorn at home.)

From Farms to Fairgrounds

Early American settlers adopted corn, including popcorn, and learned to grow and cultivate it, ensuring it stayed in the diet of hundreds of thousands of people for the next several centuries. In the mid-1800s, the steel plow—which could cut through tough vegetation—transformed Midwestern agriculture. In Nebraska, Iowa, and Indiana, corn—especially the poppable variety—became such an important cash crop that it was dubbed “prairie gold.” By 1917, the region had so deeply embraced this nickname that it inspired poetry: Members of the Iowa Press and Authors’ Club collaborated to produce Prairie Gold, a volume of poems and stories that celebrated the region’s corn production.

Popcorn has long been popped in pots over a flame, but the turn of the 19th century brought a flurry of popcorn innovation. In 1875, a Kentucky resident named Frederick J. Myers patented a corn-popping device that added a stay-cool handle. But popcorn’s real rise wouldn’t come until sellers could easily carry popping machines around with them. That happened in Chicago in 1885, when Charles Cretors invented a lightweight electric machine that popped corn in oil, allowing vendors to easily move along with crowds in search of a better profit. Eight years later, Cretors improved the model by adding a contraption that would butter and salt the popcorn, too. The first commercial popcorn brands also got their start around the same time, when Iowa’s Albert Dickinson Co., which sold kernels under the names Big Buster and Little Buster, came onto the scene in the 1880s.

Subsequent patents provide a glimpse of the popcorn problems inventors sought to solve, both decorative and gustatory. In 1892, James T. Nvoods of Utah applied to patent a machine that coated freshly popped corn in a sugar syrup that would help preserve the snack. The coating separated the kernels so they could be boxed or packaged without getting soggy or dusty. Around then, two brothers began to experiment with new ways to flavor popcorn. Originally from Germany, Frederick and Lewis Rueckheim sold small batches of popcorn they made with a handheld popper. In 1896, they developed a combination that stuck: Cracker Jack, a combination of crunchy popcorn and salty peanuts coated in molasses.

The 20th century brought more popcorn patents, each aiming to improve the product or refine the tools of the trade. As the century progressed, individual vendors and commercial entities alike would build on this foundation, using technology to solidify popcorn’s status as a ubiquitous, familiar snack food that could be marketed to the masses. But back then, popcorn vendors relied on crowds at street fairs, festivals, and sporting events for all of their sales. No one expected Hollywood to change popcorn’s course forever.

Popcorn Goes to the Movies

Between 1920 and 1930, an initial wave of 20,000 movie theaters opened across America, with attendance reaching 25 million weekly movie-goers in 1925. Enterprising snack vendors took note: Those who normally camped out at sporting events or festivals began to set up shop outside of movie theaters, drawing the ire of the venues’ owners. “Many movie theaters had carpeted their lobbies with valuable rugs to emulate the grand theater lobbies,” Andrew Smith writes in Popped Culture: A Social History of Popcorn in America. In an effort to avoid sticky, greasy spills, most theaters banned snacks and soda outright.

But this ban would soon be overturned. In the late 1920s, sound—dialogue, music, and sound effects—came to the movies, and the industry experienced an enormous boom. Weekly movie-going soared to 90 million people in 1930, ushering in the Golden Age of cinema, thanks in part to the fact that illiterate Americans could finally enjoy movies, too. Unfortunately, the shift to sound caused growing pains for the industry. Small community or rural theaters shuttered, unable to afford the new technology. Movie theaters that did survive “redefined the evening from one of champagne to one of popcorn and soda,” according to sociologist Richard Butsch.

Pressure mounted as the Great Depression set in. As millions of Americans lost any sense of financial security, popcorn became their go-to “affordable luxury” at 10 cents per bag, writes Smith. Desperate to stay afloat, movie theaters finally caved and began renting portions of their lobbies to popcorn and snack vendors.

Depression-era stories of wealth amassed through popcorn sales began to flourish; they seem at least partially rooted in fact. Smith writes of an Oklahoma farmer who bought back three farms with popcorn money, and of a Dallas chain that earned $190,000 from popcorn in some locations while its snack-free locations went broke. One Kansas City vendor, Julia Braden, earned an annual income worth nearly $230,000 today after she successfully negotiated with the local theater to let her sell popcorn to movie goers.

Theaters eventually began to offer their own refreshments, marrying concessions and movie tickets once and for all. They were even willing to take losses on tickets to boost attendance, encouraging guests to spend their money on the more profitable concessions. That legacy continues today: Theaters sell popcorn at a markup between 800 and 1,500 percent, since distributors claim a substantial cut of ticket sales. As popcorn became a fixture in movie theater lobbies, its aroma became inextricably tied to the movies.

Out of the Movie Theater and into the Microwave

With popcorn sales ensured by Hollywood, the big business of popcorn moved on to targeting a home audience—particularly after Americans began watching television during the 1940s.

Though the first microwave was invented in 1946, the appliance didn’t become commonplace in American kitchens until the 1980s—a match made in heaven for popcorn, which popped just as well in microwavable packaging as it did on the stove. The microwave’s arrival coincided with a fitness boom, making popcorn the perfect relatively healthy snack for diet-conscious consumers. The first microwave popcorn was released in 1981; it contained perishable butter and required refrigeration. Another version, by Pillsbury, came frozen.

It was an undeniable hit: Within two years, microwave popcorn was available nationally and brought in $53 million in sales, according to a New York Times report. By 1984, a shelf-stable version hit stores*, and sales climbed even higher. Americans bought $250 million worth of popcorn in 1986, setting off an all-out battle between snack food companies that attempted to corner the market.

*Cultured dairy products—including butter—get their distinct flavor from two chemicals, diacetyl and acetoin. These compounds are synthesized and recombined for that “natural butter flavoring” that’s stabilized, infused into oils, and dispensed at movie theaters or used to flavor microwave popcorn.

Unfortunately for Nabisco and General Mills, one agricultural scientist had already become an unlikely popcorn king among men: Orville Redenbacher, a skinny, bespectacled man from Indiana with an immaculate suit, bow tie, and swoop of silver hair. Redenbacher was a Purdue-educated farmer who became famous for tinkering with hybrid varieties of corn. In 1965, Redenbacher and his research partner, Charlie Bowman, successfully created a kernel that would expand twice as much as the yellow corn Americans were familiar with. They called their hybrid “snowflake,” for its shape and ability to expand to up to 40 times its original size.

In 1991, Redenbacher spent his 85th birthday taping an episode of The Late Show with David Letterman. A slightly awkward guest, he touched his thick, plastic-framed glasses bashfully as the studio audience clapped. “In 1970, I hired a big firm in Chicago to come up with a name. They came up with the name ‘Orville Redenbacher’—which is the same identical name my mother thought of, 85 years ago,” Redenbacher joked, pulling out an old favorite quip. “And they charged $13,000 for the idea.”

Letterman’s assessment that Redenbacher was responsible for transforming the popcorn industry still holds true. The “snowflake” hybrid Redenbacher and Bowman developed accounted for 45 percent of the total microwave popcorn market at the time of Redenbacher’s death in 1995.

Pre-Popped Popcorn is on the Rise

Though pre-popped popcorn failed to impress movie-goers in the 1930s, today, pre-popped snacks are on the rise. During the 2000s, people began to eye microwaved popcorn with suspicion. A 2008 study found that diacetyl, a chemical used in artificial butter flavoring, was linked to Alzheimer’s and lung damage in industrial settings, and microwavable bags were lined with perfluorooctanoic acid (PFOA), which was linked to a condition dubbed “popcorn lung” due to respiratory diseases contracted by microwave popcorn factory workers. (More recently, the same condition has been linked to e-cigarettes.)

In 2013, AdAge reported that consumers were also growing tired of waiting for popcorn to pop. Microwave popcorn’s growth was miniscule, compared to nearly 12 percent growth—or nearly $672 million in sales—among pre-popped popcorns like Smart Food and Skinny Pop. The trend suggested that consumers wanted popcorn that was ready to eat, not a snack they had to tend to. Plus, as any college student knows, microwave popcorn has a tendency to burn, setting off fire alarms when unattended. Just this month, a brand was recalled when some of its bags began to ignite in the microwave.

Regardless of the reasons, ready-to-eat popcorn seems here to stay. In 2018, one marketing agency reported that Americans were ready to be more adventurous with their popcorn. Instead of traditional butter or salt, consumers craved popcorn that was cheesy, chocolatey, or studded with mix-ins like nutritional yeast.

Still, if you’re willing to wait a few minutes, it’s cheap and easy to make popcorn the old school way—from kernels that last months in the pantry. It’s comforting to know that with a glug of oil and a few minutes on a hot stove, freshly made popcorn—plus a new release on Netflix or Hulu—is always within reach, whether you’re stuck at home for weeks on end or not.

Source: Serious Eat

Nutrition a Key Ingredient for Cognitive Health of Midlife and Older Canadians

A new study, investigating factors associated with verbal fluency among a large sample of anglophone Canadians aged 45-85, found that individuals who consumed more vegetables and fruits and more nuts and pulses (such as lentils and beans) scored higher on tests of verbal fluency.

“These findings are consistent with other research that has found a Mediterranean diet high in fruits, vegetables, nuts, and legumes is protective against cognitive decline,” reported co-author Dr. Karen Davison, a nutrition informatics research program director at Kwantlen Polytechnic University, in British Columbia and a North American Primary Care Research Fellow. “Every increase in average daily fruit and vegetable intake was linked to higher verbal fluency scores, but the best outcomes were found among those who consumed at least 6 servings a day.”

Verbal fluency is an important measure of cognitive function. To test it, subjects are asked to list as many words from a given category as they can in one minute. This measures language and executive function and can be used to detect cognitive impairment.

Adults who have insufficient appetite, face challenges in preparing food or consume low-quality diets, may be at risk of malnourishment, and grip strength can be used to assess under-nutrition. Those in the study who had poor grip strength and/or high nutritional risk scores also had lower verbal fluency.

“Previous research has also indicated that measures of under-nutrition are associated with cognitive decline,” said co-author Zahraa Saab, a recent Masters of Public Health graduate of the University of Toronto.

The researchers investigated the relationship between other factors and cognitive health, as well, including immigrant status, age, blood pressure, obesity, and body fat.

Immigration status

Anglophone immigrants who had lived in Canada at least 20 years had higher verbal fluency scores than their Canadian-born peers. The researchers suspect that this protective effect may be partially due to better cognitive reserve among immigrants.

“Our earlier research on a big British cohort of individuals born in 1946 found that those who emigrated from United Kingdom had, on average, 5 points higher IQ than their peers who remained in the UK,” says senior author, Esme Fuller-Thomson, professor at University of Toronto’s Factor-Inwentash Faculty of Social Work (FIFSW) and director of the Institute for Life Course & Aging. “We purposively restricted the current study to those whose mother tongue was English, so we could investigate the association between immigrant status and verbal fluency, independent of bilingualism.”

Previous research suggests that those who are bilingual have a lower incidence and delayed onset of dementia. Most of the studies finding a ‘bilingualism advantage’ have, unfortunately, neglected to account for immigration status.

“Our findings suggest that this is an important omission, because even immigrants whose mother tongue is English had significantly higher verbal fluency scores than anglophones born in Canada. Thus, the ‘bilingualism advantage’ may be at least partially attributable to the “healthy immigrant effect,” said Fuller-Thomson, who is also cross-appointed to U of T’s Department of Family and Community Medicine and the Faculty of Nursing.

Age & Education

“Consistent with other studies, those younger in age had better cognitive functioning scores when compared to older participants.” says co-author Hongmei Tong, assistant professor of Social Work at MacEwan University in Edmonton.

The association between cognitive impairment and advanced age may be mediated or moderated by cognitive reserve factors such as high educational levels, which are protective against cognitive decline.

“Respondents who were aged 75-85 with a high school degree had verbal fluency scores comparable to individuals a decade younger who had not completed high school,” says co-author Vanessa Taler, associate professor of psychology, University of Ottawa.

Blood Pressure, Obesity & Body Fat

Adults with stage 2 hypertension had lower verbal fluency scores.

“Our findings underline the importance of managing blood pressure for brain health in mid-life and beyond,” says co-author Shen (Lamson) Lin, a doctoral student at the FIFSW.

Both obesity and higher percent body fat were associated with worse verbal fluency scores.

“Obesity has been linked in other research to inflammation and to greater insulin resistance, both of which have been associated to cognitive decline,” says co-author Karen Kobayashi, professor in the Department of Sociology and a research fellow at the Institute on Aging & Lifelong Health at the University of Victoria.

The study team analyzed data from the baseline Canadian Longitudinal Study on Aging, which included 8,574 anglophone participants aged 45-85, of whom 1,126 were immigrants who had arrived in Canada 20 or more years earlier. All participants were living in the community and were free from dementia. Two verbal fluency tests were examined: the Controlled Oral Word Association Test (COWAT) and the Animal Fluency (AF) task. The article was published this month in the Journal of Nutrition Health and Aging.

“The team’s findings suggest that it may be beneficial to design policies and health care practices to reduce nutrition risk, improve diet quality, and address obesity and hypertension among midlife and older citizens in order to improve these potentially modifiable risk factors for lower verbal fluency scores,” adds Dr. Fuller-Thomson. “The good news is that the higher levels of education obtained by baby boomers and subsequent birth cohorts may mitigate some of the cognitive decline often observed in previous generations of older adults.”

Source: EurekAlert!

Yes, Bad Sleep Does Make People Grumpy

Not getting enough sleep can kill your mood the morning after, Norwegian researchers report.

“Not in the sense that we have more negative feelings, like being down or depressed,” said lead author Ingvild Saksvik-Lehouillier of the Norwegian University of Science and Technology in Trondheim. “But participants in our study experienced a flattening of emotions when they slept less than normal. They felt less joy, enthusiasm, attention and fulfillment.”

For the study, 59 volunteers spent seven nights in their own beds and slept as long as they usually do. Next, they slept two hours less than normal for three nights.

On several mornings, participants were shown more than 300 pictures on a computer screen over 14 minutes. They were asked to tap the space bar whenever an image did not contain an “X” — a test of accuracy and responsiveness.

The reaction time was faster after the participants had been sleep deprived, “but the error rate went up,” said Saksvik-Lehouillier, an associate professor of psychology. “It seems that we react more quickly to compensate for lower concentration. Then there’ll be more mistakes.”

The takeaway: It may be smart to avoid activities that require a high level of accuracy after a night of short sleep.

Other research has found that sleep deprivation may have the same effect on driving, for instance, as alcohol does.

Participants did better and better each day they took the test after sleeping normally but scored worse each day after less sleep, the study found.

The study volunteers also answered questions about their emotions — both positive and negative.

The researchers didn’t find clear differences when it came to negative emotions, but positive feelings scored worse after just one night of reduced sleep and dropped even more after three nights.

“We already know that fewer positive emotions have a major impact on mental health. We also know that poor sleep is included in virtually all mental health diagnoses,” Saksvik-Lehouillier said in a university news release.

But not everyone needs to sleep seven or eight hours a night, she said.

“The most important thing is how you feel,” Saksvik-Lehouillier said. “If you’re in a good mood and alert when you get up, those are indications that your sleep habits are working for you.”

The findings were published recently in the journal Sleep.

Source: HealthDay

Today’s Comic