Video: 3 Most Useful Kitchen Gadgets – Are They Worth It?

Do you really need a slow cooker, rice cooker, sous vide immersion circulator, and a pressure cooker all sitting around your kitchen taking up counter space? Depends how you intend to use them — and on the chemistry of how food gets cooked.

Watch video at You Tube (3:51 minutes) . . . . .

Advertisements

A Pasta Dish with Smoked Trout, Eggplant and Cheese

Ingredients

1 medium eggplant, cut into 8 thick slices
olive oil
12 oz fresh lasagna sheets
6-1/2 oz smoked trout or salmon
6-1/2 oz goat cheese, cut into thick slices
1 cup sun-dried tomatoes, chopped

Sauce

3-1/2 tablespoons butter
2 tablespoons roughly chopped fresh parsley
1 cup chicken stock
3 tablespoons lime juice
cracked black pepper

Method

  1. Lightly brush the eggplant with olive oil and broil until golden brown on both sides.
  2. Cut the lasagna sheets into 12 (3-inch) squares.
  3. Cook in batches in a large pot of rapidly boiling water until al dente. Drain well.
  4. Place 1 sheet of lasagna on the center of each plate. Top with a folded slice of smoked trout, a slice of eggplant and a slice of goat cheese.
  5. Top with another sheet of lasagna and layer as before. Finish with a curl of smoked trout, crumbled goat cheese and a sprinkling of sun-dried tomatoes.
  6. To make the sauce, heat the butter, parsley, stock, lime juice and pepper in a pan. Drizzle over the stacks and serve immediately

Makes 4 servings.

Source: Food Style – Pasta

Video: Why Sourdough Bread is Healthy

Watch video at vimeo (1:50 minutes) . . . . .

The Vindication of Cheese, Butter, and Full-Fat Milk

James Hamblin wrote . . . . . . . . .

As a young child I missed a question on a psychological test: “What comes in a bottle?”

The answer was supposed to be milk. I said beer.

Milk almost always came in cartons and plastic jugs, so I was right. But this isn’t about rehashing old grudges. I barely even think about it anymore! The point is that the test was a relic of a time before me, when milk did come in bottles. It arrived on doorsteps each morning, by the hand of some vanishing man. And just as such a world was alien to me as a kid, the current generation of small children might miss a similar question: “Where does milk come from?”

Many would likely answer almonds or beans or oats.

Indeed, the already booming nut-milk industry is projected to grow another 50 percent by 2020. Much of this is driven by beliefs about health, with ads claiming “dairy free” as a virtue that resonates for nebulous reasons—many stemming from an earlier scare over saturated fat—among consumers lactose intolerant and tolerant alike. The dairy industry is now scrambling to market milk to Millennial families, as the quintessential American-heartland beverage once thought of as necessary for all aspiring, straight-boned children has become widely seen as something to be avoided.

It all happened quickly. In the 1990s, during the original “Got Milk?” campaign, it was plausible to look at a magazine, see supermodels with dairy-milk mustaches, and think little of it. Now many people would cry foul. With nut milks dominating the luxury café-grocery scenes frequented by celebrities, an image like that would surely elicit cries of disingenuousness: There’s no way you actually drink cow’s milk! And if you do, it’s probably skim or 2-percent milk, which leave no such thick mustache!

Difficult as it may be for Millennials to imagine, the average American in the 1970s drank about 30 gallons of milk a year. That’s now down to 18 gallons, according to the Department of Agriculture. And just as it appears that the long arc of American beverage consumption could bend fully away from the udder, new evidence is making it more apparent that the perceived health risks of dairy fats (which are mostly saturated) are less clear than many previously believed.

A new study this week in The American Journal of Clinical Nutrition is relevant to an ongoing vindication process for saturated fats, which turned many people away from dairy products such as whole milk, cheese, and butter in the 1980s and ’90s. An analysis of 2,907 adults found that people with higher and lower levels of dairy fats in their blood had the same rate of death during a 22-year period.

The implication is that it didn’t matter if people drank whole or skim or 2-percent milk, ate butter versus margarine, etc. The researchers concluded that dairy-fat consumption later in life “does not significantly influence total mortality.”

“I think the big news here is that even though there is this conventional wisdom that whole-fat dairy is bad for heart disease, we didn’t find that,” says Marcia de Oliveira Otto, the lead researcher of the study and an assistant professor of epidemiology, human genetics, and environmental science at the University of Texas School of Public Health. “And it’s not only us. A number of recent studies have found the same thing.”

Hers adds to the findings of prior studies that also found that limiting saturated fat is not a beneficial guideline. While much similar research has used self-reported data on how much people eat—a notoriously unreliable metric, especially for years-long studies—the current study is noteworthy for actually measuring the dairy-fat levels in the participants’ blood.

A drawback to this method, though, is that the source of the fats is unclear, so no distinction can be made between cheese, milk, yogurt, butter, etc. The people with low levels of dairy fats in their blood weren’t necessarily dairy free, but they may have been consuming low-fat dairy. All that can be said is that there was no association between dairy fats generally and mortality.

The researchers also found that certain saturated fatty acids may have specific benefits for some people. High levels of heptadecanoic acid, for example, were associated with lower rates of strokes.

De Oliveira Otto believes that this evidence is not itself a reason to eat more or less dairy. But she said it could encourage people to give priority to whole-fat dairy products over those that may be lower in fat but higher in sugar, which may be added to make up for a lack of taste or texture. She points to the classic example of chocolate milk, the low-fat varieties of which are still given to schoolchildren under the misguided belief that it is a “health food.”

The latest federal Dietary Guidelines for Americans, which guide school lunches and other programs, still recommend “fat-free or low-fat dairy.” These guidelines are issued by the U.S. Department of Agriculture, so they have long been biased toward recommending dairy consumption in a country that is rich in dairy-production infrastructure. Veganism is not encouraged given a national interest in continuing to consume the dairy the country produces. But promoting low-fat and fat-free dairy over whole milk has no such economic defense.

The takeaway is that, from a personal-health perspective, dairy products are at best fine and reasonable things to eat, and avoiding butter and cheese is less important than once believed. While the narrative that cheese and butter are dangerous is changing, it also remains true that dairy isn’t necessary for children or adults. A diet rich in high-fiber plants has more than enough protein and micronutrients to make up for a lack of dairy—and the vitamin D that’s added to milk can just as well be added to other foods, taken as a supplement, or siphoned from the sun.

With every new study that tells us more about the complexities of human nutrition and stymies efforts to fit nutrients into simple good-bad binaries, the easier it should be to direct our concerns productively. This study is another incremental addition to an ever-expanding body of knowledge, the point of which is that we should worry less about the harmful effects of single nutrients and more about the harms done by producing food. At this point, the clearest drawbacks to consuming animal products are not nutritional but environmental, with animal agriculture contributing to antibiotic resistance, deforestation, and climate change. While there is room for debate over the ideal amounts of saturated fat in human blood, the need to move toward an environmentally sustainable food system is unambiguous.

Source : The Atlantic

Why Alzheimer’s May Be Tougher to Spot in Women

Serena Gordon wrote . . . . . . . .

If your memory starts slipping, your gender may play a role in whether or not you are diagnosed with Alzheimer’s disease, a new study suggests.

How?

Women excel in a skill called verbal memory — the ability to learn and remember verbal information such as stories or grocery lists. At the moment, tests to detect Alzheimer’s disease rely heavily on measuring this skill, the study authors explained, which means some women may appear normal when they already have the memory-draining disease.

“About 10 percent of women originally diagnosed as normal were shown to meet the criteria for Alzheimer’s disease,” said study author Pauline Maki, a professor of psychiatry and psychology at the University of Illinois at Chicago.

“Conversely, we found that there were about 10 percent of men who were reclassified as not having Alzheimer’s disease. These findings suggest it’s important to pay attention to sex differences to improve the diagnostic accuracy for women and for men,” Maki said.

Alzheimer’s disease is a type of dementia that causes difficulties with memory, thinking and behavior. Right now, almost 6 million Americans have Alzheimer’s. By 2050, that number is expected to hit 14 million, according to the Alzheimer’s Association.

Maki said that women have a lifelong advantage in verbal memory, and that’s due to the hormone estrogen. She said when younger women must have their ovaries removed, causing a sharp decline in estrogen, there’s also a sharp decline in verbal memory.

Verbal memory is also affected during menopause, when estrogen levels drop naturally. Maki said the brain typically learns to compensate and actually produces its own estrogen, which helps to preserve verbal memory.

“That verbal memory reserve gives women an advantage in maintaining memory functioning despite having Alzheimer’s pathology in the brain,” she said, adding that the advantage can quickly turn into a disadvantage as the disease progresses.

“Women transition so much more quickly from preclinical Alzheimer’s disease to the dementia stage because they’re diagnosed later,” Maki noted.

The new study included almost 800 women. Their average age was 73 years and their average education was 15 years. Almost 950 men were also part of the study. They were slightly older, with an average age of 74, and had an average education of 16 years.

The researchers looked at the standard cut-offs in tests to diagnose Alzheimer’s disease, and compared those to gender-specific cut-offs to see if they could more accurately classify men and women with or without the disease.

And, in fact, the investigators found they did. About 1 in 10 women and 1 in 10 men were improperly diagnosed before the sex-specific test cut-offs were used.

Heather Snyder, senior director of medical and scientific operations for the Alzheimer’s Association, said, “This research suggests that using these types of tests [verbal memory] for women can mask underlying changes in biology. This might mean that you don’t detect changes as early as you might for a male.”

Snyder noted that it’s important to get the diagnosis as early as possible. It gives people a chance to think about the care they want and to take care of financial and family concerns.

She said an early diagnosis also allows people to take advantage of the medicines currently available, and to have a chance at participating in clinical trials.

The study is scheduled for presentation Monday at the Alzheimer’s Association annual meeting, in Chicago. Findings presented at meetings are typically viewed as preliminary unless published in a peer-reviewed journal.

Source: HealthDay


Today’s Comic