How the US Military Helped Invent Cheetos

Cheese Purists the world over exalt their mummified milk. Their silken Goudas and savory Emmentalers. Their fetid fetas and squeaky queso frescos. Their moldy Roqueforts and runny Camemberts. These disks of rotted dairy are the pinnacle of thousands of years of experimentation that began when a herdsman carrying a ruminant’s stomach brimming with milk found that by journey’s end, he had a bag full of curds and whey.

Modern cheese making is a little more complicated, but the same principles apply. Fresh milk is allowed to ferment, with either wild or cultured bacteria. Then, when they have raised the acidity enough, rennet—enzymes from calves’ stomachs (these have now been replaced with laboratory‑produced enzymes)—is added. This coagulates the caseins, which make up about 80 percent of the total milk protein, so that they form a gel.

Then there’s a lot of manipulation—cutting, stirring, and heating—that removes fluid, or whey, leaving behind solid curds. The curds are put into molds, salted or brined, and pressed, which expels more whey and turns the cheese into a solid mass. Mold may be added, either at the beginning or later in the process. Then, depending on the variety, the cheeses are matured for anywhere from two weeks to two years, allowing enzymes, both those from microbes and those from the rennet, to turn fats and proteins into tasty new substances.

Cheese is one of the bedrocks on which the Western diet is founded—a long‑term storage method for excess milk, especially when cool storerooms and caves were available. But the food didn’t fare so well during summer or in hot climates. With heat, animal fat softens or even liquefies, oozing out and creating an oily and unappealing mess.

In the early twentieth century, dairymen on either side of the Atlantic—the Swiss duo Walter Gerber and Fritz Stettler in 1911 and James Kraft in 1916—hit on and patented a solution to the seasonal sweats: emulsifying salts. The chemical disperses water‑phobic caseins by exchanging sodium for calcium; this permits the now smaller particles to be diffused and suspended in liquid. Melting traditional cheeses and mixing them with the emulsifying salts resulted in a cheese‑like product that withstands high temperatures and protracted storage.

Even better, this new food could be made and sold very cheaply, because it could be produced, at least in part, from the rinds and irregular bits left over from cutting wheels of cheese into bricks. Melting the ingredients also pasteurized them, inactivating the live bacteria and enzymes and contributing to a longer shelf life.

The army placed its first order for processed cheese–which at the beginning, came in only one flavor: white—during World War I, buying twenty‑five million quarter‑pound tins from Kraft. This single act probably established Kraft’s century‑long (and still going strong) food industry hegemony. By the time World War II rolled around, the military was a raving cheeseaholic, consuming the dairy product by itself, on sandwiches, or as sauces for vegetables, potatoes, and pasta.

In 1944 alone, the Quartermaster Corps bought more than one hundred million pounds from Kraft’s parent company, National Dairy Products Corporation (which finally itself took the Kraft name in 1969), as well as five hundred thousand pounds of cheese spread (bacon bits optional) to accompany the K and some of the C rations. During the war, the company’s sales almost doubled. But it still wasn’t enough. The military was hungry for new ways to store, ship, and eat cheese.

At the beginning of the war, the army had embarked on a dehydration‑ and‑compression spree—by removing heavy water and reducing its volume, more food could be packed into a single shipment, always an advantage when there are millions of mouths to feed. All foodstuffs except meat were run through the drying chambers and squashed into bricks—fruits and vegetables, flour, potatoes, eggs, and cheese.

As would become its historic pattern, the military funded or supported a variety of efforts, some of which were destined to die a quiet death and others that would garner glory, becoming wartime staples and the basis for future consumer products. Cheese dehydration research was conducted by the Quartermaster Corps’ Subsistence Research Laboratory, through the USDA laboratories, at various universities, including the University of California at Davis, and by industry, notably Kraft.

Unless a food has a strong and flexible internal structure—think cellulose, the long chains of sugar molecules that give plant cells their rigidity—it crumbles when it dries out, something food technologists call fines. One can imagine the first experiment in drying and pressing a proud block of Wisconsin cheddar: cheese dust. This ruled out eating reconstituted cheese out of hand in slices or chunks. But for cooking, the granular form would be an advantage.

The first real cheese powder was developed in 1943 by George Sanders, a USDA dairy scientist. (Even before the war began, USDA’s research facilities had been enlisted to work toward military goals, exhorted by Secretary of Agriculture Henry Wallace “to consider their possible contributions to national needs as the defense program approaches the stage of ‘maximum effort.” This relationship continues to this day; the USDA has collaborated with the Quartermaster Corps and later the Natick Center on topics as varied as chemical testing, fungi collection and classification, potatoes, dairy, and, from 1980 on, operation of the army’s radiation food sterilization program.)

Until then, it had been “considered impossible to dehydrate natural, fat‑containing cheese,” because the heat melted the fat, which then separated out. Sanders’s innovation was to divide the process into two steps. In the first, the cheese, shredded or grated, was dried at a low temperature; this hardened the surface proteins of the particles, forming a protective barrier around the lipids. Once sufficient water had been evaporated, the cheese was ground and dehydrated at a higher temperature. The final step was to form it into what the patent describes as cakes. A 1943 war bond ad unveiled the product to the public with a picture of a bare‑chested solider feeding a second soldier bundled up in a parka with a cheese cake on a pointy stick:

For jungle or ski troops—a new kind of cheese! . . . But they should taste the same—and taste good—wherever they’re eaten. That has meant many headaches for the Army Quartermaster Corps and the food processors who supply them. . . . For emergency use in arctic and tropics, National Dairy laboratories developed a dehydrated, compressed cheese that keeps well anywhere and takes less shipping weight and space.

In the summer of 1945, Little Boy and Fat Man were detonated in Japan, ending the war and leaving the Quartermaster Corps with warehouses full of food as well as an elaborate manufacturing and distribution system still churning out goods for millions of troops. This would take years to redirect or dismantle. Fearful of the effect of the sudden withdrawal of its huge wartime contracts, the government propped up the dairy business first by buying their excess product and then, in some cases, by selling it back to them at lower prices. (The Commodity Credit Corporation, created during the Great Depression and still in existence, would later distribute these surpluses to welfare recipients and the elderly—the storied “government cheese.”) A temporary federal agency, the Surplus Property Administration, sold off at bargain‑basement prices the food the Quartermaster Corps had amassed.

Who doesn’t love something they get for free or at a third of the original cost? But what could one do with football fields full of potato flakes, a cave stuffed with dried eggs (the army’s strange storage location for one hundred million pounds of the stuff), or a mountain of dehydrated cheese?

Well, there was one group always interested in lowering the cost of finicky fresh ingredients: the grocery manufacturers, businesses such as Swift, Quaker Oats, General Foods, General Mills, Libby’s, Borden, McCormick, Colgate‑Palmolive, Gerber, Scott Paper, Kellogg’s, Pillsbury, and Kraft. (The strength of the companies that produced the packaged goods that lined the nation’s nascent supermarkets, many with deep military ties, only grew over the next century, as did that of their trade group, the Grocery Manufacturers Association, today the food industry’s most powerful lobbying organization.)

Perhaps instead of real cheese, the food corporations could mix in the cheap powder to add flavor. Not only would they save outright on the cost of ingredients, they’d pay a lot less to ship and store it—after all, that was the army’s primary purpose in developing dehydrated cheese in the first place. These ration conversions inspired a flood of fledgling products, particularly in the new and growing categories of convenience and snack foods.

In 1948 the Frito Company (it merged with H. W. Lay & Company in 1961 to become Frito‑Lay, Inc.) debuted the country’s first cheesy snack food, made with the same Wisconsin cheddar the army used for its dehydrated products. Frito Company founder Charles Doolin had been a military supplier, even building a facility in San Diego, where there is a naval base, to service his contracts.

According to his daughter Kaleta Doolin, “During the war, tins of chips were sent overseas to be served in mess halls and sold in PXs. This venture helped put the company over the top as a nationwide business.” Afterward, new plants were opened in Dallas, Los Angeles, and Salt Lake City, where soon cornmeal and water were being extruded, puffed, fried in oil, and coated with finger‑licking, orange dehydrated cheese. Cheetos!

Excerpted from Combat-Ready Kitchen: How the U.S. Military Shapes the Way You Eat.

Source: WIRED

Advertisements

Peanut Butter and Jelly Sandwich – The Combination Is Delicious and Original

Ken Albala wrote . . . . . .

While the peanut butter and jelly sandwich eventually became a staple of elementary school cafeterias, it actually has upper-crust origins.

In the late-19th century, at elegant ladies’ luncheons, a popular snack was small, crustless tea sandwiches with butter and cucumber, cold cuts or cheese. Around this time, health food advocates like John Harvey Kellogg started promoting peanut products as a replacement for animal-based foods (butter included). So for a vegetarian option at these luncheons, peanut butter simply replaced regular butter.

One of the earliest known recipes that suggested including jelly with peanut butter appeared in a 1901 issue of the Boston Cooking School Magazine.

“For variety,” author Julia Davis Chandler wrote, “some day try making little sandwiches, or bread fingers, of three very thin layers of bread and two of filling, one of peanut paste, whatever brand you prefer, and currant or crabapple jelly for the other. The combination is delicious, and so far as I know original.”

The sandwich moved from garden parties to lunchboxes in the 1920s, when peanut butter started to be mass produced with hydrogenated vegetable oil and sugar. Marketers of the Skippy brand targeted children as a potential new audience, and thus the association with school lunches was forged.

The classic version of the sandwich is made with soft, sliced white bread, creamy or chunky peanut butter and jelly. Outside of the United States, the peanut butter and jelly sandwich is rare – much of the world views the combination as repulsive.

These days, many try to avoid white bread and hydrogenated fats. Nonetheless, the sandwich has a nostalgic appeal for many Americans, and recipes for high-end versions – with freshly ground peanuts, artisanal bread or unusual jams – now circulate on the web.

Source: The Conversation

What Did 17th Century Food Taste Like?

From Res Obscura . . . . . . .

As the official portraitist for the Spanish monarchy at the height of its glory, Diego Velázquez painted queens, emperors, and gods. But one of his most famous paintings is a window into a much humbler world. A woman is frying eggs in hot oil, ready to scoop them out with a simple wooden spoon. Behind her, a servant boy carries a half-full jug of wine and a melon tied up in a loop of twine.

This painting is the type of thing historians love. A profoundly talented artist with a knack for realism, choosing the type of subject matter that is so normal that it rarely gets preserved (the same is true today—how many contemporary painters choose to depict taquerias or bagel shops?) Scholars suspect that Velazquez’s own family members may have served as models in his early paintings. It’s possible that the woman in this painting numbered among them, since she also appears in a religious painting he produced in the same year.

But this post is not about Velazquez. It’s not even about art history. It’s about food.

What can we learn about how people ate in the seventeenth century? And even if we can piece together historical recipes, can we ever really know what their food tasted like?

This might seem like a relatively unimportant question. For one thing, the senses of other people are always going to be, at some level, unknowable, because they are so deeply subjective. Not only can I not know what Velázquez’s fried eggs tasted like three hundred years ago, I arguably can’t know what my neighbor’s taste like. And why does the question matter, anyway? A very clear case can be made for the importance of the history of medicine and disease, or the histories of slavery, global commerce, warfare, and social change.

By comparison, the taste of food doesn’t seem to have the same stature. Fried eggs don’t change the course of history.

But taste does change history.

One example, chosen at random: the Mexican chili peppers hiding in the bottom edges of both paintings.

The pepper family (genus Capsicum) is native to the Americas, and it was still a relatively new arrival in the cuisines of Asia, Africa, and Europe when Velazquez was alive. As a non-elite person born in 1599, we can guess that his grandparents would not have been familiar with the taste of peppers and that his parents still thought of them as an exotic plant from across the seas. Even the name he, and we, apply to the plant was a foreign import: the word ‘chili’ is from Nahuatl, the language of the Aztecs. So is ‘avocado’ (Nahuatl ahuacatl), ‘tomato’ (tomatl) and chocolate (chocolatl).

The taste for these foods was a significant factor in the series of global ecological movements between the Old and New Worlds that historians call the Columbian Exchange. Any time we eat kimchi, or kung pao chicken, or pasta with red sauce, we are eating foods that are direct results of the Columbian Exchange.

Someone really needs to make a better map of the Columbian Exchange. This one, from a public-domain resource for teachers from UT Austin, is one of the best I could find, but it doesn’t come close to capturing the full range of exchanges.

But we’re also eating modern foods. That’s not to say that there aren’t older correlates to these dishes—there undoubtedly are. But food has changed since the early modern period. Globalization of food crops has transformed the flavors of regional cuisines. Meanwhile, factory farming has led to a homogenization of some of the varietals available to us, while also creating a huge variety of new strains and hybrids.

One example: I didn’t realize until recently that broccoli, Brussels sprouts, cauliflower, kale, cabbage, and collard greens are all technically the same species, Brassica oleracea. The substantial differences between these sub-species are all due to patient intervention by human farmers over millennia. Many of these changes are surprisingly recent. Early versions of cauliflower may have been mentioned by Pliny and medieval Muslim botanists, but as late as 1600, a French author was writing that cauli-fiori “as the Italians call it” was “still rather rare in France.” Likewise, Brussels sprouts don’t appear to have become widely cultivated until the Renaissance.

[ . . . . . . ]

Read more . . . . .

Tuna Salad Sandwich – A Taste of Home for Working Women

Megan Elias wrote . . . . . . .

The tuna salad sandwich originated from an impulse to conserve, only to become a symbol of excess.

In the 19th century – before the era of supermarkets and cheap groceries – most Americans avoided wasting food. Scraps of chicken, ham or fish from supper would be mixed with mayonnaise and served on lettuce for lunch. Leftovers of celery, pickles and olives – served as supper “relishes” – would also be folded into the mix.

The versions of these salads that incorporated fish tended to use salmon, white fish or trout. Most Americans didn’t cook (or even know of) tuna.

Around the end of the 19th century, middle-class women began to spend more time in public, patronizing department stores, lectures and museums. Since social conventions kept these women out of the saloons where men ate, lunch restaurants opened up to cater to this new clientele. They offered women exactly the kind of foods they had served each other at home: salads. While salads made at home often were composed of leftovers, those at lunch restaurants were made from scratch. Fish and shellfish salads were typical fare.

When further social and economic changes brought women into the public as office and department store workers, they found fish salads waiting for them at the affordable lunch counters patronized by busy urban workers. Unlike the ladies’ lunch, the office lunch hour had time limits. So lunch counters came up with the idea of offering the salads between two pieces of bread, which sped up table turnover and encouraged patrons to get lunch to go.

When canned tuna was introduced in the early 20th century, lunch counters and home cooks could skip the step of cooking a fish and go straight to the salad. But there was downside: The immense popularity of canned tuna led to the growth of a global industry that has severely depleted stocks and led to the unintended slaughter of millions of dolphins. A clever way to use dinner scraps has become a global crisis of conscience and capitalism.

I like mine on toasted rye.

Source: The Conversation

Donuts And Apple Cider: An Autumn Marriage Made By Autos And Automation

Deena Prichep wrote . . . . . .

When baker Julie Richardson was growing up in Vermont, Autumn Saturdays had a particular rhythm. First, soccer practice. And then, to the apple orchard for some cider and donuts.

“I would sit there and watch that machine — watch the doughnuts plop into the hot oil, go down the conveyor belt and plop out the other end,” she says.

For many New Englanders — and for people across the country who grew up near apple orchards — it’s just not Fall without cider and donuts. It’s a combination that makes culinary sense: When cider is used to make the dough, Richardson notes, the acidity helps yield a tender crumb. And a cold, crisp glass of cider helps wash down the deliciously oily donuts, hot from the fryer — the autumnal version of milk and cookies.

It’s a tasty pairing for sure. And one that evokes earlier times.

“We put this rural gloss on it,” says food writer and culinary historian Michael Krondl. “Farm stands have been around forever, and cider donuts harken back to New England Fall and changing leaves. Yeeeeeah,” he laughs. Krondl’s audible skepticism comes because he thinks this sepia-tinged tradition arose because of some distinctly new world trends — namely automation, and automobility.

But first, to acknowledge the actual old-world component: Yes, both cider and donuts have long histories. Krondl notes that in early America, hard cider was “one of the primary beverages, prior to Prohibition — especially in apple-growing areas like New England and the Upper Midwest.” It was cheap, common and easier to make than beer (not to mention handy when you’ve got a bumper crop of fruit that would otherwise go bad).

As for donuts: You can find mentions of fried cakes in the Bible, and pretty much every culture has their own beloved take on batter or dough hitting hot fat (not surprising, given that heating a cauldron of oil is a bit easier than rigging up an oven). Something resembling modern donuts has probably been a part of American history since the early Dutch settlers, and the treat got a big boost during World War I, thanks to the tasty outreach of the Salvation Army. But back stateside, donuts weren’t as widespread as they are today — namely because making them was a fairly labor-intensive process.

Until Adolph Levitt came along.

Sally Levitt Steinberg, Adolph’s granddaughter, tells the almost mythical origin story of how her grandfather met an engineer on a Midwestern train, and the two of them came up with the prototype for a donut-making machine. After many failures, Levitt finally succeeded in 1921, setting up the machine in the window of his bakery in Harlem, N.Y. Instead of having to roll out the dough, cut the donuts and fry them in a pot, bakers could just set up this machine, which plopped perfect circles of batter into hot oil, then fried and flipped them at just the right time. It’s the exact contraption that later hypnotized Richardson — and so many others.

The public recognized the delicious importance of the invention. Steinberg says her grandfather took the machine out for a demonstration in Times Square, and it stopped traffic all over the city.

“It just exploded,” says Steinberg. “Then he realized that the machine was going to last forever, and the money wouldn’t. So then he got into the rest of the aspects of the business — mixes, shops, selling donuts in supermarkets.”

And, as Krondl notes, at the same time that Levitt’s donut machine was taking over, another phenomenon was happening — the rise of the automobile.

“It’s the collision of the automobile, automation and advertising,” says Krondl. “You’ve got these machines in every donut shop in America, and the Doughnut Corporation of America [Levitt’s company] is controlling them. And you begin to have these farm stands, particularly near urban areas, where people can go on a Sunday drive. People would do that in the early days of the automobiles — excursions.”

Drivers and passengers would get hungry on these drives. Krondl asserts that the newly perfected automatic donut machine (with its accompanying easy-to-use commercial mixes) was a perfect way to feed them at their destination. And the donuts went perfectly with the cider these farm stands were already pressing.

“Like McDonalds,” Krondl says, “farm stands are a function of mobility and the highway system.”

A hundred years ago, Adolph Levitt wasn’t thinking about “automobility” or Sunday drivers — he was just thinking of returning GIs’ appetite for tasty fried treats, and how to turn a fortuitous encounter with an engineer into the solution for a market need. But Krondl maintains that the result, coming at the time it did, married cider and donuts together forever. And we should all be grateful.

“At a certain point, you couldn’t have a farm stand without a donut machine,” says Krondl. “Which I totally support.”

Source: npr