From Trash to Treasure: The History of Barbecued Ribs

Robert Moss wrote . . . . . . . . .

Pork ribs are a staple of American barbecue. Memphis is famous for its dry-rubbed version, and rib tips are a staple at Chicago’s South Side barbecue joints. Even down in beef-centric Texas, pork ribs are in high demand, constituting one-third of the state’s “holy trinity,” along with brisket and sausage.

But that hasn’t always been the case. Historically speaking, ribs are relative newcomers to the pits. Nonetheless, many writers have erroneously assumed that the antebellum South was their likely place of origin. This explanation by Meathead Goldwyn of AmazingRibs.com strikes the standard chords that have led some people to assume barbecued ribs were a product of that time: “In the pre-Civil War South, Masters got to eat the best cuts of meat. They ate the tenderloin from along the pig’s back, ‘high on the hog’ (yes, that’s where the expression came from), while the slaves got the tougher, more gristle-riddled cuts.”

But no one was putting slabs of ribs on barbecue pits back in the 19th century. Instead, barbecued ribs are an early 20th century innovation, one driven not by the distribution of pig pars on a plantation but by the rise of industrial meatpacking, mechanical refrigeration, and commercial barbecue stands. And our barbecue menus are richer (and our fingers stickier) as a result.

The Whole Hog

It’s easy to forget how dramatically mechanical refrigeration and railroad transport changed the way Americans eat—especially when it comes to meat. Fresh meat from larger livestock like pigs and cows wasn’t available year-round before the Civil War, because there was no way to keep it from spoiling. Farmers had to wait until the first cold winter weeks to slaughter their pigs; it needed to be cold enough—below 40°F—for the carcass to cool quickly and not spoil, but also not so cold that the meat would freeze.

A hog killing on a 19th-century farm was a laborious but celebratory event, with the whole family and plenty of neighbors and friends pitching in. Almost every part of the pig was put to good use. The blood was reserved for puddings and the fat rendered into lard in giant kettles. Smaller scraps of meat and fat were ground into sausages, and the heads and feet were boiled to make “souse meat” or rendered into a thick, savory stew—hash and rice, South Carolina’s traditional barbecue side dish, evolved from these hog-killing stews.

The carcasses were then allowed to chill overnight and the next morning were cut into hams, shoulders, and “middlings” (side meat or bacon), which were taken to the smokehouse and preserved by curing and smoking. The parts left behind—the chine (backbone), the tenderloins, the chitterlings (intestines), and the ribs—were eaten over the next few days.

Those traditional hog-killing dinners featured fresh roasted spare ribs and chine served with bread, potatoes, apples sauce, and cabbage or greens. And they might well be the only fresh pork a farm family enjoyed all year. They couldn’t have a hog killing during the summer—especially not in the South—for the meat would spoil in the sweltering heat long before they finished all the butchering, lard rendering, and sausage-making.

There was one exception to this, though. At big events, where the entire community gathered, farmers could take a few pigs to a shady grove where a barbecue pit awaited, slaughter them and remove the entrails right on the spot, and put the whole animals on the pit to cook. And that’s exactly what a 19th-century barbecue entailed.

Barbecue originated not as a way of “making do” with lesser cuts, but rather as a method of whole-animal cookery—one usually staged for a large crowd. I’ve been unable to find any accounts that describe enslaved people (or anyone else, for that matter) cooking ribs or other individual cuts on a barbecue pit. Plenty of primary sources, however, describe or illustrate whole carcasses of pigs, goats, lambs, and even cows being cooked over a bed of coals in pits dug in the ground. When people in the 19th century ate barbecued ribs, they pulled the meat from a whole pig that was already cooked.

The Meat Packers’ Cast Offs

This doesn’t mean that no one ate spare ribs in the 19th century—they just weren’t barbecuing them. As the century advanced, ribs became available in greater and greater quantities, provided you lived in the right place—namely, a city like Indianapolis or Louisville, where hogs were being packed and processed to ship around the country.

Industrial pork packing arose in the early decades of the century, driven first by improved river navigation and then by the expansion of railroads. Cincinnati, blessed with a prime position on the Ohio River and close to burgeoning cornfields and hog farms, emerged as “Porkopolis,” the largest pork-producing city in the world at the time.

By 1836, Cincinnati’s four largest slaughterhouses were collectively killing and butchering some 2,600 hogs in a single day, producing between 200 and 500 barrels of pork along with 200 kegs of lard. In these early days, the tools and procedures used to slaughter a hog in a commercial setting were not so different from those of a rural hog killing; it was just conducted on a much larger scale, with each step—dispatching the pig with a blow from a hammer, scalding the carcass in boiling water, scraping the hair away—performed by a different worker, on an assembly line of sorts.

Barrels were essential to the pork trade. With no means of refrigerated transport, packers had to preserve the meat before shipping, but they didn’t want to waste weeks slow-smoking it like farm families did. Instead, they packed the hams and shoulders in barrels, filled in the gaps with chines, hocks, and jowls, then poured in a sweet and salty “pickle” made from rock salt and brown sugar boiled in water.

The spareribs didn’t fit in the barrels, and packers found themselves with literal tons of unwanted racks on their hands. “It is said that during the hog-killing season in Cincinnati,” the New Orleans Times-Picayune reported in 1844, “any keeper of a boarding-house, by sending a basket to the butcher’s, can have it filled with the finest and most delicious spare ribs, and ‘free gratis for nothing’ at that.”

But even the city’s boarding houses couldn’t eat up the supply. In the early days, one account recalled, “cart loads upon cart loads of spare-ribs” were “drawn to the water’s edge and emptied into the Ohio to get rid of them.”

That started to change in the 1870s, when artificial ice-making and then mechanical refrigeration transformed meat packing from a seasonal to a year-round business. Now packers could hang onto spareribs and sell them to retailers as a low-cost cut.

Recipes for spare ribs appear in cookbooks and newspapers with greater frequency in the closing decades of the 19th century. Many advised cutting the ribs into three-bone pieces and parboiling them before seasoning and finishing on a hot gridiron over coals in a kitchen fireplace. Others called for roasting them in an oven over a bed of sauerkraut and serving with applesauce, mashed potatoes, and mustard.

In 1895, the Ottawa Herald (that’s Ottawa, Kansas) contemplated options for Thanksgiving menus and noted, “Turkey and cranberries may cost more than spare ribs and turnips, but a good, well seasoned spare rib baked brown and crisp beats any turkey that ever flapped his wings.” But pork ribs weren’t destined to displace the gobbler on the traditional Thanksgiving menu. Instead, they helped transform the way Americans ate their barbecue.

The Rise of the Rib Shack

Before the 20th century, barbecue wasn’t a commercial product. It was served at occasional, large-scale gatherings where whole animals were cooked outdoors on open pits. These events were typically provided free of charge as part of community Fourth of July celebrations or political campaigns.

As the country urbanized, though, entrepreneurial cooks started selling slow-smoked meats on city street corners and in courthouse squares. Often these were farmers who slaughtered one or two of their own pigs, cooked them on a pit, and took the meat into town to sell over the weekend. The first barbecue stands were informal operations—just a tent or temporary shed—but over time they evolved into permanent restaurants, and their operators began offering a regular slate of meats. They increasingly bought those meats from local packing houses instead of raising the animals themselves, and many restaurateurs started buying individual cuts like shoulders and hams instead of whole pigs.

Those local packers had plenty of spare ribs on hand, too, which they were happy to unload for cheap. The historical record doesn’t pinpoint any particular region where barbecued ribs were introduced, nor any particular type of operation. In a matter of a few years, spare ribs could be found all over the country at barbecue stands, cafés, and take-out butcher shops—anywhere that had a barbecue pit and smoked meats to sell to the public.

In the 1920s, A.R. Hubbard’s Cafe in Houston offered barbecued ribs alongside dinners and short orders. Clegg’s Hotel and Cafe in Greensboro, North Carolina, featured “barbecued spare ribs with sweet potatoes” for its 75-cent Special Sunday Dinner. Rasmussen’s in Davenport, Iowa, offered “Tennessee Style Barbecue Ribs,” which it touted as “inexpensive—with a fine appetizing taste.”

Rasmussen’s reference to “Tennessee style” is tantalizing, but I’ve found no other evidence to indicate that rib-cooking was more common in Tennessee than anywhere else. In fact, a surprising number of the stands selling spare ribs were found in Iowa—which, perhaps not coincidentally, was prime hog-producing territory.

One notable rib fan was the famed New York Yankee slugger Babe Ruth. The Yankees swept the St. Louis Cardinals in four games in the 1928 World Series, and the night after the final game, as the Yankees’ east-bound train rolled into Mattoon, Illinois, the Babe entertained his teammates and reporters with “50 pounds of barbecued spare ribs and an amber-color fluid which foamed suspiciously on being poured into serving glasses.” (This was in the midst of Prohibition, we should note.)

But you didn’t have to be a star athlete to relish a platter of ribs. In large cities—particularly those with a sizable African-American community—ribs emerged as a late-night staple for the nightclub crowd, as club owners set up small pits behind their establishments and cooked a few racks to sell to hungry revelers. In 1928 the movie editor for the Detroit Times returned from a visit to the East Side to report that “barbecue spare ribs in the doorway emporiums of the black belt” were also drawing in lots of white customers. “Served with a spicy sauce, the ribs are thirst-provoking; and nearby beer spots get a brisk play as a resort, color lines being ignored.”

Ribs were a hit among the late-night crowd in Memphis, too. The city’s rib pioneer was John Mills, who in the late 1920s opened a barbecue stand on 4th Street, just around the corner from the famous nightlife district on Beale. He cooked his ribs on a charcoal-fired brick pit in the alley out back and mopped them with a peppery hot sauce. Two decades before Charlie Vergos started selling his now-legendary dry-rubbed ribs at The Rendezvous, Mills was drawing a steady crowd of musicians and celebrities like Kate Smith and Bing Crosby, who always stopped by for ribs when they were in town.

The Golden Age of Ribs

By the 1930s, barbecued ribs could be found at thousands of barbecue stands, nightclubs, and cafes across the country. In the years just after World War II, ribs crossed over to the menus at high-end restaurants, as well. In 1948, the syndicated food columnist Ida Bailey Allen noted, “People pay fancy prices to nibble at barbecued spare ribs in a swanky restaurant,” bemused that a once-humble cut had gone uptown.

Ribs were in high demand for backyard barbecuing, too, as that form of home entertainment surged in the post-War years. In 1955, the New York Times declared, “This increasingly popular cut of meat inevitably will claim the attention of almost every outdoor cook during the summer season ahead.” A century before, packinghouses literally couldn’t give ribs away, but now, the Times reported, “their price is in their luxury bracket.” Since a pound of ribs served only one diner, effectively “the meat costs more than a sirloin steak or prime rib roast, both of which yield two to three servings per pound.”

This same period witnessed the emergence of the so-called St. Louis-style rib. This wasn’t a method of cooking, but rather of cutting the meat to gussy up its presentation. On a full rack of spare ribs, there is a line where each of the long bones ends and a short length of cartilage and fat begins. Butchers in St. Louis took to slicing away the tips (also called the “brisket” or “collar”) and removing the short, pointed end of the rack just past the 13th bone. The result was a long, squared-off slab that let diners chew the meat straight off the long bones without worrying about all the cartilage and fat on the ends.

The first mention I’ve found of trimming ribs this way appeared in 1947 in the St. Louis Post Dispatch. It describes the rib-cooking method of Adolph Feiler, the chef at the decidedly swanky Forest Park Hotel, who barbecued ribs on a charcoal rotisserie with electric powered spits, swabbing the meat at frequent intervals with a tomato-based sauce. A photo shows Roscoe Duncan, Feiler’s “first cook,” preparing the ribs by removing the tips with a cleaver. “This job is usually done by the butcher,” the article noted.

St. Louis’s local meat packers embraced the cut to differentiate their products from those of the national packing houses. In 1995, Elaine Viets of the St. Louis Post Dispatch interviewed retired local butcher Robert F. Eggleston, who recalled that in the post-War era there were 15 to 20 meat-packing establishments around St. Louis. “The major packers cut the spare ribs from the carcass and sold them that way,” Eggleston told Viets. “They left on a big hunk of bone and gristle we butchers called the collar. . . The St. Louis packers took off about half that collar. It cost consumers a little more, but it was a better value. Rib lovers bought it. That was the St. Louis cut rib.”

The method took off across the country, and by the early 1950s, butchers from California to Mississippi were advertising “St. Louis Style” ribs as a premium product. In Brownsville, Texas, in 1951, regular spares sold for 39 cents a pound, while St. Louis style ran for 45 cents. An ad in the Rockville, Illinois, Morning Star described the cut as “Center Strips of Ribs Only / The Brisket Is Removed” and declared them “Perfect for Bar-B-Quing.”

Ironically, this innovation let to packers having a new unwanted cut on their hands: the rib tip—that long strip of cartilage, gristle, and meat that had been carved away to pretty up the slab. Once again, barbecue joints came to the rescue. High-end hotels and swanky nightclubs might roast prime St. Louis cuts on motorized rotisseries, but barbecue cooks started buying up the tips and putting them on their old-school pits, letting the magic of smoke and time transform them into something delicious.

Rib tips are now a staple of St. Louis’s traditional barbecue restaurants alongside pork snoots—an even more undervalued part of the hog. In Chicago, which by the turn of the 20th century had eclipsed Cincinnati as America’s hog-packing capital, rib tips were adopted at legendary South Side joints like Lem’s and Argia B’s in the 1950s and 1960s and are now an essential part of the city’s signature style. Connoisseurs know they have to gnaw their way around a little gristle to get to the good stuff, but they swear the meat is tastier and worth the extra effort.

That’s a much better use of leftover pig parts than dumping them in the Ohio River.

Source: Serious Eats

Advertisements

Glucose and The Brain: Improving Mental Performance

Glucose is a type of sugar which the brain depends on for fuel. Studies show that dips in glucose availability can have a negative impact on attention, memory and learning, and that administering glucose can enhance these aspects of cognitive function. The brain also uses up more glucose during challenging mental tasks. Therefore, it may be especially important to keep blood glucose levels at an optimum level for good cognitive function. Consuming regular meals may help to achieve this.

Glucose as fuel

Glucose is a type of sugar which comes predominantly from starchy foods (bread, rice, pasta and potatoes) as well as fruits, juices, honey, jams and table sugar. The body can break down the digestible carbohydrates in these foods into glucose, which is transported in the bloodstream to the brain and other organs for energy. The body tightly regulates blood glucose levels; this is known as glucose homeostasis. A process called gluconeogenesis allows the body to make its own glucose from the building blocks of protein and fat. Glucose can be stored in form of glycogen in the liver and to a somewhat lesser extent in the muscle. Glycogen forms an energy reserve that can be quickly mobilised to meet a sudden need for glucose (physical exercise), but also when glucose intake from food is insufficient (during fasting, for example), the body can get glucose by breaking down its glycogen stores. Liver glycogen is nearly depleted 12 to 18 hours after eating, overnight fasting, for example, after which the body relies more on energy from breaking down fats.

The energy needs of the brain

The human brain is made up of a dense network of neurons, or nerve cells, which are constantly active — even during sleep. To obtain the energy needed to sustain this activity, the brain depends on a continuous supply of glucose from the bloodstream. A healthy diet should provide 45-60% of total energy from carbohydrates.1 A normal weight adult requires 200 g of glucose per day, two-thirds of which (about 130 g) is specifically needed by the brain to cover its glucose needs.

The brain competes with the rest of the body for glucose when levels dip very low — such as during starvation. By tightly controlling its share of glucose under these conditions, the brain can maintain its high level of activity. It does this through two main mechanisms: first, by drawing glucose directly from the blood when its cells are low on energy; and second, by limiting the amount of glucose available to the rest of the body so that there is more available to the brain.2,3 These mechanisms are essential for survival. Unlike muscles (including the heart), and the liver, the brain cannot use fatty acids directly for fuel.

Glucose and the mental performance

Despite this sophisticated regulation, short-term dips in glucose availability do occur in certain brain areas. These may impair various cognitive functions such as attention, memory, and learning.4

Studies on glucose have demonstrated how administering this sugar can improve cognitive functioning — in particular, short-term memory and attention.4 Most of these studies give participants a set amount of glucose as a drink. A study by Sünram-Lea and colleagues found that a glucose drink significantly improved long-term verbal memory and long-term spatial memory in young adults. The effect was similar whether the drink was consumed after an overnight fast, a two-hour fast post-breakfast, or a two-hour fast post-lunch.5 Similarly, Riby and colleagues found glucose enhanced memory.6

The more demanding mental tasks appear to respond better to glucose than simpler tasks. This may be because the brain’s uptake of glucose increases under conditions of mild stress, which includes challenging mental tasks.4

Given that the brain is sensitive to short-term drops in blood glucose levels, and appears to respond positively to rises in these levels, it may be beneficial to maintain adequate blood sugar levels in order to maintain cognitive function.4 Eating regular meals may help to achieve this. In particular, studies in children and adolescents have shown that eating breakfast can help to improve mental performance by boosting ability in memory- and attention-related tasks.7

Conclusion

The brain is a highly active organ that relies on glucose for fuel. Glucose comes either directly from carbohydrate-containing foods and drinks, or is produced by the body from non-carbohydrate sources. Keeping blood sugar levels at an optimal level appears to be helpful for maintaining good cognitive function, particularly for more mentally demanding tasks. Consuming regular meals may be a useful way of achieving this.

Source: eufic


Today’s Comic

How Raw Fish and Vinegared Rice Became a World Favourite

Bernice Chan and Alkira Reinfrank wrote . . . . . . . . .

The Japanese dish of sushi is pretty much ubiquitous around the world. From nigiri, with its slice of raw fish on a mound of rice, to the maki roll wrapped in nori, or seaweed, sushi looks deceptively simple to make.

But there is so much more to sushi than meets the eye: the quality rests on the cut of the fish, its freshness and provenance, the origin of the rice, how it was prepared and seasoned, and the kind of vinegar and sugar used.

However, the sushi we know today tastes and looks very different to how it did centuries ago. First of all, the rice in the original “sushi” was not intended to be eaten. Mixed with salt, it was used to preserve the fish and then thrown out.

And sushi’s origins aren’t even Japanese, says Nobu Hong Kong executive sushi chef Kazunari Araki, who has more than 20 years of sushi-making experience.

The combination of rice and fish, he explains, originated in the third century along the Mekong River in Southeast Asia, where countries such as Thailand, Vietnam, Myanmar, Laos and Cambodia are now situated.

“The people who lived around the river would catch a lot of fish, and because the climate is so hot they had to find a way to keep the fish [from rotting]. People in the area were also making rice, so they found a way to keep the fish [fresh] by using a rice and salt [mixture],” Araki explains.

After the fish were cleaned and gutted, they were covered in the salt and rice mixture in buckets for several months, or much longer, to preserve the meat. Before they ate the fish, the rice was discarded because it was too salty to consume.

By the 12th century, this method of fermenting fish had travelled from the Mekong to China, and then on to Japan, where it was called narezushi. However, in the 16th century, in the Edo period, Araki says, vinegar replaced salt in the preservation process, which was a major step forward in the development of sushi. It also gave birth to the name sushi – which translates to “vinegared rice”.

“With vinegar, you only need to marinate the fish for a few hours or overnight, so that shortened the time to eat the fish compared to six months or a year,” Araki says.

That led to the fish portions becoming smaller in the 18th and 19th centuries, from a whole fish to slices as large as the circumference of the hand. Araki says the next major development came in the Meiji era, in the 1900s, when ice machines were developed.

“Ice means you can keep fish fresh. You don’t have to marinate it. You just cut it and keep it on the ice. Whenever you make the rice, you cut the fish, place it on top of the rice and then eat it. You don’t have to marinate with soy sauce because the fish is fresh. Just dip it into soy sauce. This is the modern way to eat nigiri.”

If you want to try narezushi, Shiga prefecture to the east of the ancient capital, Kyoto, is the place where it is still prepared. But Araki warns it’s not for everyone.

“When something is fermented it goes bad. So you can imagine the fermented fish smells. The fish tastes salty and smells, but you can still taste the fish. When you have it with sake, it’s really good.”

So how did sushi go from being a traditional Japanese food to one loved by diners in the West? To understand its globalisation, you need to look to how the cuisine made its way to the US.

American academic James Farrer has been studying the Japanese food phenomenon for 12 years and says there have been multiple “booms” in Japanese food in the West, all characterised by showmanship, performance and exoticism.

The first wave of globalisation struck in the 1930s with sukiyaki, the hotpot-style dish with sliced beef and vegetables. “The very first Japanese restaurants in the United States would have been sukiyaki … It was popular because it was so exotic. It was served by women in kimonos … and it was associated with the idea of the geisha,” says Farrer, an associate professor of sociology at Sophia University in Tokyo.

However, this craze faded in the post-World War II period.

The next boom to capture Americans’ imagination was teppanyaki in the 1970s. “It was easy for Western people to like, because it’s basically meat … served in an exotic way by a chef who’s cooking on a metal plate in front of you,” says Farrer, who has lived in Asia for 30 years.

It was a theatrical dining experience far removed from the traditional TV dinners most Americans were used to at the time.

The next wave – the biggest to date, Farrer says – was sushi in the 1970s and 1980s. “Sushi was much more radical because it involved teaching people in the West … to eat raw fish, which was not a standard part of the diet in almost any other place [in the world].”

The sushi boom occurred as Japan was becoming a global economic powerhouse. Unlike Chinese food, Japanese food was introduced to America by “rich migrants” and was eaten by wealthy Japanese businessmen, giving it an air of luxury.

Sushi first appeared in Los Angeles in the 1970s, “which was kind of the centre of global pop culture” at the time. “[From there] you had Hollywood stars, celebrity chefs and other opinion leaders … starting to embrace this new culture of sushi and popularising it,” Farrer says.

The raw fish phenomenon even made its way into films, including the 1985 John Hughes classic The Breakfast Club, in which bad-boy Bender rears in disgust when Molly Ringwald’s rich “princess” character pulls out a sushi lunch to eat during detention. “You won’t accept a guy’s tongue in your mouth and you’re gonna eat that?” he asks provocatively.

“If you were eating sushi [in the early days] you were doing something exotic. You were doing something sensual. There was a sort of sexual element to sushi and it really fed into the 1980s of self-indulgence,” Farrer says.

It was also a happy coincidence that sushi made a splash just as the US West Coast was in the midst of a fitness and weight-loss craze, and the macrobiotic diet was on everybody’s mind.

“In this period, you had this notion that Western cuisine was too fatty and oily and used too much butter and sauces … [whereas] Japanese food had the properties of being low fat. It was light and focused on the ingredients.”

Slowly the sushi craze migrated from the high-class fringes of the West Coast to the mainstream. To help ease diners into the idea of eating raw fish and seaweed, it started to take on more “American” characteristics, with sushi masters adding “substitute” ingredients such as “avocado and cream cheese”.

A by-product of this cultural collision is the California roll, with ingredients never used in sushi in Japan – cucumber, crab sticks and avocado. This invention also looked different to traditional maki, being turned “inside out”, with the nori hidden under a layer of rice.

“Most American didn’t like it … Seaweed, black in colour, [made people question] ‘Is it edible? It’s weird’,” Araki says. “[So in the California roll] they didn’t see the seaweed on the outside, so they could eat it.”

While there is much debate over who invented the California roll – some say it was Hidekazu Tojo in Vancouver; others insist it was Ken Seusa in Los Angeles – the roll is just one North American sushi adaptation that has gone on to influence the Japanese favourite’s global popularity.

“Japanese food in America had a huge impact on how Japanese food was popularised around the world,” Farrer says.

In 1996, Araki opened his own restaurant in Boston and for two years made what he describes as “crazy creative rolls”, adding ingredients such as avocado, wagyu, and colourful egg roe. He recalls he even made an Italian roll with meat, dried tomatoes, tomato sauce and mozzarella.

If you were eating sushi [in the early days] you were doing something exotic. You were doing something sensual. There was a sort of sexual element to sushi and it really fed into the 1980s of self-indulgence

“Basically you can put whatever you want [in the roll] as long as it doesn’t fall apart,” he explains.

From America, sushi eventually made its way around the world over the following four decades, but, Farrer says, “by far the most popular market for Japanese food now is Asia. Asia is surpassing Europe and the United States as the place where new Japanese restaurants are opening.”

Araki has since shunned his crazy rolls and returned to traditional Japanese sushi. Having worked for Nobu Matsuhisa for 12 years, he follows the celebrity chef’s philosophy of making food with kokoro (心), which in Japanese means “spirit”.

“When you use kokoro to make food, then the customers can feel it,” he says. “I’m 59 years old now and I try to understand how to use kokoro to serve people.”

Source: SCMP

How Almonds Went From Deadly To Delicious

Susie Neilson wrote . . . . . . . . .

St. Basil’s Hexaemeron, a Christian text from around the fourth century, contains a curious botanical instruction: Pierce an almond tree in the trunk near its roots, stick a “fat plug of pine” into its center — and its almond seeds will undergo a remarkable change.

“Thus the … bitter almonds … lose the acidity of their juice, and become delicious fruits,” the text reads. “Let not the sinner then despair of himself. … If agriculture can change the juices of plants, the efforts of the soul to arrive at virtue, can certainly triumph over all infirmities.” The cause of this change, scientists later theorized, was stress: Jamming pine wood into the almond tree’s core may have halted production of the toxins.

We don’t need pine wood to turn almonds sweet anymore. Most almonds produced today are naturally tasty and safe to eat. Back then, though, many were bitter and poisonous. Even today, consuming 50 — or fewer — wild, bitter almonds could potentially kill an adult, and just a handful contain enough cyanide to be lethal to a child.

Over time, farmers have bred domesticated almond trees to produce mostly sweet seeds. But wild almonds helped us out — and now we know just how they went from deadly to delicious. A study published this week in the journal Science sequenced the almond genome and shows that a single genetic mutation “turned off” the ability to make the toxic compound thousands of years ago — a key step before humans could domesticate almonds.

The bitterness and toxicity of wild almonds come from a compound called amygdalin. When ingested, this compound breaks down into several chemicals, including benzaldehyde, which tastes bitter, and cyanide, a deadly poison. Wild, bitter almond seeds serve as amygdalin storehouses, keeping predators away with their nasty taste and poisonous effect.

But at some point thousands of years ago, a mutation occurred in a wild almond. This mutation inhibits the production of amygdalin almost completely. Sweet almonds still have trace amounts of amygdalin but not enough, by any reasonable measure, to produce dangerous amounts of cyanide.

“Wild almonds are bitter and lethal, even in tiny amounts, because [they have] this amygdalin,” says study co-author Stefano Pavan, a professor in agricultural genetics and plant breeding at the University of Bari in Italy. (Pavan’s primary co-author was Raquel Sánchez-Pérez, a senior biochemistry researcher at CEBAS-CSIC, an agricultural research center in Spain.) “This mutation is very important because it’s the mutation that allowed almond domestication.”

Sometime after the almond mutation occurred, according to the researchers, humans discovered this sweet variant. When exactly this happened, though, is still unknown. Almond trees are widely believed to be among the world’s first domesticated trees. Archaeological evidence of cultivated almonds dates back to 3,000 B.C. But some geneticists think that humans probably started cultivating sweet mutated almonds much earlier than that, around 12,000 years ago.

What we do know: Once humans started encountering these new, tasty almonds, we embraced them with gusto. From Greece to California, we planted almond trees in droves and picked our trees carefully for the “sweet” allele — which is dominant over the “bitter” allele anyway. Over time, domesticated almonds lost almost all of their amygdalin.

Today, many people have never even heard of poisonous almonds, much less come across one in the wild — though some folks still eat bitter almonds in small doses. In Tunisia, for instance, people still make orgeat syrup with bitter almonds.

Dianne Velasco, a postdoctoral researcher in plant genetics at the University of California, Davis, whose work focuses on almonds and peaches, says that the research could potentially be put to use “very quickly” in helping plant breeders raise almonds more efficiently.

She says that right now, the earliest that almond breeders can assess the bitterness of their almond varieties is when their trees ripen and produce almonds, at three to five years of age. Knowing what mutation causes bitterness, she says, could potentially allow breeders to select the sweet varieties before they plant them. “This cuts into how much land usage [breeders] need, as well as cost,” she says.

Source: npr

Food History: Peanut Butter

Did you know that Canada has a peanut butter claim to fame? Marcellus Gilmore Edson of Montreal was the first person to patent modern peanut butter for peanut candy. Issued in 1884 by the United States government, Edson patented the finished product in the process of milling roasted peanuts. His patent is based on the preparation of a peanut paste as an intermediate to the production of the modern product we know as peanut butter.

Since peanut butter made its debut hundreds of years ago, it has proven to be a popular snack item. In fact, recent research shows that peanut consumption in Canada continues to ride high, indicating that peanut products are a staple in Canadian homes.

Source: Peanut Bureau of Canada