Video: Cotton Candy Burrito

Sugar Sugar in Sarnia, Canada serves the combination of cotton candy and ice cream in the form of a burrito.

Watch video at You Tube (1:33 minutes) . . . . .

Chinese-style Stir-fried Egg white with Dried Scallop

Ingredients

3/4 oz conpoy (dried scallop)
6 big egg whites
1 egg yolk

Seasoning

4 Tbsp evaporated milk,
1 Tbsp cornstarch
dash of sesame oil
pinch of ground white pepper
1/2 tsp salt

Ginger Vinegar
1/2 Tbsp finely diced ginger,
2 Tbsp Zhejiang red vinegar (浙醋)

Method

  1. Wash dried scallop. Soak in water for 2 hours. Steam with 1 slice ginger and 1/2 tsp wine for 1/2 hour. Tear into fine shreds when cold.
  2. Whisk egg white to mix. Add seasoning and dried scallop. Whisk again to combine.
  3. Heat a wok with 1/2 cup oil. When oil is warm, whisk egg white again and pour into the oil. Stir-fry over low heat untill cooked. Remove and drain off oil. Transfer to the serving plate. Put egg yolk on top and serve with ginger vinegar on the side.
  4. Before eating, dizzle some vinegar on top of the dish. Mix egg white with egg yolk and enjoy.

Source: Hong Kong magazine

Scientists Are Turning People’s Food Photos Into Recipes

lauel Dalryple wrote . . . . . . . .

When someone posts a photo of food on social media, do you get cranky? Is it because you just don’t care what other people are eating? Or is it because they’re enjoying an herb-and-garlic crusted halibut at a seaside restaurant while you sit at your computer with a slice of two-day-old pizza?

Maybe you’d like to have what they’re having, but don’t know how to make it. If only there were a way to get their recipe without commenting on the photo.

Researchers at the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Laboratory (CSAIL) would like that for you, too. That’s why they’re creating an artificial neural network — a computer system modeled after the human brain — to examine those photos and break them down into recipes.

The growth of the Internet has supported the ability to collect and publish several large-scale datasets, allowing for great advances in the field of artificial intelligence (AI), says Javier Marin, a postdoctoral research associate at CSAIL and co-author of a paper published this July at the Conference on Computer Vision and Pattern Recognition in Honolulu.

“However, when it comes to food, there was not any large-scale dataset available in the research community until now,” Marin says. “There was a clear need to better understand people’s eating habits and dietary preferences.”

To do this, researchers have been feeding the computer pairs of photos and their corresponding recipes — about 800,000 of them. The AI network, called Recipe 1M, chews on all of that for a while, learning patterns and connections between the ingredients in the recipes and the photos of food.

“What we’ve developed is a novel machine learning model that powers an app. The demo that you see is just a pretty interface to that model,” says Nicholas Hynes, an MIT graduate student at CSAIL who also co-authored the paper.

You, too, can try out this interface, called Pic2Recipe. To use it, just upload your food photo. The computer will analyze it and retrieve a recipe from a collection of test recipes that best matches your image.

It usually works pretty well, although it can miss an ingredient or two sometimes. Take for example, this video, in which the MIT team uploads a photo of sugar cookies.

“The app took the image, figured out what was in it and how it was prepared, and gave us the recipe that it thinks was most likely to have produced the image,” says Hynes.

Pic2Recipe did correctly identify eight out of the 11 ingredients. And it did accurately find a recipe for sugar cookies. Alas, it missed the icing.

But the program doesn’t need to visually recognize every ingredient in the photo to find an accurate recipe.

“Just like a human, it can infer the presence of invisible, homogenized or obscured ingredients using context. For instance, if I see a green colored soup, it probably contains peas — and most definitely salt!” says Hynes. “When the model finds the best match, it’s really taking a holistic view of the entire image or the entire recipe. That’s part of why the model is interesting: It learns a lot about recipes in a very unstructured way.”

But as with every new technology, there are some kinks to work out.

The current model sometimes has trouble making fine distinctions between similar recipes, Hynes says. “For instance, it may detect a ham sandwich as pastrami or not recognize that brioche contains milk and egg. We’re still actively improving the vision portion of the model.”

Another issue, Hynes says, is that the current model has no explicit knowledge of basic concepts like flavor and texture. “Without this, it might replace one ingredient with another because they’re used in similar contexts, but, doing so would significantly alter this dish,” Hynes says. “For example, there are two very similar Korean fermented ingredients called gochujang and doenjang, but the former is spicy and sweet while the latter is savory and salty.”

There are other refinements to be made, such as how to recognize an ingredient as diced, chopped or sliced. Or how to tell the difference between different types of mushrooms or tomatoes.

And when a reporter at The Verge tried the demo, photos of ramen and potato chips turned up no matches. How could the program miss such basics?

“This is simply explained by not having recipes for those foods in the dataset,” Hynes says. “For things like ramen and potato chips, people generally don’t post recipes for things that come out of a bag.”

In the future, the MIT researchers want to do more than just let you have what they’re having. They are seeking insight into health and eating habits.

“Determining the ingredients — and therefore how healthy they are — of images posted in a specific region, we could see how health habits change through time,” says Marin.

Hynes would like to take the technology a step farther, and is working on a way to automatically link from an image or ingredient list to nutrition information.

“Using it to improve peoples’ health is definitely big; when I go to community/potluck dinners, it always astonishes me how people don’t pay attention to preparation and how it relates to plausible serving sizes,” he says.

Hynes also can see how aspiring cooks might appreciate a system that takes a restaurant item and tells them how to make it. “Even everyday people with dietary restrictions — gluten free, vegan, sparse pantry — would appreciate a tool that could minimally modify a complicated dish like Beef Wellington so that it fits the constraints.”

And why stop there? These are MIT scientists, after all, collaborating with researchers from the Qatar Computing Research Institute and the Polytechnic University of Catalonia in Spain.

“In the far future, one might envision a robo-chef that fully understands food and does the cooking for you!” Hynes says.

Source: npr

Scientists Unveil their Star Trek-like ‘Tricorder’ to Track Human Health

Dennis Thompson wrote . . . . . . .

Average folks may one day be able to use a Star Trek-inspired home medical device to diagnose a dozen different ailments and track five major vital signs, all without needing to draw blood or visit a doctor’s office.

Engineers developed the DxtER device as part of a competition to create a modern-day version of the “tricorder” that Dr. Leonard “Bones” McCoy waved over patients on the starship Enterprise to diagnose illnesses.

The DxtER combines an array of different sensors with intelligent diagnostic software in a package weighing less than 5 pounds, said technology design expert Philip Charron, a member of the team that created the device.

A person using the DxtER at home can test themselves for anemia, urinary tract infections, diabetes, atrial fibrillation (irregular heartbeat), sleep apnea, COPD (chronic obstructive pulmonary disease), pneumonia, ear infection, whooping cough, high blood pressure, mononucleosis and increased white blood cell counts, Charron said.

“A regular consumer can sit down with our device, answer some questions, and it will come back with a diagnosis,” Charron said.

The DxtER also can monitor five vital signs continuously: blood pressure, heart rate, body temperature, respiratory rate and blood oxygen levels.

All of the sensors are noninvasive. No blood needs to be drawn and nothing needs to be inserted into the body, he explained.

Charron said that his group, Final Frontier Medical Devices, received a $2.6 million award in April for developing the DxtER and demonstrating an accuracy rate higher than 70 percent.

Final Frontier was to present the DxtER this week at the annual meeting of the American Association for Clinical Chemistry (AACC) in San Diego. Findings presented at meetings are typically viewed as preliminary until they’ve been published in a peer-reviewed journal.

“Now it’s up to the peers in the scientific community to see if it works the way it’s supposed to,” said AACC President Michael Bennett, director of the Palmieri Metabolic Laboratory at Children’s Hospital of Philadelphia.

Charron said the DxtER combines:

  • A chest sensor for monitoring heart rate, respiration and temperature.
  • A wrist and hand sensor that checks blood pressure, blood sugar levels, hemoglobin, white blood cell counts and blood oxygen levels.
  • A digital stethoscope that listens to sounds from breathing.
  • A spirometer, which is a device to measure the air capacity of the lungs when people breathe into the machine.

All of these devices interact with diagnostic software running on a tablet, Charron noted.

“You apply some of these devices to your body, and then answer questions on the tablet about your symptoms, just like you would with a doctor in a doctor’s office,” Charron said.

The competition, called the Qualcomm Tricorder XPRIZE, kicked off in 2012 and involved 300 different engineering teams.

“It really was a five-year mission, just like in Star Trek,” Charron said.

The Final Frontier team put about a half million dollars out of their own pockets into the endeavor, which ate up as many as 24,000 hours of combined effort, according to Charron.

The team tested its diagnostic software against an approved set of anonymous patient charts, to see if it would accurately diagnose illnesses when provided the proper vital signs and body chemistry readings, Charron said.

For the contest, researchers handed the device to patients with diagnosed medical conditions. The patients had to be able to use the device on themselves, unaided, to accurately diagnose their specific illness.

The point was to develop a home health care kit that could help families figure out basic illnesses, preventing unnecessary trips to the emergency room or doctor’s office, Bennett said.

Because of this, it was built to be affordable. The DxtER should cost between $200 and $400 retail, Charron said, although insurers might chip in to bring the cost down even lower if the device proves accurate enough to cut down on visits to the hospital or doctor.

Alongside home health care, the DxtER also could prove of great benefit for disaster relief, in refugee camps, as part of military medicine, and even as a means of diagnosing health problems during space travel, Charron suggested.

The DxtER now will proceed to clinical trials. The device’s accuracy will be tested against tried-and-true equipment now being used in hospitals, and the results will help researchers better hone the diagnostic software, Charron said.

James Nichols is medical director of clinical chemistry at Vanderbilt University School of Medicine. He said, “As with any new device, we need to see what its performance is. We want it to be reproducible and be accurate and compare well with traditional known tests. When we see how it actually performs once it’s put in the hands of laypeople, then we’ll know a little better what still needs to be worked out with the device.”

As part of development, Nichols added, engineers also need to consider how the data gathered by DxtER will be uploaded and shared with a person’s primary care doctor.

“The data is not of much use in the device itself,” Nichols said. “It needs to connect with the person’s medical record.”

Source: HealthDay


Today’s Comic