Artificial Intelligence Might Help Spot, Evaluate Prostate Cancer

Amy Norton wrote . . . . . . . . .

In another step toward using artificial intelligence in medicine, a new study shows that computers can be trained to match human experts in judging the severity of prostate tumors.

Researchers found that their artificial intelligence system was “near perfect” in determining whether prostate tissue contained cancer cells. And it was on par with 23 “world-leading” pathologists in judging the severity of prostate tumors.

No one is suggesting computers should replace doctors. But some researchers do think AI technology could improve the accuracy and efficiency of medical diagnoses.

Typically, it works like this: Researchers develop an algorithm using “deep learning” — where a computer system mimics the brain’s neural networks. It’s exposed to a large number of images — digital mammograms, for example — and it teaches itself to recognize key features, such as signs of a tumor.

Earlier this month, researchers reported on an AI system that appeared to best radiologists in interpreting screening mammograms. Other studies have found that AI can outperform doctors in distinguishing harmless moles from skin cancer, and detecting breast tumor cells in lymph node samples.

The new study looked at whether it’s possible to train an AI system to detect and “grade” prostate cancer in biopsied tissue samples. Normally, that’s the work of clinical pathologists — specialists who examine tissue under the microscope to help diagnose disease and judge how serious or advanced it is.

It’s painstaking work and, to a certain degree, subjective, according to study leader Martin Eklund, a senior researcher at the Karolinska Institute in Sweden.

Then there’s the workload. In the United States alone, more than 1 million men undergo a prostate biopsy each year — producing more than 10 million tissue samples to be examined, Eklund’s team noted.

To create their AI system, the researchers digitized more than 8,000 prostate tissue samples from Swedish men ages 50 to 69, creating high-resolution images. They then exposed the system to roughly 6,600 images — training it to learn the difference between cancerous and noncancerous tissue.

Next came the test phase. The AI system was asked to distinguish benign tissue from cancer in the remaining samples, plus around 300 from men who’d had biopsies at Karolinska. The AI results, the researchers reported, were almost always in agreement with the original pathologist’s assessment.

And when it came to grading the severity of prostate tumors with what’s called a Gleason score, the AI system was comparable to the judgment of 23 leading pathologists from around the world.

Much work, however, remains. A next step, Eklund said, is to see how the AI system performs across different labs and different pathology scanners, which are used to create digital images.

But one day, he said, AI could be used in a number of ways — including as a “safety net” to make sure a pathologist didn’t miss a cancer. It might also improve efficiency by prioritizing suspicious biopsies that pathologists should examine sooner.

Studies like this are a necessary step toward incorporating AI into medical practice, said Dr. Matthew Hanna, a pathologist at Memorial Sloan Kettering Cancer Center in New York City.

But, he stressed, “there’s still a long road ahead.”

Hanna, who was not involved in the study, is also a spokesperson for the College of American Pathologists.

Like Eklund, he said that any AI system would have to be validated across different centers, and different pathology scanners. And ultimately, Hanna said, studies will need to show that such technology can be used effectively in pathologists’ real-world practice.

There are practical realities, too. At the moment, Hanna pointed out, only a relative minority of pathology labs use digital systems in patient care. That’s key because for any AI algorithm to work, there have to be digital images to analyze. Most often, pathologists still study tissue using the classic approach — glass slides and a microscope.

What’s clear is that machines won’t be replacing humans — at least in the foreseeable future.

“This technology is coming,” Hanna said. “But as opposed to replacing doctors, it will transform how they deliver care — hopefully for the better.”

The study was reported online in The Lancet Oncology.

Source: HealthDay


Today’s Comic

Researchers Produce First Laser Ultrasound Images of Humans

Jennifer Chu wrote . . . . . . . . .

For most people, getting an ultrasound is a relatively easy procedure: As a technician gently presses a probe against a patient’s skin, sound waves generated by the probe travel through the skin, bouncing off muscle, fat, and other soft tissues before reflecting back to the probe, which detects and translates the waves into an image of what lies beneath.

Conventional ultrasound doesn’t expose patients to harmful radiation as X-ray and CT scanners do, and it’s generally noninvasive. But it does require contact with a patient’s body, and as such, may be limiting in situations where clinicians might want to image patients who don’t tolerate the probe well, such as babies, burn victims, or other patients with sensitive skin. Furthermore, ultrasound probe contact induces significant image variability, which is a major challenge in modern ultrasound imaging.

Now, MIT engineers have come up with an alternative to conventional ultrasound that doesn’t require contact with the body to see inside a patient. The new laser ultrasound technique leverages an eye- and skin-safe laser system to remotely image the inside of a person. When trained on a patient’s skin, one laser remotely generates sound waves that bounce through the body. A second laser remotely detects the reflected waves, which researchers then translate into an image similar to conventional ultrasound.

In a paper published today by Nature in the journal Light: Science and Applications, the team reports generating the first laser ultrasound images in humans. The researchers scanned the forearms of several volunteers and observed common tissue features such as muscle, fat, and bone, down to about 6 centimeters below the skin. These images, comparable to conventional ultrasound, were produced using remote lasers focused on a volunteer from half a meter away.

“We’re at the beginning of what we could do with laser ultrasound,” says Brian W. Anthony, a principal research scientist in MIT’s Department of Mechanical Engineering and Institute for Medical Engineering and Science (IMES), a senior author on the paper. “Imagine we get to a point where we can do everything ultrasound can do now, but at a distance. This gives you a whole new way of seeing organs inside the body and determining properties of deep tissue, without making contact with the patient.”

Early concepts for noncontact laser ultrasound for medical imaging originated from a Lincoln Laboratory program established by Rob Haupt of the Active Optical Systems Group and Chuck Wynn of the Advanced Capabilities and Technologies Group, who are co-authors on the new paper along with Matthew Johnson. From there, the research grew via collaboration with Anthony and his students, Xiang (Shawn) Zhang, who is now an MIT postdoc and is the paper’s first author, and recent doctoral graduate Jonathan Fincke, who is also a co-author. The project combined the Lincoln Laboratory researchers’ expertise in laser and optical systems with the Anthony group’s experience with advanced ultrasound systems and medical image reconstruction.

Yelling into a canyon — with a flashlight

In recent years, researchers have explored laser-based methods in ultrasound excitation in a field known as photoacoustics. Instead of directly sending sound waves into the body, the idea is to send in light, in the form of a pulsed laser tuned at a particular wavelength, that penetrates the skin and is absorbed by blood vessels.

The blood vessels rapidly expand and relax — instantly heated by a laser pulse then rapidly cooled by the body back to their original size — only to be struck again by another light pulse. The resulting mechanical vibrations generate sound waves that travel back up, where they can be detected by transducers placed on the skin and translated into a photoacoustic image.

While photoacoustics uses lasers to remotely probe internal structures, the technique still requires a detector in direct contact with the body in order to pick up the sound waves. What’s more, light can only travel a short distance into the skin before fading away. As a result, other researchers have used photoacoustics to image blood vessels just beneath the skin, but not much deeper.

Since sound waves travel further into the body than light, Zhang, Anthony, and their colleagues looked for a way to convert a laser beam’s light into sound waves at the surface of the skin, in order to image deeper in the body.

Based on their research, the team selected 1,550-nanometer lasers, a wavelength which is highly absorbed by water (and is eye- and skin-safe with a large safety margin). As skin is essentially composed of water, the team reasoned that it should efficiently absorb this light, and heat up and expand in response. As it oscillates back to its normal state, the skin itself should produce sound waves that propagate through the body.

The researchers tested this idea with a laser setup, using one pulsed laser set at 1,550 nanometers to generate sound waves, and a second continuous laser, tuned to the same wavelength, to remotely detect reflected sound waves. This second laser is a sensitive motion detector that measures vibrations on the skin surface caused by the sound waves bouncing off muscle, fat, and other tissues. Skin surface motion, generated by the reflected sound waves, causes a change in the laser’s frequency, which can be measured. By mechanically scanning the lasers over the body, scientists can acquire data at different locations and generate an image of the region.

“It’s like we’re constantly yelling into the Grand Canyon while walking along the wall and listening at different locations,” Anthony says. “That then gives you enough data to figure out the geometry of all the things inside that the waves bounced against — and the yelling is done with a flashlight.”

In-home imaging

The researchers first used the new setup to image metal objects embedded in a gelatin mold roughly resembling skin’s water content. They imaged the same gelatin using a commercial ultrasound probe and found both images were encouragingly similar. They moved on to image excised animal tissue — in this case, pig skin — where they found laser ultrasound could distinguish subtler features, such as the boundary between muscle, fat, and bone.

Finally, the team carried out the first laser ultrasound experiments in humans, using a protocol that was approved by the MIT Committee on the Use of Humans as Experimental Subjects. After scanning the forearms of several healthy volunteers, the researchers produced the first fully noncontact laser ultrasound images of a human. The fat, muscle, and tissue boundaries are clearly visible and comparable to images generated using commercial, contact-based ultrasound probes.

The researchers plan to improve their technique, and they are looking for ways to boost the system’s performance to resolve fine features in the tissue. They are also looking to hone the detection laser’s capabilities. Further down the road, they hope to miniaturize the laser setup, so that laser ultrasound might one day be deployed as a portable device.

“I can imagine a scenario where you’re able to do this in the home,” Anthony says. “When I get up in the morning, I can get an image of my thyroid or arteries, and can have in-home physiological imaging inside of my body. You could imagine deploying this in the ambient environment to get an understanding of your internal state.”

Source: MIT News

The 2019 Kitchen Technology Year in Review

Michael Wolf wrote . . . . . . . . .

2019 was an action-packed year in world of food tech. Among other things, we saw an explosion in new products that promise to change what we eat, rapid change in food delivery models, and something of a slow motion food robot uprising.

The consumer kitchen also saw significant change, even if things didn’t move as fast as some would hope. As we close out the year, I thought I’d take a look back at the past twelve months in the future kitchen.

It’s An Instant Pot and Air Fryer World and We’re Just Living In It

Here’s an experiment: Next time you’re making cocktail-party conversation, ask someone about their most recent cooking gadget purchase for the home. Chances are its either an Instant Pot or an air fryer.

Regardless of how the two products perform relative to one another, the big takeaway is that the Instant Pot/pressure cooker and air fryer represent the two breakaway categories in countertop cooking over the past few years, and that trend continued strong in 2019.

Why? Because both products give consumers lots of cooking power to create a variety of meals at a low entry price point. Add in what are large and vibrant online recipe communities for both product categories, and you can see why both only became more popular in 2019.

Next-Gen Cooking Concepts See Mixed Results

Outside of pressure cookers and air fryers, 2019 was a decidedly mixed bag of results for next-gen countertop cooking concepts. June and Tovala both plugged along selling their second generation ovens, Suvie started shipping its four-chamber cooking robot and Brava’s “cook with light” oven tech sold to Middleby. But unlike the air fryer and Instant Pot, none of these new products have gone viral.

Why?

First, most of these products are fairly expensive, often coming in at $300 or above. That’s probably too high to convince enough consumers to take a chance on a new product in a new product category they don’t know much about.

Second, consumers need to better understand these new products. While I don’t expect Thermomix to replicate the success they’ve found with direct-sales in Europe in North America, there’s a reason such a premium priced product has succeeded in Europe: it has made consumer education and evangelism core to the business model.

Finally, the market has yet to see a product with just the right mix of new technology and high-value user-focused features that supercharges consumer interest. That said, there are some new products like Anova’s steam oven or the Miele Dialog’s solid state cooking (I’m told most big appliance makers are working on a similar product) that could potentially capture the imagination of consumers.

Large Appliance-Makers Continue to Dabble in Innovation

So here’s what some of the big appliance brands with resources did in 2019: Whirlpool came out of the gate fast with a lineup of new smart cooking appliances at CES 2019, including a pretty cool modular smart oven concept in the SmartOven+. Electrolux launched a new Drop-powered blender and partnered with Smarter to add machine vision and connected commerce features to its smart fridge camera platform. Turkish appliance giant Arcelik debuted a combo cooking and washing product concept under the Grundig brand.

Overall though, it wasn’t a big year for appliance-makers on the innovation front. Many of us waited for these companies to launch some of their more promising technologies, like Miele’s Dialog or BSH Appliances Pai interface, but neither effort seemed to move forward much, at least in any public way, in 2019.

A Sputtering Consumer Sous Vide Market

It was a bad year for those who make sous vide gear. In mid-year we learned that ChefSteps, maker of the Joule sous vide circulator, would be laying off a significant amount of the team after they ran into money problems. And, just a little over a week ago, one of the first consumer sous vide startups in Nomiku announced it would be shutting its doors.

Why did the consumer sous vide market lose steam? My guess the primary reason is because sous vide cooking is just too slow as an everyday or multiple-time-per-week cooking method. While some like Nomiku wanted to position the sous vide as a replacement for the microwave, it just isn’t convenient enough and requires too many steps for culinary average joes accustomed to the push-button cooking of the microwave. The reality is over time many sous vide circulators ended up stuck in the kitchen drawer.

Software Powers The Meal

At Smart Kitchen Summit 2017, Jon Jenkins said we will all someday “eat software” as it becomes more important in how we create food in the kitchen. Evidence of this was everywhere in 2019 as companies rolled out new software features to do things like cook plant-based meat to companies like Thermomix and Instant Brands betting big on software for the future.

We also saw kitchen-centric software players like SideChef, Drop and Innit loaded up on more partnerships with appliance and food brands to better tie together the meal journey, while Samsung NEXT acquired a digital recipe and shopping commerce platform in Whisk.

In short, it’s fairly obvious that for a kitchen appliance brand to survive, it’s becoming table stakes to have something of an evolved software strategy.

Amazon Continues Its Push Into Kitchen

If there’s been one takeaway from watching Amazon over the past few years, it’s that they see the food and the kitchen as an important strategic battleground. This past year did nothing to dispel this belief as the company introduced their own smart oven and continued to file weird food-related patents. Amazon also pushed forward with new delivery concepts for the home that bring together the different parts of the Amazon portfolio (voice ordering, smart home, grocery and more).

Grocery Delivery Space Race

Amazon wasn’t the only one looking to connect the smart home to grocery delivery this year. Walmart also debuted a new in-fridge delivery service called InHome. Meanwhile both companies and big grocery conglomerates like Kroger continue to invest in robotics and home delivery.

The reason for this growing interest in innovative home delivery concepts is pretty simple: more and more consumers are shopping online for groceries. Big platform players like Amazon and Google see a massive new opportunity, while established grocery players are forced to innovate to play defense.

No One Has Recreated The Success of the Keurig Model (Yet)

While much of the early focus for new kitchen startups has been on copying the Keurig model of pairing a piece of kitchen hardware with a robust consumer consumables business, unfortunately none have really been able to emulate the model for food products. There’s been no shortage of cocktail making robots, coffee, 3D food printing, chai tea and others attracted the the concept of recurring revenue that food sales bring, but as we’ve seen it’s hard to emulate the pod model approach.

Some, like Tovala, look to have had some limited success on pairing cooking hardware with food delivery, while others like Brava, Nomiku and ChefSteps weren’t able to create sustainable models. Genie and Kitchenmate are making a go of it in the office environment, while Level couldn’t and had to shut its doors earlier this year.

I expect kitchen hardware entrepreneurs to try to continue to pair food sales with products, but I expect that it will be tough sailing unless the company land upon very compelling, easy-to-use solution that turnkeys the cooking solution.

Cooking Media: A Peloton For The Kitchen Emerges

Forget Buzzfeed Tasty quick-play cooking videos. In 2019, we saw the emergence of other players providing deeper, more personalized cooking guidance that emulates what Peloton or Mirror have done with home-fitness instruction. Food Network made the biggest splash with its Food Network Kitchen concept while others like FET Kitchen are creating their own hardware platforms.

For Buzzfeed’s part, they haven’t given up on Tasty quite yet. Instead, they partnered up with Amazon to push their recipes onto the Echo Show, complete with step-by-step guidance. The combo creates essentially what is a fairly quick and easy guided cooking product.

Food Waste Reduction Comes Into Focus — Everywhere But The Home

If any place is lacking in innovation when it comes to reducing the amount of food we throw away, it’s the consumer kitchen. Sure, some startups are trying to rethink how we approach cooking by helping us to work with the food we have, while others are rethinking the idea of food storage, but innovation in home food waste reduction is lacking when compared to what we are seeing in restaurants and CPG fronts. We hope this changes in 2020.

True Home Cooking Robots Remained A Futuristic Vision in 2019

While single-function taskers like the Rotimatic did significant volume and others like Suvie positioned itself as a “cooking robot” for the home, the reality is we saw no significant progress towards a true multifunction consumer cooking robot. Companies like Sony see the opportunity to create a true home cooking robot, but for now food robots remain primarily the domain of restaurants, grocery and delivery.

Source: The Spoon

First Public Taste Test of Cultured Fish Maw in Hong Kong

Catherine Lamb wrote . . . . . . . . .

For many Western consumers, “fish maw” is an unfamiliar foodstuff. However, in China and other surrounding regions, the ingredient, which is technically the dried swim bladders of large fish like sturgeon, is considered a delicacy. For that reason, it’s both extremely expensive and leading to extreme overfishing. There’s even a black market for the stuff.

In Hong Kong, startup Avant Meats is finding a more sustainable way to feed hunger for fish maw by growing it outside the animal. The company got one step closer to that goal last month, when they did the first public taste test of their cultured fish maw at the Future Food Summit at Asia Society Hong Kong.

The fish maw, grown from cells from a croaker fish, was embedded in a potato ball which was then deep-fried. Obviously we didn’t get to taste it ourselves, but in a video sent to The Spoon taste testers noted the ball’s chewy, gelatinous texture, a hallmark of fish maw. Texture is one of the biggest hurdles for cell-based meat, so if Avant Meats has indeed nailed it that could serve them well as they head to market.

When I spoke with Avant Meats co-founder and CEO Carrie Chan back in March, she explained that they had decided to focus on fish maw as their first product because of it’s simple composition, which allows them to speed up R&D, scale quickly, and come to market at a lower price point. Another reason they chose to focus on fish maw is because of its popularity with consumers in China and Hong Kong, their initial target demographic. However, according to a press release sent to The Spoon, their next product will be a fish filet that is intended for both Eastern and Western menus.

This year has been a busy one for cultured meat companies in Asia. Back in March Shiok Meat debuted its cell-based shrimp in the startup’s home country of Singapore, and Japan-based Integriculture recently did a taste test of cultured foie gras.

American companies like Memphis Meats, JUST, and Wild Type have also done several tastings of their own cell-based products, some on significantly larger scales. However, since cell-based (cultivated?) meat will likely debut in Asia, it’s exciting to see the increase in cultured meat and seafood activity in the area — especially for products developed specifically to appeal to Asian palates.

Avant Meats has raised an undisclosed pre-seed round and has a team of four in its Hong Kong HQ. They’re hoping to reach pilot production by late 2022/early 2023.

Source: The Spoon

Wearable Sweat Sensor Detects Gout-Causing Compounds

There are numerous things to dislike about going to the doctor: Paying a copay, sitting in the waiting room, out-of-date magazines, sick people coughing without covering their mouths. For many, though, the worst thing about a doctor’s visit is getting stuck with a needle. Blood tests are a tried-and-true way of evaluating what is going on with your body, but the discomfort is unavoidable. Or maybe not, say Caltech scientists.

In a new paper published in Nature Biotechnology, researchers led by Wei Gao, assistant professor of medical engineering, describe a mass-producible wearable sensor that can monitor levels of metabolites and nutrients in a person’s blood by analyzing their sweat. Previously developed sweat sensors mostly target compounds that appear in high concentrations, such as electrolytes, glucose, and lactate. Gao’s sweat sensor is more sensitive than current devices and can detect sweat compounds of much lower concentrations, in addition to being easier to manufacture, the researchers say.

The development of such sensors would allow doctors to continuously monitor the condition of patients with illnesses like cardiovascular disease, diabetes, or kidney disease, all of which result in abnormal levels of nutrients or metabolites in the bloodstream. Patients would benefit from having their physician better informed of their condition, while also avoiding invasive and painful encounters with hypodermic needles.

“Such wearable sweat sensors have the potential to rapidly, continuously, and noninvasively capture changes in health at molecular levels,” Gao says. “They could enable personalized monitoring, early diagnosis, and timely intervention.”

Gao’s work is focused on developing devices based on microfluidics, a name for technologies that manipulate tiny amounts of liquids, usually through channels less than a quarter of a millimeter in width. Microfluidics are ideal for an application of this sort because they minimize the influence of sweat evaporation and skin contamination on the sensing accuracy. As freshly supplied sweat flows through the microchannels, the device can make more accurate measurements of sweat and can capture temporal changes in concentrations.

Until now, Gao and his colleagues say, microfluidic-based wearable sensors were mostly fabricated with a lithography-evaporation process, which requires complicated and expensive fabrication processes. His team instead opted to make their biosensors out of graphene, a sheet-like form of carbon. Both the graphene-based sensors and the tiny microfluidics channels are created by engraving the plastic sheets with a carbon dioxide laser, a device that is now so common that it is available to home hobbyists.

The research team opted to have their sensor measure respiratory rate, heart rate, and levels of uric acid and tyrosine. Tyrosine was chosen because it can be an indicator of metabolic disorders, liver disease, eating disorders, and neuropsychiatric conditions. Uric acid was chosen because, at elevated levels, it is associated with gout, a painful joint condition that is on the rise globally. Gout occurs when high levels of uric acid in the body begin crystallizing in the joints, particularly those of the feet, causing irritation and inflammation.

To see how well the sensors performed, the researchers ran a series of tests with healthy individuals and patients. To check sweat tyrosine levels, which are influenced by a person’s physical fitness, they used two groups of people: trained athletes and individuals of average fitness. As expected, the sensors showed lower levels of tyrosine in the sweat of the athletes. To check uric acid levels, they took a group of healthy individuals and monitored their sweat while they were fasting as well as after they ate a meal rich in purines, compounds in food that are metabolized into uric acid. The sensor showed uric acid levels rising after the meal. Gao’s team also performed a similar test with gout patients. Their uric acid levels, the sensor showed, were much higher than those of healthy people.

To check the accuracy of the sensors, the researchers also drew blood samples from the gout patients and healthy subjects. The sensors’ measurements of uric acid levels strongly correlated with levels of the compound in the blood.

Gao says the high sensitivity of the sensors, along with the ease with which they can be manufactured, means they could eventually be used by patients at home to monitor conditions like gout, diabetes, and cardiovascular diseases. Having accurate real-time information about their health could even allow a patient to adjust their own medication levels and diet as required.

“Considering that abnormal circulating nutrients and metabolites are related to a number of health conditions, the information collected from such wearable sensors will be invaluable for both research and medical treatment,” Gao says.

Source: California Institute of Technology