The Power of Advanced NGS Technology in Routine Pathogen Testing of Food

Stephanie Pollard wrote . . . . . . . . .

The food industry is beginning to transition into an era of big data and analytics unlike anything the industry has ever experienced. However, while the evolution of big data brings excitement and the buzz of new possibilities, it also comes coupled with an element of confusion due to the lack of tools for interpretation and lack of practical applications of the newly available information.

As we step into this new era and begin to embrace these changes, we need to invest time to educate ourselves on the possibilities before us, then make informed and action-oriented decisions on how to best use big data to move food safety and quality into the next generation.

Stephanie Pollard will be presenting “The Power of Advanced NGS Technology in Routine Pathogen Testing” at the 2018 Food Safety Consortium | November 13–15One of the big questions for big data and analytics in the food safety industry is the exact origins of this new data. Next Generation Sequencing (NGS) is one new and disruptive technology that will contribute significantly to a data explosion in our industry.

NGS-based platforms offer the ability to see what was previously impossible with PCR and other technologies. These technologies generate millions of sequences simultaneously, enabling greater resolution into the microbial ecology of food and environmental surfaces.

This represents a seismic shift in the food safety world. It changes the age-old food microbiology question from: “Is this specific microbe in my sample?” to “what is the microbial makeup of my sample?”

Traditionally, microbiologists have relied on culture-based technologies to measure the microbial composition of foods and inform risk management decisions. While these techniques have been well studied and are standard practices in food safety and quality measures, they only address a small piece of a much bigger microbial puzzle. NGS-based systems allow more complete visibility into this puzzle, enabling more informed risk management decisions.

With these advances, one practical application of NGS in existing food safety management systems is in routine pathogen testing. Routine pathogen testing is a form of risk assessment that typically gives a binary presence/absence result for a target pathogen.

NGS-based platforms can enhance this output by generating more than the standard binary result through a tunable resolution approach. NGS-based platforms can be designed to be as broad, or as specific, as desired to best fit the needs of the end user.

Imagine using an NGS-based platform for your routine pathogen testing needs, but instead of limiting the information you gather to yes/no answers for a target pathogen, you also obtain additional pertinent information, including: Serotype and/or strain identification, resident/transient designation, predictive shelf-life analysis, microbiome analysis, or predictive risk assessment.

By integrating an NGS-based platform into routine pathogen testing, one can begin to build a microbial database of the production facility, which can be used to distinguish resident pathogens and/or spoilage microbes from transient ones. This information can be used to monitor and improve existing or new sanitation practices as well as provide valuable information on ingredient quality and safety.

This data can also feed directly into supplier quality assurance programs and enable more informed decisions regarding building partnerships with suppliers who offer superior products.

Similarly, by analyzing the microbiome of a food matrix, food producers can identify the presence of food spoilage microbes to inform more accurate shelf-life predictions as well as evaluate the efficacy of interventions designed to reduce those microbes from proliferating in your product (e.g. modified packaging strategies, storage conditions, or processing parameters).

Envision a technology that enables all of the aforementioned possibilities while requiring minimal disruption to integrate into existing food safety management systems. NGS-based platforms offer answers to traditional pathogen testing needs for presence/absence information, all the while providing a vast amount of additional information. Envision a future in which we step outside of our age-old approach of assessing the safety of the food that we eat via testing for the presence of a specific pathogen. Envision a future in which we raise our standards for safety and focus on finding whatever is there, without having to know in advance what to look for.

Every year we learn of new advancements that challenge the previously limited view on the different pathogens that survive and proliferate on certain food products and have been overlooked (e.g., Listeria in melons). Advanced NGS technologies allow us to break free of those associations and focus more on truly assessing the safety and quality of our products by providing a deeper understanding of the molecular makeup of our food.

Source: Food Safety Tech


Cute Automated Delivery Robot Unveiled

Annie Palmer wrote . . . . . . . . .

Your next Starbucks latte might be delivered by an adorable roving robot.

Postmates, the food and grocery delivery company, has debuted its new autonomous delivery robot, named ‘Serve.’

The four-wheeled rover closely resembles a brightly colored cooler, except it has huge, saucer-shaped eyes and an array of cameras meant to help it navigate the streets.

For now, Postmates will dispatch the first Serve robots in Los Angeles before it plans to roll out more robots in several cities across the U.S. over the next year.

The firm says Serve will replace human deliverymen when items only need to be delivered short distances, such as a couple blocks around the corner.

When a user places an order, a Serve might be dispatched and show up at their door.

From there, they’ll either enter a code on the device’s touchscreen or use their phone to unlock the latch at the top of the device.

They simply reach in, grab their food item and then Serve is on its way.

Serve is ‘all-electric,’ can carry a payload of up to 50lbs and can travel up to 30 miles on a single charge.

The robot uses a combination of cameras and LIDAR technology to get around town.

‘Using Lidar and the most advanced sensors of any automated delivery rover, Serve creates a virtual picture of the world in real time,’ Postmates explained.

‘An interactive touch screen is a part of how Serve communicates.’

It also moves at ‘walking speed’ so as to not get in anyone’s way while rolling down the sidewalk.

There’s several flashing strips of LED along the sides of the device that act as turn signals and will flash if it changes directions.

A touchscreen on top of the device acts as a way for customers to interact with it.

Serve is meant to appear human-like, with huge eyes that blink and it’ll even play music like a pseudo-ice cream truck.

The display is equipped with video chat software in case of an emergency, while a ‘Help’ button on the device can also be triggered if a user has a question.

Even though it’s semi-autonomous, a human is always monitoring the device from a remote control room and is able to take over operating Serve at any time using a game controller.

Postmates created Serve with the community and regulators in mind. It faces stricter regulations in cities like San Francisco, where tech companies must obtain a permit before they can test robots on the streets.

The firm describes Serve as a ‘respectful member of the community’ that yields to pedestrians and won’t hog the sidewalk.

‘Serve safely walks alongside pedestrians, navigates around fire hydrants, and respects our sidewalks,’ Postmates said.

The firm said it hopes Serve will help cut costs and save time on deliveries.

And in the future, Postmates envisions even more capabilities for the roving robot.

‘It could patrol the neighborhood,’ Bastian Lehmann, Postmates’ co-founder and CEO, told Wired.

‘Or you could use it for evil things, like it could write parking tickets.’

Source: Daily Mail

Video: Tasting the World’s First Test-Tube Steak

Israeli-based Aleph Farms says it has created the world’s first steak grown in a lab. We got a taste.

The race is on to create lab-grown meat products. Still, little is known about their safety and potential impact. In this episode of Moving Upstream, WSJ’s Jason Bellini visits entrepreneurs, scientists, and ranchers to understand how it’s made, and gets a first taste of steak grown from cultured cells.

Watch video at The Wall Street Journal (9:58 minutes) . . . . .

Read also:

Aleph Farms Puts a Steak in the Ground, Unveils New Cell-Based Cut of Meat . . . . .

World’s Smallest Wearable Device Warns of UV Exposure, Enables Precision Phototherapy

Marla Paul wrote . . . . . . . . .

The world’s smallest wearable, battery-free device has been developed by Northwestern Medicine and Northwestern’s McCormick School of Engineering scientists to measure exposure to light across multiple wavelengths, from the ultra violet (UV), to visible and even infrared parts of the solar spectrum. It can record up to three separate wavelengths of light at one time.

The device’s underlying physics and extensions of the platform to a broad array of clinical applications are reported in a study to be published Dec. 5 in Science Translational Medicine. These foundational concepts form the basis of consumer devices launched in November to alert consumers to their UVA exposure, enabling them to take action to protect their skin from sun damage.

When the solar-powered, virtually indestructible device was mounted on human study participants, it recorded multiple forms of light exposure during outdoor activities, even in the water. The device monitored therapeutic UV light in clinical phototherapy booths for psoriasis and atopic dermatitis as well as blue light phototherapy for newborns with jaundice in the neonatal intensive care unit. It also demonstrated the ability to measure white light exposure for seasonal affective disorder.

As such, it enables precision phototherapy for these health conditions, and it can monitor, separately and accurately, UVB and UVA exposure for people at high risk for melanoma, a deadly form of skin cancer. For recreational users, the sensor can help warn of impending sunburn.

The device was designed by a team of researchers in the group of John Rogers, the Louis Simpson and Kimberly Querrey Professor of Materials Science and Engineering, Biomedical Engineering in the McCormick School of Engineering and a professor of neurological surgery at Northwestern University Feinberg School of Medicine.

“From the standpoint of the user, it couldn’t be easier to use – it’s always on yet never needs to be recharged,” Rogers said. “It weighs as much as a raindrop, has a diameter smaller than that of an M&M and the thickness of a credit card. You can mount it on your hat or glue it to your sunglasses or watch.”

It’s also rugged, waterproof and doesn’t need a battery. “There are no switches or interfaces to wear out, and it is completely sealed in a thin layer of transparent plastic,” Rogers said. “It interacts wirelessly with your phone. We think it will last forever.”

Rogers tried to break it. His students dunked devices in boiling water and in a simulated washing machine. They still worked.

Northwestern scientists are particularly excited about the device’s use for measuring the entire UV spectrum and accumulating total daily exposure.

“There is a critical need for technologies that can accurately measure and promote safe UV exposure at a personalized level in natural environments,” said co-senior author Dr. Steve Xu, instructor in dermatology at Feinberg and a Northwestern Medicine dermatologist.

“We hope people with information about their UV exposure will develop healthier habits when out in the sun,” Xu said. “UV light is ubiquitous and carcinogenic. Skin cancer is the most common type of cancer worldwide. Right now, people don’t know how much UV light they are actually getting. This device helps you maintain an awareness and for skin cancer survivors, could also keep their dermatologists informed.”

Light wavelengths interact with the skin and body in different ways, the scientists said.

“Being able to split out and separately measure exposure to different wavelengths of light is really important,” Rogers said. “UVB is the shortest wavelength and the most dangerous in terms of developing cancer. A single photon of UVB light is 1,000 times more erythrogenic, or redness inducing, compared to a single photon of UVA.”

In addition, the intensity of the biological effect of light changes constantly depending on weather patterns, time and space.

“If you’re out in the sun at noon in the Caribbean, that sunlight energy is very different than noon on the same day in Chicago.” – dermatologist Steve Xu

Skin cancer is reaching epidemic proportions in the U.S. Basal cell carcinoma and squamous cell carcinoma of the skin account for more than 5.4 million cases per year at a cost of $8.1 billion dollars yearly. In 2018, there will be an estimated 178,000 new cases of melanoma, causing 9,000 deaths. Every hour, one person dies of melanoma.

First accurate dosing of phototherapy

Currently, the amount of light patients actually receive from phototherapy is not measured. “We know that the lamps for phototherapy are not uniform in their output — a sensor like this can help target problem areas of the skin that aren’t getting better,” said Xu. Doctors don’t know how much blue light a jaundiced newborn is actually absorbing or how much white light a patient with seasonal affective disorder gets from a light box. The new device will measure this for the first time and allow doctors to optimize the therapy by adjusting the position of the patient or the light source.

Because the device operates in an “always on” mode, its measurements are more precise and accurate than any other light dosimeter now available, the scientists said. Current dosimeters only sample light intensity briefly at set time intervals and assume that the light intensity at times between those measurements is constant, which is not necessarily the case, especially in active, outdoor use scenarios. They are also clunky, heavy and expensive.

How the tiny sensor works

Light passes through a window in the sensor and strikes a millimeter-scale semiconductor photodetector. This device produces a minute electrical current with a magnitude proportional to the intensity of the light. This current passes to an electronic component called a capacitor where the associated charge is captured and stored. A communication chip embedded in the sensor reads the voltage across this capacitor and passes the result digitally and wirelessly to the user’s smartphone. At the same time, it discharges the capacitor, thereby resetting the device.

Multiple detectors and capacitors allow measurements of UVB and UVA exposure separately. The device communicates with the users’ phone to access weather and global UV index information (the amount of light coming through the clouds). By combining this information, the user can infer how much time they have been in the direct sun and out of shade. The user’s phone can then send an alert if they have been in the sun too long and need to duck into the shade.

Called “My Skin Track UV”, the UVA version of the platform is now commercially available.

Source: Northwestern University

Today’s Comic

Revolutionary Technology Pinpoints Biopsies to Detect Prostate Cancer

A team of engineers and medical researchers found that the technology enabled surgeons to pick up clinically relevant cancers that were missed when using current visual detection methods. The best approach would be to use both techniques in tandem, according to the findings published today in European Urology.

The software is deployed via a system called SmartTarget®.

The advent of MRI-targeted biopsies, where MRI scans are used to inform surgeons where a tumour lies before they conduct a biopsy (tissue sample), has improved detection rates to close to 90% from 50% in the last five years.

Now, the SmartTarget system has further enhanced this technique by allowing a 3D model of the prostate and cancer to be created for each patient from their MRI scans using advanced image processing and machine learning algorithms.

During a biopsy, this model is fused with ultrasound images to highlight the area of concern, which otherwise does not appear in the ultrasound images, helping to guide the surgeon while conducting the procedure.

Until last year when MRI targeting was introduced, the established way to test for prostate cancer involved taking a biopsy from the prostate without knowing where in the prostate a tumour was likely to be, resulting in close to half of life-threatening cancers being missed.

“Prostate cancer detection has been improving at a very fast rate in recent years, and this technology pushes the science forward even further, enabling clinicians to pick up prostate cancer quickly so that patients can access the right treatment early enough,” said co-senior author Professor Hashim Ahmed, who began the research in UCL Medicine before moving to Imperial College London.

For the present study, 129 people with suspected prostate cancer underwent two biopsies – one using the SmartTarget system, and one where surgeons could only visually review the MRI scans. Funded by the UK Department of Health and Social Care and Wellcome Health Innovation Challenge Fund, the study was conducted at UCLH.

The two strategies combined detected 93 clinically significant prostate cancers, with each of them picking up 80 of these cancers; each missed 13 that the other method picked up.

The researchers say that surgeons’ visual review of MRI scans should be used in tandem with SmartTarget as using this technique enables surgeons to learn to make subtle adjustments such as adapting to the movement of the patient and the prostate as the needle is inserted.

“We developed the SmartTarget system to equip surgeons with vital information about the size, shape and location of prostate tumours during a biopsy that is otherwise invisible on ultrasound images,” explained co-senior author Dr Dean Barratt (UCL Medical Physics & Biomedical Engineering and UCL Centre for Medical Image Computing), who invented and led the development of the SmartTarget system.

“The software provides them with a clear target. As MRI-targeted biopsies require a very high degree of expertise and experience, we hope that the imagery displayed by SmartTarget will help to bring high accuracy prostate cancer diagnosis to a much wider range of patients and hospitals.”

The researchers say the new methods could reduce the number of biopsies needed, and reduce the unnecessary surgeries caused by over diagnosis of less harmful cancers.

The SmartTarget software has been commercialised by SmartTarget Ltd, a company spun out by UCL’s commercialisation company UCL Business PLC (UCLB), and the system is already in use by several hospitals in the UK and USA.

The inter-disciplinary study brought together engineers, urologists and radiologists, supported by the UCL Translational Research Office in project management and navigating the translational and regulatory pathway involved in taking the project from the lab bench to the operating room.

“There has been much discussion and speculation in the media recently on the degree to which computers and artificial intelligence will be integrated into clinical care. Studies such as this one are extremely important as they provide valuable evidence on the performance of a new technology in the clinical setting,” said co-senior author Professor Mark Emberton (Dean, UCL Medical Sciences).

“With this study we now have hard data showing that SmartTarget is as good as a group of experts in targeting tumours in the prostate, and have a glimpse of how clinicians and computers will be working together in the future for the good of the patient.”

Source: University College London