Newswise — HOUSTON – (June 28, 2023) –Researchers from Rice University and the University of Maryland have developed groundbreaking technology in the field of optical imaging. This new technology, called NeuWS (short for "neural wavefront shaping"), has the potential to revolutionize imaging capabilities by enabling cameras to see through challenging environments such as fog, smoke, heavy rain, murky water, skin, bone, and other mediums that scatter light and obscure objects.

The ability to image through scattering media has long been regarded as the "holy grail problem" in optical imaging. The scattering of light hinders its effective use in various scenarios where higher spatial resolution is required. By effectively undoing the effects of scattering, the researchers aim to significantly extend the capabilities of imaging technology.

NeuWS, the core technique developed by Ashok Veeraraghavan's lab at Rice University in collaboration with Christopher Metzler's research group at the University of Maryland, employs neural wavefront shaping to tackle this challenge. The technology holds immense promise in overcoming scattering-related obstacles in imaging applications.

For instance, autonomous driving vehicles face significant challenges in capturing high-quality images under adverse weather conditions. By addressing light scattering, NeuWS could potentially enhance imaging capabilities in such scenarios. Similarly, biologists struggling with deep tissue imaging in vivo encounter issues due to light scattering in skin and other layers of tissue. By mitigating the effects of scattering, NeuWS may offer a solution for visualizing deep tissue non-invasively. Underwater photographers also face limitations in imaging distant subjects due to light scattering in water. NeuWS has the potential to address this challenge and enable the capture of distant underwater scenes.

In essence, scattering is the underlying technical problem in these and numerous other circumstances. The NeuWS technology opens up exciting possibilities for overcoming scattering challenges and expanding the frontiers of imaging technology in various fields.

According to Veeraraghavan, this development represents a significant advancement in addressing the scattering problem, and it holds the potential for practical implementation. While there is still much work to be done before prototypes can be built for specific application domains, the demonstrated approach shows promise in traversing those domains.

The underlying concept of NeuWS revolves around the understanding that light waves possess two key properties: magnitude and phase. Magnitude represents the energy carried by the wave at a specific location, while phase indicates the oscillation state of the wave at that location. Overcoming scattering requires measuring the phase, but directly measuring it is challenging due to the high-frequency nature of optical light.

To overcome this obstacle, the researchers employ a measurement technique using "wavefronts," which are single measurements containing both phase and intensity information. They then employ sophisticated backend processing to rapidly extract phase information from several hundred wavefront measurements per second.

Metzler, who is an assistant professor of computer science at the University of Maryland and an alumnus of Rice University, explained that the technical challenge lies in devising a method to swiftly measure phase information. His involvement in the earlier wavefront-processing technology called WISH, developed at Rice University and published in 2020, positioned him well for this latest endeavor.

Veeraraghavan noted that their earlier technology, WISH, addressed a similar problem but made the assumption that the environment was static and predictable. However, in the real world, conditions are constantly changing.

With NeuWS, their goal is not only to undo the effects of scattering but to do so rapidly enough that the scattering medium itself remains unchanged during the measurement process.

Instead of directly measuring the oscillation state, they measure its correlation with known wavefronts. By interfering a known wavefront with the unknown wavefront and measuring the resulting interference pattern, they determine the correlation between the two wavefronts.

Metzler illustrated this concept using the example of looking at the North Star through a cloudy haze at night. If they know the expected appearance of the North Star and observe how it becomes blurred in a particular way, they can deduce how everything else will be blurred.

Veeraraghavan clarified that this is not a mere comparison but a correlation-based approach. By measuring at least three such correlations, they can accurately reconstruct the unknown wavefront.

The researchers demonstrated the practicality of their technology by utilizing state-of-the-art spatial light modulators capable of performing several hundred measurements per minute. Veeraraghavan, Metzler, and their team successfully captured videos of moving objects that were originally obscured by scattering media, using the modulator and their computational approach.

Haiyun Guo, a Ph.D. student in Veeraraghavan's research group and one of the lead authors of the study, emphasized that this achievement serves as a proof-of-principle for the real-time correction of light scattering using their technology.

In one series of experiments, the researchers placed scattering media between an overhead camera and a spinning microscope slide containing printed images of an owl or a turtle. The media included onion skin, slides coated with nail polish, slices of chicken breast tissue, and light-diffusing films. By measuring the ability of NeuWS to correct for light scattering in each case, the researchers successfully produced clear videos of the spinning figures.

Metzler explained that they developed algorithms capable of continuously estimating both the scattering and the scene. This computational approach, utilizing neural representation, enables efficient and rapid correction of scattering, making the entire process possible.

NeuWS utilizes rapid light modulation of incoming wavefronts to generate multiple slightly modified phase measurements. These altered phases are subsequently inputted into a neural network comprising 16,000 parameters. The neural network swiftly calculates the required correlations to retrieve the original phase information of the wavefront.

"The inclusion of neural networks enables faster processing by reducing the number of measurements needed," explained Veeraraghavan.

Metzler highlighted the significant advantage of this approach, stating, "This reduction in measurements is actually the key benefit. With fewer measurements, we require much less time for capturing. It is this efficiency that allows us to capture videos rather than just still frames."

The research was supported by the Air Force Office of Scientific Research (FA9550- 22-1-0208), the National Science Foundation (1652633, 1730574, 1648451) and the National Institutes of Health (DE032051), and partial funding for open access was provided by the University of Maryland Libraries’ Open Access Publishing Fund.

-30-

Journal Link: Science Advances