Newswise — Four scientists at the Department of Energy’s SLAC National Accelerator Laboratory will receive Early Career Research Program awards for research that’s developing new ways to study fundamental particles with machine learning and study nanoscale objects and quantum materials with powerful X-ray laser beams.

Tais Gorkhover, Michael Kagan, Kazuhiro Terao and Josh Turner are among 84 recipients selected from a large pool of university and national laboratory applicants, the DOE Office of Science announced today. They will each receive $500,000 per year for five years in support of their work. 

The grant program is designed to bolster the nation’s scientific workforce by providing support to exceptional researchers during the crucial early career years, when many scientists do their most formative work, the DOE announcement said.

“It’s absolutely thrilling to have four of our best and brightest young scientists achieve this recognition,” said JoAnne Hewett, SLAC’s chief research officer and associate lab director for fundamental physics. “Our winners span a broad set of science topics, and this reflects our depth of talent across the lab.”

SLAC Director Chi-Chang Kao added, "Our Early Career Award winners are representative of the many extraordinary young scientists at SLAC whose careers we support and nurture, and whose work is vital to the future of the lab. It’s great to see them get this recognition."

Tais Gorkhover: X-ray Holograms of Nanoparticles

Tais Gorkhover came to SLAC in 2014 from the Technical University of Berlin on a Peter Paul Ewald fellowship from the Volkswagen Foundation, and in 2016 she was named a Panofsky Fellow, an award that offers generous research support to exceptional and promising young scientists and gives them a role in shaping the direction of research at the lab.

As an investigator with the Stanford PULSE Institute, Gorkhover has been working on new ways to make high-resolution 3-D images of free-floating nanoparticles with the lab’s Linac Coherent Light Source (LCLS) X-ray laser. These particles play a role in combustion, climate and catalysis, among other things, but they are fragile and short-lived and it’s hard to observe them in their natural environments.

In March, she and her colleagues reported that they had developed a new technique called in-flight holography and used it to make the first X-ray holograms of nanosized viruses that had not been formed into crystals or attached to any surface. What’s more, the new technique greatly simplifies the interpretation of the LCLS data that goes into making the hologram, so what used to take thousands of steps now takes only two.

With the Early Career Award, she says, she’ll be working on ways to capture short-lived changes in nanoscale objects by making two holograms in rapid succession and comparing them to see what changed in between.

“You can get only so much resolution out of an image, even with holography,” she says. “But in principle this technique could be sensitive to changes that take place in femtoseconds, or millionths of a billionth of a second, and that are even smaller than the details visible in either of the holograms.”

Some of these experiments will take place at LCLS and others will employ table-top lasers, using a method called high harmonic generation to shift the energies and frequencies of laser light into higher ranges, Gorkhover says. If the team succeeds in developing a table-top version, high-resolution holographic imaging of changes in nanoscale objects will become available to many more researchers.

Michael Kagan: Digging Deeper into the Higgs Boson

Michael Kagan came to SLAC from Harvard University in 2012. As a postdoctoral researcher, he helped to build and test a new detector for the ATLAS experiment, one of two experiments at Europe’s Large Hadron Collider where the Higgs boson was discovered in 2012.

Like Gorkhover, he was named a Panofsky Fellow in 2016, and he continues to work on the SLAC ATLAS team while based at CERN, the particle physics center in Geneva that’s home to the LHC.

Even though the discovery of the Higgs boson filled in the last missing piece of the Standard Model that describes the fundamental particles and forces, there’s still a lot we don’t know about the Higgs, Kagan says. The particle can only be detected indirectly, by analyzing the particles it produces when it decays.

Only about one in a billion particle collisions at the LHC produces a Higgs, he notes. And the two types of collisions he is interested in – those that produce relativistic Higgs bosons traveling at nearly the speed of light or produce two Higgs particles at once – are even more rare, occurring about once in a trillion collisions.

Kagan is leading a group that develops tools to identify the signatures of bottom quarks in the ATLAS detector data. He’s also an ATLAS experiment leader in developing and applying machine learning tools to the torrent of LHC data, with the goal of winnowing out and studying these rare events in which Higgs bosons decay into two bottom quarks.

“We look for machine learning algorithms that may have been built for something different, like analyzing text or images, but that may be adaptable to our data and for what we want to do,” Kagan says. “When something doesn’t exist for what we need to do, we also develop algorithms of our own.”

Kagan plans to return to SLAC late next year, where he’ll participate in a major upgrade to the ATLAS pixel detector system; a significant fraction of it will be assembled by SLAC. This is the system that’s most vital for identifying bottom quarks coming out of Higgs decays.

“I’m very excited – I’m ecstatic!” he says. “All these projects I’m working on are very much in development. The use of machine learning in high-energy physics has just started to be explored in the last few years and we’re just starting to see how powerful these tools will be. This gives me an opportunity to push the boundaries and to assemble the team and resources that will allow us to do that.”

Kazuhiro Terao: Going After Neutrino s with Machine Learning

Kazu Terao came to SLAC last year from Columbia University, where he did postdoctoral research in neutrino physics. As an associate staff scientist in the lab’s neutrino group, he is helping to prepare for research at the future Deep Underground Neutrino Experiment (DUNE) at the Long-Baseline Neutrino Facility (LBNF).

Terao has the daunting task of finding better ways to analyze huge amounts of data coming from the DUNE detector. Reconstructing that data – tracing it back to identify the events that produced it – is one of SLAC’s major contributions to the project.

“The detector for the current MicroBooNE experiment can record up to 8 million pixels of information from each incoming neutrino, and the DUNE detector is 400 times bigger,” Tareo says. “Some of those pixels contain information about the neutrino and some do not. From the ones that do contain neutrino information we want to find out just two things: the neutrino’s energy and its type, or ‘flavor’.”

Terao is the organizer of the DeepLearnPhysics research collaboration, a group of physicists who are looking for ways to use machine learning techniques – deep neural networks, in particular – to analyze data from particle physics experiments like MicroBooNE and DUNE.

His approach goes beyond simple applications of machine learning that look for signals against a noisy background, but may not tell you how they reach their conclusions. That’s too much of a black box for particle physicists, he says; they want to see each logical step along the way to make sure they understand the data completely.

“I am constructing a chain of machine learning algorithms where each one extracts individual physics features from the data, such as where a neutrino appeared in the detector or what particles were produced as a result,” Terao says. “You have a hierarchy of these algorithms in which the whole chain makes sense and comes to a logical conclusion, and you can actually see the local information that you would have used to arrive at the conclusion for yourself.”

Part of Terao’s project involves applying these techniques at the Short-Baseline Neutrino Project at the DOE’s Fermi National Accelerator Laboratory.

Another goal is to make the best use of modern hardware components such as graphics processing units, or GPUs, which run thousands of computer cores in parallel to rapidly process data. SLAC and Stanford also have a very strong team developing field-programmable gate arrays, or FPGAs, Terao says. These are integrated circuits that contain programmable “logic blocks,” and the plan is to harness them for machine learning, too.

“We’ll be working together to figure out the best ways to implement FPGAs to do the things we want to do. That’s very exciting,” he says. “The fact that DOE is supporting the software development needed to apply machine learning in high-energy physics means a lot to me.”

Josh Turner: Probing Quantum Materials with X-rays

Joshua Turner is a staff scientist at LCLS. His research focuses on quantum materials, whose surprising properties arise out of the collaborative behavior of their electrons. One of the best-known examples is the unconventional superconductors, which have the potential to transform society by carrying electricity at close to room temperature with zero loss.

Turner came to the lab in 2010 after doing postdoctoral research at Stony Brook University. Most of his work at SLAC has involved studying ultrafast phenomena using LCLS. In recent studies, he has been using pairs of closely spaced X-ray laser pulses to observe tiny fluctuations in things like the charges and spins of atoms within a material. These fluctuations are thought to play a key role in phenomena like magnetism and superconductivity. By understanding them better, scientists hope to eventually control them at an atomic level so they can deliberately design materials with exactly the qualities they want.

SLAC researchers have developed several ways to create pairs of X-rays that are separated by a small, precise interval, Turner says. The “two-bucket” method produces pulses that are billionths of a second apart; a new “split-and-delay” technique creates pulses separated by trillionths of a second; and the “fresh slice” method generates pulses with just millionths of a billionth of a second between them.

Turner plans to employ all of these two-pulse methods to study the physics of quantum materials, with a focus on unraveling the mystery of how superconductivity is related to fluctuations in patterns of electron spin and charge within a material. He’ll be working closely with SIMES, the Stanford Institute for Materials and Energy Sciences at SLAC, to tie his research into the broader theory, modeling and experimental work underway in the field of quantum materials. 

In a study published last summer, Turner co-led a team that used the two-bucket technique to look at skyrmions – complex, vortex-like structures in a material’s magnetic structure that might be exploited to improve data storage – with billionth-of-a-second resolution, 1,000 times better than had been possible before. This huge leap in resolution will allow researchers to measure fluctuations in all sorts of materials that were previously out of reach.

“It is an incredible honor to win this award. I am so thankful to be given this opportunity to focus my work on fluctuations in quantum materials,” Turner says. “This area of research is very exciting. I still can’t believe I will get paid to work on it!”


SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, Calif., SLAC is operated by Stanford University for the U.S. Department of Energy's Office of Science. For more information, please visit slac.stanford.edu.

SLAC National Accelerator Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.