Scientists are preparing for the increased brightness and resolution of next-generation light sources with a computing technique that reduces the need for human calculations to reconstruct images.
The Department of Energy is supporting the development of both conventional exascale supercomputers and quantum computers. Each provide benefits that could transform scientific research.
The National Science Foundation (NSF) has renewed funding for OpenTopography, a science gateway that provides online access to Earth science oriented high-resolution topography data and processing tools to a broad user community advancing research and education in areas ranging from earthquake geology to ecology and hydrology.
Los Alamos National Laboratory computer scientists have developed a new artificial intelligence (AI) system that may be able to identify malicious codes that hijack supercomputers to mine for cryptocurrency such as Bitcoin and Monero.
Lawrence Livermore National Laboratory (LLNL) and artificial intelligence computer company Cerebras Systems have integrated the world’s largest computer chip into the National Nuclear Security Administration’s (NNSA’s) Lassen system, upgrading the top-tier supercomputer with cutting-edge AI technology.
Hadrons are elusive superstars of the subatomic world, making up almost all visible matter, and British theoretical physicist Antoni Woss has worked diligently with colleagues at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility to get to know them better. Now, Woss’ doctoral thesis on spinning hadrons has earned him the 2019 Jefferson Science Associates Thesis Prize.
A series of simulations using multiple supercomputers, including Comet at the San Diego Supercomputer Center (SDSC) at UC San Diego, suggests that when the neutron stars’ masses are different enough, the result is far noisier. The models predicted an electromagnetic ‘bang,’ which isn't present when the merging stars' masses are similar, according to researchers.
Researchers at the University of Delaware have gained new understanding of the virus that causes hepatitis B and the “spiky ball” that encloses the virus’s genetic blueprint. They examined how the capsid—a protein shell that protects the blueprint and also drives the delivery of it to infect a host cell—assembles itself. Scientists believe that the capsid is an important target in developing drugs to treat hepatitis B, a life-threatening and incurable infection that afflicts more than 250 million people worldwide.
A Lawrence Livermore National Laboratory (LLNL) team has published new supercomputer simulations of a magnitude 7.0 earthquake on the Hayward Fault. This work represents the highest ever resolution ground motion simulations from such an event on this scale.
Virginia Institute of Marine Science (VIMS) researchers used supercomputer simulations to examine impacts of both regional and global changes affecting the Chesapeake Bay. They discovered that historical increases in fertilizers and atmospheric carbon dioxide concentrations have forced the bay to behave increasingly like a small sea on a continental shelf rather than a traditional estuary.
Researchers at the McKelvey School of Engineering at Washington University in St. Louis have developed a new algorithm for solving a common class of problem -- known as linear inverse problems -- by breaking them down into smaller tasks, each of which can be solved in parallel on standard computers.
Computational scientific research is no longer one-size-fits-all. The massive datasets created by today’s cutting-edge instruments and experiments — telescopes, particle accelerators, sensor networks and molecular simulations — aren’t best processed and analyzed by a single type of machine.
A team led by Dan Jacobson of the Department of Energy’s Oak Ridge National Laboratory used the Summit supercomputer at ORNL to analyze genes from cells in the lung fluid of nine COVID-19 patients compared with 40 control patients.
Scientists at the Department of Energy’s Oak Ridge National Laboratory used neutron scattering and supercomputing to better understand how an organic solvent and water work together to break down plant biomass, creating a pathway to significantly improve the production of renewable biofuels and bioproducts.
Researchers at the University of New Hampshire used SDSC's Comet supercomputer to validate a model using a machine learning technique called Dynamic Time Lag Regression (DTLR) to help predict the solar wind arrival near the Earth’s orbit from physical parameters of the Sun.
Geoscientists at The University of Texas at Dallas recently used supercomputers to analyze massive amounts of earthquake data to generate high-resolution, 3D images of the dynamic geological processes taking place far below the Earth’s surface.
A team used the Summit supercomputer to simulate transition metal systems—such as copper bound to molecules of nitrogen, dihydrogen, or water—and correctly predicted the amount of energy required to break apart dozens of molecular systems, paving the way for a greater understanding of these materials.
The NSF has awarded the San Diego Supercomputer Center (SDSC) at UC San Diego a $5 million grant to develop a high-performance resource for conducting artificial intelligence (AI) research across a wide swath of science and engineering domains.
Researchers have performed the first room temperature X-ray measurements on the SARS-CoV-2 main protease—the enzyme that enables the virus to reproduce. It marks an important first step in the ultimate goal of building a comprehensive 3D model of the enzymatic protein that will be used to advance supercomputing simulations aimed at finding drug inhibitors to block the virus’s replication mechanism and help end the COVID-19 pandemic.
Researchers from Harvard University and the University of Texas Medical Branch at Galveston recently used the Comet supercomputer at the San Diego Supercomputer Center (SDSC) at the University of California San Diego to uncover the novel ways in which DNA prepares itself for repair.
Researchers have demonstrated that an advanced computer code could help design stellarators confine the essential heat from plasma fusion more effectively.
The Sherlock Division of the San Diego Supercomputer Center (SDSC) at the University of California San Diego has broadened its secure Cloud solutions portfolio to offer Skylab, an innovative customer-owned Cloud platform solution that provides a self-standing, compliant environment for secure workloads in the Amazon Web Services (AWS) Cloud.
To meet the needs of tomorrow’s supercomputers, the National Nuclear Security Administration’s (NNSA’s) Lawrence Livermore National Laboratory (LLNL) has broken ground on its Exascale Computing Facility Modernization (ECFM) project, which will substantially upgrade the mechanical and electrical capabilities of the Livermore Computing Center.
Because of silicon’s relatively high cost, hybrid organic-inorganic perovskites (HOIPs) have emerged as a lower-cost and highly efficient option for solar power, according to a recent study by Georgia Institute of Technology (Georgia Tech) researchers.
The Sherlock Division of the San Diego Supercomputer Center at the University of California San Diego has expanded its multi-Cloud solution, Sherlock Cloud, to include the Google Cloud Platform (GCP).
After 20 years at UC San Diego, Larry Smarr will step down as the director of the California Institute for Telecommunications and Information Technology (Calit2) and retire as a distinguished professor from the Jacobs School of Engineering’s Computer Science and Engineering Department at the end of this month.
A team at Stanford University used the OLCF’s Summit supercomputer to compare simulations of a G protein-coupled receptor with different molecules attached to gain an understanding of how to minimize or eliminate side effects in drugs that target these receptors.
Largely unaffected by the pandemic, the Daya Bay reactor neutrino experiment in Shenzen, China, has continued to pump data to remote supercomputers for analyses.
Sandia anticipates being one of the first DOE labs to receive the newest A64FX Fujitsu processor, a Japanese Arm-based processor optimized for high-performance computing.Arm-based processors are used widely in small electronic devices like cell phones.
Climate scientists from the IBS Center for Climate Physics discover that, contrary to previously held beliefs, Neanderthal extinction was neither caused by abrupt glacial climate shifts, nor by interbreeding with Homo sapiens.
Penn State researchers will need the power of supercomputers not just to investigate possible treatments and therapies for the novel coronavirus, but also to explore ways to help the world recover socially, economically and psychologically.
A team of scientists led by Abhishek Singharoy at Arizona State University used the Summit supercomputer at the Oak Ridge Leadership Computing Facility to simulate the structure of a possible drug target for the bacterium that causes rabbit fever.
The Argonne Leadership Computing Facility recently hosted a workshop to help researchers advance code development efforts for Argonne’s upcoming exascale system, Aurora.
Researchers from the Colorado School of Mines have been using multiple supercomputers to study certain characteristics of zirconia. The team recently published their findings in the Journal of the European Ceramic Society.
During an internship at Brookhaven National Laboratory, Juliette Stecenko is using modern supercomputers and quantum computing platforms to perform astronomy simulations that may help us better understand where we came from.
David Richardson’s job is literally to make sure the light stays on. But it’s not just any light – it’s a very special X-ray light that could play a crucial role in an eventual treatment for COVID-19. Richardson is an operator at Lawrence Berkeley National Laboratory’s synchrotron light source facility, the Advanced Light Source (ALS), and is one of a handful of workers providing essential services to scientists working on COVID-19-related research.
University of Texas at Austin researchers recently simulated the catalytic mechanism and atomic structure of nickel-doped graphene using Comet at the San Diego Supercomputer Center (SDSC) and Stampede2 at the Texas Advanced Computing Center. The simulations showed how the catalyst converts carbon dioxide into carbon monoxide, an important feedstock for chemical engineering.
Using ORNL’s now-decommissioned Titan supercomputer, a team of researchers estimated the combined consequences of many different extreme climate events at the county level, a unique approach that provided unprecedented regional and national climate projections that identified the areas most likely to face climate-related challenges.
For two decades, physicists have been trying to reconcile a gap between theoretical and experimental data on a particle called the muon. A new study, powered by Argonne's supercomputer Mira, sharpens one piece of the puzzle.
Two decades ago, an experiment at Brookhaven National Laboratory pinpointed a mysterious mismatch between established particle physics theory and actual lab measurements. A multi-institutional research team (including Brookhaven, Columbia University, and the universities of Connecticut, Nagoya and Regensburg, RIKEN) have used Argonne National Laboratory’s Mira supercomputer to help narrow down the possible explanations for the discrepancy, delivering a newly precise theoretical calculation that refines one piece of this very complex puzzle.
University of Alabama in Huntsville (UAH) professor of biological science Dr. Jerome Baudry is collaborating with Hewlett Packard Enterprise (HPE) to use HPE’s Cray Sentinel supercomputer to search for natural products that are effective against the COVID-19 virus.
Quantum machine learning, an emerging field that combines machine learning and quantum physics, is the focus of research to discover possible treatments for COVID-19, according to Penn State researchers led by Swaroop Ghosh, the Joseph R. and Janice M. Monkowski Career Development Assistant Professor of Electrical Engineering and Computer Science and Engineering. The researchers believe that this method could be faster and more economical than the current methods used for drug discovery.
For an experiment that will generate big data at unprecedented rates, physicists led design, development, mass production and delivery of an upgrade of novel particle detectors and state-of-the art electronics.
Scientists and engineers at Fermilab and Brookhaven are uniting with other organizations in the Open Science Grid to help fight COVID-19 by dedicating considerable computational power to researchers studying how they can help combat the virus-borne disease.
An ORNL team developed the XACC software framework to help researchers harness the potential power of quantum processing units, or QPUs. XACC offloads portions of quantum-classical computing workloads from the host CPU to an attached quantum accelerator, which calculates results and sends them back to the original system.
This is a continuing profile series on the directors of the Department of Energy (DOE) Office of Science User Facilities. Michael E. Papka is the director of the Argonne Leadership Computing Facility.