Let’s talk! Scientists demonstrate coherent coupling between a quantum dot and a donor atom in silicon, vital for moving information inside quantum computers.
The West Big Data Innovation Hub (WBDIH) at the San Diego Supercomputer Center (SDSC) at UC San Diego is one of four regional big data hubs partner sites awarded a $1.8 million grant from the National Science Foundation (NSF) for the initial development of a data storage network during the next two years.
In an effort to reduce errors in the analyses of diagnostic images by health professionals, a team of researchers from Oak Ridge National Laboratory has improved understanding of the cognitive processes involved in image interpretation, work that has enormous potential to improve health outcomes for the hundreds of thousands of American women affected by breast cancer each year. The ORNL-led team found that analyses of mammograms by radiologists were significantly influenced by context bias, or the radiologist’s previous diagnostic experiences.
Argonne material scientists have discovered a reaction that helps explain the behavior of a key electrolyte additive used to boost battery performance.
The U.S. Department of Energy’s Oak Ridge National Laboratory today unveiled Summit as the world’s most powerful and smartest scientific supercomputer.
In a study published in the May 21, 2018 issue of the Proceedings of the National Academy of Sciences, a team of researchers – aided with supercomputing resources from the San Diego Supercomputer Center (SDSC) based at UC San Diego – created a dynamic computer simulation to delineate a key biological process that allows the body to repair damaged DNA.
Forward-thinking scientists in the 1970s suggested that circuits could be built using molecules instead of wires, and over the past decades that technology has become reality. The trouble is, some molecules have particularly complex interactions that make it hard to predict which of them might be good at serving as miniature circuits. But a new paper by two University of Chicago chemists presents an innovative method that cuts computational costs and improves accuracy by calculating interactions between pairs of electrons and extrapolating those to the rest of the molecule.
A team led by Berkeley Lab researchers has enlisted powerful supercomputers to calculate a quantity, known as the “nucleon axial coupling” or gA, that is central to our understanding of a neutron’s lifetime.
Argonne joins its sister national laboratories in powering a new earth modeling system with supercomputers. The system features weather-scale resolution and can help researchers anticipate decadal-scale changes that could influence the U.S. energy sector in years to come.
Tom Jordan and a team from the Southern California Earthquake Center (SCEC) are using the supercomputing resources of the Argonne Leadership Computing Facility (ALCF), a U.S. Department of Energy Office of Science User Facility, to advance modeling for the study of earthquake risk and how to reduce it.
By stretching the amount of time proteins can be simulated in their natural state of wiggling and gyrating, a team of researchers at Colorado State University has identified a critical protein structure that could serve as a molecular Achilles heel able to inhibit the replication of dengue virus and potentially other flaviviruses such as West Nile and Zika virus.
Jen Marie Phifer and Forest Good of Los Lunas High School won top honors on Tuesday at the 28th Annual New Mexico Supercomputing Challenge held at Los Alamos National Laboratory.
The U.S. Department of Energy will fund research into a novel approach to improving efficiency of next-generation supercomputer simulations with an award to Rensselaer Polytechnic Institute doctoral candidate Caitlin Joann Ross.
U.S. Secretary of Energy Rick Perry today announced the release of a Request for Proposals (RFP) for the development of at least two new exascale supercomputers, including Lawrence Livermore National Laboratory’s next-generation system code-named “El Capitan.”
The mirror-like physics of the superconductor-insulator transition operates exactly as expected. Scientists know this to be true following the observation of a remarkable phenomenon, the existence of which was predicted three decades ago but that had eluded experimental detection until now. The observation confirms that two fundamental quantum states, superconductivity and superinsulation, both arise in mirror-like images of each other.
Berkeley Lab and Joint Genome Institute researchers took one of the most popular clustering approaches in modern biology—Markov Clustering algorithm—and modified it to run efficiently and at scale on supercomputers. Their algorithm achieved a previously impossible feat: clustering a 70 million node and 68 billion edge biological network in hours.
ORNL model could better predict tiny methylmercury pockets lurking in creek algae; engines work smarter with new fuel innovation; making narrow metallic structures to advance tiny electronics, drug delivery; certain enzymes that try to break down antibiotics may inform better drug designs for fighting resistant bacteria; current software simulations for small modular reactors upscaled to run on future supercomputers.
Tapping into the tremendous power of the Cherry Creek II supercomputer at UNLV just got easier for faculty researchers and community partners alike, thanks to a new MOU between the university and Altair Engineering.
Computers have helped researchers develop a new phosphor that can make LEDs cheaper and render colors more accurately. An international team led by engineers at UC San Diego first predicted the new phosphor using supercomputers and data mining algorithms, then developed a simple recipe to make it in the lab. Unlike many phosphors, this one is made of inexpensive, earth-abundant elements and can easily be made using industrial methods. As computers predicted, the new phosphor performed well in tests and in LED prototypes.
In a recent demonstration project, physicists from Brookhaven National Laboratory and Berkeley Lab used the Cori supercomputer at the National Energy Research Scientific Computing Center to reconstruct data collected from a nuclear physics experiment, an advance that could dramatically reduce the time it takes to make detailed data available for scientific discoveries.
Grover and GM colleagues Jian Gao, Venkatesh Gopalakrishnan, and Ramachandra Diwakar are using the Titan supercomputer at the Oak Ridge Leadership Computing Facility to improve combustion models for diesel passenger car engines with an ultimate goal of accelerating innovative engine designs while meeting strict emissions standards.
A team of networking experts from the Department of Energy’s Energy Sciences Network (ESnet), with the Globus team from the University of Chicago and Argonne National Laboratory, have designed a new approach that makes data sharing faster, more reliable and more secure.
For deep learning to be effective, existing neural networks to be modified, or novel networks designed and then "trained" so that they know precisely what to look for and can produce valid results. This is a time-consuming and difficult task, but one that a team of ORNL researchers recently demonstrated can be dramatically expedited with a capable computing system.
“We’re geeks, and we’re motivated.” That’s how Amin Amooie, a doctoral student in earth sciences at The Ohio State University, explained his team’s efforts to build the supercomputer they’ve dubbed “Buckeye Pi.”
With a top-story list populated by breakthroughs in supercomputing, accelerator science, space missions, materials science, life science, and more, Los Alamos National Laboratory put its Big Science capabilities to wide, productive use in 2017.
An international team of researchers ran multi-scale, multi-physics 2D and 3D simulations at NERSC to illustrate how heavy metals expelled from exploding supernovae held the first stars in the universe regulate subsequent star formation and influence the appearance of galaxies in the process.
New supercomputer simulations have revealed the role of transport proteins called efflux pumps in creating drug-resistance in bacteria, research that could lead to improving the drugs’ effectiveness against life-threatening diseases and restoring the efficacy of defunct antibiotics.
A unique collaboration between a music professor and an engineering professor at Virginia Tech will result in the creation of a new platform for data analysis that will make it possible to understand the significance of data by turning it into sound.
Using the Titan supercomputer, a research team at Oak Ridge National Laboratory has developed an evolutionary algorithm capable of generating custom neural networks that match or exceed the performance of handcrafted artificial intelligence systems.
For the first time, scientists have used high-performance computing (HPC) to reconstruct the data collected by a nuclear physics experiment—an advance that could dramatically reduce the time it takes to make detailed data available for scientific discoveries. The demonstration project used the Cori supercomputer at the National Energy Research Scientific Computing Center (NERSC), a high-performance computing center at Lawrence Berkeley National Laboratory in California, to reconstruct multiple datasets collected by the STAR detector during particle collisions at the Relativistic Heavy Ion Collider (RHIC), a nuclear physics research facility at Brookhaven National Laboratory in New York.
Researchers are grappling with increasingly large quantities of image-based data. Machine learning and deep learning offer researchers new ways to analyze images quickly and more efficiently than ever before. Scientists at multiple national laboratories are working together to harness the potential of these tools.
Argonne National Laboratory is collaborating with Hewlett Packard Enterprise (HPE) to provide system software expertise and a development ecosystem for a future high-performance computing (HPC) system based on 64-bit ARM processors.
The San Diego Supercomputer Center (SDSC) at the University of California San Diego received two key HPCwire awards for 2017, recognizing the use of its Comet supercomputer in the areas of artificial intelligence (AI) research and the life sciences.
A quest to help the systems software community work on very large supercomputers without having to actually test on them has spawned an affordable, scalable system using thousands of inexpensive Raspberry Pi nodes.
A team of computer scientists and engineers from Sandia National Laboratories and Boston University recently won the Gauss Award at the International Supercomputing conference for their paper about using machine learning to automatically diagnose problems in supercomputers.
Rural counties continue to rank lowest among counties across the U.S., in terms of health outcomes. A group of national organizations including the Robert Wood Johnson Foundation and the National 4-H Council are leading the way to close the rural health gap.
Just one year after the Department of Energy's Exascale Computing Program began funding projects to prepare scientific applications for exascale supercomputers, the Pagoda Project - led by Lawrence Berkeley National Laboratory - has successfully reached a major milestone: making its open source software libraries publicly available as of September 30, 2017.
An INCITE research team, led by Jonathan Aurnou of UCLA, is using Mira to develop advanced models to study magnetic field generation on Earth, Jupiter and the sun at an unprecedented level of detail.
Description: Lawrence Livermore National Laboratory, Frederick National Laboratory for Cancer Research, GSK, and University of California San Francisco will combine vast data stores, supercomputing, and scientific expertise to reinvent the discovery process for cancer medicines.
This week’s landmark discovery of gravitational and light waves generated by the collision of two neutron stars eons ago was made possible by analyses and signal verification performed by Comet, an advanced supercomputer based at the San Diego Supercomputer Center (SDSC) at UC San Diego.
Physicists and computational scientists at Brookhaven Lab will help to develop the next generation of computational tools to push the field of nuclear physics forward.
Astrophysicist Chris Fryer was enjoying an evening with friends on August 25, 2017, when he got the news of a gravitational-wave detection by LIGO, the Laser Interferometer Gravitational-wave Observatory
On Aug. 17, scientists around the globe were treated to near-simultaneous observations by separate instruments that would ultimately be confirmed as the first measurement of the merger of two neutron stars and its explosive aftermath.
LOS ALAMOS, N.M., Dec. 7, 2016—Scott Crooker, of Los Alamos National Laboratory’s Condensed Matter and Magnet Science group, and William Charles Louis III, of the Laboratory’s Physics Division, have been named Fellows of the American Association for the Advancement of Science (AAAS). Election as an AAAS Fellow is an honor bestowed upon AAAS members by their peers.
Scientists need to learn how to take advantage of exascale computing. This is the mission of the Argonne Training Program on Extreme-Scale Computing (ATPESC), which held its annual two-week training workshops over the summer.