Artificial Intelligence (AI)


Image courtesy of Oak Ridge Leadership Computing Facility

Researchers used supercomputing and deep learning tools to predict protein structure, which has eluded experimental methods such as crystallography.

DOE and its national laboratories have invested in AI development and use since the early 1960s, developing cutting-edge AI tools, data science capabilities, and high-performance computing leadership. Advanced Scientific Computing Research (ASCR) supports research in many Artificial Intelligence (AI) areas, including: scientific machine learning (SciML) for complex systems; generative AI, large language models (LLMs), and foundation models; interpretable and explainable AI; privacy-preserving AI and federated learning; AI for visualization; and co-designed hardware and neuromorphic computing to accelerate scientific discovery. The ASCR-sponsored user facilities support and further research initiatives in AI.  For example, ASCR’s sustained support for leading-edge high-performance computing led to the world’s first exascale computing ecosystem that is providing the hardware and software needed to power future generations of frontier AI.

ASCR has held several workshops on AI. Some of the priority research directions include:

  • Domain-Aware Scientific Machine Learning
  • Interpretable Scientific Machine Learning.
  • Robust Scientific Machine Learning
  • Data-Intensive Scientific Machine Learning.
  • Machine Learning-Enhanced Modeling and Simulation
  • Intelligent Automation and Decision Support.

The ASCR High Performance Computing (HPC) Facilities are leading venues for AI research and development.  Today they offer advanced supercomputers and data systems including Aurora at the Argonne Leadership Computing Facility (ALCF), Frontier at the Oak Ridge Leadership Computing Facility (OLCF), and Perlmutter at the National Energy Research Supercomputing Center (NERSC).  Collectively, these systems have over 100,000 GPUs from AMD, Intel, and NVIDIA.  The distinct architectures provide deployments of large-scale AI software infrastructure for training and inference across these diverse GPU offerings, enhancing, diversifying, and hardening the AI software landscape.  These systems are proving invaluable to advancing AI research and for applying AI methodologies to a variety of data-intensive research challenges, spanning extreme-scale modeling and simulation and extreme-scale experimental and observational data.  The ASCR HPC Facilities also host exploratory efforts with hardware vendors.  For example, the ALCF AI Testbed is a growing collection of some of the world’s most advanced AI accelerators, including Cerebras, Graphcore, Groq, and SambaNova systems, available for open scientific research.  ASCR’s high-performance network facility, ESnet, also plays a vital role in enabling data-intensive science across geographic, institutional, and domain boundaries.

By investing in AI, ASCR hopes to make advances that accelerate scientific discovery and transform science and energy research by harnessing DOE investments in massive data from scientific user facilities, software for predictive models and algorithms, high-performance computing platforms, and the national workforce.

ASCR Funding

Award abstracts and information about awards made prior to FY2018 can be found here.

ASCR Workshops and Reports

Press Releases

Other Notable Reports


Artificial Intelligence (AI) Program Manager Contacts:

Steven Lee
Privacy-Preserving AI
Steven.Lee@science.doe.gov

Margaret Lentz
Data and Visualization
Margaret.Lentz@science.doe.gov

Kalyan Perumalla
AI Systems
Kalyan.Perumalla@science.doe.gov

Robinson Pino
Neuromorphic Computing
Robinson.Pino@science.doe.gov

Bill Spotz
AI for Complex Systems
William.Spotz@science.doe.gov