Skip to main content
Feature Story | Argonne National Laboratory

Artificial Intelligence: Transforming science, improving lives

Argonne implements Laboratory-wide AI for Science initiative — leveraging world-class facilities, exploring new AI techniques, building collaboration, transforming traditional research methods and driving discovery

Commitment to developing artificial intelligence (AI) as a national research strategy in the United States may have unequivocally defined 2019 as the Year of AI — particularly at the federal level, more specifically throughout the U.S. Department of Energy (DOE) and its national laboratory complex.

In February, the White House established the Executive Order on Maintaining American Leadership in Artificial Intelligence (American AI Initiative) to expand the nation’s leadership role in AI research. Its goals are to fuel economic growth, enhance national security and improve quality of life.

The initiative injects substantial and much-needed research dollars into federal facilities across the United States, promoting technology advances and innovation and enhancing collaboration with nongovernment partners and allies abroad.

In response, DOE has made AI — along with exascale supercomputing and quantum computing — a major element of its $5.5 billion scientific R&D budget and established the Artificial Intelligence and Technology Office, which will serve to coordinate AI work being done across the DOE.

At DOE facilities like Argonne National Laboratory, researchers have already begun using AI to design better materials and processes, safeguard the nation’s power grid, accelerate treatments in brain trauma and cancer and develop next-generation microelectronics for applications in AI-enabled devices.

Over the last two years, Argonne has made significant strides toward implementing its own AI initiative. Leveraging the Laboratory’s broad capabilities and world-class facilities, it has set out to explore and expand new AI techniques; encourage collaboration; automate traditional research methods, as well as lab facilities and drive discovery.

In July, it hosted an AI for Science town hall, the first of four such events that also included Oak Ridge and Lawrence Berkeley national laboratories and DOE’s Office of Science.

In July, Argonne hosted an AI for Science town hall, the first of four such events. Engaging nearly 350 members of the AI community, the event stimulated conversation around expanding the development and use of AI, while addressing critical challenges. (Image by Argonne National Laboratory.)

 

Engaging nearly 350 members of the AI community, the town hall served to stimulate conversation around expanding the development and use of AI, while addressing critical challenges by using the initiative framework called AI for Science.

AI for Science requires new research and infrastructure, and we have to move a lot of data around and keep track of thousands of models,” says Rick Stevens, Associate Laboratory Director for Argonne’s Computing, Environment and Life Sciences (CELS) Directorate and a professor of computer science at the University of Chicago.

How do we distribute this production capability to thousands of people? We need to have system software with different capabilities for AI than for simulation software to optimize workflows. And these are just a few of the issues we have to begin to consider.”

The conversation has just begun and continues through Laboratory-wide talks and events, such as a recent AI for Science workshop aimed at growing interest in AI capabilities through technical hands-on sessions.

Argonne also will host DOE’s Innovation XLab Artificial Intelligence Summit in Chicago, meant to showcase the assets and capabilities of the national laboratories and facilitate an exchange of information and ideas between industry, universities, investors and end-use customers with Lab innovators and experts.

Argonne’s forthcoming exascale computer Aurora — scheduled to begin operation in 2021 — has the capacity to deliver a billion billion calculations per second. (Image by Argonne National Laboratory.)

What exactly is AI?

Ask any number of researchers to define AI and you’re bound to get — well, first, a long pause and perhaps a chuckle — a range of answers from the more conventional utilizing computing to mimic the way we interpret data but at a scale not possible by human capability” to a technology that augments the human brain.”

Taken together, AI might well be viewed as a multi-component toolbox that enables computers to learn, recognize patterns, solve problems, explore complex datasets and adapt to changing conditions — much like humans, but one day, maybe better.

While the definitions and the tools may vary, the goals remain the same: utilize or develop the most advanced AI technologies to more effectively address the most pressing issues in science, medicine and technology, and accelerate discovery in those areas.

At Argonne, AI has become a critical tool for modeling and prediction across almost all areas where the Laboratory has significant domain expertise: chemistry, materials, photon science, environmental and manufacturing sciences, biomedicine, genomics and cosmology.

A key component of Argonne’s AI toolbox is a technique called machine learning and its derivatives, such as deep learning. The latter is built on neural networks comprising many layers of artificial neurons that learn internal representations of data, mimicking human information-gathering-processing systems like the brain.

Deep learning is the use of multi-layered neural networks to do machine learning, a program that gets smarter or more accurate as it gets more data to learn from. It’s very successful at learning to solve problems,” says Stevens.

A staunch supporter of AI, particularly deep learning, Stevens is principal investigator on a multi-institutional effort that is developing the deep neural network application CANDLE (CANcer Distributed Learning Environment), that integrates deep learning with novel data, modeling and simulation techniques to accelerate cancer research.

Coupled with the power of Argonne’s forthcoming exascale computer Aurora — which has the capacity to deliver a billion billion calculations per second — the CANDLE environment will enable a more personalized and effective approach to cancer treatment. 

And that is just a small sample of AI’s potential in science. Currently, all across Argonne, researchers are involved in more than 60 AI-related investigations, many of them driven by machine learning.

Argonne Distinguished Fellow Valerie Taylor’s work looks at how applications execute on computers and large-scale, high-performance computing systems. Using machine learning, she and her colleagues model an execution’s behavior and then use that model to provide feedback on how to best modify the application for better performance.

Better performance may be shorter execution time or, using generated metrics such as energy, it may be reducing the average power,” says Taylor, director of Argonne’s Mathematics and Computer Science (MCS) division. We use statistical analysis to develop the models and identify hints on how to modify the application.”

Material scientists are exploring the use of machine learning to optimize models of complex material properties in the discovery and design of new materials that could benefit energy storage, electronics, renewable energy resources and additive manufacturing, to name just a few areas.

And still more projects address complex transportation and vehicle efficiency issues by enhancing engine design, minimizing road congestion, increasing energy efficiency and improving safety.

Better performance may be shorter execution time or, using generated metrics such as energy, it may be reducing the average power. We use statistical analysis to develop the models and identify hints on how to modify the application.” — Valerie Taylor, director of Argonne’s Mathematics and Computer Science (MCS) division

Beyond the deep

Beyond deep learning, there are many sub-ranges of AI that people have been working on for years, notes Stevens. And while machine learning now dominates, something else might emerge as a strength.”

Natural language processing, for example, is commercially recognizable as voice-activated technologies — think Siri — and on-the-fly language translators. Exceeding those capabilities is its ability to review, analyze and summarize information about a given topic from journal articles, reports and other publications, and extract and coalesce select information from massive and disparate datasets.

Immersive visualization can place us into 3D worlds of our own making, interject objects or data into our current reality or improve upon human pattern recognition. Argonne researchers have found application for virtual and augmented reality in the 3D visualization of complicated data sets and the detection of flaws or instabilities in mechanical systems.

And of course, there is robotics — a program started at Argonne in the late 1940s and rebooted in 1999 — that is just beginning to take advantage of Argonne’s expanding AI toolkit, whether to conduct research in a specific domain or improve upon its more utilitarian use in decommissioning nuclear power plants.

Until recently, according to Stevens, AI has been a loose collection of methods using very different underlying mechanisms, and the people using them weren’t necessarily communicating their progress or potentials with one another.

But with a federal initiative in hand and a Laboratory-wide vision, that is beginning to change.

Among those trying to find new ways to collaborate and combine these different AI methods is Marius Stan, a computational scientist in Argonne’s Applied Materials division (AMD) and a senior fellow at both the University of Chicago’s Consortium for Advanced Science and Engineering and the Northwestern-Argonne Institute for Science and Engineering.

Stan leads a research area called Intelligent Materials Design that focuses on combining different elements of AI to discover and design new materials and to optimize and control complex synthesis and manufacturing processes.

Work on the latter has created a collaboration between Stan and colleagues in the Applied Materials and Energy Systems divisions, and the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science User Facility.

Merging machine learning and computer vision with the Flame Spray Pyrolysis technology at Argonne’s Materials Engineering Research Facility, the team has developed an AI intelligent software” that can optimize, in real time, the manufacturing process. 

Argonne intelligent software drives the real-time optimization of the flame spray pyrolysis synthesis. (Image by Marius Stan, Argonne National Laboratory.)

 

Our idea was to use the AI to better understand and control in real time — first in a virtual, experimental setup, then in reality — a complex synthesis process,” says Stan.

Automating the process leads to a safer and much faster process compared to those led by humans. But even more intriguing is the potential that the AI process might observe materials with better properties than did the researchers.

What drove us to AI?

Whether or not they concur on a definition, most researchers will agree that the impetus for the escalation of AI in scientific research was the influx of massive data sets and the computing power to sift, sort and analyze it.

Not only was the push coming from big corporations brimming with user data, but the tools that drive science were getting more expansive — bigger and better telescopes and accelerators and of course supercomputers, on which they could run larger, multiscale simulations.

The size of the simulations we are running is so big, the problems that we are trying to solve are getting bigger, so that these AI methods can no longer be seen as a luxury, but as must-have technology,” notes Prasanna Balaprakash, a computer scientist in MCS and ALCF

Data and compute size also drove the convergence of more traditional techniques, such as simulation and data analysis, with machine and deep learning. Where analysis of data generated by simulation would eventually lead to changes in an underlying model, that data is now being fed back into machine learning models and used to guide more precise simulations.

More or less anybody who is doing large-scale computation is adopting an approach that puts machine learning in the middle of this complex computing process and AI will continue to integrate with simulation in new ways,” says Stevens.

And where the majority of users are in theory-modeling-simulation, they will be integrated with experimentalists on data-intense efforts. So the population of people who will be part of this initiative will be more diverse.”

But while AI is leading to faster time-to-solution and more precise results, the number of data points, parameters and iterations required to get to those results can still prove monumental.

Focused on the automated design and development of scalable algorithms, Balaprakash and his Argonne colleagues are developing new types of AI algorithms and methods to more efficiently solve large-scale problems that deal with different ranges of data. These additions are intended to make existing systems scale better on supercomputers, like those housed at the ALCF; a necessity in the light of exascale computing.

The size of the simulations we are running is so big, the problems that we are trying to solve are getting bigger, so that these AI methods can no longer be seen as a luxury, but as must-have technology.”  — Prasanna Balaprakash, a computer scientist in MCS and ALCF

 

We are developing an automated machine learning system for a wide range of scientific applications, from analyzing cancer drug data to climate modeling,” says Balaprakash. One way to speed up a simulation is to replace the computationally expensive part with an AI-based predictive model that can make the simulation faster.”

Industry support

The AI techniques that are expected to drive discovery are only as good as the tech that drives them, making collaboration between industry and the national labs essential.

Industry is investing a tremendous amount in building up AI tools,” says Taylor. Their efforts shouldn’t be duplicated, but they should be leveraged. Also, industry comes in with a different perspective, so by working together, the solutions become more robust.”

Argonne has long had relationships with computing manufacturers to deliver a succession of ever-more powerful machines to handle the exponential growth in data size and simulation scale. Its most recent partnership is that with semiconductor chip manufacturer Intel and supercomputer manufacturer Cray to develop the exascale machine Aurora.

But the Laboratory is also collaborating with a host of other industrial partners in the development or provision of everything from chip design to deep learning-enabled video cameras.

One of these, Cerebras, is working with Argonne to test a first-of-its-kind AI accelerator that provides a 100–500 times improvement over existing AI accelerators. As its first U.S. customer, Argonne will deploy the Cerebras CS-1 to enhance scientific AI models for cancer, cosmology, brain imaging and materials science, among others.

The National Science Foundation-funded Array of Things, a partnership between Argonne, the University of Chicago and the City of Chicago, actively seeks commercial vendors to supply technologies for its edge computing network of programmable, multi-sensor devices.

The Waggle project, part of the Array of Things partnership, aims to enable a new breed of smart city research and sensor-driven environmental science. (Image by Argonne National Laboratory.)

But Argonne and the other national labs are not the only ones to benefit from these collaborations. Companies understand the value in working with such organizations, recognizing that the AI tools developed by the labs, combined with the kinds of large-scale problems they seek to solve, offer industry unique benefits in terms of business transformation and economic growth, explains Balaprakash.

Companies are interested in working with us because of the type of scientific applications that we have for machine learning,” he adds What we have is so diverse, it makes them think a lot harder about how to architect a chip or design software for these types of workloads and science applications. It’s a win-win for both of us.”

AI’s future, our future

There is one area where I don’t see AI surpassing humans any time soon, and that is hypotheses formulation,” says Stan, because that requires creativity. Humans propose interesting projects and for that you need to be creative, make correlations, propose something out of the ordinary. It’s still human territory but machines may soon take the lead.

It may happen,” he says, and adds that he’s working on it.

In the meantime, Argonne researchers continue to push the boundaries of existing AI methods and forge new components for the AI toolbox. Deep learning techniques like neuromorphic algorithms that exhibit the adaptive nature of insects in an equally small computational space can be used at the edge” — where there are few computing resources; as in cell phones or urban sensors.

An optimizing neural network called a neural architecture search, where one neural network system improves another, is helping to automate deep-learning-based predictive model development in several scientific and engineering domains, such as cancer drug discovery and weather forecasting using supercomputers. 

Just as big data and better computational tools drove the convergence of simulation, data analysis and visualization, the introduction of the exascale computer Aurora into the Argonne complex of leadership-class tools and experts will only serve to accelerate the evolution of AI and witness its full assimilation into traditional techniques.

The tools may change, the definitions may change, but AI is here to stay as an integral part of the scientific method and our lives.

Some additional material provided by Jo Napolitano.

Acknowledgments

The CANDLE research was supported by the Exascale Computing Project (17-SC-20-SC), a collaborative effort of DOE’s Office of Science and the National Nuclear Security Administration.

Funding for the work on machine learning and computer vision with the Flame Spray Pyrolysis technology is provided by Argonne’s Laboratory-Directed Research and Development (LDRD) program.

The Laboratory’s neuromorphic research is funded by Argonne’s CELS LDRD Expedition and the Defense Advanced Research Projects Agency’s (DARPA) Lifelong Learning Machines Program

The DeepHyper research is sponsored by the DOE 2018 Early Career Award, which is funded by the Advanced Scientific Computing Research (ASCR) program within DOE’s Office of Science.

Project work associated with the Array of Things is funded by the National Science Foundation.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://​ener​gy​.gov/​s​c​ience.