Scientists across the globe are using artificial intelligence (AI) to design better materials and processes, accelerate drug discovery, explore the mysteries of the universe, automate traditional research, and drive an array of scientific discoveries.
This colloquium, the fourth in the series celebrating Argonne’s 75th anniversary, will engage a panel of experts to discuss how AI is used in, and has transformed, science.
At Argonne and other U.S. Department of Energy (DOE) laboratories, “AI for Science” broadly means the next generation of methods and scientific opportunities involving the use of computational learning and machine intelligence. This includes the development and application of AI methods (e.g., machine learning, deep learning, statistical methods, data analytics, automated control, and related areas) to build models from data and use the models to advance scientific research.
Please join us as we explore the transformational potential of AI: how to best define it, how it can impact diverse areas of science, what we’ve learned so far, and where we go from here.
Computer Scientist and Professor of Natural Computation, University of Birmingham;
Program Director, Data Science for Science and Humanities, Alan Turing Institute
Associate Laboratory Director for Computing, Environment and Life Sciences, Argonne National Laboratory
- Douglas Finkbeiner
Professor of Astronomy and Physics, Department of Astronomy, Harvard University
- Patrick Riley
Senior Vice President, Artificial Intelligence, Relay Therapeutics (Relay Tx)
- Subramanian Sankaranarayanan
Group Leader, Theory and Modeling Group, Center for Nanoscale Materials, Nanoscience & Technology, Argonne National Laboratory;
Associate Professor, Department of Mechanical and Industrial Engineering, University of Illinois Chicago
- Rebecca Willett
Professor of Statistics and Computer Science, University of Chicago
Jonathan Rowe is a Computer Scientist and Professor of Natural Computation at the University of Birmingham and Program Director for Data Science for Science and Humanities at the Alan Turing Institute. He holds a degree in Mathematics and a Ph.D. in Computer Science from the University of Exeter. He has recently become Chair and Principal Investigator of the £38.8 million AI for Science and Government program at the Turing Institute and Deputy Pro-Vice Chancellor for Strategic Projects at the University of Birmingham.
Rick Stevens is Argonne’s Associate Laboratory Director for the Computing, Environment and Life Sciences directorate. Stevens has been at Argonne since 1982 and has served as director of the Mathematics and Computer Science division and as Acting Associate Laboratory Director for Physical, Biological and Computing Sciences. He is currently leader of Argonne’s Exascale Computing Initiative and a Professor of Computer Science at the University of Chicago Physical Sciences Collegiate division. From 2000-2004, Stevens served as Director of the National Science Foundation’s TeraGrid Project and from 1997-2001 as Chief Architect for the National Computational Science Alliance. Stevens is interested in the development of innovative tools and techniques that enable computational scientists to solve important large-scale problems effectively on advanced scientific computers. Specifically, his research focuses on three principal areas: advanced collaboration and visualization environments, high-performance computer architectures (including grids), and computational problems in the life sciences. In addition to his research work, Stevens teaches courses on computer architecture, collaboration technology, virtual reality, parallel computing, and computational science.
Douglas Finkbeiner, Professor of Astronomy and Physics at Harvard University, specializes in astroparticle phenomenology. His research interests include galactic dust, especially as a foreground for cosmology; applied machine learning in astronomy, including neural networks and the wavelet scattering transform; inference problems on large data sets; observable consequences of dark matter annihilation; and high-energy astrophysics (e.g., the “Fermi Bubbles”). Finkbeiner has worked in research teams at UC Berkeley and Princeton University, where he was a Hubble Fellow and then a Russell-Cotsen Fellow in the Department of Astrophysical Sciences. In 2006, he joined the faculty at Harvard University in the Department of Astronomy, with a joint appointment in the Physics Department since 2009. He has been active in large optical surveys (SDSS-I, II and Pan-STARRS1) and is currently a member of SDSS-V and DESI.
Patrick Riley leads the artificial intelligence group at Relay Therapeutics (Relay Tx), applying learning methods to the discovery process. He has over 15 years of data science and machine learning experience. He came to Relay Tx from Google, where he was a principal software engineer and a lead of the Accelerated Science team. His work spanned areas as diverse as cell imaging, nuclear fusion, and materials science. His most important work was on the application of machine learning to small molecules, including foundational work on graph neural networks and their application to DNA-encoded small-molecule library screening. Riley holds a Ph.D., M.S., and B.S. in Computer Science from Carnegie Mellon University.
Subramanian Sankaranarayanan is the Group Leader of the Theory and Modeling group in the Nanoscience and Technology division at Argonne National Laboratory and an Associate Professor in the Mechanical Engineering department at University of Illinois Chicago. He is also a Senior Fellow at the Institute of Molecular Engineering at University of Chicago. Prior to joining Argonne, Sankaranarayanan was a postdoctoral fellow at the School of Engineering and Applied Sciences at Harvard University. His research focuses on the use of machine learning to bridge the electronic, atomistic, and mesoscopic scales for accelerated materials discovery and design. He uses machine learning techniques to develop first-principles-based force fields for simulating reactive and mesoscopic systems. Other programmatic efforts include development and use of AI algorithms for inverse design of materials and deep learning for integrated X-ray imaging of ultrafast energy transport across solid-solid and solid-liquid interfaces. His interests span a diverse range of applications from energy storage, tribology, and corrosion to neuromorphic computing and thermal management. He is a co-inventor on four patents and has co-authored more than 150 journal publications, including several high-impact publications in Science, Nature, Nature Materials, Nature Communications, Proceedings of the National Academy of Sciences, ACS Nano, Nano Letters, and Physical Review Letters, to name a few.
Rebecca Willett is a Professor of Statistics and Computer Science at the University of Chicago. Her research interests include signal processing, machine learning, and large-scale data science. In particular, she has studied methods to leverage low-dimensional models in a variety of contexts, including when data are high dimensional, contain missing entries, are subject to constrained sensing or communication resources, correspond to point processes, or arise in ill-conditioned inverse problems. This work lies at the intersection of high-dimensional statistics, inverse problems in imaging and network science (including compressed sensing), learning theory, algebraic geometry, optical engineering, nonlinear approximation theory, statistical signal processing, and optimization theory. Her group has made contributions both in the mathematical foundations of signal processing and machine learning and in their application to a variety of real-world problems. She is involved in active collaborations with researchers in astronomy, materials science, microscopy, electronic health record analysis, cognitive neuroscience, precision agriculture, biochemistry, and atmospheric science.