Hostname: page-component-cd9895bd7-jn8rn Total loading time: 0 Render date: 2024-12-25T05:40:27.787Z Has data issue: false hasContentIssue false

The Future of Chemistry is through Computations

Published online by Cambridge University Press:  22 October 2024

Giulia Palermo*
Affiliation:
Department of Bioengineering, University of California Riverside, Riverside, CA, USA Department of Chemistry, University of California Riverside, Riverside, CA, USA
Bengt Nordén
Affiliation:
Department of Chemistry and Chemical Engineering, Chalmers University of Technology, Gothenburg, Sweden
*
Corresponding author: Giulia Palermo; Email: gpalermo@engr.ucr.edu
Rights & Permissions [Opens in a new window]

Abstract

Type
Editorial
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives licence (http://creativecommons.org/licenses/by-nc-nd/4.0), which permits non-commercial re-use, distribution, and reproduction in any medium, provided that no alterations are made and the original article is properly cited. The written permission of Cambridge University Press must be obtained prior to any commercial use and/or adaptation of the article.
Copyright
© The Author(s), 2024. Published by Cambridge University Press

The 2024 Nobel Prize in Chemistry award celebrates the transformative power of computational methods in chemistry and highlights the importance of interdisciplinary approaches in addressing some of the most pressing challenges in science today.

This year’s award recognizes groundbreaking contributions that have reshaped the scientific landscape. David Baker is honored for his pioneering work in computational protein design, while John Jumper and Demis Hassabis are celebrated for their revolutionary advancements in protein structure prediction through the development of AlphaFold. Together, these innovations have dramatically accelerated progress in biology and medicine, fundamentally changing how scientists understand, design, and manipulate proteins.

David Baker, a biochemist at the University of Washington, has been a pioneering figure in computational protein design (Simons et al., Reference Simons, Bonneau, Ruczinski and Baker1999). His development of the Rosetta software suite has not only enabled accurate predictions of protein structures but also revolutionized the ability to design entirely novel proteins with tailored functions (Huang et al., Reference Huang, Boyken and Baker2016). This marks a monumental shift in biochemistry and molecular biology, where traditionally, understanding protein function depended on the study of naturally occurring proteins. Baker’s groundbreaking work has shown that it is possible to transcend the limitations of nature by creating new proteins from scratch, engineered to perform specific tasks, such as catalyzing chemical reactions, binding to particular molecules, or even fighting diseases.

John Jumper and Demis Hassabis, working at DeepMind, revolutionized the field of protein structure prediction with their development of AlphaFold, an artificial intelligence (AI) approach that represents a quantum leap in this area (Senior et al., Reference Senior, Evan, Jumper, Kirkpatrick, Sifre, Green, Qin, Zidek, Nelson, Bridgland, Penedones, Pedersen, Simonyan, Crossan, Kohli, Jones, Solver, Kavukcuoglu and Hassabis2020; Jumper et al., Reference Jumper, Evans, Pritzel, Green, Figurnov, Ronneberger, Tunyasuvunakool, Bates, Zidek, Potapenko, Bridgland, Meyer, Kohl, Ballard, Cowie, Romera-Paredes, Nikolov, Jain, Adler, Back, Petersen, Reiman, Clancy, Zielinski, Steinegger, Pacholska, Berghammer, Bodenstein, Silver, Vinyals, Senior, Kavukcuoglu, Kohli and Hassabis2021). AlphaFold’s breakthrough came in 2020 when it achieved unprecedented accuracy in predicting protein structures from amino acid sequences, solving a problem that had confounded scientists for decades. By leveraging advanced deep learning techniques, AlphaFold was able to predict the three-dimensional structures of proteins, accelerating biological research and reaching implications for drug discovery, biotechnology, and understanding fundamental biological processes.

Protein design and prediction tools can complement experimental techniques such as X-ray crystallography and cryo-electron microscopy. Additionally, they support biophysical approaches like molecular dynamics simulations and single-molecule experiments, offering a powerful toolset for researchers across disciplines. The synergy of these technologies has the potential to revolutionize fields such as drug development and personalized medicine. One can now leverage protein design and structural prediction to inform the design of novel proteins that do not exist in nature. Biophysical methods can subsequently characterize the mechanisms of action, while in vitro and in vivo experiments can evaluate their real-world impact. This integrated approach could lead to significant breakthroughs in biotechnology, enabling the creation of enzymes with new functionalities and the development of entirely new classes of therapeutics.

The 2024 Nobel Prize in Chemistry also carries broader implications for the evolution of the discipline. For centuries, chemistry has primarily been an experimental science, rooted in empirical observation and manipulation. However, the rise of powerful computational tools has fundamentally transformed the way chemists tackle problems, enabling them to simulate and predict chemical phenomena in silico before conducting laboratory experiments. This trend is likely to accelerate, as computational methods become indispensable for the development of new materials, drugs, and technologies. This recognition underscores the growing importance of interdisciplinary collaboration in modern chemistry, particularly at the intersection of chemistry, computer science, and artificial intelligence. By integrating these diverse fields, researchers are pushing the boundaries of what is possible in chemistry and biotechnology. Their work exemplifies how collaboration across disciplines can lead to revolutionary discoveries and reshape our understanding of complex biological systems.

This year, the Nobel Committee has also drawn attention to the profound interconnectedness among the fields of computational science, protein design, and artificial intelligence by awarding the Nobel Prize in Physics to John J. Hopfield and Geoffrey E. Hinton. Their foundational contributions to machine learning and artificial neural networks have played a crucial role in advancing these technologies, further bridging the gap between computational methods and biological applications. This recognition highlights how innovations in one field can significantly impact others, fostering a collaborative environment that drives progress across disciplines.

John J. Hopfield, a physicist and neuroscientist, made significant contributions in the early 1980s with the development of the Hopfield network, a type of recurrent artificial neural network (Hopfield, Reference Hopfield1982; Hopfield and Tank, Reference Hopfield and Tank1986). This model introduced associative memory to neural computation, allowing networks to store and recall patterns, thus enhancing our understanding of information processing in a way that mimics biological neural networks. Key features of the Hopfield network include its ability to converge on stable states that represent stored memories or patterns, demonstrating complex behaviors such as pattern completion and error correction. His mathematical framework has influenced fields like statistical mechanics, optimization, and cognitive science.

Geoffrey E. Hinton, often called the ‘Godfather of Deep Learning’, has been instrumental in advancing artificial neural networks (Fahlman et al., Reference Fahlman, Hinton and Sejnowski1983). His work on backpropagation improved the training of deep neural networks, enabling them to learn complex data representations (Rumelhart et al., Reference Rumelhart, Hinton and Williams1986). Hinton’s development of deep learning architectures has led to breakthroughs in image recognition, natural language processing, and reinforcement learning. His insights into unsupervised learning and hierarchical data representations have fundamentally changed how AI researchers approach problems, paving the way for systems that can generalize from data and perform tasks once thought to require human intelligence.

The Nobel Prizes in Chemistry awarded to David Baker, John Jumper, and Demis Hassabis, alongside the Nobel Prize in Physics for John J. Hopfield and Geoffrey E. Hinton, reflect the profound interconnectedness of computational science, protein design, and artificial intelligence. The work of these scientists exemplifies a convergence of ideas and methodologies with significant implications for society.

In conclusion, this year’s award emphasizes the revolutionary impact of computational methods in chemistry and highlights the significance of interdisciplinary collaboration in tackling today’s most critical scientific challenges. The work of these scientists is driving innovation across industries from pharmaceuticals to biotechnology, and the future implications of their work in society promise to be equally groundbreaking.

Acknowledgements

G.P. acknowledges support from the National Institutes of Health [R01GM141329] and by the National Science Foundation [CHE-2144823], as well as by the Sloan Foundation (grant n. FG-2023-20431) and the Camille and Henry Dreyfus Foundation (grant n. TC-24-063).

References

Fahlman, SE, Hinton, GE and Sejnowski, TJ (1983) Massively parallel architectures for Al: NETL, thistle, and Boltzmann machines. In Proceedings of the AAAI-83 Conference. Washington DC: AAAI Press, pp. 109113.Google Scholar
Hopfield, JJ (1982) Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences United States of America 79, 25542558Google Scholar
Hopfield, JJ and Tank, DW (1986) Computing with neural circuits: A model. Science 233, 625.Google Scholar
Huang, PS, Boyken, SE and Baker, D (2016) The coming of age of de novo protein design. Nature 537, 320327Google Scholar
Jumper, J, Evans, R, Pritzel, A, Green, T, Figurnov, M, Ronneberger, O, Tunyasuvunakool, K, Bates, R, Zidek, A, Potapenko, A, Bridgland, A, Meyer, C, Kohl, SAA, Ballard, AJ, Cowie, A, Romera-Paredes, B, Nikolov, S, Jain, R, Adler, J, Back, T, Petersen, S, Reiman, D, Clancy, E, Zielinski, M, Steinegger, M, Pacholska, M, Berghammer, T, Bodenstein, S, Silver, D, Vinyals, O, Senior, AW, Kavukcuoglu, K, Kohli, P and Hassabis, D (2021) Highly accurate protein structure prediction with AlphaFold. Nature 596, 583589Google Scholar
Rumelhart, DE, Hinton, GE and Williams, RJ (1986) Learning representations by back-propagating errors. Nature 323, 533536.Google Scholar
Senior, AW, Evan, R, Jumper, J, Kirkpatrick, J, Sifre, L, Green, T, Qin, C, Zidek, A, Nelson, AWR, Bridgland, A, Penedones, H, Pedersen, S, Simonyan, K, Crossan, S, Kohli, P, Jones, DT, Solver, D, Kavukcuoglu, K and Hassabis, D (2020) Improved protein structure prediction using potential from deep learning. Nature 577, 706710Google Scholar
Simons, KT, Bonneau, R, Ruczinski, I and Baker, D (1999) Ab initio protein structure prediction of CASPIII targets using ROSETTA. Proteins: Structure, Functions, and Genetics Supplement 3, 171176Google Scholar