Results for ""
According to an official press release, the Nobel Prize in Physics 2024 was awarded to John J Hopfield, Princeton University, NJ, USA, and Geoffrey E Hinton, University of Toronto, Canada, “for foundational discoveries and inventions that enable machine learning with artificial neural networks.” The two Nobel Laureates in Physics have used tools from physics to develop methods that are the foundation of today’s powerful machine learning.
John Hopfield created an associative memory that can store and reconstruct images and other types of data patterns. Geoffrey Hinton invented a method that can autonomously find properties in data and perform tasks such as identifying specific elements in pictures.
In an artificial neural network, the brain’s neurons are represented by nodes with different values. These nodes influence each other through connections that can be likened to synapses and which can be made stronger or weaker. The network is trained, for example, by developing stronger connections between nodes with simultaneously high values. This year’s laureates have conducted important work with artificial neural networks from the 1980s onward.
“The laureates’ work has already been of the greatest benefit. In physics, we use artificial neural networks in many areas, such as developing new materials with specific properties,” says Ellen Moons, Chair of the Nobel Committee for Physics.
John Hopfield invented a network that uses a method for saving and recreating patterns. We can imagine the nodes as pixels. The Hopfield network utilises physics that describes a material’s characteristics due to its atomic spin – a property that makes each atom a tiny magnet. The network is described in a manner equivalent to the energy in the spin system found in physics and is trained by finding values for the connections between the nodes so that the saved images have low energy. When the Hopfield network is fed a distorted or incomplete image, it methodically works through the nodes and updates their values so the network’s energy falls. The network thus works stepwise to find the saved image that is most like the imperfect one it was fed with.
Geoffrey Hinton used the Hopfield network as the foundation for a new network that uses a different method: the Boltzmann machine. This can learn to recognise characteristic elements in a given type of data. Hinton used tools from statistical physics, the science of systems built from many similar components. The machine is trained by feeding it examples that are very likely to arise when the machine is run. The Boltzmann machine can be used to classify images or create new examples of the type of pattern on which it was trained. Hinton has built upon this work, helping initiate the current explosive development of machine learning.