(vectorfusionart/Shutterstock)
Chinese researchers claim to have demonstrated how artificial intelligence can extend the reach of classical supercomputing into the domain of quantum chemistry, using China’s Sunway OceanLite system to model molecular behavior at an unprecedented scale.
The team used the OceanLite, also known as the “New Sunway,” to train a neural network capable of simulating the quantum states of molecules, which is a task traditionally reserved for quantum computers or heavily simplified models. Their work applied a method called neural-network quantum states (NNQS), which uses machine learning to approximate how electrons move and interact within atoms and molecules.
Running on 37 million processor cores, the Sunway system achieved 92% strong scaling and 98% weak scaling, meaning performance held steady as processors and problem sizes increased. This high efficiency is rarely achieved at this scale and indicates close alignment between software and hardware, according to an article by VAST Data’s Head of Industry Relations (and former HPCwire Managing Editor) Nicole Hemsoth Prickett.
Quantum chemistry simulations require representing all possible configurations of electrons, an exponentially complex problem. Traditional methods can model only small systems because the number of possible electron configurations scales exponentially with system size. NNQS seeks to overcome that limit by training a neural network to approximate a molecule’s wavefunction, which mathematically represents how its electrons are distributed across quantum states.
In this study, the researchers modeled systems containing up to 120 spin orbitals, extending the scale of neural network quantum simulations beyond previous limits. The network was trained to predict local energies for sampled electron configurations, then refined until its output matched the molecule’s true energy distribution.
The Sunway OceanLite system, which is the successor to the TaihuLight supercomputer, is powered by SW26010-Pro processors, made up of clusters of small compute cores that use local memory instead of cache, allowing precise control of data movement. Tens of thousands of these processors are linked to form a system with more than forty million cores, capable of exascale performance, according to the VAST Data report. Although the architecture is well-suited for repetitive tasks like deep-learning training, the researchers adapted it to handle the irregular workloads of quantum simulations.
Adapting it involved creating a data-parallel NNQS-Transformer framework tailored to the machine’s layered design. Management cores coordinated communication among nodes, while lightweight compute elements performed calculations within local memory. A dynamic load-balancing algorithm helped distribute uneven workloads, ensuring no cores remained idle.
The project demonstrates that machine learning can model quantum systems accurately enough for practical chemistry and materials research using existing exascale hardware. The Sunway study expands on earlier NNQS efforts, showing that classical supercomputers can now handle molecular problems once thought to require quantum hardware. The results also highlight a potential bridge between classical and quantum computing: using neural networks on traditional machines to explore the same physical systems that future quantum computers will study directly.
While full performance details remain undisclosed, the research is another step forward in China’s development of large-scale, AI-enabled scientific computing. It also suggests that supercomputers can serve as powerful platforms for quantum-inspired simulations, bringing new materials discovery within reach before practical quantum processors are available. Along with recent work from SandboxAQ and Nvidia, which used AI accelerators to perform quantum chemistry simulations on GPUs, these studies show the growing convergence between AI hardware, HPC architectures, and scientific modeling.
Related

