The Chinese Computational Cosmology Group has simulated the biggest virtual universe on the world’s fastest supercomputer, Sunway TaihuLight. It is five times bigger than the previous largest virtual universe created by University of Zurich, Switzerland.
In June 2017, University of Zurich created a catalogue of 25 billion artificial galaxies on 2 trillion digital particles, using a giant supercomputer. It is currently being used for Euclid satellite (to be launched in 2020) experiment with the aim of analyzing the nature of dark energy and dark matter.
Around a month later, China simulated an advanced successor by utilizing 10 million CPU cores of the world’s most powerful supercomputer (with 93 petaflops of processing power), Sunway TaihuLight, in Wuxi.
It seems like China is taking a full advantage of its superior machine powers that has outpaced other countries in the last couple of years. Recreating the universe is only the first step. Scientists believe that within 3 years China will be on the top in making new discoveries about the birth of the universe.
Chair scientist of Computational Cosmology Group in Beijing, Gao Liang, explained that they simulated the early expansion of our universe in 10 trillion digital particles. To run this, they had used 10 million CPU cores, with each core running multiple instructions in order to boost calculation speed.
But while the Switzerland’s project ran for 80 hours, the Chinese version of ‘ virtual universe’ was executed for an hour only.
Limitations of Machine
The N-body simulation intensifies with the increase in amount of particles. In 1970s, it was possible to simulate only 1,000 particles. In four decades, China has reached trillion-particle. But Chinese supercomputers have hardly operated at full capacity. Running full throttle would load a lot of strain on the hardware. They need a smart software to coordinate a vast amount of cores and processing units within the system.
A professor in the Institute of Astronomy and Space Science at Sun Yat-sen University, Lin Weipeng, said ‘because of the unique architecture of TaihuLight and its chip, Liang’s team had to create most of the software from the scratch. It was a labour-intensive task involving a lot of complex algorithms. One smallest error could have crashed the whole machine’.
Due to the lack of efficient software, Chinese supercomputers frequently have to split their processing unit to execute small and medium scale tasks for different users. ‘This is not what supercomputers are designed for, but it is about to change”, Lin added.
These kind of simulations will help astronomers to focus on particular areas of the universe. Researchers are calling this milestone a “warm-up exercise”. Their eye is on simulating the entire universe from its birth to current date. But the machine require to hold 13.8 billion years of data of universe isn’t built yet.
The next bigger simulation will be executed on TaihuLight’s successor, which is about 10 times faster and completed by 2019. This supercomputer will work alongside world’s largest radio telescope (250 meters in radius), in Guizhou China.