The EU talks a big game when it comes to the green transition. While there has been significant political backlash in several sectors for decarbonisation, one area seems to be proving undoubtedly successful — the energy efficiency of its EuroHPC supercomputers.

The European joint venture’s first exascale supercomputer, JUPITER, is currently being built at the Forschungszentrum Jülich in North Rhine-Westphalia, Germany. Its first precursor module is named JEDI (this news should obviously have been released on May 4th, if you ask us) and was installed in April.

Beyond offering some nerds the ultimate abbreviation opportunity, the Jupiter Exascale Development Instrument has also just ranked first on the Green500 list of the most energy-efficient supercomputers worldwide. 

Data centres consume massive amounts of energy. The demand for compute will only increase, as training and running AI gobble up more of our precious resources. According to Arm CEO Rene Haas, AI could be responsible for as much electricity consumption as India by 2030. 

The <3 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol’ founder Boris, and some questionable AI art. It’s free, every week, in your inbox. Sign up now!

As such, it is imperative that all the supercomputer projects — the machines needed to train state-of-the-art AI — announced over the past couple of years are held to account for the amount of energy they require to run. 

Nvidia for the HPC win, again

The Green500 is a ranking from the biannual Top500 list. While the latter ranks systems purely based on performance, the former rates them accordingly to how energy efficient they are. In another triumph for Nvidia, almost all leading systems on the Green500 rely mostly on graphic processing units, or GPUs. 

JEDI is based on Nvidia’s GH200 Grace Hopper Superchip, which combines both its GPU and CPU architectures. It also uses a direct liquid hot water-cooling system, part of the Eviden BullSequana XH3000 architecture, which uses significantly less energy than conventional air cooling.  

When complete, the JUPITER system will feature 24,000 Nvidia Grace Hopper chips, 125 BullSequana XH3000 racks, and surpass the threshold of one exaflop. That is, a computational capacity of one quintillion (that’s a 1 followed by 18 zeros) floating point operations per second. 

Specifically for 8-bit calculations, the most common for training AI, the computing power will increase to hit well over 70 exaflops. This would make the system, which will come into general operation early 2025, the most powerful AI system in the world. At least, until the next one comes online, because “always in motion is the future.” 



Source link