Selene (supercomputer)

Selene is a supercomputer developed by Nvidia, capable of achieving 63.460 petaflops, ranking as the fifth fastest supercomputer in the world, when it entered the list. Selene is based on the Nvidia DGX system consisting of AMD CPUs, Nvidia A100 GPUs, and Mellanox HDDR networking. Selene is based on the Nvidia DGX Superpod, which is a high performance turnkey supercomputer solution provided by Nvidia using DGX hardware. DGX Superpod is a tightly integrated system that combines high performance DGX compute nodes with fast storage and high bandwidth networking. It aims to provide a turnkey solution to high-demand machine learning workloads. Selene was built in three months and is the fastest industrial system in the US while being the second-most energy-efficient supercomputing system ever.

Selene utilizing 1080 AMD Epyc CPUs and 4320 A100 GPUs is used to train BERT, the natural language processor, in less than 16 seconds, which usually takes most smaller systems about 20 minutes to execute. IEEE Spectrum reported that as per December 2021 among all the commercially available supercomputing systems Selene topped all the results of MLPerf benchmark, which is the benchmark developed by the consortium of artificial intelligence developers from academia, research labs, and industry aiming to unbiasedly evaluate the training and inference performance for hardware, software, and services used for AI.

Selene is deployed by the Argonne National Laboratory to research different ways to end the coronavirus. It has been used to tackle problems around the concepts of protein docking and quantum chemistry, which are vital to developing an understanding of the coronavirus and a potential cure for it.

Nvidia used Selene to train its GauGAN2 AI model, which is used in Nvidia Canvas software to create art using artificial intelligence, using 10 million landscape images for training. GauGAN2 AI model uses segmentation mapping, inpainting, and text-to-image generation in a single model to create art.