ASX-Dividend-Report-Banner

Supermicro Introduces Rack Scale Plug-and-Play Liquid-Cooled AI SuperClusters for NVIDIA Blackwell and NVIDIA HGX H100/H200 - Radical Innovations in the AI Era to Make Liquid-Cooling Free with a Bonus

June 05, 2024 01:00 PM AEST | By Cision
 Supermicro Introduces Rack Scale Plug-and-Play Liquid-Cooled AI SuperClusters for NVIDIA Blackwell and NVIDIA HGX H100/H200 - Radical Innovations in the AI Era to Make Liquid-Cooling Free with a Bonus
Image source: Kalkine Media

Generative AI SuperClusters, Integrated with NVIDIA AI Enterprise and NIM Microservices, Offer Instant ROI Gains and More AI Work per Dollar Through a Massively Scalable Compute Unit, Simplifying AI for Rapid Deployment

SAN JOSE, Calif. and TAIPEI, June 5, 2024 /PRNewswire/ -- Supermicro, Inc. (NASDAQ: SMCI), a Total IT Solution Provider for AI, Cloud, Storage, and 5G/Edge, is introducing a ready-to-deploy liquid-cooled AI data center, designed for cloud-native solutions that accelerate generative AI adoption for enterprises across industries with its SuperClusters, optimized for the NVIDIA AI Enterprise software platform for the development and deployment of generative AI. With Supermicro's 4U liquid-cooled, NVIDIA recently introduced Blackwell GPUs can fully unleash 20 PetaFLOPS on a single GPU of AI performance and demonstrate 4X better AI training and 30X better inference performance than the previous GPUs with additional cost savings. Aligned with its first-to-market strategy, Supermicro recently announced a complete line of NVIDIA Blackwell architecture-based products for the new NVIDIA HGX™ B100, B200, and GB200 Grace Blackwell Superchip.

Plug-and-Play Liquid-Cooled AI SuperCluster
Plug-and-Play Liquid-Cooled AI SuperCluster

"Supermicro continues to lead the industry in creating and deploying AI solutions with rack-scale liquid-cooling," said Charles Liang, president and CEO of Supermicro. "Data centers with liquid-cooling can be virtually free and provide a bonus value for customers, with the ongoing reduction in electricity usage. Our solutions are optimized with NVIDIA AI Enterprise software for customers across industries, and we deliver global manufacturing capacity with world-class efficiency. The result is that we can reduce the time to delivery of our liquid-cooled or air-cooled turnkey clusters with NVIDIA HGX H100 and H200, as well as the upcoming B100, B200, and GB200 solutions. From cold plates to CDUs to cooling towers, our rack-scale total liquid cooling solutions can reduce ongoing data center power usage by up to 40%."

Visit www.supermicro.com/ai for more information.

At COMPUTEX 2024, Supermicro is revealing its upcoming systems optimized for the NVIDIA Blackwell GPU, including a 10U air-cooled and a 4U liquid-cooled NVIDIA HGX B200-based system. In addition, Supermicro will be offering an 8U air-cooled NVIDIA HGX B100 system and Supermicro's NVIDIA GB200 NVL72 rack containing 72 interconnected GPUs with NVIDIA NVLink Switches, as well as the new NVIDIA MGX™ systems supporting NVIDIA H200 NVL PCIe GPUs and the newly announced NVIDIA GB200 NVL2 architecture.

"Generative AI is driving a reset of the entire computing stack — new data centers will be GPU-accelerated and optimized for AI," said Jensen Huang, founder and CEO of NVIDIA. "Supermicro has designed cutting-edge NVIDIA accelerated computing and networking solutions, enabling the trillion-dollar global data centers to be optimized for the era of AI."

The rapid development of large language models and the continuous new introductions of open-source models such as Meta's Llama-3 and Mistral's Mixtral 8x22B make today's state-of-the-art AI models more accessible for enterprises. The need to simplify the AI infrastructure and provide accessibility in the most cost-efficient way is paramount to supporting the current breakneck speed of the AI revolution. The Supermicro cloud-native AI SuperCluster bridges the gap between cloud convenience of instant access and portability, leveraging the NVIDIA AI Enterprise, allowing moving AI projects from pilot to production seamlessly at any scale. This provides the flexibility to run anywhere with securely managed data, including self-hosted systems or on-premises large data centers.  

With enterprises across industries rapidly experimenting with generative AI use cases, Supermicro collaborates closely with NVIDIA to ensure a seamless and flexible transition from experimentation and piloting AI applications to production deployment and large-scale data center AI. This result is achieved through rack and cluster-level optimization with the NVIDIA AI Enterprise software platform, enabling a smooth journey from initial exploration to scalable AI implementation.

Managed services compromise infrastructure choices, data sharing, and generative AI strategy control. NVIDIA NIM microservices, part of NVIDIA AI Enterprise, offer managed generative AI and open-source deployment benefits without drawbacks. Its versatile inference runtime with microservices accelerates generative AI deployment across a wide range of models, from open-source to NVIDIA's foundation models. In addition, NVIDIA NeMo™ enables custom model development with data curation, advanced customization, and retrieval-augmented generation (RAG) for enterprise-ready solutions. Combined with Supermicro's NVIDIA AI Enterprise ready SuperClusters, NVIDIA NIM provides the fastest path to scalable, accelerated Generative AI production deployments.

Supermicro's current generative AI SuperCluster offerings include:

  • Liquid-cooled Supermicro NVIDIA HGX H100/H200 SuperCluster with 256 H100/H200 GPUs as a scalable unit of compute in 5 racks (including 1 dedicated networking rack)
  • Air-cooled Supermicro NVIDIA HGX H100/H200 SuperCluster with 256 HGX H100/H200 GPUs as a scalable unit of compute in 9 racks (including 1 dedicated networking rack)
  • Supermicro NVIDIA MGX GH200 SuperCluster with 256 GH200 Grace Hopper Superchips as a scalable unit of compute in 9 racks (including 1 dedicated networking rack)

Supermicro SuperClusters are NVIDIA AI Enterprise ready with NVIDIA NIM microservices and NVIDIA NeMo platform for end-to-end generative AI customization and optimized for NVIDIA Quantum-2 InfiniBand as well as the new NVIDIA Spectrum-X Ethernet platform with 400Gb/s of networking speed per GPU for scaling out to a large cluster with tens of thousands of GPUs.

Supermicro's upcoming SuperCluster offerings include:

  • Supermicro NVIDIA HGX B200 SuperCluster, liquid-cooled
  • Supermicro NVIDIA HGX B100/B200 SuperCluster, air-cooled
  • Supermicro NVIDIA GB200 NVL72 or NVL36 SuperCluster, liquid-cooled

Supermicro's SuperCluster solutions are optimized for LLM training, deep learning, and high volume and batch size inference. Supermicro's L11 and L12 validation testing and on-site deployment service provide customers with a seamless experience. Customers receive plug-and-play scalable units for easy deployment in a data center and faster time to results.

About Super Micro Computer, Inc.

Supermicro (NASDAQ: SMCI) is a global leader in Application-Optimized Total IT Solutions. Founded and operating in San Jose, California, Supermicro is committed to delivering first-to-market innovation for Enterprise, Cloud, AI, and 5G Telco/Edge IT Infrastructure. We are a Total IT Solutions manufacturer with server, AI, storage, IoT, switch systems, software, and support services. Supermicro's motherboard, power, and chassis design expertise further enable our development and production, enabling next-generation innovation from cloud to edge for our global customers. Our products are designed and manufactured in-house (in the US, Asia, and the Netherlands), leveraging global operations for scale and efficiency and optimized to improve TCO and reduce environmental impact (Green Computing). The award-winning portfolio of Server Building Block Solutions® allows customers to optimize for their exact workload and application by selecting from a broad family of systems built from our flexible and reusable building blocks that support a comprehensive set of form factors, processors, memory, GPUs, storage, networking, power, and cooling solutions (air-conditioned, free air cooling or liquid cooling).

Supermicro, Server Building Block Solutions, and We Keep IT Green are trademarks and/or registered trademarks of Super Micro Computer, Inc.

All other brands, names, and trademarks are the property of their respective owners.

SMCI-F

 


Disclaimer

The content, including but not limited to any articles, news, quotes, information, data, text, reports, ratings, opinions, images, photos, graphics, graphs, charts, animations and video (Content) is a service of Kalkine Media Pty Ltd (“Kalkine Media, we or us”), ACN 629 651 672 and is available for personal and non-commercial use only. The principal purpose of the Content is to educate and inform. The Content does not contain or imply any recommendation or opinion intended to influence your financial decisions and must not be relied upon by you as such. Some of the Content on this website may be sponsored/non-sponsored, as applicable, but is NOT a solicitation or recommendation to buy, sell or hold the stocks of the company(s) or engage in any investment activity under discussion. Kalkine Media is neither licensed nor qualified to provide investment advice through this platform. Users should make their own enquiries about any investments and Kalkine Media strongly suggests the users to seek advice from a financial adviser, stockbroker or other professional (including taxation and legal advice), as necessary.
The content published on Kalkine Media also includes feeds sourced from third-party providers. Kalkine does not assert any ownership rights over the content provided by these third-party sources. The inclusion of such feeds on the Website is for informational purposes only. Kalkine does not guarantee the accuracy, completeness, or reliability of the content obtained from third-party feeds. Furthermore, Kalkine Media shall not be held liable for any errors, omissions, or inaccuracies in the content obtained from third-party feeds, nor for any damages or losses arising from the use of such content.
Kalkine Media hereby disclaims any and all the liabilities to any user for any direct, indirect, implied, punitive, special, incidental or other consequential damages arising from any use of the Content on this website, which is provided without warranties. The views expressed in the Content by the guests, if any, are their own and do not necessarily represent the views or opinions of Kalkine Media. Some of the images/music that may be used on this website are copyrighted to their respective owner(s). Kalkine Media does not claim ownership of any of the pictures displayed/music used on this website unless stated otherwise. The images/music that may be used on this website are taken from various sources on the internet, including paid subscriptions or are believed to be in public domain. We have made reasonable efforts to accredit the source wherever it was indicated as or found to be necessary.

This disclaimer is subject to change without notice. Users are advised to review this disclaimer periodically for any updates or modifications.

AU_advertise

Advertise your brand on Kalkine Media

Sponsored Articles


Investing Ideas

Previous Next
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.