- Supermicro (SMCI, Financial) unveils new Data Center Building Block Solutions (DCBBS) for rapid AI data center deployment.
- DCBBS boasts up to 40% power savings, 60% reduced footprint, and 40% lower water usage, slashing TCO by 20%.
- The solution supports up to 2,048 NVIDIA Blackwell GPUs with comprehensive customization options at multiple levels.
Super Micro Computer, Inc. (NASDAQ: SMCI), a leading provider of IT solutions, has launched its innovative Data Center Building Block Solutions (DCBBS). This comprehensive package is designed to facilitate the construction of liquid-cooled AI data centers, allowing for deployment within just three months. It includes all necessary components and services such as servers, storage, networking, racks, liquid cooling, software, and support.
Central to the DCBBS offering is the AI Factory package, which can support up to 2,048 NVIDIA Blackwell GPUs and provides networking speeds up to 800Gb/s. This package is designed to meet the growing demands of AI infrastructure, enabling organizations to implement robust AI training and inference environments effectively.
The DCBBS solution significantly enhances energy efficiency and sustainability, featuring Supermicro's advanced DLC-2 technology. It delivers up to 40% power savings, reduces the data center footprint by 60%, and decreases water consumption by 40%, culminating in a 20% reduction in total cost of ownership (TCO) compared to traditional air-cooled systems.
Customization is a key component of DCBBS, offering options at the system, rack, and data center levels. Clients can tailor system-level components like CPUs, GPUs, and networking interfaces, while rack-level configurations include various enclosure sizes. Additionally, Supermicro provides comprehensive project management and support services, ensuring seamless deployment and operation.
As liquid-cooled data centers are expected to rise from less than 1% to 30% of the market, Supermicro's DCBBS positions the company strategically to capture significant market share in this rapidly growing segment. This modular approach simplifies AI data center buildouts, addressing critical industry pain points such as prolonged deployment cycles and high operational costs.