We use cookies to make your experience better. To comply with the new e-Privacy directive, we need to ask for your consent to set the cookies. Learn more.
GPU Solutions
Supermicro offers a wide variety of high performance GPU server solutions with massive parallel processing processing power and netowrking flexibility. With support for NVIDIA’s Tesla V100 GPUs, these solutions provide the parallelism needed for today’s high performance server applications such as AI. Supermicro GPU solutions are available in a 1U, 2U, 4U and Tower form factors with optomization for HPC workloads, Computational Finance and Oil and Gas Simulation. Nvidia NV-Link greatly improves the efficiency of parallel processing tasks by removing the bandwidth bottleneck associated with PCIe 3.0, which will translate to improved performance in applications such as AI. Supermicro GPU Servers can support up to 6TB DDR4-2933MHz Memory across 24 DIMM slots with support for Intel's 2nd Gen. Scalable CPUs and Optane DC Persistent Memory.
Deployed in fields such as:
Key Features/Applications:
AI/ML, Deep Learning Training and Inference, Big Data Analytics, High Performance Computing (HPC, Research Laboratory/National Laboratory
CPU:
Dual Socket P (LGA 3647) support 2nd Gen Intel® Xeon® Scalable processors
Chassis:
4U / 1 Node
Drive:
24x Hot-swap 3.5" drive bays
RAM:
24x DDR4 DIMM Slots
Network Ports:
2x 10GBase-T LAN ports
Key Features/Applications:
AI/ML, Deep Learning Training and Inference, High Performance Computing (HPC), Big Data Analytics, Astrophysics Simulation, Chemistry Simulation, Research Laboratory/National Laboratory
CPU:
Dual Socket P (LGA 3647) support 2nd Gen Intel® Xeon® Scalable processors (Cascade Lake/Skylake)
Chassis:
4U Rackmountable
Drive:
Up to 24 Hot-swap 2.5" drive bays; 8x 2.5" SATA drives supported with included H/W, 2x 2.5" NVMe drives supported with included H/W, 1 NVMe based M.2 SSD
RAM:
24 DIMMs; up to 6TB 3DS ECC DDR4-2933MHz RDIMM/LRDIMM, Supports Intel® Optane™ DCPMM
Network Ports:
2x 10GBase-T LAN ports via Intel C622
Key Features/Applications:
AI/ML, Deep Learning Training and Inference, High Performance Computing (HPC), Big Data Analytics, Astrophysics Simulation, Chemistry Simulation, Research Laboratory/National Laboratory
CPU:
Dual Socket P (LGA 3647) support 2nd Gen Intel® Xeon® Scalable processors (Cascade Lake/Skylake)
Chassis:
4U Rackmountable
Drive:
Up to 24 Hot-swap 2.5" drive bays 8x 2.5" SATA drives supported with included H/W, 2x 2.5" NVMe drives supported with included H/W
RAM:
24 DIMMs; up to 6TB 3DS ECC DDR4-2933MHz RDIMM/LRDIMM, Supports Intel® Optane™ DCPMM
Network Ports:
2x 10GBase-T LAN ports via Intel C622
Key Features/Applications:
Artificial Intelligence (AI), HPC, AI / Deep Learning, Deep Learning/AI/Machine Learning Development
CPU:
Dual-Socket, AMD EPYC™ 9004 Series Processors
Chassis:
4U Rackmount Liquid Cooling
Drive:
Default: Total 8 bay(s) / 8 front hot-swap 2.5" NVMe drive bay(s)
RAM:
24 DIMM slots Up to 6TB: 4800 ECC DDR5
Network Ports:
1 RJ45 1 GbE Dedicated IPMI LAN port(s)
Key Features/Applications:
AI / Deep Learning, High Performance Computing
CPU:
Dual Socket SP5 AMD EPYC™ 9004 Series Processors
Chassis:
4U Rackmount
Drive:
Default: Total 6 bay(s) 2 front hot-swap 2.5" SATA drive bay(s) 4 front hot-swap 2.5" NVMe drive bay(s)
RAM:
Slot Count: 24 DIMM slots Max Memory (1DPC): Up to 6TB 4800MT/s ECC DDR5 RDIMM/LRDIMM
Network Ports:
1 RJ45 1 GbE Dedicated IPMI LAN port(s) 2 RJ45 10 GBASE-T LAN port(s)