2024/10/15
Oct 14th, 2024— PEGATRON, a globally recognized Electronics and Manufacturing Service (EMS+) provider, offers customers extensive flexibility in configuring their server solutions. This includes support for ORv3 rack solutions, compliance with the DC-MHS specifications (supports DC-SCM module, OCP3.0 NIC), 19-inch and 21-inch rack options, as well as liquid or air cooling and various other options as custom functions. The company is pleased to showcase the latest advancements in AI solutions at the OCP Global Summit 2024. Join us at Booth #A14, where PEGATRON aims to be your premier partner for a seamless transition.
Highlights of PEGATRON at OCP Global Summit 2024:
1. 2U 2-Node, AMD EPYC™ 9005 2-Socket solutions (MS303-4A1)
The server MS303-4A1 is an OCP ORv3 compliant liquid-cooled server with an ORv3 busbar clip for 48VDC power delivery. It comes equipped with 2x E1.S ports, 8x NVMe SSDs (E3.S), 2x PCIe 5.0 LP slots, and 4x OCP 3.0 slots for each node. MS303-4A1 has more flexible expansion slots and SSDs, making it suitable for deployment as a computing server for various cloud computing applications.
2. 2U 2-Node, Intel® Xeon® 6 1-Socket solutions (MS301-2T1 and MS302-2T1)
The MS301-2T1 and MS302-2T1 models are versatile solutions designed to support a wide range of applications, including high-performance computing, gaming, and storage. Their multi-functional design ensures flexible deployment across different environments, adapting seamlessly from 19” to 21” racks. These systems, compliant with DC-MHS specifications (support DC-SCM module, OCP 3.0 NIC), are built to meet the evolving needs of modern data centers.
3. Large Language Model (LLM) Training and Inference with the PEGATRON's Rack-scale AI Server solutions (RA4401-72N1)
The RA4401-72N1, built with the NVIDIA GB200 NVL72 system architecture, is an advanced server system utilizing NVIDIA Blackwell GPUs. In total, u to 72 Blackwell GPUs and 36 NVIDIA Grace CPUs can be deployed in the RA4401-72N1 rack and fully connected via NVIDIA NVLink™ technology. It offers a high-performance liquid-cooled, rack-scale solution tailored for demanding LLM training and inference workloads. This innovative AI server includes the NVIDIA BlueField-3 DPU, providing cloud network acceleration, high-speed data access, robust security measures, and flexible GPU compute options for large-scale AI environments. Furthermore, the system incorporates Axiado's BMC modular solution, providing enhanced security and modular management of the compute nodes, ensuring a streamlined and secure operation.
4. 4U 8x GPU server solutions (AS400-2T1)
This innovative 4U GPU server, built with the NVIDIA MGX™ 4U reference architecture, is engineered to revolutionize how businesses manage complex tasks. Equipped with 2-Socket AMD EPYC™ 9005 processor, 8x NVIDIA H200 NVL GPUs, the AS400-2T1 delivers exceptional AI and graphics performance, making it ideal for demanding applications such as AI training, LLM training/inference, and high-performance computing.
Additionally, the AS400-2T1 is future-proof, offering compatibility with upcoming GPUs, ensuring long-term flexibility and adaptability in its design.
5. 2U 4x GPU server solutions (AS205-2T1)
AS205-2T1 is equipped with a 2-Socket Intel Xeon® 6 processor and is based on the NVIDIA MGX 2U reference architecture. It supports up to 4 NVIDIA H100 NVL and L40S GPUs. The AS205-2T1 is designed to meet the rigorous needs of AI, graphics, image processing, and digital twin applications.
6. 2U AI Inference/Training Server (AS207-2N1)
The AS207-2N1 is a powerful 2U server optimized for AI inferencing or training, based on the NVIDIA MGX 2U reference architecture. It supports up to 2 NVIDIA Blackwell GPUs and 2 NVIDIA Grace CPUs for superior performance in complex AI workloads.
Equipped with NVIDIA NVLinkTM for high-speed GPU communication and a dual-port NVIDIA ConnectX®-7 NIC, it enhances network efficiency and performance for data-intensive tasks. The server includes 1 PCIe Gen5 x16 slot for NVIDIA BlueField-3 DPUs, ensuring scalability and data acceleration. Equipped with 4 U.2 NVMe SSD slots, the AS207-2N1 provides fast data access, making it an ideal choice for organizations focused on large-scale AI applications.