en.Wedoany.com Reported - Samsung Electronics plans to deliver samples of CMM-D memory modules based on the CXL 3.1 standard to major server and data center customers in the third quarter of 2026, with mass production potentially starting as early as the fourth quarter. According to South Korean media outlet THE ELEC, citing industry sources, Samsung internally plans to complete the manufacturing of CXL 3.1 CMM-D samples after June this year and supply them to customers. If they successfully pass customer quality verification, the production scale will be determined and mass production will commence in the fourth quarter.
CXL (Compute Express Link) is an open high-speed interconnect standard built upon the PCIe physical layer, supporting cache-coherent data communication between CPUs and accelerators such as GPUs, FPGAs, and memory expansion modules. It is regarded as a core enabling technology for next-generation disaggregated data center architectures. Samsung's upcoming CMM-D 3.1 module integrates multiple DRAM chips with a dedicated CXL controller on a single PCB, offering a maximum capacity of 1TB and bandwidth of 72GB per second, utilizing a PCIe 6.0 interface. Compared to the previous generation product based on the CXL 2.0 specification—which featured a maximum capacity of 256GB, bandwidth of 36GB per second, and a PCIe 5.0 interface—the new generation module doubles the transfer rate per channel, allowing the system to achieve higher data throughput from CXL-attached memory while significantly reducing latency.
Memory pooling is the core value of CXL technology in data center applications. Through CXL switch chips, multiple processors can share a pool of additional memory resources, dynamically allocated based on actual workload demands, eliminating the need to equip each processor with large amounts of redundant memory. This approach significantly improves memory resource utilization and is particularly suited for workload scenarios with urgent demands for high-bandwidth, low-latency memory expansion, such as AI training and inference, and high-performance computing. Industry insiders point out that CXL memory and HBM are not substitutes but complementary—the latter offers higher bandwidth and faster operating speeds, while the former focuses on flexible memory capacity expansion, with both working synergistically within server architectures.
Samsung has accumulated over three years of experience in the CXL memory field. In May 2023, Samsung completed the development of CMM-D 2.0 samples and supplied them to more than 40 global customers, including major cloud service providers like Microsoft, Google, and Amazon, AI companies with self-built data centers such as Meta, and server enterprises like Dell and Supermicro. CMM-D 3.0 samples have also been supplied to these customers. Samsung Electronics Vice President Kevin Yoon publicly stated as early as September 2025 that the CXL 3.1 and PCIe 6.0 CMM-D solution was planned for launch in 2026. The announcement of this mass production timeline indicates that the roadmap is progressing as planned.
However, the commercialization process for CXL memory has not been entirely smooth. Due to sustained strong demand for general-purpose DRAM and HBM, Samsung has somewhat lowered the commercialization priority for CXL in its capacity allocation, delaying the CMM-D 3.1, originally planned for launch by the end of 2025, by about half a year. The memory bottleneck issue in the current data center market is becoming increasingly prominent—the exponential growth in AI model parameter sizes is exerting continuous pressure on memory capacity and bandwidth, while the physical expansion space for traditional DIMM slots is limited. CXL memory modules connect to servers in a manner similar to solid-state drives, enabling memory capacity expansion without requiring significant modifications to the server architecture, thus offering a new path to resolving this bottleneck. The approaching mass production milestone will test Samsung's execution capabilities in commercializing CXL technology.
This article is compiled by Wedoany. All AI citations must indicate the source as "Wedoany". If there is any infringement or other issues, please notify us promptly, and we will modify or delete it accordingly. Email: news@wedoany.com










