x210c m6 Memory Guide
x210c m6 Memory Guide
Memory Organization
The standard memory features are:
Memory is organized with eight memory channels per CPU, with up to two DIMMs per channel, as
shown in Figure 1.
Slot 2
Slot 2
Slot 1
A1 A2 A2 A1
Chan A Chan A
B1 B2 B2 B1
Chan B
Chan B
C1 C2 C2 C1
Chan C Chan C
D1 D2 D2 D1
Chan D Chan D
CPU 1 CPU 2
E1 E2 E2 E1
Chan E Chan E
F1 F2 F2 F1
Chan F Chan F
G1 G2 G2 G1
Chan G Chan G
H1 H2 H2 H1
Chan H Chan H
Ranks
Product ID (PID) PID Description Voltage
/DIMM
3200-MHz DIMMS
UCSX-MR-X16G1RW 16 GB RDIMM SRx4 3200 (8Gb) 1.2 V 1
UCSX-MR-X32G1RW 32 GB RDIMM SRx4 3200 (16Gb) 1.2 V 1
UCSX-MR-X32G2RW 32 GB RDIMM DRx4 3200 (8Gb) 1.2 V 2
UCSX-MR-X64G2RW 64 GB RDIMM DRx4 3200 (16Gb) 1.2 V 2
UCSX-ML-128G4RW 128 GB LRDIMM QRx4 3200 (16Gb) (non-3DS) 1.2 V 4
UCSX-ML-256G8RW1 256 GB LRDIMM 8Rx4 3200 (16Gb) (3DS) 1.2 V 8
Intel® OptaneTM Persistent Memory (PMem)
UCSX-MP-128GS-B0 Intel® OptaneTM DC Persistent Memory, 128GB, 3200 MHz
UCSX-MP-256GS-B0 Intel® OptaneTM DC Persistent Memory, 256 GB, 3200 MHz
UCSX-MP-512GS-B0 Intel® OptaneTM DC Persistent Memory, 512 GB, 3200 MHz
Intel® OptaneTM Persistent Memory (PMem) Operational Modes
UCSX-DCPMM-AD App Direct Mode
UCSX-DCPMM-MM Memory Mode
Memory Mirroring Option
N01-MMIRROR Memory mirroring option
Notes:
1. Review the X210c M6 Compute Node Spec Sheet for additional 256GB DIMM usage condition.
DIMM Guidelines
■ System speed is dependent on the CPU DIMM speed support. Refer to Table 1 on page 3 for
DIMM speeds.
■ The servers support the following memory reliability, availability, and serviceability (RAS)
BIOS options (only one option can be chosen):
— Adaptive Double Device Data Correction (ADDDC) (default)
— Maximum performance
— Full mirroring
— Partial mirroring
■ DIMM Count Rules:
— Allowed DIMM count for 1-CPU:
• Minimum DIMM count = 1; Maximum DIMM count = 16
• 1, 2, 4, 6, 8, 12, or 16 DIMMs allowed
• 3, 5, 7, 9, 10, 11, 13, 14, 15 DIMMs not allowed.
— Allowed DIMM count for 2-CPUs
• Minimum DIMM count = 2; Maximum DIMM count = 32
• 2, 4, 8, 12, 16, 24, or 32 DIMMs allowed
• 6, 10, 14, 18, 20, 22, 26, 28, 30 DIMMs not allowed.
■ Mixing Rules:
— Mixing different types of DIMM (RDIMM with any type of LRDIMM or 3DS LRDIMM with
non-3DS LRDIMM) is not supported within a server.
— Mixing RDIMM with RDIMM types is allowed if they are mixed in same quantities, in a
balanced configuration.
— Mixing 16 GB, 32 GB, and 64 GB RDIMMs is supported.
— 128 GB and 256 GB LRDIMMs1 cannot be mixed with other RDIMMs
— 128 GB non-3DS LRDIMMs cannot be mixed with 256 GB 3DS LRDIMMs1
— Single-rank DIMMs can be mixed with dual-rank DIMMs in the same channel
— Allowed mixing must be in numbered “pairs” (for example, 8x32 GB and 8x64 GB).
Such pairs as 10x32 GB and 6x64 GB are not allowed.
— RDIMMs of different sizes can be mixed within a channel. When mixing RDIMMs of
different densities (sizes), populate DIMMs with the highest density first. For
example, if you have to mix 32 GB RDIMMs with 16 GB RDIMMs, then populate the 32
GB DIMMs in blue slots (or slot 1) and then 16 GB DIMMs in black slots (or slot 2).
— Do not mix DIMM types (size, speed, ranks) in a system that uses PMem. In these
cases, all DIMMs must be the same type and size.
— RDIMMs of different ranks can be mixed within a channel. When mixing RDIMMs with
different ranks, populate RDIMMs with the higher rank first. For example, when
Notes
1. Review the X210c M6 Compute Node Spec Sheet for additional 256GB DIMM usage condition.
mixing dual-rank RDIMMs with single-rank RDIMMs populate the dual-rank RDIMMs in
blue slots first and then single-rank RDIMMs in black slots.
DIMM Parameter DIMMs in the Same Channel DIMM in the Same Slot1
DIMM Capacity DIMMs in the same channel (for For best performance, DIMMs in the
example, A1 and A2) can have same slot (for example, A1, B1, C1,
RDIMM = 16, 32, 64 GB
different capacities. D1, E1, F1) should have the same
LRDIMM = 128, 256 GB2 capacity.
DIMM Speed DIMMs will run at the lowest DIMMs will run at the lowest speed
3200-MHz speed of the CPU installed of the CPU installed
DIMM Type Do not mix LRDIMMs with RDIMMs Do not mix LRDIMMs with RDIMMs in
RDIMMs, or LRDIMMs in a channel a slot
Notes:
1. Although different DIMM capacities can exist in the same slot, this will result in less than optimal performance.
For optimal performance, all DIMMs in the same slot should be identical.
2. Review the X210c M6 Compute Node Spec Sheet for additional 256GB DIMM usage condition.
■ Population Rules
— Each channel has two memory slots (for example, channel A = slots A1 and A2).
• A channel can operate with one or two DIMMs installed.
• If a channel has only one DIMM, populate slot 1 first (the blue slot).
— When both CPUs are installed, populate the memory slots of each CPU identically.
Fill the blue slots (slot 1) in the memory channels first according to the
recommended DIMM populations in Table 3. The table gives the DIMM populations
for both mirrored and non-mirrored configurations.
■ Memory Limitations
— The maximum combined memory allowed in the 16 DIMM slots controlled by any one
CPU is 6 TB (for 8 x 512 GB PMem and 8 x 256 GB DIMMs1).
— The maximum combined memory allowed in the 32 DIMM slots controlled by two
CPUs is 12 TB (for 16 x 512 GB PMem and 16 x 256 GB DIMMs1).
■ For best performance, observe the following:
— For optimum performance, populate at least one DIMM per memory channel per
CPU. When one DIMM is used, it must be populated in DIMM slot 1 (blue slot farthest
away from the CPU) of a given channel.
— For populations of 1 DIMM per channel (DPC) and 2DPC, all supported DIMMs on Cisco
UCS M6 servers run at their labeled speed provided the processor supports that
speed.
— When populating DIMM slots for optimal performance, multiples of 16 DIMMs are
best because there are 8 memory channels per CPU socket and 2-CPUs must be
populated.
— At the same memory speed, 2 DPC may perform slightly better than 1 DPC for
RDIMMs (workload dependent).
— For optimum performance, use dual rank RDIMMs preferably, then single rank
RDIMMs, and lastly LRDIMMs. Larger size LRDIMMs provide large capacity memory
configurations but the performance of these DIMMs is lower than standard RDIMMs.
— For small to medium memory capacities, whenever possible, install dual rank
RDIMMs for optimal performance. Dual rank RDIMMs perform better than single rank
RDIMMs. Single rank DIMMs limit the performance of memory-intensive workloads in
1 DIMM per channel configurations.
— 256 GB RDIMMs1 should be used for the largest memory capacity requirement. These
DIMMs provide the maximum memory size supported for 2-socket UCS M6 servers.
■ DIMMs for both CPUs must always be configured identically
■ All DIMMs must be DDR4 DIMMs that support ECC. Non-buffered UDIMMs and non-ECC DIMMs
are not supported.
Notes
1. Review the X210c M6 Compute Node Spec Sheet for additional 256GB DIMM usage condition.
■ Cisco memory from previous generation servers (DDR3 and DDR4) is not supported with the
Cisco UCS X210c M6 Compute Node.
NOTE: System performance is optimized when the DIMM type and quantity are equal
for both CPUs, and when all channels are filled equally across the CPUs in the server.
Table 4 shows the Cisco-supported all-DIMM configurations. These configurations are a subset of the
Intel-supported configurations.
Table 4 3rd Gen Intel® Xeon® Scalable Processors (Ice Lake) All DIMM Physical Configuration
DIMM +
PMem CPU 1 or CPU 2
Count
Slot Slot Slot Slot Slot Slot Slot Slot Slot Slot Slot Slot Slot Slot Slot Slot
1 2 1 2 1 2 1 2 2 1 2 1 2 1 2 1
1+0 DIMM
12 + 0 DIMM DIMM DIMM DIMM DIMM DIMM DIMM DIMM DIMM DIMM DIMM DIMM
16 + 0 DIMM DIMM DIMM DIMM DIMM DIMM DIMM DIMM DIMM DIMM DIMM DIMM DIMM DIMM DIMM DIMM
PMem Guidelines
■ All installed PMem must be the same size. Mixing PMem of different capacities is not supported.
■ When PMem are installed, all DIMMs installed must be identical (same speed, size and ranks).
■ PMem and DIMMs must be populated as shown in Table 5.
NOTE: In Table 5, all DIMMs must be identical to each other and all PMem must also
be identical to each other. The table shows the Cisco-supported configurations (it is
a subset of the Intel-supported configurations).
Table 5 3rd Gen Intel® Xeon® Scalable Processors (Ice Lake) DIMM and PMem1 Physical Configuration
DIMM +
PMem CPU 1 or CPU 2
Count
Slot Slot Slot Slot Slot Slot Slot Slot Slot Slot Slot Slot Slot Slot Slot Slot
1 2 1 2 1 2 1 2 2 1 2 1 2 1 2 1
8 + 44 DIMM DIMM PMem DIMM DIMM PMem PMem DIMM DIMM PMem DIMM DIMM
8 + 85 DIMM PMem DIMM PMem DIMM PMem DIMM PMem PMem DIMM PMem DIMM PMem DIMM PMem DIMM
Notes:
1. All systems must be fully populated with two CPUs when using PMem at this time.
2. AD, MM
3. AD
4. AD, MM
5. AD, MM
■ Two CPUs must be installed when using PMem.For each memory channel with both a PMem and a
DIMM installed, the PMem is installed in channel slot 2 (black slot closest to the CPU) and the DIMM is
installed in channel slot 1 (blue slot farthest from CPU).
■ To maximize performance, balance all memory channels
■ For best memory performance, use identical DIMM and PMem types within a server (same speed, size
and ranks).
■ In configurations with PMem installed, memory mirroring is supported, with two restrictions:
■ Mirroring is only enabled on the DIMMs installed in the server; the PMem themselves do not
support mirroring.
■ Only App Direct mode is supported. Memory mirroring cannot be enabled when PMem are in
Memory Mode.
Memory Modes
The Ice Lake CPUs support two memory modes:
For example, if App Direct mode is configured and the DIMM sockets for a CPU are populated with 8 x 256
GB DRAMs1 (2 TB total DRAM) and 8 x 512 GB PMem (4 TB total PMem), then 6 TB total counts towards the
CPU capacity limit.
Memory Mode
PMem operates as a 100% memory module. Data is volatile and DRAM acts as a cache for PMem. Only the
PMem capacity counts towards the CPU capacity limit. This is the factory default mode.
For example, if Memory mode is configured and the DIMM sockets for a CPU are populated with 8 x 256 GB
DRAMs1 (2 TB total DRAM) and 8 x 512 GB PMem (4 TB total PMem), then only 4 TB total (the PMem memory)
counts towards the CPU capacity limit. All of the DRAM capacity (2 TB) is used as cache and does not factor
into CPU capacity. The recommended Intel DRAM:PMem ratio for Memory Mode is from 1:2 to 1:16.
Notes
1. Review the X210c M6 Compute Node Spec Sheet for additional 256GB DIMM usage condition.
Only the following mixed DRAM/PMem memory configurations are supported per CPU socket:
The available DRAM capacities are 16 GB, 32 GB, 64 GB, 128 GB, or 256 GB1.
The available PMem capacities are 128 GB, 256 GB, or 512 GB.
Notes
1. Review the X210c M6 Compute Node Spec Sheet for additional 256GB DIMM usage condition.
Physical Layout
Each CPU has eight memory channels:
Each memory channel has two slots: slot 1 and slot 2. The blue-colored slots are for slot 1 and the black
slots for slot 2.
As an example, slots A1, B1, C1, D1, E1, F1, G1, and H1 belong to slot 1, while A2, B2, C2, D2, E2, F2, G2
and H2 belong to slot 2.
Figure 2 show how slots and channels are physically laid out on the motherboards for the servers. Each CPU
has channels A, B, C, D, E, F, G, and H. The slot 1 (blue) slots are always located farther away from a CPU
than the corresponding slot 2 (black) slots. Slot 1 slots (blue) are populated before slot 2 slots (black).
Figure 2 Physical Layout of Cisco UCS X210c M6 Compute Node CPU Memory Channels and Slots
P2 F0 P2 E0 P1 B0
P2 F1 P2 E1 P1 B1
P2 H0 P1 A0 P1 D0
P2 H1 P1 A1 P1 D1
P2 G0 P1 C0
P2 G1 P1 C1
Front
CPU 2 CPU 1
P2 C1 P1 G1
P2 C0 P1 G0
P2 D1 P1 E1 P1 H1
P2 D0 P1 E0 P1 H0
P2 B1 P2 A1 P1 F1
P2 B0 P2 A0 P1 F0
Table 6 Recommended Memory Configurations for 3rd Gen Intel® Xeon® Scalable Processors (Ice Lake)
For Best Performance
CPU-1 CPU-2
Blue Slots Black Slots Blue Slots Black Slots
Total System Total DIMMs in
Bank 1 Bank 2 Bank 1 Bank 2 DIMM Type
Memory Size the system
(A1,B1,C1,D1, (A2, B2, C2, D2, (A1,B1,C1,D1, (A2, B2, C2, D2,
E1, F1,G1,H1) E2, F2,G2,H2) E1, F1,G1,H1) E2, F2,G2,H2)
256 GB 8x16 GB - 8x16 GB - R 16
Notes:
1. Review the X210c M6 Compute Node Spec Sheet for additional 256GB DIMM usage condition.
NOTE:
■ This Table 6 lists only best recommended memory configurations based on memory
performance data.
■ Yellow Highlighted Cells represent Sweet Spot configurations for achieving optimum
performance in a system.
■ These memory configurations will yield the best performance since the memory is
populated equally for both the CPUs across all the eight memory channels.
■ The recommendations of Table 6 are based on memory performance measurements, done
for a C240 M6 configured with two 3rd Generation Intel Xeon Scalable 8380 processors.
■ 32GB dual rank and 64GB dual rank RDIMMs provide the highest memory bandwidth at 1 DPC
and 2 DPC.
■ Among all mixing configurations, 8x32 GB + 8x64 GB mix per CPU (1536 GB total system
capacity for 2-sockets) provides the highest memory bandwidth.
■ 128 GB LRDIMMs with up to 4096 GB total system capacity for 2-Sockets, and 256 GB1
LRDIMMs with up to 8192 GB total system capacity for 2-Sockets provide the largest memory
capacities. Note: these LRDIMMs cannot be mixed with any RDIMMs;
Notes:
1. Review the X210c M6 Compute Node Spec Sheet for additional 256GB DIMM usage condition.
■ 16 GB and 32 GB RDIMMS
■ 16 GB and 64 GB RDIMMs
■ 32 GB and 64 GB RDIMMs
Table 7 Supported Memory Configurations for 3rd Gen Intel® Xeon® Scalable Processors (Ice Lake)
CPU-1 CPU-2
Total Blue Slots Black Slots Blue Slots Black Slots
System DIMM Total DIMMs in
Memory Bank 1 Bank2 Bank 1 Bank 2 Type the system
Size (A1,B1,C1,D1, (A2, B2, C2, D2 (A1,B1,C1,D1, (A2, B2, C2, D2
E1, F1, G1, H1) E2, F2, G2, H2) E1, F1, G1, H1) E2, F2, G2, H2)
16 GB RDIMMs
32 GB 1x16 GB - 1x16 GB - R 2
64 GB 2x16 GB - 2x16 GB - R 4
32 GB RDIMMs
64 GB 1x32 GB - 1x32 GB - R 2
Table 7 Supported Memory Configurations for 3rd Gen Intel® Xeon® Scalable Processors (Ice Lake)
CPU-1 CPU-2
Total Blue Slots Black Slots Blue Slots Black Slots
System DIMM Total DIMMs in
Memory Bank 1 Bank2 Bank 1 Bank 2 Type the system
Size (A1,B1,C1,D1, (A2, B2, C2, D2 (A1,B1,C1,D1, (A2, B2, C2, D2
E1, F1, G1, H1) E2, F2, G2, H2) E1, F1, G1, H1) E2, F2, G2, H2)
64 GB RDIMMs
128 GB LRDIMMs
256 GB LRDIMMs1
6144 GB1 6x256 GB1 6x256 GB1 6x256 GB1 6x256 GB1 LR 24
Table 7 Supported Memory Configurations for 3rd Gen Intel® Xeon® Scalable Processors (Ice Lake)
CPU-1 CPU-2
Total Blue Slots Black Slots Blue Slots Black Slots
System DIMM Total DIMMs in
Memory Bank 1 Bank2 Bank 1 Bank 2 Type the system
Size (A1,B1,C1,D1, (A2, B2, C2, D2 (A1,B1,C1,D1, (A2, B2, C2, D2
E1, F1, G1, H1) E2, F2, G2, H2) E1, F1, G1, H1) E2, F2, G2, H2)
8192 GB1 8x256 GB1 8x256 GB1 8x256 GB1 8x256 GB1 LR 32
Notes:
1. Review the X210c M6 Compute Node Spec Sheet for additional 256GB DIMM usage condition.
NOTE: 1-CPU configuration, with identical mix of DIMMs as 2-CPUs shown on Table 7
above, is possible but not recommended for performance reason.
Notes:
1. Review the X210c M6 Compute Node Spec Sheet for additional 256GB DIMM usage condition.
Notes:
1. The 8:1 DRAM:PMem ratio is valid for App Direct mode only.
Selection of PMem also requires that all CPUs be fully populated. The rules of mixed DIMM and PMem
configurations are as follows.
■ Only the number of DIMMs and PMem allowed per CPU are as shown in Table 9.
■ All PMem must be equal in size
■ All DIMMs must be equal in size and type
■ For the App Direct Mode, both DCPMM and DIMM capacities count towards the CPU capacity limit
■ For the Memory Mode only the PMem capacity counts towards the CPU capacity limit. DIMMs are
used for cache only and do not counts toward the CPU capacity limit.
Table 10 through Table 13 show all the possible combinations of DRAMs and PMem possible in each of the
four supported DRAM/PMem 2-CPU mixed configurations.
2-CPU Configuration
DRAM PMem
8 8
Notes:
1. Red cells represent the unsupported configurations and ratio for Memory Mode.
2. Review the X210c M6 Compute Node Spec Sheet for additional 256GB DIMM usage condition.
2-CPU Configuration
DRAM PMem
16 8
Notes:
1. Red cells represent the unsupported configurations and ratio for Memory Mode.
2. Review the X210c M6 Compute Node Spec Sheet for additional 256GB DIMM usage condition.
2-CPU Configuration
DRAM PMem
16 2
Notes:
1. Red cells represent the unsupported configurations and ratio for Memory Mode.
2. Review the X210c M6 Compute Node Spec Sheet for additional 256GB DIMM usage condition.
2-CPU Configuration
DRAM PMem
16 16
Notes:
1. Red cells represent the unsupported configurations and ratio for Memory Mode.
2. Review the X210c M6 Compute Node Spec Sheet for additional 256GB DIMM usage condition.
Procedure
Step 1 Open both DIMM connector latches.
Step 2 Press evenly on both ends of the DIMM until it clicks into place in its slot
Note: Ensure that the notch in the DIMM aligns with the slot. If the notch is misaligned, it is
possible to damage the DIMM, the slot, or both.
Step 3 Press the DIMM connector latches inward slightly to seat them fully.
Step 4 Populate all slots with a DIMM or DIMM blank. A slot cannot be empty.
1
3
2
4
1
306040