What just happened? Intel's next-gen Xeon CPUs, codenamed Granite Rapids, could come with a massive cache upgrade compared to their predecessors. While the high-end Emerald Rapids SKUs ship with up to 320MB of L3 cache, the leading Granite Rapids chips are said to offer a whopping 480MB of L3 cache.

The news comes from tipster @InstLatX64 (via Tom's Hardware), who spotted a recent entry within the SDE 9.33.0 (Software Development Emulator) update, where Intel listed up to 480 MB of L3 cache for its 6th-gen Granite Rapids Xeon CPUs. If the listing is accurate, it would mean a 50 percent increase over Emerald Rapids and should help Intel better compete with AMD in the server and high-end workstation space.

It is worth noting that AMD currently offers its Genoa-X 3D V-Cache processors with up to 1,152 MB L3 cache, so it remains to be seen if Intel has anything up its sleeve to compete against products with such high cache capacities. While the company has publicly stated that it plans to offer CPUs with 3D-stacked cache in the future, there's no telling when these chips will be ready for prime time.

Launched in December 2023, Emerald Rapids already brought a massive upgrade in the amount of L3 cache compared to the Sapphire Rapids family that went official earlier in the year. While the flagship Sapphire Rapids chip only had 105MB of L3 cache, Emerald Rapids brought a 3x increase, with the top-end chips in the lineup shipping with as much as 320MB of the good stuff.

Expected to be launched in the second quarter of this year, Granite Rapids CPUs will only have Performance cores and be based on the Birch Stream platform. They will offer increased core counts and higher clock speeds compared to their predecessors, and also include FP16 to improve the precision support for AI developers.

Intel also claims that the Granite Rapids Xeon chips will offer a 2.9x boost in AI inferencing (DeepMD+LAMMPS), 2.8x boost in memory bandwidth and up to 3x better performance in AI workloads compared to the Sapphire Rapids Xeon CPUs.