Most laptops released over the past two decades have used replaceable SODIMM sticks for memory although a growing number of thin and light systems in recent years have featured non-replaceable LPDDR memory soldered to the motherboard.

But last year Dell introduced a new type of removable memory called CAMM that’s capable of higher capacity and higher speeds while taking up less space. At the time the company said it hoped CAMM would be widely adopted by other companies, and now it looks like that’s a real possibility.

Dell (2022)

As PC World reports, JEDEC, the organization that managing laptop memory standards (among other things) has voted to approve the CAMM 0.5 specification as a potential replacement for SODIMM memory and version 1.0 of the new specification could be adopted later this year. That would pave the way for companies other than Dell to release their first laptops using CAMM memory as soon as 2024.

CAMM stands for Compression Attached Memory Module, and Dell says the modules it released last year:

  • Are 57% thinner than a SODIMM stick
  • Support theoretical higher speeds than the fastest SODIMM modules
  • Support up to 128GB of RAM on a single module

While a CAMM module is longer and wider than a typical SODIMM module, it’s thinner, which could make it a better fit for thin and light laptops. And since it supports up to 128GB of RAM, it can even be used to reduce the height of high-performance PCs like mobile workstations, which may have to use as many as four SODIMM slots to get that much memory, with two on each side of the motherboard.

Dell also says that the SODIMM format will struggle to support speeds higher than 6,400 MT/s, which shouldn’t be a problem for CAMM.

And Dell Senior Enginer Tom Schnell, who is also a JEDEC committee member, tells PC World that CAMM modules could also be equipped with LPDDR memory, allowing PC makers to use next-gen low-power memory without sacrificing repairability and upgradeability.

Of course, it could take a while before CAMM fully replaces SODIMM. JEDEC voting to approve the new standard is an important step, but PC makers and memory module manufacturers will still need to actually produce hardware using the new standard. And I suspect that transition could take several years.

Support Liliputing

Liliputing's primary sources of revenue are advertising and affiliate links (if you click the "Shop" button at the top of the page and buy something on Amazon, for example, we'll get a small commission).

But there are several ways you can support the site directly even if you're using an ad blocker* and hate online shopping.

Contribute to our Patreon campaign

or...

Contribute via PayPal

* If you are using an ad blocker like uBlock Origin and seeing a pop-up message at the bottom of the screen, we have a guide that may help you disable it.

Subscribe to Liliputing via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 9,543 other subscribers

7 replies on “CAMM could replace SODIMM memory in next-gen laptops”

  1. Meh. With Apple and unified memory, it makes me wonder why laptop chips need separate RAM anyway. On laptops – as well as phones, tablets, consoles and even some mini PCs – 80% to 90% of the consumers aren’t going to upgrade the RAM. In many cases they can’t … it is soldered in.

    Get rid of the separate RAM. Get rid of the L1, L2 and L3 cache. Just create “Intel Processor” (which replaces Celeron and Pentium) with 4 GB of RAM baked into the CPU. Core i3? Comes with 8 GB of RAM. Core i5? 8 GB and 16 GB variants. AMD Athlon, Core 3 and Core 5? The same. It gets tricker with Core i7 – where anywhere from 16 GB to 64 GB makes sense – and Core i9 but it should at least be an option.

    People aren’t going to buy a Core i3 or Ryzen 3 laptop with a UHD GPU and then for some reason put 64 GB RAM in it. Or if they have some justification – they need A LOT of RAM but don’t need much in the way of performance and want to save a few bucks – then I guess that is what desktops are for.

    Intel in particular has no excuse. With the chiplet thing that they are going to do starting with Meteor Lake later this year, they are already getting TSMC to manufacture their laptop discrete GPUs as 4nm and 3nm tiles until Intel’s actual equivalent processes – not those that Intel claims to be 4nm and 3nm which allegedly go online this year – are available in 2025. Why not have TSMC (or better yet Samsung) make 4nm RAM tiles as well? It would blow AMD’s 3D cache out of the water.

    1. For one thing, I don’t think that cache is used for the same thing as random access memory; it doesn’t work fast just because it’s on the same chip, but also because there’s relatively little of it. Second, if you make the RAM part of the same chip, you increase the area you need on the silicon wafer, which increases both waste from around the edges of the wafer, and the area the has to have no imperfections on it, as a bad RAM section can mean the whole chip is worthless.
      It’s entirely about cost savings due to more usable yields and multiple options for sources, which is why even phones have separate RAM. And in the case of phones, some of them are literally stacking the RAM on top of the CPU to save on board space, which is still easier than making both as a single block. With laptops, there is the possibility of using chiplets as RAM, but that still constrains the CPU designer into working out the supply contract for both the CPU and RAM with a single fabricator. Also, the CPU designer would have to order various amounts of a greater number of SKUs based on speculation about what kinds of laptops and mini-pcs would sell well next year, rather than letting the laptop manufacturers speculate for them. Furthermore, I don’t know if UEFI is compatible with RAM on the CPU, but if it’s not, that’s a problem for everyone’s operating system, and one that’ll probably kill choice in operating systems on a computer built like this.

      So basically, for that to be viable, Intel and AMD would have to get into designing and selling laptops themselves. But even Apple doesn’t do that, they still have separate RAM chips, they’re just soldered to the same carrier board as the CPU. And I believe “Unified memory” is just a fancy way to say system RAM is used as VRAM, which is what happens in any system with integrated graphics.

    2. That sort of makes sense. It’s not quiet possible with x86 due to certain restrictions (unless you’re Intel/AMD the platform controller) but I could certainly see it happening with ARM and RISC-V.

      In fact, I had an idea about this back in 2015 or so. Which basically has a motherboard/chipset, that scales out with size and thermal allocation. The premise goes as SSD -> CPU -> RAM -> GPU -> HDMI cable. But all of these are on the board and connected together. So there’s a big square GPU in the middle, it has a data-link heading out to the right for Video-Out connection. However 270′ surrounding the GPU are RAM modules which are connected to each other and they’re connected to the GPU. Then the next layer surrounding the RAM modules are CPU chiplets, and they also connect to each other as well as the RAM modules in front of them. Then there’s another layer surrounding the CPU chiplets which are Flash Chips for Storage, which again, are connected to each other and these are also connected to the CPU.

      …why?
      Because modern computing usually works in that direction, it makes sense to physically bring them in such an arrangement. It would make the system much more efficient, and it could have a much more effective yet cheaper cooling system. And you could scale it from a micro-size all the way up to something formidable. I always hoped the nomenclature would scale up appropriately as well. If we take Apple as an example, something like this below:

      Watts – Year 2020 – 2021 – 2022, etc etc SoC Generation and Nomenclature
      ~~1W: Apple M10, M11, M12, M13, M14… …
      (for 2in / watches, wearables)
      ~~3W: Apple M20, M21, M22, M23, M24… …
      (for 5in / phones, small gadgets)
      ~~5W: Apple M30, M31, M32, M33, M34… …
      (for 7in / phablets, small tablets)
      ~~7W: Apple M40, M41, M42, M43, M44… …
      (for 9in / large tablets)
      ~10W: Apple M50, M51, M52, M53, M54… …
      (for 11in / ultrabooks)
      ~15W: Apple M60, M61, M62, M63, M64… …
      (for 14in / notebooks)
      ~25W: Apple M70, M71, M72, M73, M74… …
      (for 17in / thick laptops)
      ~45W: Apple M80, M81, M82, M83, M84… …
      (for 29in / iMac, All-in-One office pc)
      ~95W: Apple M90, M91, M92, M93, M94… …
      (for Mac Pro, Desktops with strong cooling)

  2. I don’t like the idea. To me this sounds like it has the potential to make RAM upgrades more expensive, and more difficult to find the configuration that you need.

    Presumably these CAMM modules can contain an entire dual channel array of memory within a single CAMM module (since they are comparing the size to large arrays of SODIMM, and you would always expect to find dual-channel support with a laptop with 4x SODIMM slots).

    This means that when I go shopping for a RAM upgrade, I need to find a specific CAMM module with the exact configuration of the total memory that I want. As opposed to simply buying as many SODIMM modules as I need to make up the configuration I want.

    This means that manufacturers and retailers need to focus on many more different possible SKUs of RAM, rather than just a couple of individual SODIMMs that customers can mix and match.

    For example, if Corsair wants to offer their “Vengeance” line in a CAMM form-factor, to cover all of the different configurations (1×8, 2×8, 1×16, 2×16, 1×32, 2×32), they need to make 6x individual products for every single RAM speed/latency they plan to offer, as opposed to just 3x (8gb, 16gb, and 32gb).

    Another problem is that when I go to upgrade a new laptop’s memory, I need to pay for memory I already own all over again. If I buy a laptop with 16gb of RAM, and I want 32gb, I can’t simply buy another 16gb SODIMM. I need to pay for 32gb of additional RAM, and discard/reuse/sell the original 16gb module.

    1. If laptops were still being made like they were in the early 2010s, I’d agree.
      But as it is, laptop memory is a mess, with most motherboards just soldering the stuff on, others soldering half of it on and only offering a single SODIMM slot to upgrade, others populating both slots so you can’t upgrade by just buying a second SODIMM. The same goes for most mini-pc boards, although they’re a little better about letting you upgrade. On top of that, you have to make sure your RAM is compatible with your motherboard anyway, even on desktops. If every new laptop jumped on this instead of soldering the RAM, it might actually improve things overall for upgrading RAM.

      1. I wouldn’t say laptop memory is a “mess”. I’d say it’s a healthy market, with lots of variety, flexibility, and options.

Comments are closed.