When Intel unveiled its Compute Card platform in early 2017, the company painted a picture of a modular future where you could buy a smart TV, internet-connected home appliances, or even computer docks and upgrade their processors, memory, and RAM in the future by simply swapping out a credit card-sized device.
That future never really arrived.
Few Compute Card-compatible devices were ever developed, and now Intel has confirmed to the folks at Tom’s Hardware that it currently has no plans to develop any new Compute Cards.
Intel does plan to continue selling and supporting current-gen cards through the end of the year.
So if you really want to see what this modular platform has to offer, you could pick up an Intel Compute Card Dock from CDW for $110, and then pick up a card or two. Here are some of the best prices I could find for existing models:
- Celeron N3450/4GB/64GB eMMC for $159 – B&H
- Pentium N4200/4GB/64GB eMMC for $168 – Provantage
- Core m3-7Y30/4GB/128GB SSD for $325 – CDW
- Core i5-7Y57/8GB/128GB SSD for $517 – CDW
But at this point what Intel is selling is a dead-end platform. You could probably pick up a laptop, tablet, or mini desktop PC with similar specs for the same price or less. And since Intel doesn’t plan to offer any next-gen cards with anything newer than an Intel Apollo Lake or 7th-gen “Kaby Lake” Core processor, the only real reasons I can see for investing in this technology now are curiosity or maybe the ability to set up a multi-user workstation.
You could theoretically buy a single docking station and plug it into a monitor, mouse and keyboard, and then pick up a few different cards for everyone in your household, school or office. Insert the card and each user gets their own custom environment. Pop out the card and all the data, apps, and settings go with you on that pocket-sized card.
Intel’s vision for a modular computing future certainly had some appeal. But I can’t help but wonder if it was held back by a few factors.
First, there’s little reason for device makers to adopt the platform. Whether you’re making a PC or a smart TV, it’s probably cheaper and easier to use existing, time-tested technologies. And you probably don’t really want users to upgrade the hardware in a few years by popping out one card and replacing it with another. You’d make more money by selling that same person a completely new device… even if they don’t need a new display, keyboard, or other components.
Second, Intel’s platform wasn’t exactly an open standard. Maybe if third-party companies were able to produce their own compute cards that would be compatible with any display, dock, or other device, the platform would have gained a little more traction.
Intel was hardly the first company to try to build a modular computing platform. Google’s Project Ara was supposed to bring us totally modular smartphones… but Google canceled Ara before it launched. The Fairphone 2 has a modular design that makes it easy to repair, if not necessarily to upgrade. Motorola offers modular add-ons that bring additional functionality to is Moto Z smartphones… and idea that’s been copied by Essential and RED, for better or worse.
After running a successful crowdfunding campaign, BLOCKS is apparently delivering modular smartwatches to (some) backers. And there are all sorts of niche products like the GameShell modular handheld game console, KITE DIY modular phone, and the long-in-development EOM68 project which is an initiative to build an open-source, modular, upgradable computing platform.
But at this point it looks like modular computing is still a niche category that’s unlikely to go mainstream anytime soon.
These cards had less then 64gb of storage when they first came out. That’s why I didn’t buy one. 32gb or less just ain’t enough for Windows and eMMC storage is problematic with Linux.
Remember when Intel was making a “push” into the mobile computing space and practically giving away the Atom Z35XX’s for pennies to Taiwanese manufacturers?
I think the more practical option for having your own setup is to have a portable SSD for each user the plug into a base which already has everything else. Your OS and software will be available to you only while the ram and cpu will be located in the base. 1 tb ssd nowadays cost less than $200.
The only issue with that, is the industry evolved via a different route: processing instead of memory.
There’s just too much “red tape” in the form of processors and software that prevents an idea like this to get off the ground. Perhaps if we had Full-sized Desktop PC’s where the consumer can hot-swap the CPU out, being based on an ARM Cortex-A, this may have been feasible.
But the preferred processor is x86, and the preferred software platform is Windows. Phone’s have long struggled to break the mould of “expensive toys”.
Could you’ve ever bough this at retail at all?
It would have been nice to have a standard I/O and power header interface kind of like the Raspberry Pi. That way people could 3D print cases and add batteries and screens… would have been nice to have seen a DIY cell phone built on one of these.
I really like the NUC form factor. I would not want to give up any of its strengths for more portability. I’m not surprised that the compute card form factor is gone.
I remember when these were new and the emphasis was on providing ability to upgrade the specs of “smart” devices like TVs. It was even mentioned that distribution would likely go through the device manufacturers. I don’t recall seeing these offered to consumers before, and I was not expecting that to be the case. The cards here and the dock seem expensive for what they are, unsurprisingly, so it’s a tough sell for consumers. And they were likely too expensive to make sense in embedded applications where they would be up against inexpensive ARM and MIPS chips, as well as external TV boxes.
And I never did understand how the thermal management worked. The cards are quite compact, even compared to compute sticks. An Anandtech article mentioned the docks would have to assist in dissipating heat. It also mentioned, as expected, the cards themselves would get pretty hot after use, which would prevent quick and easy swapping of cards. It’s a neat concept, but not big enough for a company like Intel. Seems like an idea that’s a better fit for a single-board computer seller.
Intel was also too late to the game once again, smart TVs running off Android were already a thing and some could say the fad even had time to die on its own before Intel came out with this technologically-innovative module design, but commercially it’s of no interest as they completely missed the boat AND failed to showcase it for other uses.
Comments are closed.