Nearly much every new PC announced this year with the latest Intel, AMD, or Qualcomm processors is positioned as an “AI PC,” but companies have been pretty vague about what exactly that means until recently.

Now Intel is spelling it out while trying to attract software developers to actually give users reasons to care.

Intel launched its AI PC Acceleration Program last fall with an emphasis on attracting major software developers to tap into Intel’s tools for enabling on-device AI features. Now the company is expanding the program to small and mid-side developers by providing access to software tools and a new dev kit that combines an Asus NUC Pro 14 mini PC featuring an Intel Core Ultra processor with pre-installed software for tapping into the Intel AI Boost capabilities of those chips.

The chip maker has also finally spelled out the definition of an AI PC, which it says has been set by Microsoft. In order to qualify, a PC needs to:

  • Feature a CPU, GPU, and NPU
  • Come with Microsoft’s Copilot feature
  • Have a Copilot key on the keyboard (assuming the system has a keyboard)

That’s Microsoft’s definition, but theoretically there’s nothing stopping PC makers from slapping the “AI PC” name on hardware that ships with Linux or other non-Microsoft software. And if that happens, I doubt we’ll see that Copilot key.

Of course, Intel isn’t the only chip maker that’s integrated NPUs (neural processing units) with its latest processors. But some of the company’s developer tools, like the OpenVINO Tookit, are optimized for Intel hardware, which could give the company a leg up over the competition.

Much the way GPUs are specialized to run graphics-oriented tasks that would be too energy-intensive to run on CPU power along, an NPU can be used to bring new capabilities to PCs, or to improve energy efficiency by offloading things that could be done on a CPU to silicon that’s optimized to run those tasks more quickly and efficiently, resulting in better performance and longer battery life.

So far real-world applications for PCs with NPUs have been fairly limited. Windows PC users can access Windows Studio Effects for energy-efficient background blur, eye contact correction, and automatic framing during video calls in supported applications.

But as more developer start to leverage NPUs we could see additional applications beyond the usual niche applications of doing things like generating images from text-based prompts or enhanced image editing features.

These are still early days for PCs with integrated NPUs though. Microsoft says a PC with an Intel Core Ultra 7 165H processor delivers up to 34 TOPS of total AI performance, although only 11 TOPS is provided by the NPU (the rest comes from the GPU and CPU).

But the company says next-gen AI PCs will have NPUS that can offer at least 40 TOPS of AI performance on their own, which could allow some of the AI features that currently require a cloud connection to run on-device. Tom’s Hardware reports that Intel says next-gen AI PCs with 40 TOPS NPUs should be able to “run more elements of Copilot” locally without offloading processing to a Microsoft server.

It’s interesting to note that nobody’s promising Copilot will ever sever its reliance on cloud computing completely. That would certainly make the feature more useful (and potentially more private), but it would also give users one less reason to fork over money to Microsoft to pay for premium features in Copilot Pro.

via Intel, AnandTech, The Register, Tom’s Hardware (1)(2)

Support Liliputing

Liliputing's primary sources of revenue are advertising and affiliate links (if you click the "Shop" button at the top of the page and buy something on Amazon, for example, we'll get a small commission).

But there are several ways you can support the site directly even if you're using an ad blocker* and hate online shopping.

Contribute to our Patreon campaign


Contribute via PayPal

* If you are using an ad blocker like uBlock Origin and seeing a pop-up message at the bottom of the screen, we have a guide that may help you disable it.

Subscribe to Liliputing via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 9,543 other subscribers

Join the Conversation


Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  1. What’s becoming more and more obvious is that on-device AI features is just another ploy to get buyers to buy new hardware, yet mainstream consumers have no use for this stuff and are perfectly content playing around with Bing/Copilot and ChatGPT services.

    There’s really only specific needs for fast on-device hardware – if you’re wondering “do I need this?”, the answer is “nope”.

  2. How does it compare (performance and power tradeoffs) when running various AI workloads on the GPU vs the NPU? Wondering if some workloads are better run on the GPU. The GPU has been the processing unit for AI processing for years which largely involves a lot of matrix multiplications. From my understanding, these NPUs are optimized for matrix multiplication as well.

    Then there’s the thing this article talks about. I really wonder if devs will find compelling use cases for mainstream users. It feels like end user AI use cases are mimicking GPU use cases (makes sense given the core matrix multiplication task): mostly photo/video processing algorithms. There’s GenAI but that seems like it’ll need beefy servers for a long while. Unlike photo/video processing algorithms, it’s not something you take a coffee break for every prompt/question.

  3. Several thoughts on this:
    If Copilot is run locally, does information about how you are using it still get sent to Microsoft? Can the NPU be used for other, non-A.I. stuff? Is Copilot being offloaded to local PCs because Microsoft doesn’t have or doesn’t want to pay for the servers to run this for everyone? Will it be possible for Microsoft to lock NPU usage on Windows behind a pay wall?

    1. Microsoft isn’t going to want to lock NPU usage behind a paywall, they’re trying to make you cover the cost of buying and running the server hardware they’d have to use otherwise. This is why phones do auto captioning locally and more and more parts of Google Assistant and Siri are also run locally, and why there’s a requirement of 16gb of RAM to even use copilot.
      However, I would still expect them to be sent whatever you input and whatever the system outputs, just look at the telemetry people found in Windows Calculator for just how much information that sends home.

  4. “..actually give users reasons to care.” <- That illustrates the problem in a nutshell. “a.i.” this, “npu” that, “…A.I. IS YOUR FRIEND. TRUST COPILOT.”
    a.i. is the trendy new solution in search of a problem.. All these giant corporations are trying quite hard to compel the average person to adopt it, convince them that they need it, show them that it is “the future”..while only providing the most nebulous of promises.
    They were just as adament about those home 3D tvs that required those horrible 3D glasses to show content in 3D becoming “the future”.. In the end I’m pretty sure those initial 720p crt tvs held out longer in the market.

  5. m.2 2280 AI accelerator seems like a nice form factor for small and mobile. A full-size GPU card accelerator is the original form factor… again, no need for any PC architecture changes.

  6. I’m not going to buy one of Microsoft’s slimy “AI PCs”.

    OEMs have an opportunity to side with customers by eschewing MS’s newest scam.

    1. You may not have a choice about that.
      Not just because you might not able to get a laptop without the completely and utterly pointless copilot key, but because you might be expected to use copilot to increase your throughput, for example, to tabulate data from a bunch of differently arranged PDFs so you can compare a bunch of similar products from a bunch of different vendors, or rapidly compose and edit social media posts and modify them for all the multitude of websites you’re supposed to be posting them on for whatever stupid reason.
      It’s a bit hard for someone like me to get a handle on just how urgently the need to step up your productivity (and displace more people/make more noise) is.