Intel’s first few “Compute Stick” products were basically fully functional PCs packed into a tiny case with an HDMI connector that you could plug directly into a display.

The latest Compute Sticknot th from Intel is a little different. Intel subsidiary Movidius has launched a Neural Compute Stick that you can plug into the USB port of any computer to add hardware-accelerated machine learning/vision processing/artificial intelligence to an existing computer.

The Neural Compute Stick sells for $79 and it’s actually something Movidius was working on before Intel acquired the company… but now that Movidius is part of the Intel team, it gets to wear the Compute Stick name, I guess.

The stick features a Movidius Vision Processing Unit (VPU), which allows it to perform neural network processing without relying on cloud computing, so no internet connection is required.

Intel says the VPU uses under 1 watt of power, while offering 100 gigaflops of performance.

You plug the stick into a computer via a USB 3.0 port.

The whole thing measures about 72.5mm x 27mm x 14mm and the stick is fanless for silent operation. Possible applications include powering security cameras, smart home products, and drones, among other things.

via Intel and AnandTech



Support Liliputing

Liliputing's primary sources of revenue are advertising and affiliate links (if you click the "Shop" button at the top of the page and buy something on Amazon, for example, we'll get a small commission).

But there are several ways you can support the site directly even if you're using an ad blocker and hate online shopping.

Contribute to our Patreon campaign

or...

Contribute via PayPal

8 replies on “Intel’s Neural Compute Stick brings machine learning to any PC”

  1. How plug & play is this, software wise? I imagine software needs to be updated to make use of it, or does it just listen to events and processes things as you go (which sounds like a security & privacy nightmare, but it does have its uses)?

  2. “but not that Movidius is part of the Intel team, it gets to wear the Compute Stick name, I guess.”
    Is it supposed to be “now”, or is the “I guess” indicating that the Movidius team is being shunned by the Intel team, only grudgingly being given the Compute Stick name, even though it was acquired?

      1. It would be pedantic to suggest “fewer” in place of “less” in the correct situation. I think you’ll find, however, a *quite significant* difference between the words “now” and “not”.

  3. ARM has their open source MALI library for machine learning and neural compute.
    https://community.arm.com/graphics/b/blog/posts/arm-compute-library-for-computer-vision-and-machine-learning-now-publicly-available
    Many Intel products are discontinued if they don’t become popular… I would definitely go open source and pick a GPU family that will be made by many different vendors.
    The nice thing about ARM is that the SOC that the GPU is on is already very low power… no PC needed.

  4. 100 Gflops/stick is not very impressive for a dedicated ASIC, their so-called Myriad 2 VPU. It might be interesting if it had hardware optimized for true multi-level Convolutional Neural Network (CNN) processing, something that’s sorely lacking today. But I see no mention CNN support on-chip.

Comments are closed.