Intel’s been releasing processors with decent integrated graphics for years, enabling support for 4K video playback, hardware-accelerated video encoding and 3D content, among other things. But serious gamers, graphic designers, and cryptocurrency miners still tend to prefer discrete graphics solutions from NVIDIA or AMD when they’re looking for better performance.

But Intel plans to offer its own discrete GPU… in a few years.

Last year the company hired former AMD VP Raja Koduri toe head up its graphics division. Now MarketWatch reports that Intel is on track to launch its first discrete graphics solution in 2020.

That might seem like a long time from now, but MarketWatch notes that it usually takes about 3 years to develop a new GPU and bring it to market, so the timeline is actually pretty aggressive.

We’ll have to wait a few years to see how Intel’s upcoming graphics solutions stack up against whatever it is that NVIDIA and AMD are selling by then though.

It’s also interesting to note that while increased competition in the GPU space will likely be good news for gamers, Intel probably isn’t just doing this to go after that market. GPU technology can also be utilized for hardware-accelerated AI and machine learning, among other things… and in case you hadn’t noticed, that seems to be a huge growth market in computing right now.



Support Liliputing

Liliputing's primary sources of revenue are advertising and affiliate links (if you click the "Shop" button at the top of the page and buy something on Amazon, for example, we'll get a small commission).

But there are several ways you can support the site directly even if you're using an ad blocker and hate online shopping.

Contribute to our Patreon campaign

or...

Contribute via PayPal

5 replies on “Report: Intel to launch its first discrete graphics chips in 2020”

  1. First “GPU” yes but not first discrete graphics chip, i740 comes to mind back in the early AGP days, hardly a GPU though. Stoked to see more competition though and excited to see what intel will come up with, maybe a resurrection of Larrabee cores?

  2. I was scratching my head until the line “AI and machine learning”. Intel finally has decent integrated graphics which should be good enough for most users. I really can’t see Intel spending the resources to compete with AMD or nVidia for the gamers market and the cryptocurrency market may evaporate or change suddenly and make any work in that area worthless. AI and machine learning will need the additional horsepower of a discrete card and I am sure Intel does not want to be beholding to AMD or nVidia. Makes perfect sense now. They do not have to compete for pure graphics horsepower.

  3. It’s about time. GPUs are so closely related to the market Intel is already in, I wondered for a long time why they didn’t branch out. Plus you have all those “pesky” ARM processors cutting away at Intel’s dominance in this space. They’re also going to lose Apple soon-ish.

    I hear about all these different companies – ISPs, Telcos, and so many others – branching into fields completely unrelated to their core businesses… makes me wonder, now that Intel has made the push (very hard to do for a large conservative company: think IBM) if they have other fields on their roadmap.

  4. I don’t see Intel’s GPUs being competitive in actual graphics. If they’re targeting AI/ML acceleration, then why not create a dedicated chip for that? It doesn’t necessarily need to be a GPU. They did purchase Nervana but I haven’t exactly heard good things about the Nervana chip. Anyway, they can improve on it.

  5. Even if these dGPUs don’t end up being as powerful as others’, if Intel sticks to open sourcing their GPU drivers like with their integrated ones, then I can see Linux users opting for them. Also assuming they’ll be much better than their iGPUs.

    If they’re targeting ML, then I’m not sure why they’d opt for a GPU. AMD and NVIDIA are already GPU companies so it makes sense they’d leverage their GPU expertise in non-graphics applications.

    Anyway, we’ll see. I’m sure Intel has plenty of resources to throw at this project but it’s still going to be challenging for them.

Comments are closed.