Intel has been making big strides with its integrated graphics technology in recent years, to the point that gaming-focused mini PCs and handhelds with Intel HD graphics aren’t a complete joke.
But you still get better graphics performance with a discrete GPU, which is why Intel is partnering with AMD to launch its first Intel CPU + AMD GPU processors this year. And the company hired AMD’s chief GPU architect Raja Koduri last year to help develop new graphics technology in-house.
Now Intel is showing off a prototype for an upcoming discrete GPU.
Update: It turns out Intel presented a paper describing a new power management technique for exiting GPU technology rather than a new discrete graphics solution, contrary to earlier reports. According to an Intel spokesperson:
Last week at ISSCC, Intel Labs presented a research paper exploring new circuit techniques optimized for power management. The team used an existing Intel integrated GPU architecture (Gen 9 GPU) as a proof of concept for these circuit techniques. This is a test vehicle only, not a future product. While we intend to compete in graphics products in the future, this research paper is unrelated. Our goal with this research is to explore possible, future circuit techniques that may improve the power and performance of Intel products.
The original article continues below.
According to PC Watch, Intel announced the new graphics processor at the ISSCC event in San Francisco.
It’s still very much a work in progress and it’s unclear when it’ll be ready to ship, or if Intel will ever ship this particular GPU. It could be more of a demonstration project, and at this point it seems to be designed to bring better graphics performance to low-end PCs, which means it’s not going to be competitive with the latest high-end graphics cards from AMD or NVIDIA.
But here’s what we know o far:
- There’s a 14nm chip with the GPU and a system agent with a control system and input/output features and an FPGA bridge to connect the GPU to the PC.
- The chip has 1.5 billion transistors
- It has a frequency range of 50 MHz at 0.51 volts to 400 MHz at 1.2 volts.
- One of the key things Intel seems to be focusing on is efficiency, bringing the same kind of fine-grained power controls to GPUs that the company already offers for CPUs.
You can find more details and a bunch of presentation slides at PC Watch (Google Translate version if your Japanese is rusty).
Correction: Intel is working on discrete graphics, but it didn’t just unveil a prototype
Maybe a straight GPU Compute card? if the efficiency is high, the clock speed should not matter that much. If there are heavy optimizations in regards to power consumption and instructions per cycle, this could work for GPU Compute like OpenCL or others.
For instance, cryptomining normally user memory and memory speed more than the GPU processor itself.
I wonder if this is with one eye towards the second generation ‘discrete’ video in package. It seemed odd that they licensed the graphics from AMD so maybe if it takes off as a solution they want a home made version.
1.5 Billion transistors?
So its somewhere around the RX 540 – GT 1030 level.
I’d be interested to see how it performs in synthetic tests, as well as day-to-day tasks like Gaming (Rocket League -> WatchDogs 2 -> Ashes). Who knows, they could provide legitimate competition to the Green Team… or maybe even lead to destroying the low-end graphics card market (<$150) by including it in their laptop and desktop sku's (to be more competitive to Zen).
My impression of discrete graphics is that you get the performance of a low end graphics card in the same form factor as integrated graphics. It’s a packaging benefit for a customer who wants better performance than integrated graphics.
Are they trying to upsell corporate customers from integrated graphics?
I am a fan of big and little boxes. No thank you.
That’s right.. Intel pull the GPU Larrabee project for years, and for years Intel massive marketing machine predicted the doom of NVIDIA and AMD for GPUs.. It was all a fake, but AMD and NVIDIA stock price went down significantly for a while..
Seems like Intel is going to pull another failed Larrabee to stupid investors that only read old charts and fake news from Intel..
Intel will need years and create a complete new company to pull a high-end GPU that is high-class of the AMD and NVIDIA caliper..
Not a chance for Intel.. AMD may become the new Intel within 2 years.. AMD has high-end x86-64 synergistic CPUs, GPUs, and APUs.. while Intel has no high-end GPUs and no HSA APU real architectures.. EMIB is just a packaging technology and Intel will need years to come up with a native HSA APU like AMD has done..
This announcement appears to have made no difference to the stock price of either AMD or nVidia, so investors in the two mainstream GPU producers do not appear to be worried by the possible competition from Intel.
US markets were closed today.
Indeed so, and now the US markets have opened today, February 20th, 2018, the price of both AMD and nVidia has jumped UP. The price for Intel stock is also up.
Perhaps if the prototype ever becomes a production item, then AMD and nVidia may have something to worry about, but I doubt it.
The Intel + AMD combined chip seems to be the significantly better offering for power efficiency and performance.
Nice follow-up. 🙂 I have no idea whether it makes sense to watch market prices to gauge this kind of thing. I just wanted to point out that yesterday was a bad day to try it. 🙂
lol ofc we know this will never ship, it’s a test chip , not a product. At the very least the fact that they have 2 different generations of their GPU on the same chip should tell you that. It’s also very small and very low clocks, a lot less than their current integrated GPUs. The focus of the presentation is on power saving techniques but tells us nothing at all about anything else.
I think it’s been decades since Intel last released a discrete GPU. I’d totally consider trying out their discrete GPU’s though. It’s always interesting to see what 3rd players have to offer compared to the top 2 such as with VIA x86 CPU’s compared to Intel and AMD.
Comments are closed.