Intel’s first processors that combine an Intel CPU with AMD Radeon graphics are on the way, but Intel hasn’t provided many details about the upcoming chips yet.

The company did go ahead and list one on its website though. While detailed specs still aren’t available, the company has a page with a list of processors that are appropriate for overclocking, and one of them is the previously unannounced Intel Core i7-8809G.

Here’s what we know about that processor so far. It’s a 3.1 GHz quad-core chip with support for hyperthreading, which means it has 4 cores, but supports 8 processes.

The processor has 8MB of cache and supports dual-channel DDR4-2400 memory. And it features AMD Radeon RX Vega M GH graphics as well as Intel HD Graphics 630, which means it should be able to use the higher-performance Radeon graphics for gaming, video editing, or other tasks, while using the lower-power Intel HD graphics for more basic activities.

Intel says the Core i7-8809G has a 100W Target Package TDP and it’s an unlocked processor, which means it’s overclockable. It also means you’ll probably see this processor in desktop PCs rather than laptops.

via VideoCardz

Support Liliputing

Liliputing's primary sources of revenue are advertising and affiliate links (if you click the "Shop" button at the top of the page and buy something on Amazon, for example, we'll get a small commission).

But there are several ways you can support the site directly even if you're using an ad blocker* and hate online shopping.

Contribute to our Patreon campaign

or...

Contribute via PayPal

* If you are using an ad blocker like uBlock Origin and seeing a pop-up message at the bottom of the screen, we have a guide that may help you disable it.

Subscribe to Liliputing via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 9,544 other subscribers

18 replies on “Intel Core i7-8809G chip with AMD Radeon graphics listed on Intel website”

  1. Wonder if this is affected by the Intel CPU bug that the main OS vendors are busy fixing.
    Not clear how it affects the desktop experience, on server -specially cloud- side the performance impact can go up to 30%… imagining loosing 30% of all you computers in one go!

    https://www.theregister.co.uk/2018/01/02/intel_cpu_design_flaw/
    https://forums.anandtech.com/threads/massive-security-hole-in-xeons-incoming.2532563/
    https://lonesysadmin.net/2018/01/02/intel-cpu-design-flaw-performance-degradation-security-updates/

    Brad, what about a detailed article on this… it would be interesting to get your view on it.

    1. Reading up on it now and will write something up soon. But honestly, there are folks with a deeper understanding of how operating systems interact with processors than me, and The Reg article is probably the best write up I’ve seen so far. We probably won’t learn the full extent of the issue, its impact, and what it means for future processors until Intel releases an official statement.

      1. My suggestion for an article from you comes from the fact that I consider that you do some pretty good analysis and summaries (judging from all the past articles). Most of the articles I have read are too techy for most people ( kernel branch prediction isn’t clear to many, nor PTI).

        1. Aww shucks. 🙂 Anyway, it’s up:

          https://liliputing.com/2018/01/pcs-servers-get-slower-thanks-security-updates-patching-major-intel-cpu-vulnerability.html

          Let me know if i got anything wrong. 🙂

          At this point I do think there are more questions than answers. And I’m pleasantly surprised to see that the Phoronix benchmarks suggest that only *some* tasks will be slower.

          But this certainly seems like a good opportunity for AMD, particularly if Intel ends up continuing to release 8th-gen chips affected by the vulnerability.

    1. A leaked roadmap showed a dedicated GPU coming in one of this year’s upcoming Intel NUCs. I would wager dollars to donuts that you are onto something there. 🙂

  2. I’m guessing BOM costs will be more than going for an equivalent 2 chip solution because Intel. I wonder what target devices this is supposed to be for. I’m not a cooling expert but if it’s as easy or easier to cool than 2 chips, then maybe smaller PCBs for more product design flexibility?

  3. Hopefully we will see more encouraging GPU benchmarks from this thing. I recall seeing a benchmark recently that suggested it performed no better than the integrated Intel UHD 620 GPU that comes with most 8th Gen laptop CPUs.

    If this did have a decent GPU, I would be really excited to see it used in some small formfactor uses. I think its best use would be a “Steambox” gaming PC (a PC in the size/shape of a gaming console).

    Edit: Good news, it appears to be doing much better: https://www.notebookcheck.net/Supposed-Intel-AMD-Core-i7-8705G-benchmarks-rival-the-GTX-1050-in-performance.262790.0.html

    1. At GTX 1050 performance levels, it really is quite interesting. If they manage to put this chip into something like the ASRock Deskmini 110, they would have a *very* tempting device.

      1. Agreed. I would personally like something more flat-shaped. Like the Fractal Design Node 202 case. It would make it fit into a TV entertainment stand better.

  4. Right now we see i7-hq and Nvidia 1060 VR ready laptops. I would think that the 8809g chip will use similar amount of power and be able to be used in laptops. Is there a major reason to believe this will not be able to be used in laptops?

    1. Yes, there is a major reason: it’s a 100W TDP chip. You know what they don’t stick in laptops? 100W chips.

      Had you read the actual article, you would have encountered the following: “It also means you’ll probably see this processor in desktop PCs rather than laptops.”

      1. Any chance that you think it could make it’s way into a next gen cell phone?

      2. But they’ll stick a 45W chip next to a 55W chip for 100W of chips in a laptop. No reason why not have a single 100W chip, though heat disposal will be harder due to concentration.

      3. Everything is the same… but different. The combined TDP is what matters here. There are many high-end laptops with power draws around 90W+ from 45W+ CPUs and 65W+ GPUs. The only difference now is the CPU and GPU reside in the same package. This lowers cost and design complexity for PC manufacturers: this a fair deal more compelling than a dual GPU/CPU with a complex network of fans and heatpipes as well as hundreds more power and bus traces. You can bet your boots that OEMs are going to use this when both Intel and AMD’s clout and illustrious histories are associated with this product.

      4. I have to agree. Between the massive power draw (which would eat through a laptop battery very quickly) and the heat such a processor would produce (good luck getting that much heat off of the CPU and out of the laptop) using such a processor in a laptop would be asinine. The power supply required for such a computer is also large and difficult to carry in a standard sized laptop case (there are a few laptops that use 120 watt power supplies, they are extremely large — a laptop with this processor would require an even larger power supply).

Comments are closed.