Intel has been investing heavily in improving graphics performance of its processors, and the company is starting to share details at the Consumer Electronics Show. Among other things, Intel has revealed the Intel DG1 graphics card with next-gen Intel Xe graphics.
The company has also made it clear that this isn’t actually a graphics card you’ll be able to buy and put into a desktop computer… and you probably wouldn’t want to anyway.
Instead it’s aimed at software developers. The chip maker is sending samples out to help ensure that by the time it is ready to ship processors with its next-gen graphics, third-party software is able to take advantage of the new GPU technology.
While the Intel DG1 looks a lot like the kind of desktop graphics card you’d expect from NVIDIA or AMD, it’s more of a laptop GPU crammed inside of a desktop PCIe card.
According to The Verge, the card has a 20-watt (give or take) GPU that can consume up to 40 or 50 watts. But it’s basically using the same GPU technology that will be featured in Intel’s upcoming “Tiger Lake” U-series laptop processors, which are expected to be chips that use around 15 to 25 watts for both the CPUÂ and GPU.
When Tiger Lake chips arrive later this year, Intel says they should offer twice the graphics performance of last year’s “Ice Lake” chips… which, in turn, offer up to 2X the performance of previous-gen U-series processors.
That could make discrete graphics like NVIDIA’s MX150/250 series GPUs unnecessary in entry-level thin and light laptops… although high-end gaming laptops will probably continue to ship with AMD or NVIDIA graphics.
Intel’s new GPUs aren’t just coming to Tiger Lake chips. During a keynote presentation earlier this week, Intel showed off a laptop with a discrete Intel DG1 GPU playing Destiny 2 at 1080p resolution and 60 frames per second. It’s unclear if the company will ever offer discrete GPUs to laptop makers or if Intel just plans to include use Intel Xe graphics technology for the integrated GPUs in upcoming processors.
But they won’t all be low-power, laptop-class processors.
Intel Xe will come in three flavors:
- Intel Xe LP – low power, laptop-class graphics
- Intel Xe HP – high power, desktop-class performance
- Intel Xe HPC – higher performance, “exascale” GPU for workstations, servers, cloud gaming, etc
Well, it looked like an very-very low mid videocard and performed as well (looks like on par with RX550, but who knows yet). The cooling system suggests TDP in about 45-60W range.
Who will install it, well, it’s hard to say until the pricing and at least the reference designs are out.
Price/performance is one thing (I believe Intel graphics department to have their own marketing strategy, different from the CPU one). The assortment of ports and the availability of low-profile and fanless solutions is another (it may also help to compete with 750Tis from Aliexpress for 50 USD).
Also, as far as I know (I may be mistaken), Intel IGPs boast first-class driver support on Linux. This may also be a strength, compared to nVidia’s blobs and AMD’s mess.
I will not hold my breath (after all, I have a 1080), but it would be nice if Intel will bring some value competition into the unassuming and boring FHD budget gaming market.
According to other reporters, the DG1 was an embarrassment.
Only playing some games with stuttering and screen tearing below 30fps, and on Medium Settings. Most have concluded that it’s like a slightly worse RX 550. Yikes!
I wouldn’t mind having an Intel Iris UHD630 iGPU, as it’s “free”… but I’d hate to have this, because then I would know that I’ve essentially paid some money for a crap product. Heck, an APU from the 7nm Vega-11 or perhaps a Navi-8 would be much more favourable. If I wanted a dGPU, I would’ve asked for something worthwhile like a GTX 1650 or better.
Intel keeps saying that their integrated GPUs will make entry level discrete graphics obsolete. But they forget that AMD and nVidia also keeps developing new cards for the entry level too. So while it’s true that integrated graphics in a current gen CPU is better than a GT620, but then again who would use a 8 year old entry level card in a new system?
they did, i do remember they used to be lot of nvidia low end laptop gpu like 710m, 720m, 730m, 910m, 920m, 930m and other like that, where are they now? nowadays mx150 or mx250 have been standard for discrete graphics,
even in desktop 1030 is recently low end graphics no 1020 or 1010,
But that’s simply a naming convention. Nvidia simply chose not to name their entry level card MX120 or MX220.
mx started at series 40, 940mx, succeeded by 150mx then 250mx. and discrete graphics like 920m you would find them in budget laptop not expensive ultrabooks. so they did discontinue them because of improvement in built in graphics.