PC makers are starting to ship laptops with NVIDIA’s new GeForce MX250 GPU. It’s an entry-level GPU from the company which means it’s probably not going to satisfy gamers who want advanced features like ray tracing, but should be a step up from the integrated graphics available from most notebooks with 15 watt Intel or AMD processors.

For the most part NVIDIA MX250 is pretty much the same as the MX150 graphics that NVIDIA started offering in 2017… but the clock speed is a bit higher, which should lead to better performance.

But there’s one thing that hasn’t changed — the folks at NotebookCheck report that once again are actually two different versions of NVIDIA’s new entry-level mobile graphics solution:

  • 25 watt NVIDIA GeForce MX250 1D52
  • 10 watt NVIDIA GeForce MX250 1D13

Unsurprisingly, the 25 watt version offers better performance… at least 30 percent better, according to NotebookCheck. Unfortunately, there’s no easy way for shoppers to know which version of the graphics chip they’re getting because PC makers just advertise that their systems feature MX250 graphics and leave out any indication of which version.

There’s certainly a case to be made that the 10 watt version is a better solution for some customers. The lower power consumption means you’ll probably get better battery life and PC makers may be able to include the GPU in thinner and lighter computers. But it’d be nice if you knew which version you were going to get.

So if you’re in the market for a new laptop with NVIDIA GeForce MX250 graphics this year, you might want to wait until independent reviewers get their hands on a device and let you know what you’re actually getting before pulling out your wallet.

Support Liliputing

Liliputing's primary sources of revenue are advertising and affiliate links (if you click the "Shop" button at the top of the page and buy something on Amazon, for example, we'll get a small commission).

But there are several ways you can support the site directly even if you're using an ad blocker* and hate online shopping.

Contribute to our Patreon campaign

or...

Contribute via PayPal

* If you are using an ad blocker like uBlock Origin and seeing a pop-up message at the bottom of the screen, we have a guide that may help you disable it.

Subscribe to Liliputing via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 9,542 other subscribers

5 replies on “NVIDIA quietly launched two different MX250 GPUs (10 watts and 25 watts)”

  1. Man, 30% more performance for 150% more power sounds pretty terrible – especially in a mobile device.

    Unless, of course, they’re playing a little fast-and-loose with those TDP numbers…

  2. Or, even better, let your wallet tell Nvidia to stop this nonsense by buying AMD. Competition is good and flat out tricks and lies like this is not.

    1. I agree, but AMD is behind Nvidia in graphics.

      They can’t quite match their money, when it comes to buying the latest lithography (Intel, Samsung, TSMC, Global Foundries, etc etc), which greatly affects availability/price. They can’t match their R&D when it comes to the microarchitecture, which greatly affects battery life/power drain. They can’t match their Developer Segment when it comes to making optimisations for games, which greatly affects performance.

      The best they can do is GCN 3 generations* (HD7970/R9290/RX580) in the form of a “baby vega” as an iGPU/APU in their Zen1 systems.

      So until AMD does a proper upgrade in the underlying microarchitecture (ahem Better Rasterisation), then mates it to a modern lithography (TSCM 7nm or better), and throws some optimisations…. you can’t expect much improvement. Hence, it will always be 1-2 generations of performance or efficiency (or both!) behind Nvidia.

      And even then, you can’t expect them to be successful against Nvidia. There’s almost no way.
      They can only beat Nvidia by displacing Intel, ie, using a CPU that’s almost as good as the Intel chipset and beating Nvidia’s graphics by including their own graphics solution instead.

      So its no easy task, mindshare is a difficult beast altogether, but at least we saw some decent promises with 16nm Zen1/Ryzen 1000-Mobile/”Ryzen fake-2000″ with their iGPU making headway against the +14nm++ Intel CPU’s with the Nvidia GT 1030 GPU’s. Hopefully AMD stays ambitious!!!

      *yes, I know they are calling it the 6th generation with the Vega VII, but realistically, this is still running the same microarchitecture as the the RX470/RX580 just with different memory configuration, more cores, some improvements, and running on a modern 7nm lithography. It’s not an actual generational jump.

      1. While I more or less agree, this isn’t about AMD vs Nvidia, its about there being only 2 REAL options currently for “higher end” graphics and a company who repeatedly has used their position as “leader” (hard air quotes on that) to basically say to consumers “take what we give you or go kick rocks”…….again, REPEATEDLY as this is at least the 3rd time (same named and advertised graphics with wildly varying performances)I can think of they have done just this, they were sued the other 2 times and IIRC lost both…..so i really don’t understand wtf they are doing. Actions like this, and how they handled the entire Pascal line, the crypto duplicity they were involved with, the Nvidia NDA partner program thing that was pure illegal crap…….. and frankly their entire treatment of the gamers that got them to where they are today…….is plenty of reason to pick ANYONE but nvidia……regardless if they are still “the best”

        1. I know, and I couldn’t agree more.

          However, the world we live in is one of “money talks, losers walk”. So nothing will get better (it may become worse) in terms of Nvidia and the graphics market…. not until they have decent competition be that from AMD, or Intel, or Qualcomm, or Apple etc etc.

          Think back to how much a 2017 8-core CPU cost, and now look at 2019. The price has decreased around x4 folds. The same can/will happen with the graphics sector when we lack decent competition/get good competition (eg 2010 vs 2018 Graphics Market Share).

Comments are closed.