It was only about a month ago that NVIDIA unveiled its new 64-bit octo-core mobile processor, Tegra X1. It didn’t take long for the supercharged chip to show up in devices.

The X1 recently scored 74,977 on AnTuTu, which is better than the average top speed tested so far, by about 25,000 points.


CNXSoft has a comparison of the Tegra X1 next to NVIDIA’s previous Tegra chip, the K1.

By comparison, the X1 zoomed past the K1’s 57,040 score with an overall 30 percent improvement. RAM operations is also about three times faster with RAM speed at 20 percent faster.

The Tegra X1’s 256-core Maxwell graphics pushed the 3-D graphics limit to 21,093, which is only a slight improvement over the K1 device’s 3-D graphics performance. However, CNXSoft notes that it is due to the fact that the X1 test was done on a device with a higher resolution. If the comparison were made on the exact same model, it would have spiked even higher.

The X1 chip, which supports 4K video playback uses four ARM Cortex-A57 CPU cores and four lower-power Cortex-A53 cores, which together with the Maxwell graphics cores, is more than enough to outpace all competitors around.

Support Liliputing

Liliputing's primary sources of revenue are advertising and affiliate links (if you click the "Shop" button at the top of the page and buy something on Amazon, for example, we'll get a small commission).

But there are several ways you can support the site directly even if you're using an ad blocker* and hate online shopping.

Contribute to our Patreon campaign


Contribute via PayPal

* If you are using an ad blocker like uBlock Origin and seeing a pop-up message at the bottom of the screen, we have a guide that may help you disable it.

Subscribe to Liliputing via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 9,544 other subscribers

21 replies on “NVIDIA’s Tegra X1 crushes the competition”

  1. “It didn’t take long for the supercharged chip to show up in devices”
    What devices are you referring to?

  2. My reaction to the 75K Antutu score is “that’s all?”.

    This is *only* 30 percent faster than the 64-bit K1.

    This implies that the improved score is almost entirely derived from 64 additional GPU cores,
    and that the additional 6 CPU cores contributed almost nothing.

    There are three possible explanations for this:
    * The Antutu test is too heavily tilted to graphics performance that the number crunching speed up
    just didn’t contribute to the total score enough.
    * The X1’s memory design cannot truly support 8 active cores unless the application was carefully
    designed to minimize cache non-hits. (And Antutu number crunching tests are not so designed).
    * Current Linux/Android is just plain stupid in how it schedules 8 64 bit ARM cores.

    Seriously, for Tablet purposes would you upgrade from a Nexus 9 to get this chip which will undoubtedly
    drain your battery a *lot* faster? I’d want double performance to upgrade.

    1. It’s not really “additional 6 CPU cores”, because apparently, the 8 cores are used in a big.LITTLE fashion. Which means only one cluster of 4 cores is active at any given point. But yes, therefore I’d agree that the CPUs contribute very little to the increased scores. Most of it must be due to the 64 additional GPU cores. Which BTW, are all Maxwell cores (vs Kepler in Tegra K1), so even that will have a role to play.

        1. Thanks; didn’t know that. Nevertheless, considering it’s a severely power-constrained chip already; I doubt that the TX1 is ever intended for running all the cores at once. (Unless perhaps it is ever used in a device that is actively cooled and constantly plugged into the wall)

        2. the core config might similar to big.LITTLE but isn’t that nvidia using their own implementation rather than carbon copy ARM implementation?

          “However, rather than a somewhat standard big.LITTLE configuration as one might expect, NVIDIA continues to use their own unique system. This includes a custom interconnect rather than ARM’s CCI-400, and cluster migration rather than global task scheduling which exposes all eight cores to userspace applications.”

  3. Let’s not get too exited just yet let see what INTEL brings to the table with the next cherry trail ATOM and let see what devices ship with what…

    1. In terms of CPU performance, sure. But GPU, I’m not expecting something to go toe to toe with Maxwell. NVIDIA making smaller and smaller chips with such powerful graphics year after year could bring some pretty interesting products out in the next couple years that I think if given the right support could contend with consoles.

      1. “the pretty interesting products” will be cars only – or other applications where you can install proper heatsink and fan.

          1. Just my own speculation. I know depending on clock speeds of the GPUs they put in mobile my theory could be shot. But, the 750 Ti has 640 Maxwell cores and plays a bunch of games at 1080p very well. The X1 has 256 of them. I just figure with it being the same architecture, that it has to be good enough to do some console titles at a lower resolution with generally little compromise besides that. That’s why I think the more NVIDIA innovates with powerful graphics in their SoC, that eventually it could mean a set top box that costs a fraction of the costs of an Xbox or PS4 with lower power consumption but all of the capabilities. Obviously the Xbox One and PS4 will have had a stronger foothold on the market, but with the performance being good enough year after year, a company could test the waters by releasing a console or PC level title for this platform.
            I think that kind of action could lead to more willingness to give PC versions of games a shot and I think maybe, if done right, make Sony and Microsoft reconsider their models for creating loss leaders initially with their hardware, and maybe at least when it comes to consoles as we know them currently – almost make them fully one and the same as a conventional PC.

      2. cherry-trail will have 16EU compare to 4EU of bay-trail. there is possibility cherry-trail to have very good gpu

      3. I’d like to see an Android TV console running the x1 – with access to Nvidia Grid system as well. Hopefully with the non volatile memory on some kind of bus. M.2 ssd would be fine. I don’t think you’d see people complaining about it coming with only 16GB or so if you could easily upgrade it.

  4. What about heat and power? Always nvidia’s achilles heel. Oh yeah and cost.

    1. Don’t forget drivers and corner cutting. Got a shield tablet, the internal storage is needlessly slow because it’s cheap, the game streaming works but won’t do 1080p 60fps over wifi because of the slow cheap card (laptop? 150mbps actual speed, tablet 30mbps with same 300mbit supposed link). As for drivers they had a bug recently where video decode would result in green squares maybe 20% of the time, and the only way to deal with it was a reboot.

Comments are closed.