AMD says it’s next-gen processors for notebooks and low-power desktops will use up to 40 percent less power than their predecessors, while offering a modest performance boost (and significant performance gains in some areas).

While AMD isn’t providing hard numbers about power consumption or speed yet, the company says its first Carrizo chips should be available in the first half of 2015.

amd carrizo_02

The new processors are based on the same 28nm manufacturing process as the Kaveri chips they’ll replace. But AMD says the new chips offer more performance-per-watt and use heterogenous systems architecture (HSA) to make use of the integrated graphics processor for some general computing tasks. Among other things, AMD says this allows the Carrizo to transcode videos up to 3.5 times more quickly than Kaveri processors.

Carrizo processor include new “Excavator” CPU cores and a new generation of Radeon GPU cores. They featured dedicated H.265 video decoding, and AMD says the new chips offer “double digit” percentage increases in performance and battery life.

Part of that is due to new techniques for optimizing voltage, integration of the Sourthbridge system controller, and new architecture that AMD says allows it to fit 29 percent more transistors into the same sized chip.

That’s significant because AMD is still working on 28nm chips at a time when rival Intel is shipping 14nm Broadwell processors and already looking ahead toward 10nm and smaller chips. Shrinking the die size is one of the ways chip makers typically improve efficiency, but it’s clearly not the only way to do it.

via PC World, Ars Technica, and AnandTech

Support Liliputing

Liliputing's primary sources of revenue are advertising and affiliate links (if you click the "Shop" button at the top of the page and buy something on Amazon, for example, we'll get a small commission).

But there are several ways you can support the site directly even if you're using an ad blocker* and hate online shopping.

Contribute to our Patreon campaign


Contribute via PayPal

* If you are using an ad blocker like uBlock Origin and seeing a pop-up message at the bottom of the screen, we have a guide that may help you disable it.

Subscribe to Liliputing via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 9,547 other subscribers

20 replies on “AMD Carrizo chips to offer better performance, longer battery life”

  1. Sounds great if some of these are Fanless and have native graphics drivers for Linux.

  2. AMD need to leap forward to graphene production. From reading its possible and would produce clean fast low power devices, silicon is environmentally unfriendly. AMD leap ahead Intel is old school.

    1. AMD cannot because they no longer own any fabs (sold to Mubadala and the fabs became Global Foundries). AMD are left high and dry to the progress of those external fabs (Global Foundries and TSMC). This is why they are still on 28nm node (and still stuck on 32nm SOI for their Piledriver FX chips).

  3. Can someone like Samsung buy them and put this design on 14 (or even 20) nm?

    1. I’d be interested in that for sure. Samsung is said to be working on their own GPU for their mobile chipsets. If not buying AMD, then working with AMD at creating a new mobile GPU to go against what NVIDIA has been doing with the GPU’s in Tegra.

      1. True mobile GPUs (for smartphones and phablets) are quite different. AMD’s Radeon GCN is not suitable for ultra low power mobile devices. That is why you do not see any AMD chips used in smartphones. That is also why Intel is using Imagination’s PowerVR (also used by Apple) inside their smartphone chips (like Clovertrail and Moorefield), instead of using their own GPU technology. Unfortunately AMD sold the true mobile GPU technology to Qualcomm which eventually became Adreno inside the Snapdragon.

        1. I know, that’s why I said it would be interesting if they worked together on something for true mobile, as you put it. And like you said about Qualcomm, it might also be possible that they won’t work on stuff for tablet and smartphone chips because of they previously having Adreno. But I do think mobile chipset, both in the tablet, smartphone and laptop space, is going to get some big bumps up in performance soon with what NVIDIA can do packing 256 Maxwell cores into a chip like X1. Imagine a chip made for set top box or laptop. They could do so much more while still being extremely power efficient compared to a desktop computer.

      1. Nvidia’s mobile Tegra chips use ARM cores which are much simpler, very small and super efficient (usually more efficient than x86 cores). Tegra’s integrated GPU occupy smaller die area when compared to the integrated GPU inside AMD’s APUs (also Tegra’s integrated GPU performance is actually lower than most AMD’s APUs). Thus they can use 28nm and have smaller dies sizes at the same time.

      2. ……..
        They been trying to get to 16nm for a while. Everyone is. Smaller width = higher transistor count in same die area.

        1. tell that to amd who have successfully increased transistor density on an aging 28nm process.

          1. Transistor density didn’t change. Its already there all along and was used mostly for GPUs. AMD simply changed from low density CPU centric type to high density GPU type transistors for their CPU cores.

    1. From what I’ve read the gains of 20nm isn’t that great compared to 28. Going to FinFet is actually a big leap but other than Intel and Samsung there isn’t anybody else which can provide FinFet chips currently. From what I gather when first trying out a new fab process things don’t go so smoothly. So it’s possible that some of these improvements would not have been posible on a first gen 20nm chip meaning that sticking with 28nm for this gen provides certain advantages. Now this is ONLY speculation. If next time around AMD only moves to 20nm or doesn’t move at all then it will be a serious problem. This time around… we’ll have to wait and see. Plus the inclusion of h.265 hardware decoding is huge in my opinion since it can cut down on power consumption, space requirements and both when streaming (as long as the hardware solution provides good compatibility). We’ll just have to wait and see how these perform in the real world to see if they are still a good choice for the price/design wins.

      1. The need to optimize manufacturing process vs chip architecture, and the
        risks inherent in trying to do both, are why Intel optimizes only one of these
        technologies at a time, in what’s called a “tick-tock” strategy.

        The problem I have with AMD is that, while it might be cheaper than Intel,
        AMD’s performance is so much lower, so it almost always pays to just put
        up the extra bucks and get the Intel product. It may be that Intel is just
        putting the pdeal to the matal in product development, because AMD has
        been so far behind for quite some time. And Intel is even willing to lose
        billions to meet AMD price points. I don’t want AMD to disappear, but it’s
        hard to justify not choosing Intel.

        1. Erm maybe carrizo will sway your choice since its performance should be decent especially in mobile now that It has gotten its power consumption in check. But generally in desktop it is way behind although it has good gpu performance which maybe is useful in slim desktop but for the vast majority of users that opt full size desktop going intel makes more sense

  4. I’ve had decent performance and experiences with AMD products (for the price). But their driver packages are not good from a user interface point of view – just terrible UI and layout. As well, that Gaming Evolved crapware the push is borderline malware. Even their installer downloads web content (game ads) and it is most likely subject to some type of injection attack by hackers.

    AMD, not all your customers are gamers…it just seriously undermines the trust in your brands in a business setting. Stop shooting yourself in the foot with these stupid moves that SHOULD be easy to fix.

    1. then you must truly abhor intel cpus with driver updated once or twice a year.

Comments are closed.