Android is ready for 64-bit processing

Apple may be the first company to introduce a smartphone with a 64-bit processor, but it may not be long before you start to see Android phones and tablets with 64-bit chips.

Samsung has already announced plans to release its first 64-bit chips next year, likely based on ARM’s upcoming Cortex-A57 architecture, and Intel’s Bay Trail chips support 64-bit processing, and they could soon support 64-bit Android apps.

But under the hood, Android is already capable of handling 64-bit processing.

Intel Android roadmap

As Ars Technica reports, Android is based on a Linux kernel… and Linux has supported 64-bit technology for years. The only thing Android really needs to fully support 64-bit processing is for companies to produce Android hardware with 64-bit chips, and for developers to start writing apps designed to take advantage of the technology.

One of the key benefits of the move from 32-bit to 64-bit chips is largely theoretical right now: support for exabytes of RAM, rather than mere gigabytes. But the way 64-bit chips handle memory can also lead to improved performance in some tasks.

Most 64-bit apps also tend to be a bit larger than 32-bit apps, which can be a problem if you have hundreds of apps on your phone and not a lot of storage space. But it probably won’t make a huge difference for most users with 16GB or more of storage on their phones or tablets if they typically use around 100 apps or less.

via reddit

  • http://opinadorcompulsivo.blogspot.com Miquel Mayol i Tur

    Dalvik is what runs Android JAVA apps. If Dalvik ARM64 or x86_64 is not here the 64 bit kernel factor is not an advantage for Android. In fact at Intel the Dalvik is about 50% performance that at similar benchmark ARM SoC with apps that do not use Dalvik.

    In the other side Ubuntu, KDE plasma, and Tizen, even Firefox OS will have better performance.

    • LaChuck

      Have you profiled any Android apps? This might come as a bit of a shock, but while Dalvik is incredibly inefficient compared to native, or even Java SE embedded, it’s still not the key factor.
      From what I’ve seen, the memory bus and graphics are the key factor. In either browsing or gaming, there’s surprisingly little compute overhead but there’s a whole lot of data to move around.
      This might change a little with LTE, but Dalvik or not, I think focusing on 64bit hardware and software is the right call.

  • q37

    The CPU inthe Ascend D quad is Huawei’s K3V2 64-bit processor, running at 1.5GHz.

    • Turlon

      That processor is not 64 bit. They were talking about the memory. Just look at the details of the phone on the official Huawei site and they have a graph showing how much faster their 64 bit MEMORY is. No mention of the cpu being 64 bit.

    • Noloqoq

      4xCortex A9 (so even not the LPAE 40 bits (1TB) memory address of the a7,a12 and a15, Cortex-A third generation), but they have a 64bits bus, for faster transfer probably

  • Edward McCarty

    Aren’t there already a few tablets with 64-bit processors?
    And now that chip manufacturers have started the migration towards more ARM-like chips everywhere, are we really going to move back to x86? What’s the point in that?

    • CyberGusa

      Only Pro tablets running on Intel Core processors or similar offer full 64bit… What ARM has up till now has been just 32bit systems with 64bit improvements here and there for things like the memory bus for better memory bandwidth but full 64bit implementation is only starting to get serious.

      Apple looks to be first with their full 64bit A7 but that’s a custom SoC and most of the market won’t go 64bit until the second half of 2014 or later.

      Partly because they want to combine the new architecture with the next gen FAB at either 20/16 nm half node or full 14nm FAB… since, those are when they will start introducing FinFET to those FABs and allow for the best room to scale up…

      Since, 64bit is required for ARM to make any real advancement towards higher performance and to start to find serious applications in the server market besides limited minor rolls they’ve played up till now.

      Basically, ARM wants to become a real alternative for x86 and 64bit is part of the steps they’ll need to take to get there…

      • Noloqoq

        Their is some already 64bits arm servers, but based on fpga :).

        The 32 bits ARM are already more efficient than intel 64bits.

        Due to low power usage, some companies already offer, using old technology 32 bits Cortex A9, the ability to concentrate more computing power with less energy in a 19″rack, than intel x86 allows. Cortex-A15 are more powerfull than Cortex-A9 and still improve efficiency).

        One of the first limitation in ~40 U racks in data centers is price of energy needed to computers power and cooling power. So companies had already interest to move to arm architecture, depending first on system and applications maturity on this plateform.

        2012 and 2013 seen lot of optimisation on SIMD NEON on important software (GCC (4.8), LLVM (3.3), OpenSSL(1.0.1), some codecs etc…) this improve power and computing efficiency (several times, not few %, in some important cases) on ARM based SoCs for server applications.

      • CyberGusa

        Not accurate, ARM is generally more power efficient but it’s not more processor efficient!

        Even the dual core Clover Trail Z2760 ATOM, with basically a over 5 year old architecture that’s still using In Order Processing, can still provide better CPU performance than a quad core Tegra 3 and also provide better power efficiency on average as well than the Tegra 3.

        Mind, even the newer Bay Trail is still over 50% less processor efficient than Intel’s higher end Core processors and Bay Trail is 50% to 2x better than Clover Trail but still uses no more power on average than Clover Trail…

        While similar optimizations is happening on x86 software too and not just ARM…

        And ARM won’t scale up that much right away when it does go full 64bit… Designing massive chips is not something ARM designers are used to for one thing and requires a major rethink on how ARM SoCs are designed from now on.

        Also, scaling up means losing power efficiency… It’s not just Intel that has to adapt to entering a new market!

        While market momentum will be hard to gain as people in the server business are used to x86 but ARM is still new and doesn’t have the history needed to assure it as a choice.

        Not to mention it’s not like Intel is standing still, and they’ve all but closed the power efficiency gap with the ATOM and made some major improvements to the Core processor with Haswell and Broadwell stands to improve that by another 30%!

        So, really depends who can advance the fastest and out pace the other first… ARM going 64bit puts it in the race but doesn’t guarantee a place for it…

        Not while Intel is accelerating ATOM development… This year we see Silvermont Introduced with Bay Trail and 22nm FAB… Next year, at about the same time, we’ll see Airmont introduced and on the 14nm FAB… This is advancing the ATOM at faster than Moore’s law and much faster than Intel’s usual Tic-Toc 2 year cycle…

        It remains to be seen if the ARM manufacturers can keep up with that pace, let alone exceed it… but competition is good for the consumers and for now it means more choices for everyone…

  • raindog469

    Didn’t the Asus Fonepad (not Padfone) already ship with a 64-bit Atom running Android earlier this year? I guess if everything was running in 32-bit mode, that might explain why it never took off.

    • CyberGusa

      No, the first gen mobile 32nm ATOM SoCs are 32bit limited… though, mostly because of the limited support for the Imagination based GMA and other mobile optimizations… So these new Bay Trail systems will be first to market with full 64bit support since the previous non-SoC Pine Trail ATOMs…

      Though, MS still has to improve the drivers for Windows 8 to properly support Always Connected Standby for Bay Trail under 64bit version but that should be fixed by early 2014. Also, the tablet optimized Bay Trail T models will only be given up to 4GB of RAM…

      The higher end Bay Trail M will also make it into tablets though, so look for the Celeron branded Bay Trail models for up to 8GB of RAM support and probably some extra performance, albeit at higher TDP than the Bay Trail T operates.

      Android is still 32 bit though, despite the Kernel that is mainly just responsible for running the hardware for Android but the OS itself runs on top of that is still 32bit and of course all the apps made for Android are all still 32bit and that will take the longest to change as developers won’t have a real reason to change until much later.

      64bit code tends to be much larger than 32bit and there’s less tolerance for imperfections and bugs. While performance gains require everything to work under ideal conditions.

      Mind, they’re not going to be pushing more than 4GB of RAM for mobile devices for at least another year or two. Since more RAM means higher power consumption and battery life is still a major concern for mobile devices. While most mobile usages don’t yet require or can really make use of higher performance yet.

      So this is more a long term effect and we won’t really see much on the short term… Though, devices that cross over to traditional PC usages like desktop/servers and laptops may be exceptions… especially running traditional desktop software that already has a 64bit versions for most major productivity software, etc.

  • Yowan Rdotexe

    Misleading title is misleading, Intel is just optimizing the Dalvik VM and actually not Android itself. They’ve been doing that for ages. Wrong interpretation of facts as always. Android can already be compiled to run on-top a 64-bit Kernel for AArch64/ARMv8. A few run-times may need to be optimized to take advantage of the additional registers but that’s all. Don’t forget that at the Dalvik level the number of bits becomes irrelevant so Apps can’t benefit from 64-bit.

  • Low IQ

    more bits to addressing in CPU’s means more memory, but allows for more simultaneously task under those conditions, as more sensors and things are added to this environment, the necessity for more bits are only Logical, sorry for the pun! LOL!

    • Low IQ

      one can imagine 10 years in to the future of 256-bit or higher processors will be able to do?