It’s been nearly a year since Mozilla unveiled Project Quantum, a next-gen engine designed to help Firefox better leverage modern hardware for better performance.

Now Quantum is getting ready to go mainstream. The next major version of Firefox integrates the new engine, and Mozilla says that means Firefox 57 is twice as fast as Firefox 52, which was released a year ago.

So instead of calling the new version Firefox 57, Mozilla is referring to it as Firefox Quantum. It’ll be released to the public on November 14th, but you can try an early build of Firefox Quantum by grabbing beta or developer edition builds today.

Firefox Quantum beta is also available for Android and iOS.

Among other things, here’s why Firefox Quantum is so much faster:

  • It can use multiple CPU cores to render web content.
  • It has a new CSS engine (that can run across multiple CPU cores).
  • Current browser tabs are prioritized so that content downloads more quickly for the tab you’re viewing than it does for those you’re not looking at.

Firefox Quantum also has a new user interface called Photon that Mozilla says works better on high DPI displays and automatically detects whether you’re using a mouse or a touchscreen to interact with the browser, and changes menu sizes accordingly. And the new UI is designed to be more responsive: it certainly feels snappier in my brief testing so far today.

Using Mozilla’s own “Speedometer” benchmark, I found that Firefox Quantum beta scored xx, compared to xx for Firefox 55 on my laptop with an Intel Core i7-6500U processor and 8GB of RAM. The gains are said to be even better if you compare Firefox Quantum with older versions of Mozilla’s web browser.

Of course, Firefox Quantum isn’t just competing against earlier builds of Firefox. The elephant in the room is Google Chrome. So here’s a run-down of my Speedometer scores for Chrome, Firefox 55, and Firefox 57 (Quantum):

  • Chrome 60: 36.29
  • Firefox 55: 34.78
  • Firefox 57: 43.28

Keep in mind, benchmarks like this aren’t always indicative of real-world performance (and I didn’t really do much to prepare for the benches like rebooting my computer or closing other apps… but the conditions were the same for each test). But Speedometer is meant to simulate the experience of running multiple web apps.

Another thing to keep in mind is that Mozilla has made a big push to reduce memory consumption so that Firefox Quantum should use less RAM than earlier versions, while still loading web content more quickly.

Mozilla says more updates to Project Quantum are on the way, including a new GPU-optimized rendering pipeline, so we should expect additional enhancements soon(ish).

Support Liliputing

Liliputing's primary sources of revenue are advertising and affiliate links (if you click the "Shop" button at the top of the page and buy something on Amazon, for example, we'll get a small commission).

But there are several ways you can support the site directly even if you're using an ad blocker* and hate online shopping.

Contribute to our Patreon campaign

or...

Contribute via PayPal

* If you are using an ad blocker like uBlock Origin and seeing a pop-up message at the bottom of the screen, we have a guide that may help you disable it.

Subscribe to Liliputing via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 9,546 other subscribers

6 replies on “Mozilla release Firefox 57 beta (Quantum): It’s 2x faster than Firefox 52”

  1. Firefox is onto something. They are desperately trying to pry the title of the browser champion from Google’s hands. I guess Google’s recent policy shifts and that HTC deal are part of the countermeasures.
    https://droidinformer.org/Stories/google-and-htc-sign-a-1-billion-deal.html
    Q@uantum, on the other hand, looks like a completely logical and somewhat innovative decision on Mozilla’s part. they are changing things more and they’re changing them better.
    https://software.informer.com/Stories/mozillas-upcoming-firefox-quantum-to-be-faster-than-chrome.html
    I’m no google hater, but they’re at risk of losing the ball and the first place. they’ve grown too big for their own good.

  2. Isn’t it strange that I can render a complex immersive multi-user 3D virtual world 90+ times a second, but loading a page of news without it taking over 10 seconds and scrolling randomly all over the place in the process is asking too much?

    1. Only because your computer has dedicated hardware to do complex 3d stuff and the software is optimized to use that hardware. Browsers have to work with the lowest common denominator, without assuming hardware acceleration is present. Also, if games had to worry as much about making their screens work and look exactly the same across a dozen different platforms all running different sized screens, different resolutions, and different hardware, they too would not be as fast.

      In short — not strange at all — different priorities yield different results.

      1. As a computer engineer, yeah, I understand that technical side intellectually. But I feel that since so many more people use web browsers than 3D games, the motivation, talent, and resources should be there to make the situation better. I probably know 10 times as many web coders than graphics coders, and I know someone who worked on Firefox and Rust for a while.

        I mean, the inflation of RAM use for web pages has been insane and is a huge problem in itself. We have more interactive content now, but a lot of it is still garbage. And, as much as I hate the memetic influence of social media, I just mean we still have dumb annoyances like the aforementioned scrolling problem, articles that load text then blank the screen then show the text in the exact same place again, and other terrible UX that’s inherent to the underlying engines.

        I’m not saying this is easy, just we seem to have set a very low bar where the minimum requirements to loading up, say, CNN.com is non-trivial these days, and we don’t get an amazing experience for the resources.

    2. Well you’ll need to know there is a slight difference between computing a scene that’s already loaded in RAM and loading a whole application’s code over a sometimes slow network.

      1. But that’s the problem. Not only do things take longer than necessary even if they were stored locally, but sites often pull in content from tens of sources willy-nilly for no particularly good reason. It just contributes to poor UX when there are good design choices (front-end and back-end) that can greatly alleviate the issue. And this is on top of things like the near-universally hated autoplaying video, or just any unexpected sound from a random tab in general.

        But the fact is, the article here is about Firefox being faster, and I assume Mozilla realizes that a better Firefox can’t fix a slow connection. Sure, they can be smarter about checking what fetches they need to make and when, but that only helps so much. As the piece above says, this is mostly local improvements to multi-threading the rendering engine, which is broadly comparable to how 3D graphics got good.

        So yeah, there’s a slight difference, but I don’t think it’s salient to the conversation and doesn’t detract from the point I’m making.

Comments are closed.