Pioneering distributed computing project SETI@home announced this month that it would no longer leverage the processing power of volunteers’ computers to search for alien life. But the Folding@Home project which uses similar technology for medical research purposes is still going strong — stronger than ever, really.

Folding@Home director Greg Bowman notes that the combined might of all the computers working together means that Folding@Home now has “over 470 petaFLOPS of compute power,” making it more than twice as powerful as the world’s top supercomputer.

The uptick in performance comes as hundreds of thousands of computer users have dedicated spare CPU and GPU cycles to the project following Folding@Home’s announcement that it’s working on projects to gain a better understanding of coronavirus/COVID-19.

Folding@Home uses simulations to better understand how proteins fold, which can lead to treatments for viruses, lcancer, and other conditions.

In other words, it’s unlikely that the work Folding@Home is doing on the coronavirus will lead to a vaccine to help prevent the spread of COVID-19. But it could help researchers identify or develop drugs to help treat people who already have the virus, which could have an impact on just how deadly it is.

At this point, coronavirus research isn’t the only thing Folding@Home is working on. But given the immediacy and widespread impact of the pandemic, Bowman says the project is “trying to bring the whole to bear on” COVID-19.

You can find instructions for installing the Folding@Home client on your computer and dedicating unused resources to the analysis of COVID-19 at the project’s website. Once you’ve done that, the project may send small chunks of data to be processed on your computer when you’re not taking full advantage of your system’s resources for other activities. By combining the computing power of thousands of individual computers, the project now has the processing power of the world’s top 7 supercomputers combined.

Alternately, if you’re looking for other ways to help you could make a donation to Folding@Home through Washington University in St. Louis, or contribute to any number of other national, regional, or local organizations that are doing their part to both combat the disease and to help protect the health and financial security of folks who have been affected.

Update: As of March 25th, 2020 the Folding@Home project exceeded 1 exaflop (a quintillion operations per second) of performance.  

via @drGregBowman and Tom’s Hardware

 

Support Liliputing

Liliputing's primary sources of revenue are advertising and affiliate links (if you click the "Shop" button at the top of the page and buy something on Amazon, for example, we'll get a small commission).

But there are several ways you can support the site directly even if you're using an ad blocker* and hate online shopping.

Contribute to our Patreon campaign

or...

Contribute via PayPal

* If you are using an ad blocker like uBlock Origin and seeing a pop-up message at the bottom of the screen, we have a guide that may help you disable it.

Subscribe to Liliputing via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 9,545 other subscribers

4 replies on “As Folding@Home tackles the coronavirus, volunteers deliver more compute power than the world’s top supercomputers”

  1. Intel foot dragging milking and withholding investment all these years is bad timing for this. Too little too late as usual. AMD to the rescue but volumes still not there. A side note it’d be interesting to see how much cpu vs gpu power this requires, and how software/os/kernel allocates each to such tasks as protein folding models.

    1. Or could it be done better on FPGA or other exotic processor? Anyone studying this? Links?

  2. This would be a funny thing for a time traveler to read if they came from a time period where they have determined P=NP, and they have a math formula for this.

    Protein folding is a type of Constraint satisfaction problem (solving a sudoku is another example). The reason we need so much computing power is because we haven’t yet figured out a math formula to solve the problem in a reasonable amount of time. Solving it requires working through the whole data set.

    1. That’s true, but we have developed a few algorithms to do calculations locally but not entirety.

      The complexity increases exponentially as you zoom out, so it’s unlikely there will ever be a algorithm for it. That’s like what astro-physists are striving for in a unified equation. Too complex.

Comments are closed.