RPi 4b vs RPi 5 benchmark in BOINC - Einstein@Home
TL;DR: The Pi 5 is roughly 30-50% faster than Pi 4b.
I finally received my Raspberry Pi 5. I pre-ordered one straight on release day in September, and it arrived at my door a week ago. I used it for a moment as a desktop device with a full desktop install of the Raspberry Pi OS, and the first impression have been very positive so far, it feels much, much snappier in typical desktop activities such as browsing the Internet or watching videos. But that is not why I bought it, using a Pi as a desktop is on the last position of my “what to use a Pi for” list. I have several ideas on how I want to use the new Pi, and the first one is to try it out as a BOINC cruncher. From that came the plan the benchmark it in comparison to the previous model, the Raspberry Pi 4, which I have three of.
BOINC
In short, BOINC is a platform that allows people to share their computing power with scientists that need it to do scientific analysis. Everyone can install a BOINC client, select a project they are interesting it, and let their computer do calculations that are then returned to the scientists. The calculations are divided into tasks, usually a task uses a single CPU core, and you can crunch (that is the BOINC word on performing calculations) as many tasks simultaneously as the number of cores in your CPU. Some project also allow crunching on the GPU. I wrote more about BOINC in this blog post: My thoughts as I reach 100 million points in BOINC
I did a share of BOINC calculations on my Raspberry Pi 4 and was curious how it compares with the new Pi 5. And so I set up a test.
The BOINC project I chose for crunching in this test is Einstein@Home. I chose this one because it’s one of only two projects that provide ARM64 tasks (the second being Asteroids@Home, its tasks are relatively short, and the project has a high task size uniformity (more on that later).
Test baseline
To make the test as fair as possible I defined a set of baseline rules:
- both Pis used the same SD cards. I am using SandDisk Ultra 64GB.
- both Pis used the official power supplies dedicated to their specific models.
- both Pis used the same cases and active cooling. Actually, I put both of them in a single cluster case. For cooling I am using a 120mm 12V fan powered by a 6V power brick. I decided to go this way, as that large fan running at half speed is almost inaudible, and that allows me to sleep in the same room as the test rig. The Pi 5 has the official active cooler attached, but the large fan is working so well that the active cooler did not start during the whole test. I was monitoring the CPU temps, and they never exceeded 55C on both Pis, so there is certainty that neither of them was thermal throttling.
- both Pis were set up in the same way. The configuration was totally stock, no overclocking. I burned the Pi OS 64 bit Lite on both SD cards, started them, upgraded all the packages and installed BOINC. Both Pis run headless.
Why the test could not be fully fair
And now I must mention that even though I implemented all those measures mentioned above, the test conditions were not perfectly uniform. The issue is that the tasks that the BOINC servers are sending are not always the same size, and some take a longer to finish than others. Einstein@Home tasks are rather uniform in their size, but they are not exactly the same. A BOINC user has no control over which tasks they receive. So it is theoretically possible that one Pi was getting a higher percentage of larger tasks than the other. To mitigate this problem I went with making the task pool bigger. I crunched 80 tasks per Pi, in hope that more tasks would reduce the weight of a single task size. I think around 80 is the sweet spot between having a large enough pool of data, and waiting for ages for the benchmark to finish. After all, the Pis are not the fastest crunchers.
With the baseline of the way, let’s see the results.
Results
Average tasks time
For the 4b, the average tasks time was 16353 seconds (4 hours, 32 minutes and 33 seconds). For the Pi 5, the average task time was 10858 seconds (3 hours and 58 seconds), a decrease of around 33%.
Minimum task time
For the 4b, the shortest task took 13866 seconds (3 hours, 51 minutes and 6 seconds). For the Pi 5, the shortest task time took 10207 seconds (2 hours, 50 minutes and 7 seconds), a decrease of around 26%.
Maximum task time
For the 4b, the longest task took 17148 seconds (4 hours, 45 minutes and 48 seconds). For the Pi 5, the longest task took 11370 seconds (3 hours, 9 minutes and 30 seconds), a decrease of around 33%.
Average tasks per day
A day has 86400 seconds. Both Pi 4 and Pi 5 have four cores, so they have 345600 core-seconds per day. Dividing that number by the average task time, the results are 21 tasks per day for the Pi4 and 31 tasks per day for the Pi 5, the Pi 5 can crunch 47% more tasks per day than the Pi 4.
Benchmark raw results
I am attaching below the benchmark data. Please feel free to use it in whatever way you want, just please link to this blog post if you want to publish your work.
Benchmark data in LibreOffice Calc format
Summary and further steps
This has been only a preliminary test of the new Raspberry Pi. To sum the results so far, the Pi 5 is, to little shock, faster than the previous model, 33% decrease in task crunch time is a reasonable generational upgrade.
I want to continue testing the Pi 5 in different scenarios. For the next steps, I will crunch Asteroids@Home, and see if the performance difference between the 4 and 5 will be the same or different in another BOINC project.
Apart from that, I wonder how much a bottleneck is using the SD card as storage. I think I will also do a test running the Pi5 from an NVME M.2 SSD in a USB case to see what kind of a difference will it make.
I should also receive soon a Pi alternative, an Odroid C2, and see how it compares with the two pis.
As for the Pi5 itself, I plan to test it out in other scenarios, I think the next one will be making a NAS out of it. Whatever I do with it, I will write about it on my blog.
Thanks for reading!
If you have any comments about the test, its metodology, my calculations, or have suggestions what to test next, drop me an email, or contact me on Mastodon. Links are in the footer.
You can help with funding my future projects by supporting me on these crowdfunding sites: