Computer Smashes World Record Math Problem

DAViS's high-performance computer was set up with two AMD Epyc 7542 processors coupled with 1TB of RAM, which isn't sufficient to hold all of the digits they were aiming to come up with. The y-cruncher program, therefore, was used to swap out the digits to an additional 38 hard disk drives (HDD) with a total 16TB of storage space, saving a large part of the RAM on the HDDs

that's a whole lot of digits
 
Last edited:
Bez nazwy.jpg
 
The day we discover that pi is actually a repeating number is going to mark the end of the universe, as we've learned too much.

Theory states that if the universe is a Gogleplex wide in meters or more, then around 10^100^78 or exponentially long before we reach a gogleplex of 10^100^100, there will be no other random atomic configurations left, so the observer - having left his original position and travelled 10^100^78 or so - will run into an exact repeating duplicate of where he started down to the very arrangement of atoms.
 
Theory states that if the universe is a Gogleplex wide in meters or more, then around 10^100^78 or exponentially long before we reach a gogleplex of 10^100^100, there will be no other random atomic configurations left, so the observer - having left his original position and travelled 10^100^78 or so - will run into an exact repeating duplicate of where he started down to the very arrangement of atoms.
Uh, I just wanna drink beer and watch porn.
 
Theory states that if the universe is a Gogleplex wide in meters or more, then around 10^100^78 or exponentially long before we reach a gogleplex of 10^100^100, there will be no other random atomic configurations left, so the observer - having left his original position and travelled 10^100^78 or so - will run into an exact repeating duplicate of where he started down to the very arrangement of atoms.
Yeah not sure exactly where this comes from, but I do know that some topographies of the universe have a wrap around effect, so that if you go one way long enough you'll pop back on the other side... effectively the same thing I think. Of course this all requires that the Universe not expanding faster than the speed of light. Kind of makes sense too, if there is no center of the Universe because from every point in the Universe you can see the stuff in every direction then there can't be an edge, how does that work out? having the ability to see stuff from "the other side" (to give a visual) makes the most sense.

That said... 12.8 trillion digit added to a 50 trillion digit number... what exactly was the math problem that was solved? It's not like they got any closer to the "end" of pi, honestly seems like this was kind of thing that was simply done to "break the record"

DAViS's high-performance computer was set up with two AMD Epyc 7542 processors coupled with 1TB of RAM, which isn't sufficient to hold all of the digits they were aiming to come up with. The y-cruncher program, therefore, was used to swap out the digits to an additional 38 hard disk drives (HDD) with a total 16TB of storage space, saving a large part of the RAM on the HDDs.

Now this is the most disappointing part of it, I was expecting some massive super computer... nope, almost something LTT would build
 
Yeah not sure exactly where this comes from, but I do know that some topographies of the universe have a wrap around effect, so that if you go one way long enough you'll pop back on the other side... effectively the same thing I think. Of course this all requires that the Universe not expanding faster than the speed of light. Kind of makes sense too, if there is no center of the Universe because from every point in the Universe you can see the stuff in every direction then there can't be an edge, how does that work out? having the ability to see stuff from "the other side" (to give a visual) makes the most sense.

That said... 12.8 trillion digit added to a 50 trillion digit number... what exactly was the math problem that was solved? It's not like they got any closer to the "end" of pi, honestly seems like this was kind of thing that was simply done to "break the record"



Now this is the most disappointing part of it, I was expecting some massive super computer... nope, almost something LTT would build
You mean the theory about a toroidal universe? It is one I tend to believe based on the electromagnetic field of radiating bodies in the universe. It is fun to speculate about the true nature of the universe.

1629405017978.png
 
Ha, no because every new and fancy computer is busy doing more profitable things like churning over massive amounts of user data
Well, that or mining imaginary currency.

Now this is the most disappointing part of it, I was expecting some massive super computer... nope, almost something LTT would build

Seriously, 64 cores is all this takes? There are some people here on [H] who could best this, the crew on ServeTheHome could probably beat this score twice over.
Something like this with a threadripper pro would net you potentially as many cores (but faster) and up to 2TB of ram.

And this part really hurt my brain:
38 hard disk drives (HDD) with a total 16TB of storage space
What!? Lets assume a few are hot spare or parity dead space, that's still an array of 500gb-1tb drives. I'm all about avoiding e-waste and reuse - I have plenty of tiny, old drives still in use at home. But come on man, that's ridiculous for a University team of researchers to be using.
 
Theory states that if the universe is a Gogleplex wide in meters or more, then around 10^100^78 or exponentially long before we reach a gogleplex of 10^100^100, there will be no other random atomic configurations left, so the observer - having left his original position and travelled 10^100^78 or so - will run into an exact repeating duplicate of where he started down to the very arrangement of atoms.

I think you might have a typo somewhere so maybe I'm misinterpreting your post, but it sounds like Google's HQ is going to grow to at least 10^10000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000 m^2 -- probably much larger. In other words, Google has not only taken over the world, but is expanding to take over the universe? Google's googols of ad trackers will soon approach infinity.

Well, that or mining imaginary currency.

An imaginary currency! That's it! The birth of a new cryptocurrency based on calculating Pi.

This is too much to take in all at once.
 
Last edited:
I think you might have a typo somewhere so maybe I'm misinterpreting your post, but it sounds like Google's HQ is going to grow to at least 10^10000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000 m^2 -- probably much larger. In other words, Google has not only taken over the world, but is expanding to take over the universe? Google's googols of ad trackers will soon approach infinity.



An imaginary currency! That's it! The birth of a new cryptocurrency based on calculating Pi.

This is too much to take in all at once.
Nah, the new hotness is trying to disprove the Collatz Conjecture.
 

I think that:

During operation, the computer and the disks could reach up to 80°C, which is why the system was housed in a server rack with constant air cooling to avoid overheating. This contributed over half of the total 1,700 watts of power that the scientists estimate was required for the full calculation, which would still place the system in 153rd position on the Green500 list.



They mistook an almost normal for an enthusiast big machine 1,700 watt for 1,700 kilowatts, the computer in 153th position on that list has 85,000 cores from over 6,100 Xeon cpus with 215,000 gig of rams.

What!? Lets assume a few are hot spare or parity dead space, that's still an array of 500gb-1tb drives. I'm all about avoiding e-waste and reuse - I have plenty of tiny, old drives still in use at home. But come on man, that's ridiculous for a University team of researchers to be using.
That just very strange phrasing if not an error considering the writer of the article seem to think that 1,700 watts on a dual cpu system is a lot, they had 510TB of storage, it was 38 HDD of 16TB each for the data. With 2SSD to run the OS-program.

Having a lot of disks also helped for the high bandwith required (they had 8.5GB-s) which could have been made with a small number of SSD obviously, but not only cost of such massive storage of ssd (the disk and the hardware to connect all of them) the giant amount of read-write could have been an issue.

The actual final answer of pi take 63 TB of storage to save, intermediary calculation take over 90 tb and over 300 tb for swapping.

For that kind of stuff using your browser translator:
https://www.fhgr.ch/fachgebiete/angewandte-zukunftstechnologien/davis-zentrum/pi-challenge/#c15509

Is sometime better than the ZDnet-wired young art&culture&techno reporter that have to push 15 content a day.
 
Nah, the new hotness is trying to disprove the Collatz Conjecture.
Collatz-coins? Sure, why not? It would probably align better with the risk tolerance of its investors.

Besides, I'm tired of refabricating the coaster on my desk every time a new digit of Pi is added -- significant digits be damned.
 
Fun fact physicist do in fact simply some constants to equal 1 just for sake of mathematical simplicity, speed of light, universal gravitational constant, bolzmann's constant, etc. E=m is so much easier to think about, because "everyone" knows there's a factor of c² that is in there that really doesn't make any difference.
 
Fun fact physicist do in fact simply some constants to equal 1 just for sake of mathematical simplicity, speed of light, universal gravitational constant, bolzmann's constant, etc. E=m is so much easier to think about, because "everyone" knows there's a factor of c² that is in there that really doesn't make any difference.
This is great! I want to buy a new GPU. Going to find a physicist who shops at a physics supply shop to purchase one for me at 1X MSRP.
 
This is great! I want to buy a new GPU. Going to find a physicist who shops at a physics supply shop to purchase one for me at 1X MSRP.
Yeah but do realize we know the answer ultimately has a constant multiplying it.

So for your GPU we know that we need to multiply by one arm and one leg after our simplistic calculations.
 
Yeah but do realize we know the answer ultimately has a constant multiplying it.

So for your GPU we know that we need to multiply by one arm and one leg after our simplistic calculations.
Then we have to take into account the GPU quanta (Planck $), which can be approximated as:

$p=SQRT(hbar(TSMC)/Jensen^3) = $1000
 
I think that:

During operation, the computer and the disks could reach up to 80°C, which is why the system was housed in a server rack with constant air cooling to avoid overheating. This contributed over half of the total 1,700 watts of power that the scientists estimate was required for the full calculation, which would still place the system in 153rd position on the Green500 list.



They mistook an almost normal for an enthusiast big machine 1,700 watt for 1,700 kilowatts, the computer in 153th position on that list has 85,000 cores from over 6,100 Xeon cpus with 215,000 gig of rams.

Ahhhh, I understand now:
It is the GFlops by watt used ranking that was the 153 best (and the name of the list make much more sense now), it is not the absolute power use of 1,700 watts that had anything to do with their ranking.

Interesting, but wouldn't we expect this to keep happening?

You'd think every new and faster computer would keep smashing these records as time goes on, over and over again?
Very few would attempt to go all this ordeal, but one of the point of the current exercise was doing it with very limited resource (not much more watts that a sli gamers machine), the goal obviously not being about having more decimals of PI, but developing knowledge and software to take advantage of modern high performance regular PC for students to use it to calculate actual stuff in the future. The calculation being a prove of concept-objective to develop ways to make ultra memory extensive endeavor work on a "regular PC", instead of a super computer.

Considering it went more than 4 times faster than the previous record made with
  • (4) Intel Xeon E7-4880V2 2.5GHz 15C/30T CPU
  • (2) Intel Xeon E5-2670 8C/16T CPU

(much slower but still, 152 threads to work with) It is maybe with a better way to do with software-strategy wise and not pure brute force.
 
Maybe I'm missing something, but why not use GPUs? A single 3090 can calculate 32 Billion digits using GPUPI in less time than it takes 2 Epyc 7H12's to calculate 10 Billion digits using Y-Cruncher.
 
Yeah but do realize we know the answer ultimately has a constant multiplying it.

So for your GPU we know that we need to multiply by one arm and one leg after our simplistic calculations.
So... it's still 1X, that's not so bad. :(
 
Maybe I'm missing something, but why not use GPUs? A single 3090 can calculate 32 Billion digits using GPUPI in less time than it takes 2 Epyc 7H12's to calculate 10 Billion digits using Y-Cruncher.
Maybe the record was for CPUs?
 
Maybe the record was for CPUs?

Not sure, but it looks like everyone who's held it so far has done it with CPUs. The only thing I can think of is that maybe once a GPU runs out of VRAM it becomes slower than just using the CPU/Memory/Storage.
 
Back
Top