Big Science - Big Data - Supercomputer World Record

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,626
Blowing stuff up has been a great source for technology advancements for decades now. Los Alamos National Lab has the below video posted that very lightly touches on on some incredible tech that utilizes Vector Particle-In-Cell (VPIC) formulas produced back in the 1950s. The Trinity supercomputer has new world record of creating 1 trillion files in two minutes successfully. The files all represent an individual particle's trajectory, velocity, temperature, and spin among other data points.

Check out the video.

A Los Alamos National Laboratory scientist having trouble solving a stubborn research problem needed some help – his scientific simulations had generated a sea of data, but it took so long to search the data that he couldn’t find the information he needed. He found himself looking for the proverbial needle in a haystack. At the same time, the lab’s storage research team had been hard at work on another classic big data problem: creating massive numbers of files as quickly as possible. The day the team met with the scientist, you could say that Big Science and Big Data put their heads together – and now they’re making history.
 
will be interesting to see how much they can shrink the scales on some computer models. right now the resolution on say.. a climate model, can be in square miles.

still waiting for the "planetary" super computer.
 
Sounds like he's using the filesystem as a database. Each file is just a row of data, and I see no reason to do it that way.
There are many databases that scale easily to trillions of rows, this problem was solved long ago. Someone needs to tell this physicist that there's an easier way.
 
Back
Top