top | item 42188229

(no title)

olao99 | 1 year ago

I fail to understand how these nuclear bomb simulations require so much compute power.

Are they trying to model every single atom?

Is this a case where the physicists in charge get away with programming the most inefficient models possible and then the administration simply replies "oh I guess we'll need a bigger supercomputer"

discuss

order

p_l|1 year ago

It literally requires simulating each subatomic particle, individually. The increases of compute power have been used for twin goals of reducing simulation time (letting you run more simulations) and to increase the size and resolution.

The alternative is to literally build and detonate a bomb to get empirical data on given design, which might have problems with replicability (important when applying the results to rest of the stockpile) or how exact the data is.

And remember that there is more than one user of every supercomputer deployed at such labs, whether it be multiple "paying" jobs like research simulations, smaller jobs run to educate, test, and optimize before running full scale work, etc.

AFAIK for considerable amount of time, supercomputers run more than one job at a time, too.

Jabbles|1 year ago

> It literally requires simulating each subatomic particle, individually.

Citation needed.

1 gram of Uranium 235 contains 2e21 atoms, which would take 15 minutes for this supercomputer to count.

"nuclear bomb simulations" do not need to simulate every atom.

I speculate that there will be some simulations at the subatomic scale, and they will be used to inform other simulations of larger quantities at lower resolutions.

https://www.wolframalpha.com/input?i=atoms+in+1+gram+of+uran...

pkaye|1 year ago

Are they always designing new nuclear bombs? Why the ongoing work to simulate?

sliken|1 year ago

Well there's a fair bit of chemistry related to the explosions to bring the sub-critical bits together. Time scales are in the nanosecond range. Then as the subcritical bits get closer obviously the nuclear effects start to dominate. Things like berrylium are used to reflect and intensive the chain reaction. All of that is basically just a starter for the fusion reaction. That often involved uranium, lithium deturide, and more plutonium.

So it involves very small time scales, chemistry, fission, fusion, creating and channeling plasmas, high neutron fluxes, extremely high pressures, and of course the exponential release of amazing amounts of energy as matter is literally converted to energy and temperatures exceeding those in the sun.

Then add to all of that is the reality of aging. Explosives can degrade, the structure can weaken (age and radiation), radioactive materials have half lives, etc. What should the replacement rate be? What kind of maintenance would lengthen the useful lives of the weapons? What fraction of the arsenal should work at any given time? How will vibration during delivery impact the above?

Seems like plenty to keep a supercomputer busy.

ethbr1|1 year ago

I'd never considered this, but do the high temperatures impose additional computational requirements on the chemical portions?

I'd assume computing atomic behavior at 0K is a lot simpler than at 800,000,000K, over the same time step. ;)

JumpCrisscross|1 year ago

> Are they trying to model every single atom?

Given all nuclear physics happens inside atoms, I'd hope they're being more precise.

Note that a frontier of fusion physics is characterising plasma flows. So even at the atom-by-atom level, we're nowhere close to a solved problem.

amelius|1 year ago

Or maybe it suffices to model the whole thing as a gas. It all depends on what they're trying to compute.

rcxdude|1 year ago

>Are they trying to model every single atom?

Modelling a single nucleus, even one much lighter weight than uranium, is a captital-H Hard Problem involving many subject matter experts and a lot of optimisation work far beyond 'just throw it on a GPU'. Quantum systems get non-tractable without very clever approximations and a lot of compute very quickly, and quantum chromodynamics is by far the worst at this. Look up lattice QCD for a relevant keyword.

CapitalistCartr|1 year ago

It's because of the way the weapons are designed, which requires a CNWDI clearance to know, so your curiosity is not likely to be sated.

piombisallow|1 year ago

These usually get split into nodes and scientists can access some nodes at a time. The whole thing isn't working on a single problem.

GemesAS|1 year ago

Modern weapon codes couple computationally heavy physics like radiation & neutron transport, hydrodynamics, plasma, and chemical physics. While a 1-D or 2-D simulation might not be too heavy in compute often large ensembles of simulations are done for UQ or sensitivity analysis in design work.

TeMPOraL|1 year ago

Pot, meet kettle? It's usually the industry that's leading with "write inefficient code, hardware is cheaper than dev time" approach. If anything, I'd expect a long-running physics research project to have well-optimized code. After all, that's where all the optimized math routines come from.

glial|1 year ago

I bet the bulk of it is still super-fast Fortran code.

alephnerd|1 year ago

> I fail to understand how these nuclear bomb simulations require so much compute power

I wrote a previous HN comment explaining this:

Tl;dr - Monte Carlo Simulations are hard and the NPT prevents live testing similar to Bikini Atoll or Semipalatinsk-21

https://news.ycombinator.com/item?id=39515697

bongodongobob|1 year ago

My brother in Christ, it's a supercomputer. What an odd question.