View Single Post
  #2  
Old June 21st 04, 08:30 PM
Parallax
external usenet poster
 
Posts: n/a
Default Effects of Nuclear Detonations in Space

(Eric Henry) wrote in message . com...
I've been looking at an old post by George William Herbert but i can't
figure out how he reaches his conclusions.

To paraphrase and recap; he says that a nearby nuclear detonation
against a target composed of aluminum would require 100 Megawatts for
surface melt, 1000 Megawatts for surface vaporization, and 100,000
megawatts for impulsive shock damage. How were these numbers derived?

I presume that a target composed of different materials would have
different numbers for surface melt, surface vaporization, and
impulsive shock damage. But i am at a loss as to how to determine
that.

Aluminum has the following properties which might be relevant

Heat of Fusion = 10.790 kJ/mol

Heat of Vaporization = 293.40 kJ/mol

Heat of Atomization = 326 kJ/mole

Anybody have any ideas or care to comment. I've sent an email
directly to Mr Herbert but i'm certain he's been much to busy to
respond.

Thanks for the help


-----------------------------------------------------------
A kilogram of TNT produces explosive energy of around 4.2 million
joules of energy. A kiloton is defined precisely not using TNT
equivalent, but as 10^12 calories of energy, which is equivalent
to 4.19x10^12 joules of energy. If you look at a sphere 1 kilometer
in radius, it has a surface area of 12,566,371 square meters.
So a 1 kiloton bomb will put out an energy density of X-rays
of on the rough order of magnitude of 333 kilojoules per square meter.
That's enough to vaporize about 25 grams of aluminum, or 10 cubic
centimeters, or a layer about 0.01mm thick off a sheet surface if
we ignore conduction.

In reality, that's not enough to actually vaporize the surface
with conduction and other factors in play... you need more like
10^9 W/cm^2 for a microsecond, which is about the time that it
will take for the x-ray pulse to peak and then tail off,
about 10^8 W/cm^2 for a microsecond will melt part of
the surface, and 10^11 W/cm^2 for a microsecond to cause
enough vaporization to lead to impulsive shock damage
to the surface. The 1 kt bomb at 1,000 meters is
about 3.3x10^7 W/cm^2 for a microsecond.

A 1 megaton bomb at 1,000 meters is going to be about 3.3x10^10
W/cm^2,
which will vaporize some of the surface but not quite reach the
impulsive
shock damage levels.


It used to be my job to do exactly this calculation for nukes in
space. Assuem that 75% of a nuke output is soft x-rays with a 1 KeV
Blackbody temp. You can do a simple calculation from Stefans law to
get the time decay curve of this BB (doing this once in open lit got
me in trouble once so do it yourself) but you will find the decay time
is probably less than 50 nano-seconds. Almost all of the very low
energy x-rays will deposit within 1 micron of the surface and not too
much thermal conduction will occur on this time scale. Damage
thresholds will probably be on the order of .25+ cal/cm2 of 1 KeV BB
radiation on this time scale. Damage results from surface melt,
spallation and shock waves.