# How hot can plasma get?

Physics Asked by Pete Oakey on May 9, 2021

I remember reading about an experiment where fine rods of tungsten were super-heated with millions of amps of electricity, melting them into ionised gas and were then compressed (by magnetic fields?) into plasma.

The plasma heated up to temperatures never before reached. I can’t remember exactly, but I believe it was a few billion degrees fahrenheit.

It was a number of years ago – and I can’t find the report via a search engine.

Is there a limit to the temperature of plasma? What’s the current highest recorded temperature of plasma? Is it hotter than nuclear reactions?

The highest recorded temperature of a plasma is not hotter than nuclear reactions. There is a continuum of phenomena that happen at high temperature that includes and extends beyond nuclear reactions.

When temperatures get to be very high, it makes sense to start thinking in terms of the energies involved rather than sticking to the kelvin scale (or Fahrenheit, ugh). At thermodynamic equilibrium, the average energy of a "degree of freedom" with temperature $$T$$ is $$U=frac12 kT$$. For instance, a monoatomic ideal gas has mean energy per particle $$frac32 kT$$, for translations in three dimensions. If you have a system where the allowed energies come in lumps, like rotational and vibrational states in molecules, the mean energy per mode is zero while the temperature $$kT$$ is much less than the energy $$E$$ of the first excited state. This means that most systems have a larger heat capacity when they are hot than when they are cold, which makes reaching high temperatures challenging.

• For room temperature $$kT approx 25$$ milli-eV; this is a typical energy for a phonon in a solid.

• At $$kT approx 1$$ eV, a typical atom-atom collision may have enough energy to liberate an electron. This is the minimum temperature required to sustain a dense, ionized plasma (The sun's photosphere has $$kT=0.5$$ eV, which is "exactly the same" at the level of precision I'm aiming for here.)

• At $$kT approx 10^4$$ eV, even the heaviest atoms will be, on average, completely ionized. (The binding energy for the last-to-go electron is $$13.6,mathrm{eV}cdot Z^2$$, where $$Z lesssim 100$$ is the proton number.)

• At $$kT approx 0.1$$ MeV you start to have enough energy to excite nuclei internally. Light nuclei without stable excited states, like deuterium and helium-3, may be dissociated. Stable lightweight nuclei may overcome their electrical repulsion and fuse. This is the temperature scale inside the core of a star; fusion-oriented tokamaks have to run a little hotter, since stars have the advantage of size. Electrons at this temperature are beginning to be relativistic $$(m_ec^2 = 0.5rm,MeV)$$. As the temperature passes the electron mass, a secular population of positrons will develop.

• Somewhere above $$kT approx 10$$ MeV, helium dissociation will come into equilibrium with helium formation by fusion. Most collisions between heavy nuclei will have enough energy to liberate a proton or a neutron. This is probably the temperature regime in heavy stars, where all the nuclei tend to evolve towards iron-56 and nickel-58.

• At $$kT approx 100$$ MeV, most collisions have enough energy to produce pions ($$m_pi c^2 = 140$$ MeV), and many have enough energy to produce kaons ($$m_K c^2 = 500$$ MeV). These unstable particles will produce neutrinos when they decay. Neutrinos are very efficient at carrying heat away from the interaction region, so long-term astrophysical temperatures may top out around this scale. The most energetic collisions here may produce antiprotons ($$m_bar p c^2 = 1$$ GeV).

• There is a factor of a thousand or so in energy where my intuition is not very good.

• As shown at RHIC and at LHC, somewhere around $$kT approx 200$$ GeV you start to dissociate nucleons into quarks and gluons, the same way that around 1 eV you started to dissociate atoms into nuclei and electrons. Notice that this is "only" about twenty billion kelvin. The LHC is currently aiming at 8–14 TeV, nearly a factor of one hundred higher in energy.

I am not familiar with your tungsten vaporization experiment. I would wild-guess the freshly-vaporized tungsten might have a temperature of 1–10 eV and that by confining and compressing the plasma you could increase its energy density by a factor of 1000. That would put it somewhere under the low end of the energy range for a plasma with nuclear interactions.

Correct answer by rob on May 9, 2021

It depends on the kind of plasma you are talking about. I am setting aside the quark-gluon plasmas, which are different from other plasmas in that the nucleons are actually "broken in pieces".

The hottest plasmas on earth, otherwise, are generally those which are aimed at generating nuclear fusion reactions in a sizable amount (for example energy generation, or studying stars, planets, etc.). In the laboratory, a few large instruments have set impressive records, hotter than stars actually:

• tokamaks: 100 million kelvin
• the Z-machine: 2 billion kelvin
• laser facilities such as the NIF: 100 million kelvin

It looks like the Z-machine has the hottest, but I am not sure how much of this plasma is produced in each of these facilities. In general, the tokamaks can keep the temperature high for a long time (minutes), as it is very dilute. The other two techniques can only sustain the plasma for nanoseconds, as it is very dense.

Answered by fffred on May 9, 2021