Astronomy Asked by notadoctor on February 18, 2021
The initial portion (and peak) of a SNIa’s light curve is powered by the $beta$-decay reaction:
$$ ^{56}Ni rightarrow ^{56}Co + e^+ + nu_e + gamma $$
Supposing we know how much Ni-56 is created in the supernova (i.e., an initial mass of Ni-56), how can I go about estimating the luminosity peak? The half-life of Ni-56 is 6.1 days, if that’s relevant.
Does it make sense if I just look up/calculate the energy released in this reaction, multiply it by the number of Ni-56 atoms required to comprise the given initial mass, and call that the luminosity? Any other better ideas?
Nickel 56 decays to Cobalt 56 via electron capture decay, with a half-life of 6.1 days and a decay constant of $lambda = 1.31times 10^{-6}$ s$^{-1}$.
About 1.75 MeV of energy is lost as gamma rays and a further 0.41 MeV in the form of an electron neutrino (Nadyozhin 1994)
Let's assume that we are talking about the period of time after the initial detonation, where the fusion energy of carbon and oxygen is released and is sufficient to unbind the white dwarf.
Let's assume that apart from the very first few seconds of the supernova, that the envelope is transparent to neutrinos, so that energy is lost.
Let's further assume that the gamma rays are able to thermalise rapidy in the envelope, that the energy is able to diffuse to the "photosphere" of the expanding fireball in $< 6 $ days and that the work done in expanding any ejecta is also negligible. This latter may not be justified in a type Ia supernova.
The decay equation $N= N_0 exp(-lambda t)$, means the rate of gamma ray energy deposition will be $$ frac{dE}{dt} = 1.75 lambda N_0 exp(-lambda t) {rm MeV/s}, $$ where $N_0$ is the number of nickel nuclei you begin with.
Let's assume about $0.5M_{odot}$ of Nickel is produced (see Childlress et al. 2015, incidentally, this paper illustrates how complex this question really is). This means $N_{0} simeq 1.1times 10^{55}$.
Thus I make $dE/dt$ (assumed to result in the emergent luminosity), when $t=0$, to be $2.5times10^{49}$ MeV/s, or $10^{10} L_{odot}$.
This value is about a factor of two higher than the average type Ia supernova peak bolometric luminosity measured by Scalzo et al. (2014), which were well modelled with nickel masses of around $0.5 M_{odot}$. So I would suggest that one or more of the assumptions above (probably the one about no work being done on the ejecta) is a bit flaky. However, I also notice that the peak luminosity occurs about 2 weeks after the initial rise, so I would suggest that the "instant" reprocessing assumption is also incorrect and that the released energy is smoothed out somewhat so that a lot of the nickel has already decayed by the time you get to peak luminosity.
Correct answer by ProfRob on February 18, 2021
Okay, yes, this was the way I went about it, and it seems satisfactory enough. Just use a nuclear physics reference (or Google) to get the isobar masses:
$M_{^{56}Ni} = 55.9421u$
$M_{^{56}Co} = 55.9398u$
Then you have $Delta m=0.0023u=(0.0023)(931.5frac{MeV}{c^2})=2.14245frac{MeV}{c^2}$ and thus the energy per decay is $2.14245MeV$.
From here, just figure out how many Ni-56 nuclei you started with via your initial mass and straight division through by the Ni-56 mass above, then multiply by your energy released per decay and convert to the units you need.
In my case, I ended up with about $2times10^{10}L_{sun}$. Hope this helps someone else one day!
Answered by notadoctor on February 18, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP