2

Radioactive decay is an attribute of unstable nucleus. When we represent it in equation, we don't involve any macroscopic attribute of substance.

But still, rate of radioactive decay is proportional to amount of substance available (a macroscopic attribute). Why?

What's really the relation with others when it comes to bleeding out its unstability?

2 Answers2

12

The chance for a fixed nucleus to decay doesn't depend on the number of nuclei. In a fixed amount of time all the nuclei have a certain chance to decay. Increasing the number of nuclei will increase the number of nuclei that decay, but that's really just what you'd expect.

It's like rolling lots dice, the number of dice showing a certain digit will be proportional to the number of dice you roll. Continuing this analogy, say you remove a die if it shows 1. This correspond to a nucleus decaying. Then at first many dice will be removed, but as the number of dice grow smaller so will the number that are removed.

Anonymous
  • 136
  • If that's the case, half life would never be constant.. – Earth is a Spoon Jun 16 '12 at 08:43
  • 1
    I hope I haven't written something too foolish. The half life should be constant but I'm not sure I can argue it in a qualitative manner. Maybe you could view it as a gemetric concept. You have a segment in the plane and each second you cut off a certain percentage (i.e. a certain percentage of the nucleuses decay), this is something "geometric" and shouldn't depend on the coordinate system, and thus not on the length of the segment. So no matter the initial length (or number of nucleuses) you'll end up with the same number of seconds to remove half the segment. – Anonymous Jun 16 '12 at 09:22
  • My point is that an always accurate constant can't be supported by a probability based problem.. – Earth is a Spoon Jun 16 '12 at 12:44
  • 2
    Yes it can. Try reading through the wiki article on half life http://en.wikipedia.org/wiki/Half-life . If you want to delve further on the connection of the half life with the quantum mechanical probability have a look at http://socrates.berkeley.edu/~phylabs/adv/ReprintsPDF/BRA%20Reprints/03%20-%20Beta%20Decay.pdf . – anna v Jun 16 '12 at 12:57
  • 5
    @SachinShekhar: Ever heard of law of big numbers? Of course the half-life of ten or so radioactive nuclei would not be constant, but for 1 gram of, say, carbon-14, which contains $4\times 10^{22}$ nuclei, there is hardly any deviation from the expectation value. – Siyuan Ren Jun 16 '12 at 13:08
  • 1
    @SachinShekhar To be a little pedantic, the measured halflife of a small sample is, indeed, subject to random variation, but this is a case where you can actually perform a frequentist-style ensemble of experiments if you desire so there is no philosophical difficulty in claiming that there exists an underling real half-life. – dmckee --- ex-moderator kitten Jun 16 '12 at 17:20
  • @KarsusRen Thanks.. It helped. I am not really dealing with few countable nuclei. :) – Earth is a Spoon Jun 17 '12 at 03:19
3

The core reason is that the chance for a given atom to decay in the next $n$ seconds is always the same.

You can get a more intuitive feel for this by considering a simple game with dice.

  1. Start with some initial population of ordinary dice (I'll assume we're using the basic six-sided (cubical) ones, but you table top RPG fiends can use polyhedral ones if you want).

  2. On each round roll all the surviving dice. Collect, count and remove any that show a 1; these are the ones that decayed and they are removed from the surviving population (they do not participate in future rounds). Record the number that decayed and the number that remain.

  3. Continue with step #2 until none survive (or you get bored).

  4. Repeat the whole of 1--3 several time to get some sense of the variation.

  5. Graph both the (average) number that decay and the (average) number that survive as a function of the number of rounds. (For more insight, do this both on linear paper and on semi-log paper.)

You'll notice that the odd for any particular die to fail and be removed on any given round is always 1/6, which is equivalent to the constant odds for a unstable nucleus to decay in some give time frame.

You can also simulate this kind of experiment with a code like:

// Rolling dice radioactive decay demo simulator.
//
// During each round any die that rolls a 1 is removed from the
// population, and we plot the average number of dice surviving as a
// function of the number of rolls.
//
// Relies on the ROOT framework. http://root.cern.ch/
//
// Compile with 
//
//    g++ -O3 $(root-config --cflags --ldflags --libs) decay.cc -o decay
//
// then run with
//
//     ./decay [<initial population> [<number of rolls> [<number of trials>]]]
//
// and find the output in "decay.png"
#include <iostream>

#include <TProfile.h>
#include <TCanvas.h>
#include <TRandom3.h>
#include <TF1.h>

int main(int argc, char*argv[]){

  // running parameters and default values 
  int population = 1000;
  int timeticks = 25;
  int trials = 20;

  // trivial argument handling
  switch (argc) {
  default: /* FALL-THOUGH */
  case 4:  /* FALL-THOUGH */
    trials = std::atoi(argv[3]);
  case 3:  /* FALL-THOUGH */
    timeticks = std::atoi(argv[2]);
  case 2:  /* FALL-THOUGH */
    population = std::atoi(argv[1]);
  case 1:  /* FALL-THOUGH */
  case 0:  
    break;
  }

  std::cout << "Doing " << trials 
        << " trials of " << population 
        << " dice over " << timeticks << " rolls." 
        << std::endl;

  // Setup a profile histogram of the number of dice surviving at a
  // particular time over some number of trials
  TProfile*hS = new TProfile("hS","Surviving population",
                 timeticks+1,-0.5,timeticks+0.5);
  hS->GetXaxis()->SetTitle("Rolls");
  hS->GetYaxis()->SetTitle("Surviving dice");

  TProfile*hD = new TProfile("hD","Number decaying",
                 timeticks+1,-0.5,timeticks+0.5);
  hD->GetXaxis()->SetTitle("Rolls");
  hD->GetYaxis()->SetTitle("Decaying dice");

  // A PRNG
  TRandom*r=new TRandom3(0);

  for (int p=0; p<trials; ++p) { // Several passes to establish a profile
    if (p%5 == 0) std::cout << "   Trial " << p+1 << "..." <<std::endl;
    int count = population;
    int t=0;
    for (t=0; t<timeticks; t++) { // several time buckets to observe decay
      hS->Fill(t,count);
      int decay = 0;
      for (int i=count; i; --i) { // Try each surviving die
    if ( r->Integer(6) == 0) {
      --count; // discard this die if it rolls 1
      ++decay;
    }
      }
      hD->Fill(t,decay);
    } 
    hS->Fill(t,count);
  }

  // Fit the data
  hS->Fit("expo","","",0.5,timeticks+0.5);
  hS->GetFunction("expo")->SetLineColor(kRed);
  hS->GetFunction("expo")->SetLineStyle(kDotted);

  hD->Fit("expo","","",0.5,timeticks+0.5);
  hD->GetFunction("expo")->SetLineColor(kBlue);
  hD->GetFunction("expo")->SetLineStyle(kDashDotted);

  // Output results
  TCanvas*c1 = new TCanvas("c1","c1",600,900);
  c1->Divide(1,2);
  c1->cd(1);
  hS->DrawCopy("");
  hD->DrawCopy("SAME");
  c1->cd(2);
  gPad->SetLogy();
  hS->Draw("");
  hD->Draw("SAME");
  c1->Print("decay.png");
}

Using the default values (a great many dice for many trials of many rounds) you get something like

enter image description here

or with a number of dice, rounds and repeats that wouldn't bore a human to tears you get something like

enter image description here

The fits are exponentials to show the functional dependence.