One of these methods is based on a substance found in our bodies, plants and all living things—it’s called carbon. This makes the plant appear to have died many more years ago than it actually did (for example, the plant might appear to be, say, 3,000 years old, rather than 2,000).
So something that lived (and died) when the proportion of C was less than normal would appear to have died more years ago than it actually did (for example, it might give an age of 3,000 years before the present, rather than its true age of 2,000 years).
Prior to 1905 the best and most accepted age of the Earth was that proposed by Lord Kelvin based on the amount of time necessary for the Earth to cool to its present temperature from a completely liquid state.
Although we now recognize lots of problems with that calculation, the age of 25 my was accepted by most physicists, but considered too short by most geologists. Recognition that radioactive decay of atoms occurs in the Earth was important in two respects: Principles of Radiometric Dating Radioactive decay is described in terms of the probability that a constituent particle of the nucleus of an atom will escape through the potential (Energy) barrier which bonds them to the nucleus.
The energies involved are so large, and the nucleus is so small that physical conditions in the Earth (i.e. The rate of decay or rate of change of the number N of particles is proportional to the number present at any time, i.e.
Geologists draw on it and other basic principles ( to determine the relative ages of rocks or features such as faults.
Relative age dating also means paying attention to crosscutting relationships.
It also allows the estimation of the age of geological samples using the decay of long lived nuclides.
All radioactive decays follow first order kinetics.
Have you ever wondered how the scientists knew the age of the bone?