To explain.
Let’s say they have a method that can test bone age. Up to let’s say 1-2k years we can know for sure it’s accurate, since we might have believable records on the bones proving that the age test is accurate.
Past a certain age though there’s no more records. How can we know the testing is accurate and not just the method only going up to that limit and being inaccurate on anything older? Or are we just assuming?
In: 0
They do have a method to test bone age. Radio Carbon dating.
Radioactive things decay at a steady and fixed rate. This is a known rate from physics and chemistry, called the half life. It’s a physical and universal constant.
Things take in a slightly radioactive isotope from the atmosphere when alive, Carbon 14. When they die, they stop taking in carbon because they are dead. They can use the decay of the radioactive carbon to see how long something has been dead.
This will get us to about 50,000 years since the half life is only about 5,000 years.
To go farther, you can use other longer half life isotopes but getting into the geologic time scale at that point with billions of years. Accuracy goes down a bit though, and that’s accepted. At some point the accepted range gets far higher since that is the highest degree of sensitivity tests can run. To test for decay, instruments must be able to measure how much has decayed.
There are other methods like layers(stratigraphy), if something is known change over time it can be seen in buried layers. An example is the KT boundary. A layer of iridium from a meteor strike that has been very well researched, so it’s possible to have a relative “Was this before or after 66 million years ago?”
Latest Answers