Young Earth Creationism is So 6000 Years Ago

Question from Cameron:
If the snow rings dating have been proven to be wrong (they represent cold and warm days not years) how could I come to believe carbon dating and other dating types. They seem like all a fraud to me. And like saying that the stalagmites in caves formed over millions of years when I could make one in my garage in just a few months.

Answer by SmartLX:
I assume you’re referring to Kent Hovind’s argument regarding the warm-cold layers in snow cores from Greenland. Here’s Hovind’s own spiel on the subject.

There are plenty of rebuttals online if you care to look as Hovind was saying the same thing for years (almost literally the same; he had a script memorised) but to be as brief as possible, once the snow is packed down under enough layers you might get a maximum of one additional warm-cold layer per year, and not very often. Any other fluctuations are mashed together and lost as the layers flatten. Someone digging a couple of hundred feet will see lots of extra layers, and that’s why the deep cores were taken in the first place: to get the good information down where nature has naturally removed much of the “noise”.

The cores are irrelevant to the accuracy of radiometric dating because they were not used to verify the accuracy of radiometric dating. If you wonder about that, actually look up how it’s been tested. If you simply dismiss all old-earth evidence because you think some of it is incorrect and therefore non-creationist scientists aren’t worth listening to, let me introduce you to the genetic fallacy.

Stalactites and stalagmites can form using different materials and in different circumstances, some of which are fast enough to show results in weeks and some of which are slow enough to take millions of years, and geologists know the difference. Even before you consider these structures, the cave they’re in has to form first, and that can take millions of years too. There’s lots more detail here.

Adam and Eve, not Ug and Eev!

Question from Dontay:
Evidence of dinosaurs has been found…museums show that cavemen existed…. But… How can cavemen be real if Adam and Eve are supposedly the first people on earth?

Answer by SmartLX:
If by “cavemen” you simply mean people who lived in caves and hunted and gathered for a living, then perhaps Adam and Eve’s immediate descendants did that once the garden of Eden was closed to them. The timing doesn’t work out at all when you count the supposed 34 generations from the Biblical Adam to the Biblical and historical King David and compare them to the scientifically estimated dates of the cavemen’s remains, but people who are motivated to prop up the story of Genesis will accept it anyway.

If on the other hand you mean Neanderthals and other departed species within the genus Homo, there you have a conflict which is less easily dismissed. The story goes that God not only made Man more or less in his present form (or a super-version that was huge and could live for centuries) but He made Man in his own image, which is poorly defined but usually taken to mean an image of perfection. “Lesser” or more primitive versions of Man don’t jibe with this idea at all. That’s why creationist explanations of the evidence simply assert that they were all just modern-type humans with primitive lifestyles.

As for dinosaurs, all evidence points to the fact that the last ones were dead millions of years before the first humans were born. Not so for most creationists; rather than deny they existed, many of them say dinosaurs were present on the Ark, and they’re depicted as such at the new Ark Encounter park in Kentucky. Any evidence or argument that so much as requires the expression “millions of years” is explicitly demonised.

The Dating Game

Question from Bryan:
Why is radiometric dating considered accurate? I will give two examples, one speculative and one based on actual observations:

I understand half-life, log base-2 calculations and radiometric decay. What I don’t understand is how we assume a starting point for radionuclides v/s daughter nuclides. For example, what if the meteorite that hit the earth and killed the dinosaurs (or any other large meteorite) was a big ball of lead? We have not found traces of it, therefore, we can only speculate what it could have contained. If it were a big ball of lead and mixed with the elements on earth, it would give the appearance that more U-238 half lives had occurred and give falsely high age outputs. (Granted, this is probability based, so is the statement that the meteorite wasn’t lead). This is just one possibility of something that could throw off parent-daughter nuclide ratio. It could have easily just been that there was more lead on earth than we previously thought at the beginning.

This one is not speculation. C-14 dating depends on two things:
1. That production rate and loss rate have been in equilibrium for an extended amount of time, and
2. The equilibrium ratio has not changed in an extended amount of time.

C-14 is produced in the upper atmosphere when various forms of cosmic radiation produce thermal neutrons. A N-14 atom nucleus collides and absorbs the neutron and ejects a proton, thus, making C-14. This process is attenuated by the existence of the earth’s magnetic field. The stronger the magnetic field, the lower the production rate of C-14, therefore, the ratio of C-14:C12 would be initially more weighted to C-12, and therefore, give a falsely older output. The earth’s magnetic field has decreased 10% since Mathematician Gauss started observing it. This decrease is exponential. In the year 7800 BC (rounded) the magnetic field would have been approximately 128 times stronger than it is now, based on current observations of the decay rate of the magnetic field. This would have caused 100% decrease in C-14 production. (Which also begs the question, how C-14 is in fossils that are millions of years old? – throw in the half life of C-14 being only 5700 years (rounded) with that question too). In fact, Dr. Libby noted that the C-14:C-12 ratio was NOT in equilibrium when designing the test, and subsequently decided to assume equilibrium anyway. So why is this test considered accurate when there is definite evidence to the contrary?

Answer by SmartLX:
Radiometric dating is not considered universally accurate. It’s a measurement like any other, there are any number of ways to get it wrong, and there are documented examples of when it has gone wrong. Despite this, it’s been successfully used to accumulate a mountain of evidence that the world is older than a literal interpretation of the Bible would lead one to believe. The threat to Biblical literalism really is the only reason anyone challenges the principle anymore, and it’s also the only reason why you would ask a question like this on Ask The Atheist instead of an actual science site. Mind you, it wasn’t all used explicitly to disprove the Bible like the scientific conspiracy some believers imagine. Scientists were investigating all sorts of questions; the answers just happened to lie between tens of thousands of years and billions of years in the past.

There are almost 20 independent methods of radiometric dating, each based on the decay of a different parent isotope into a daughter isotope. Each has its own starting point, a known past event when the substance to be dated theoretically contained known proportions of the parent and daughter (which could be all of one and none of the other). This is non-negotiable because a method is useless without a reliable starting point, as you would agree. The implication, then, is that the knowledge of serviceable pre-conditions is a major reason why each of the ~20 methods was developed in the first place, out of the multitude of unstable isotopes in the heavier half of the periodic table. It’ll always be there, if you pick a method and look it up.

If a method’s starting point is the least bit ambiguous, say, vulnerable to contamination by outside sources of the daughter isotope (like your meteorite that’s already full of lead), one or more other independent methods are used in conjunction, taking advantage of other elements in the object. The fact that unrelated methods consistently return almost precisely the same result is a major reason for confidence in the principle as a whole, since their reasons for potentially failing are so different.

As an example we’ll look at carbon dating animals in more detail.

Artwork by Randy Russell at Windows to the Universe
Artwork by Randy Russell at Windows to the Universe

As you wrote, Carbon-14 is formed when radiation strikes nitrogen in the air, which means it happens all the time above ground. It’s absorbed by plants in the carbon dioxide they breathe, and then eaten by animals. The starting point is therefore when the animal dies and stops accumulating it. Afterwards the carbon-14 decays back to nitrogen-14, which normally dissipates as gas but is trapped with the body if the specimen is buried. Dating it is then a matter of comparing the amount of N14 to the remaining C14. This works for about 50,000 years post-mortem. Afterwards the amount of remaining initial C14 is so low it cannot be distinguished from the small amount of C14 produced in a different way: alpha or gamma radiation from other radioactive materials in the earth (with millions of years more longevity) can re-irradiate atoms of the N14 and convert it to C14 a second time. This effect is the reason why C14 would have been detected in dinosaur fossils, which are WAY too old to retain their own C14. Hence, other elements with longer half-lives are used to date them instead.

On to your other point. The observation that led to your claim about the Earth’s magnetic field was merely that the dipole field has apparently decreased since 1835.
– That it is a consistent exponential decrease is only an assertion, so raising it exponentially as you go back is unsupportable. The line of best fit between only two data points cannot be assumed to be a smooth curve.
– That it has consistently decreased at ANY rate contradicts other evidence collected from the magnetisation of iron particles in ancient clay pottery (mentioned here), which indicate very clearly what the magnetic field was like earlier on. From just other two data points in history we know it was 45% stronger 3,000 years ago and 20% weaker 6,500 years ago (so in this case, the line of best fit is a wiggle).
– The initial observation only took the dipole field into account. Geologist Brent Dalrymple wrote that increases in the nondipole field, discovered from the very same measurements, resulted in no significant change in the overall strength of the field at all over this particular interval.
– As stated, most or all radiometric dating methods rely on the proportion of the daughter isotope to the parent. The magnetic field affects the generation of carbon-14 in the atmosphere, but this would have no effect on the proportion of C14 to N14 in an already-buried corpse if the N14 is all coming from the C14.
– Does the strength of the Earth’s magnetic field so directly affect the production of every radio-isotope used in every form of radiometric dating? I don’t know, but you’re the one claiming they don’t work, so have you checked?

It’s worth pointing out that carbon dating is the best-known dating method but it’s the least of a young-earth creationist’s worries, as it only goes up to 50,000 years. That’s only one order of magnitude off the desired scale, as opposed to the six orders of magnitude of the actual discrepancy between the Biblical timeline and the evident reality of geological time.