Skip to main content

The 35 year lifespan myth

Why couldn't they just say '70'?
What is the human lifespan? We still tend to hold on to a magic number from the Bible – “three score years and ten” or seventy years as an idea of the natural length of life, but what has human life expectancy really been like through history?

Average life expectancy has grown phenomenally in the last hundred years. From the dawn of history through to the nineteenth century, average life expectancy has been between 25 and 35. Now it is in the high 60s, and higher still outside Third World countries.

How, then, did the writers of the Bible come up with the over-inflated but visionary figure of 70? This is because average figures can be very misleading. The historical figures are dragged down by a very high infant mortality rate. Before modern medicine, most children would not make it to adulthood. Similarly, many women died while giving birth, in their twenties or younger. If early deaths are excluded from the average, lifespans in the 50s, 60s and 70s were not uncommon.

Typical lifespans of those who survived into adulthood dropped as we moved from the pre-industrial to the early industrial age, though more children were surviving, so this isn’t clearly reflected in the averages. Then lifespans started to rise as modern medicine kicked in, up to the current impressive high. Now, childhood mortality is at an all time low. As Armand Leroi points out (in his book Mutants), 1994 was a remarkable year in this respect. In 1994, no eight-year-old girls died in Sweden – not a single one. While this was just one point in the statistics – the next year, no doubt a handful did – it is still a notable fact that would have been inconceivable to our medieval ancestors.

When there’s a funeral for a baby or a child it is always a very emotional and particularly sad occasion – it’s sobering to think that not many years ago, and throughout all of history before that, the majority of funerals were for babies and children.

Comments

Popular posts from this blog

Why I hate opera

If I'm honest, the title of this post is an exaggeration to make a point. I don't really hate opera. There are a couple of operas - notably Monteverdi's Incoranazione di Poppea and Purcell's Dido & Aeneas - that I quite like. But what I do find truly sickening is the reverence with which opera is treated, as if it were some particularly great art form. Nowhere was this more obvious than in ITV's recent gut-wrenchingly awful series Pop Star to Opera Star , where the likes of Alan Tichmarsh treated the real opera singers as if they were fragile pieces on Antiques Roadshow, and the music as if it were a gift of the gods. In my opinion - and I know not everyone agrees - opera is: Mediocre music Melodramatic plots Amateurishly hammy acting A forced and unpleasant singing style Ridiculously over-supported by public funds I won't even bother to go into any detail on the plots and the acting - this is just self-evident. But the other aspects need some ex

Is 5x3 the same as 3x5?

The Internet has gone mildly bonkers over a child in America who was marked down in a test because when asked to work out 5x3 by repeated addition he/she used 5+5+5 instead of 3+3+3+3+3. Those who support the teacher say that 5x3 means 'five lots of 3' where the complainants say that 'times' is commutative (reversible) so the distinction is meaningless as 5x3 and 3x5 are indistinguishable. It's certainly true that not all mathematical operations are commutative. I think we are all comfortable that 5-3 is not the same as 3-5.  However. This not true of multiplication (of numbers). And so if there is to be any distinction, it has to be in the use of English to interpret the 'x' sign. Unfortunately, even here there is no logical way of coming up with a definitive answer. I suspect most primary school teachers would expands 'times' as 'lots of' as mentioned above. So we get 5 x 3 as '5 lots of 3'. Unfortunately that only wor

Which idiot came up with percentage-based gradient signs

Rant warning: the contents of this post could sound like something produced by UKIP. I wish to make it clear that I do not in any way support or endorse that political party. In fact it gives me the creeps. Once upon a time, the signs for a steep hill on British roads displayed the gradient in a simple, easy-to-understand form. If the hill went up, say, one yard for every three yards forward it said '1 in 3'. Then some bureaucrat came along and decided that it would be a good idea to state the slope as a percentage. So now the sign for (say) a 1 in 10 slope says 10% (I think). That 'I think' is because the percentage-based slope is so unnatural. There are two ways we conventionally measure slopes. Either on X/Y coordiates (as in 1 in 4) or using degrees - say at a 15° angle. We don't measure them in percentages. It's easy to visualize a 1 in 3 slope, or a 30 degree angle. Much less obvious what a 33.333 recurring percent slope is. And what's a 100% slope