According to a paper published by Professor Efthimiou and Sohang Gandhi last year, many supernatural myths can easily be explained by physics and math. In the paper, Efthimiou tackles the vampire myth, and using basic math disproves the legend of humans turning into vampires after they are bitten. Efthimiou explains that the entire human population in 1600 would have been wiped out in less than three years. It’s an interesting paper (read it all here), but it’s one I have many issues with.
Efthimiou begins the vampire section with a little introduction:
“Anyone whose seen John Carpenter’s “Vampires” or the movie “Blade” or any of the host of other vampire films is already quite familiar with how the legend goes. The vampires need to feed on human blood. After one has stuck his fangs into your neck and sucked you dry, you turn into a Vampire yourself and carry on the blood sucking legacy. The fact of the matter is, if vampires truly feed with even a tiny fraction of the frequency that they are depicted to in movies and folklore, then the human race would have been wiped out quite quickly after the first vampire appeared.”
And already I have a problem. He brought up vampires in folklore, vampires that are completely different than those in movies and books. They are in no way pertinent to his argument, as they don’t apply to anything he says below. But moving on…
Here’s the Professor’s mathematical explanation on why vampires couldn’t possibly exist:
“Let us assume that a vampire need feed only once a month. This is certainly a highly conservative assumption given any Hollywood vampire film. Now two things happen when a vampire feeds. The human population decreases by one and the vampire population increases by one. Let us suppose that the first vampire appeared in 1600 AD. It doesn’t really matter what date we choose for the first vampire to appear; it has little bearing on our argument. We list a government website in the 5 references [US Census] which provides an estimate of the world population for any given date. For January 1, 1600 we will accept that the global population was 536,870,9114 In our argument, we had at the same time 1 vampire. We will ignore the human mortality and birth rate for the time being and only concentrate on the effects of vampire feeding. On February 1st, 1600 1 human will have died and a new vampire born.
This gives 2 vampires and (536, 870, 911−1) humans. The next month there are two vampires feeding and thus two humans die and two new vampires are born. This gives 4 vampires and (536, 870, 911−3) humans. Now on April 1st, 1600 there are 4 vampires feeding and thus we have 4 human deaths and 4 new vampires being born. This gives us 8 vampires and (536, 870, 911 − 7) humans. By now the reader has probably caught on to the progression. Each month the number of vampires doubles. This sort of progression is known in mathematics as a geometric progression — more specifically it is a geometric progression with ratio 2, since we multiply by 2 at each step.”
Here’s the table to go with the equation:
Ok, here’s where I have more issues. Efthimiou is treating vampires more like zombies – every time someone is bitten, they become a vampire. How many vampire tales do you know of where that happens? Not many. Vampires feed on and then kill their prey, they don’t necessarily turn them. Obviously there are countless vampire mythos’ out there and Efthimiou is going with the most common one, but even with this common theme, vampires are never created at that accelerated speed. His research is correct in a sense, yeah sure, if vampires turned each other at that rate then yes, the human population would dwindle. But when do you see that happening in movies (other than Daybreakers)?
I feel like this paper was a waste of the Professor’s time. It seems more like a joke than anything since it covered such a tiny facet in the vast world of vampires. Plus his information is based on fiction. Not very scientific.
What do you think of what Efthimiou had to say?