Vampire Carrying Capacity

There is a scholarly paper making the rounds of the Internet (“Cinema Fiction vs. Physics Reality,” by Costas J. Efthimiou and Sohang Gandhi) that claims to have disproved the existence of vampires using math. Their central claim is that, if a vampire needs to feed once a month, and every vampire’s victim becomes a vampire, the vampire population would increase geometrically and the human race would be wiped out in a couple of years.

Everybody knows that vampires don’t turn their victims every time. That would be as ridiculous as … as vampires actually existing. Well, anyway. In this essay, I was going to rebut their argument with some more math, but I discovered it’s already been done. Brian Thomas does a wonderful job of explaining the population ecology of vampires and humans in Sunnyvale (“Vampire Ecology in the Jossverse,” easily Googleable), and he uses much more advanced math than I could ever muster. Oh, well. Go read both papers. I have only a few notes to add, and all I have to invoke is a little algebra.

There seem to be two major feeding strategies among vampires. As far as I can tell, nobody’s named the strategies, so I’m just going to call them “grazers” and “gorgers.” Gorgers are kind of like your pet snake – they can go for weeks without food, and then they go and drain all the blood out of a human body. Grazers suck peoples’ blood, but they don’t kill them while they’re doing it, and presumably they have to do it more often.

Since vampires don’t exist, I have absolutely no facts to go on. All the numbers that I’m using I have had to pull out of thin air. The only real number I have is this: the Red Cross says that a healthy adult can lose a unit of blood (half a liter) every eight weeks without suffering any ill effects.

Say one of your grazing vampires needed a unit of blood every night to “survive.” He’d need, as an absolute minimum, fifty-six humans to give the first human enough time to recover before coming back to feed on her again.

One vampire to every fifty-six humans and the Red Cross would be pissed.

But in light of those numbers, one vampire to every thousand people or so starts to sound pretty reasonable. A small town like Northfield could support a coven of 17 or 18.

For “gorger” vampires, the math is a little different. Here we’re not concerned with how much blood a person can lose without getting hurt, but with how many people are dying. Say a gorger vampire needs to consume one victim every two weeks. A vampire like this would increase the annual death rate of a population by 26. What the vampire needs, then, is a population big enough where that increase in the death rate won’t get people suspicious.

Remember those lovely population histograms from the Elves? Birthrate/deathrate calculations are going to come in handy again. In a nice Western country where the life expectancy is around 80, 1/80th of the people have to die each year to keep the population constant. So a population with an annual death rate of 26 would have x * 1/80 =26 or 2080 people. But wait – if a vampire moved into a population of 2080, the death rate would double. The local townsfolk would be knocking on Buffy’s door in no time. Let’s multiply that by a hundred. In a city of 208,000, there are about 2,600 deaths a year. Add a vampire and the death rate would go up to 2626. Such a vampire could plausibly manage not to get caught.

Notice that the gorger model supports a much lower population density than the grazers – one individual in a city of 208,000, instead of 17 in a Northfield. And there’s the fact that a gorger’s victims would be dying under highly mysterious circumstances – it’s hard to hide all those dessicated corpses. A gorger vampire would have to limit herself to people who won’t be missed, which would drive the vampire population density even lower. There are many questions left unanswered about the grazer strategy, too. How likely are people to notice a fang-shaped hickey and waking up light-headed in the morning? Still, if you’re a vampire, it pays to be a grazer.