Skip to content

Game theory of vaccination

February 9, 2016

How unreasonable is it to not vaccinate your children?

I ask this not as a rhetorical question, but as a mathematical one. How do we describe, mathematically, the benefits and risks of vaccination? What does this description tell us about the reasonableness (or unreasonableness) of not vaccinating?

These days, most of the debate about vaccination is centered around questions of misinformation, misunderstanding, delusion, and conspiracy. But all this shouting obscures an interesting and very real mathematical question.

So let’s consider the dilemma of a perfectly well-informed and perfectly rational person faced with the decision of whether to vaccinate their child against some particular disease. Making this decision involves weighing issues of risk and reward, and thinking about selfishness and altruism.

Luckily for us, there is an entire mathematical science devoted to addressing these kinds of questions: the science of Game Theory.

In this post I want to take a game-theoretical look at the problem of vaccination. In particular, we’ll ask the questions: under what conditions is a disease dangerous enough that you should vaccinate? And is doing what’s best for your child different from doing what’s best for society as a whole?

kid_shot

 

Risk and Reward

The key idea in this analysis is as follows. When you vaccinate your child, you provide them with the benefit of immunity against a disease that they might encounter in the future. This benefit is potentially enormous, and life-saving.

However, if your child lives in a population where nearly everyone already has the vaccine, then the benefit of the vaccine to your child is greatly reduced. After all, if everyone around is effectively immune to the disease already, then the group’s “herd immunity” will greatly reduce the chance that your child ever gets exposed to the disease in the first place.

You might therefore be tempted to decide that even a very small risk inherent in the vaccine would make it not worthwhile. And, certainly, such risks do exist. For example, there is a very small chance that your child could have a serious allergic reaction to the vaccine, and this reaction could lead to things like deafness or permanent brain damage. If your child is already getting “herd immunity” from everyone else’s vaccination anyway, then why risk it?

Let’s consider this question in two steps. First, we’ll ask what is the optimal vaccination rate. This is the rate that maximizes the safety and well-being of the whole population. Then, we’ll ask the more pointed question: which is the decision that is best for your child alone, given that as a parent your concern is to minimize the chance of harm to your child, and not to the world as a whole.

Let’s discuss these ideas in a completely theoretical sense first, and then we’ll put some numbers to them to see how the real world compares to the theoretical ideal.

 

The optimal behavior

Imagine, first, a population where everyone is vaccinated against a particular disease except for some fraction x of non-vaccinators. Now suppose that a randomly-chosen individual gets exposed to the disease.

If the vaccine is highly effective, then the chance that this person will contract the disease is the same as the chance that they are not vaccinated: x. In the event that this person does contract the disease, then they will expose some number n of additional people to the disease. This wave of second-hand exposures will lead to a wave of third-hand exposures, and so on. At each wave there is a multiplication by n in the number of potentially exposed people, and a (hopefully small) probability x of the disease being communicated.

You can diagram the spread of the disease something like this:

epidemic

This picture illustrates the case n = 4 (i.e., every infected person exposes an average of four other people). Each branch labeled “x” shows the probability of the disease being spread at that step.

If you add up the expected number of infected people, you’ll get

(\text{total infected}) = x( 1 + n x (1 + nx(1 + ... ) ) )

(\text{total infected}) = x \sum_{i = 0}^{\infty} (n x)^i

(\text{total infected}) = x/(1 - n x)

This last equation already suggests an important conclusion. Notice that if the rate of non-vaccination, x, gets large enough that n x \geq 1, then the total number of infected people blows up (it goes to infinity).

In other words, if n x \geq 1 then the population is susceptible to epidemics. There is a very simple way to interpret this condition: n x is the average number of new people to whom a given sick person will pass their infection. If each sick person gets more than one other person sick, then the disease will keep spreading and you’ll get an epidemic.

If this condition is met, then there is no question about vaccinating. A population that is susceptible to epidemics is one where you need to get vaccinated. End of story.

But let’s assume that you live somewhere where this particular disease doesn’t cause epidemics anymore. (Like, say, mumps in the USA – more on this example below.) An absence of epidemics generally implies n x < 1, and any flare-up in the disease will be relatively small before it dies off.

Let’s say that every so often someone within the population is exposed to the disease. We’ll call this the rate of exposure, E, which can be defined as the number of initial exposures in the population per year. Combining this rate with the equation above means that

E x/(1 - n x)

people will be infected per year.

This rate of disease-induced sickness should be compared with the rate of vaccine-induced sickness. If a fraction x of people are not vaccinated, that means that N (1- x) people do get vaccinated, where N is the total number of people in the population. As a yearly rate, N(1-x)/T people are vaccinated per year, where T is the average lifetime of a person (or, if the vaccine requires periodic boosters, T is the time between successive vaccinations). Let’s suppose, further, that the vaccine makes a child sick with some probability v.

What this all means is that there are

(1-x) N v/T

vaccine-induced illnesses in the population every year.

(If you’re getting lost keeping track of all these variable names, don’t worry. Only two will matter in the end.)

From a population-wide standpoint, the optimal rate of vaccination is the one that minimizes the total amount of illness in the population per year:

F = E x/(1 - nx) + (1-x) N v/T.

Taking the derivative dF/dx of the function F and setting it equal to zero gives a solution for the optimal non-vaccination rate:

x = 1/n - \sqrt{R}/n.                                                                    (1)

Here, the variable R can be called the “relative disease risk”, and it is a combination of the variables introduced above:

R = ET/Nv.

You can think of R as the relative risk of the disease itself, as compared to the risk associated with getting the vaccine.

(The variable v should be considered to be the probability of getting sick from the vaccine, multiplied by its relative severity, as compared to the severity of the disease itself. More on this below.)

You can notice two things about the theoretically optimal non-vaccination rate, equation (1). First, the non-vaccination rate x is always smaller than 1/n. This guarantees that there are no epidemics.

Second, the rate of non-vaccination declines as the relative disease risk R increases, and at R > 1 the optimal non vaccination rate goes to zero. In other words, if the risk of the disease is large enough, and the risk of the vaccine is small enough, then the optimal thing is for everyone to get vaccinated.

 

Rational self-interested behavior

The analysis in the previous section was only concerned with the question “what is best for the world at large?” If you are asking the more limited question “what is best for my child?”, then the answer is slightly different. For this decision, you only need to weigh the probability of getting the disease against the probability of getting sick from the vaccine. The risk of conveying the illness to others doesn’t enter the analysis.

To figure out the probability of your child getting the disease, you can repeat a similar analysis to the one above: drawing out the tree of possibilities for each instance of infection. That analysis looks a lot like the picture above, except that there is one possible branch (representing your unvaccinated child) that has no protection against infection, and the rate of contracting the disease upon exposure is 1 instead of x.

The corresponding probability of your child being infected after a given initial exposure is therefore

\frac{1}{N} \sum_{i = 1}^{\infty} (x n)^i = 1/[n(1-xn)].

Since we have assumed that there are E initial exposures per year, the probability of your child getting the disease in their lifetime is E T/[N(1-xn)].

As a rational, self-interested parent, you should only vaccinate if this probability is greater than the probability v of your child getting sick from the vaccine. This means the condition for vaccination is

E T/[N (1-xn)] > v.

You can call this condition a “Nash equilibrium”, using the language of game theory. When the inequality is satisfied, vaccination is a good idea. When it is not satisfied, vaccination is a bad idea, and self-interested individuals will not do it. As a consequence, a population of rational, self-interested people will settle into a situation where the inequality is just barely satisfied, which is equivalent to

x = 1/n - R/n.                                                                           (2)

This result actually has a lot of features in common with the optimal result for vaccination. For one thing, it implies that you should always vaccinate if x > 1/n, which is the same lesson that has been repeated above: always vaccinate if there is any chance of an outbreak.

More pointedly, however, you should also always vaccinate any time the relative risk of the disease, R, is larger than 1.

In this sense the self-interested behavior is pretty closely aligned with the globally optimal behavior. The disagreement between them is a relatively mild quantitative one, and exists only when the relative disease risk R < 1.

 

Confident self-interested behavior

Now, it’s possible that you don’t accept one of the central premises of the analysis in the preceding section. I assumed above that an essentially healthy population is subject to occasional, randomly-occurring moments of “initial exposure”. In such moments it was assumed that a person is chosen at random to be exposed to the disease. Presumably this exposure has to do with either traveling to a foreign location where the disease is endemic, or with meeting someone who has just come from such a location.

You might think, however, that it is very unlikely that your child will ever be such a “primary exposure point”. Perhaps you know that your child is very unlikely to travel to any place where the disease is endemic, or to meet anyone who has come directly from such a place. If you have this kind of confidence, then the calculation changes a bit. Essentially, one needs to remove the probability of being the initial exposure point from the analysis above.

Under these assumptions, the resulting risk of contracting the disease becomes E T x n/[N(1-xn)], which is smaller than the one listed above by a factor x n. Consequently, the Nash equilibrium shifts to a higher rate of non-vaccination, given by

x = 1/[n(1+R)].                                                                           (3)

This equation satisfies the same “no epidemics” rule, but it is qualitatively different in the way it responds to increased disease risk R. Namely, there is never a point where the population achieves complete vaccination.

In other words, a population of “confident” self-interested individuals will always have some finite fraction x of vaccination holdouts, no matter how high the disease risk or how low the vaccine risk. If enough of their fellow citizens are vaccinated, these individuals will consider that the herd immunity is enough to keep them safe.

The three possible non-vaccination rates can be illustrated like this:

vaccination_graph

 

Real data: the MMR vaccine

The above discussion was completely theoretical: it outlined the ideal rate of vaccination according to a range of hypothetical decision-making criteria. Now let’s look at where the present-day USA falls among these hypotheticals. As a case-study, I’ll look at one of the more hotly-discussed vaccines: the measles-mumps-rubella (MMR) vaccine.

First of all, it is sadly necessary for me to remind people that there is absolutely no evidence for any link between MMR (or any other vaccine) and autism.

But that’s not to say that there is zero risk inherent in the MMR vaccine. In very rare cases, a vaccination can lead directly to a runaway allergic reaction, which can produce seizures, deafness, permanent brain damage, or other long-term effects. The CDC estimates these side effects to occur in at most one person per million MMR vaccinations. (In terms of the variables above, this means v = 10^{-6}.

Compare this to the combined rate of measles, mumps, and rubella infections in the USA. The average rate of occurrence of these diseases during the past five years has been something like 1200 cases per year. Given that the MMR vaccine coverage in the United States is about 92%, this implies a rate of “initial exposure” of something like 1200/0.08 = 15000/year across the entire US. (Most exposures do not lead to infection.)

Of course, most people who contract measles, mumps, or rubella recover without any permanent side effects – they just have to suffer through an unpleasant illness for a few weeks. So to make a fair comparison, I’ll discount the exposure rate by a factor that approximates only the risk of acquiring a permanent disability due to the disease. For example, about 0.3% of measles cases are fatal. For mumps, about 10% of cases lead to meningitis, and something like 20% of those result in permanent disability (such as hearing loss, epilepsy, learning disability, and behavioral problems). Finally, the main danger of rubella is associated with congenital rubella syndrome, a terribly sad condition that affects infants whose mothers contract rubella during the middle trimester of pregnancy.

Even discounting this last one, a low-side estimate is that about 1.7% of people who get measles, mumps, or rubella will suffer some form of permanent disability as a consequence. So I’ll discount the “primary exposure” rate to only E = 0.017 \times 15000 \approx 260/year.

This number should be compared to the rate of vaccine-induced disability, which is something like 4 instances/year (given that about 4 million people get the MMR vaccine per year, and about one per million gets a permanent disability from it).  Comparing these rates gives an estimate for the relative disease risk:

R \approx 70.

Pause for a moment: this is a large number.

It implies that the risk associated with actually contracting measles, mumps, or rubella is at least 70 times larger than the risk from the vaccine.

This is true even with the relatively low incidence of cases in the US, and even with the relatively robust “herd immunity” produced by our 92% vaccine coverage. R = 70 is also a low-side estimate – there are a number of other disease-related complications that I haven’t taken into account, and I haven’t made any attempt to account for the unpleasantness of getting a disease that you eventually recover from without permanent disability.

Given this large value of relative risk, we can safely conclude the current non-vaccination rate in the USA, x \approx 8%, is way too high. At such a large value of R, both the altruist and the self-interested person will agree that universal vaccination is the right thing to do.

Even the “confident self-interested” person, who believes that their child has no chance of being a point of primary exposure to the disease, will agree that the current vaccine coverage is too low to justify non-vaccination. Only at x less than 1% could such a calculation possibly justify non-vaccination in the present-day USA.

 

Conclusion

I went through this analysis because I believe that, at a theoretical level, there is room for a conversation about weighing the risks of vaccination against the benefits. It is true that in a relatively healthy population that is herd-immunized against outbreaks, a vaccine’s side effects can be a more real risk than the disease itself. It is also worth understanding that in such situations, the incentives of the altruist (who wants to minimize the risk to the world at large) are not perfectly aligned with the incentives of individual parents (who want to minimize the risk to their own child).

But in the present-day USA, these choices do not appear to be at all difficult, and there are no thorny theoretical issues to worry about. Our vaccines remain safe enough, and the disease risks remain large enough, that any level of rational quantitative thinking, self-interested or altruistic, leads to the same conclusion.

Vaccinate your kids.

 

(Unless, of course, you know that your child has some pre-existing medical condition that makes vaccination unsafe.)

9 Comments leave one →
  1. February 16, 2016 8:36 pm

    Thanks for this very nice analysis.

    • Brian permalink*
      February 16, 2016 8:40 pm

      You’re welcome! I hope that people find it useful or informative.

  2. February 18, 2016 12:55 pm

    Outstanding piece of work, sir!

  3. March 24, 2016 2:02 pm

    Amazing idea, get it publish sir.

    • Brian permalink*
      March 24, 2016 7:37 pm

      I actually tried to get it published on some popular website, but it was apparently too much math for prime-time.

  4. Lee permalink
    July 29, 2016 11:24 pm

    I am wondering if you are basing mortality and disability on statistics that are current and based on those countries where they still have sizable outbreaks of these illnesses, or if you went to the cdc website, and looked at historical US numbers for mortality and disability?

    http://www.cdc.gov/measles/about/history.html

    According to the CDC website, in the United States, in the decade prior to widespread vaccination for measles, somewhere between 3 and 4 million people per year contracted the disease, and there were an estimated 400 to 500 deaths per year from measles complications. Now, I am not a math gal, but that is not a .3 percent mortality. You can’t use mortality and complication statistics from countries where they don’t have the level of healthcare and sanitation that we do. Poverty and malnutrition create much worse outcomes for those individuals.

    Further, while I respect what you are doing here, the data on vaccine related deaths and disability is suspect, because it entirely relies on doctors to report, or parents or victims to report to the VAERS database, which they may or may not do, because they may or may not attribute something to the vaccine as opposed to some other cause, like SIDS for example.

    Also, I think that the outlook would be different for different vaccines. Let’s say the chicken pox vaccine. We all lived through it, and I don’t know a soul who suffered any significant harm from it, which is purely anecdotal, yet you can see where I might have some reluctance to vaccinate for it when it seems that it is being pushed by the big pharmaceutical companies as if chicken pox is suddenly a dreadful affliction on humanity?
    I don’t want to bore you, but I will say that one of the mistakes that people such as yourself often make is believing that people such as myself have no logical basis for believing what they do, and have not or are incapable of looking at something in a dispassionate and analytical way. Because everyone thinks that if you skipped a vaccine, you must be listening to Jenny McCarthy. Given that those who do not or selectively vaccinate are generally well educated, you can be certain that a cost benefit analysis was done

    So I respectfully submit that while your statistics are excellent, your data you are drawing from is not.

    • Brian permalink*
      July 30, 2016 9:08 am

      Hi Lee,

      Thanks for your comment. All my statistics come from CDC data for the USA in the most recent years available (2012 and later). The links are embedded above.

      It is certainly not my intention to claim that non-vaccinators are being illogical or irrational. Indeed, part of the purpose of this article is to make clear that the question of whether or not to vaccinate can be somewhat subtle. I personally wasn’t sure what the right answer was (from the parents’ perspective) until I did an actual quantitative analysis.

      As I see it, though, the analysis has a pretty clear conclusion: vaccination is at least 70 times more likely to help your child than to harm them.

      • Lee permalink
        July 30, 2016 9:56 am

        The numbers you are using are mortality from measles on a world wide basis. For the reasons I stated, and if you check the link I included, if we in the US were to hypothetically stop vaccinating for measles entirely, re-introduce it, the mortality HERE in the US would be significantly less than the mortality that is stated on the CDC website which includes poverty stricken countries with poor sanitation, poor nutrition, and poor healthcare options. Further, the VAERS system has an inherent flaw, in that it is a self reporting system. Self reporting systems are cheap and easy but they carry the risk of under reporting.

        I don’t think that you specifically are accusing anyone of being irrational, I am simply pointing out that the situation you are looking at is more nuanced than your math has taken into account.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: