William H. Foege’s book House on Fire is about his role in the global effort to eradicate smallpox. It’s one of those books that has something insightful on almost every page.
Implementation takes time
A vaccine against smallpox was first developed by British physician Edward Jenner in 1796. The last case of smallpox in the world (outside a lab) occurred in 1977. Two dates, separated by almost 200 years.
Let’s pause first and marvel at the mere existence of a second date. It means that smallpox, the disease that killed hundreds of millions, was eradicated from the face of the earth (except for samples stored in high-security biolabs). To date, it’s the only infectious disease affecting humans that has been eradicated.
On the other hand, it took almost 200 years to eradicate smallpox after the vaccine became available. At first sight, this delay feels surprising. A common intuition is that invention is the main bottleneck on the path to having great things. According to this logic, once a vaccine was discovered, global eradication should have followed relatively soon.
The almost 200-year gap shows that implementation can be just as important as invention. Jenner’s achievement was amazing. Having said that, another brilliant physician, scientist, or tinkerer would most likely have also discovered a vaccine a short time after. In fact, Benjamin Jesty, a British farmer, had also used the cowpox virus to vaccinate his family against smallpox in 1774, preceding Jenner by 22 years.
Invention is a strong-link problem. It doesn’t matter how many dead ends you run into as long as one approach finally works. Implementation, on the other hand, is a weak-link problem. Every element in the chain must be good enough, otherwise global eradication won’t succeed.
When you have hundreds or even thousands of factors to get right, it takes time to figure out all of them.
The following examples illustrate this point.
Edward Jenner took material from cowpox lesions on the hand of a milkmaid and applied it to a wound on a boy’s arm. This approach worked for a single case, but it was not scalable. The quality of the vaccine was unreliable, and the administration method was slow (and unreliable too). Successful eradication required a stable and reliable vaccine supply, as well as ways to mass administer the vaccine to the population. The ability to do mass vaccinations quickly, in turn, required the discovery of new tools (e.g., bifurcated needle, Ped-O-Jet) as well as new approaches (e.g., vaccinating people in a central location rather than in their homes).
A reliable vaccine supply can be available, and scores of volunteers can be ready to vaccinate the relevant population, but vaccination won’t work if the target population cannot be identified. This, in turn, requires that the country’s health authorities build out a system to identify and report outbreaks. The existing state capacity might be insufficient, requiring investments to modernize the existing health infrastructure. Existing labor laws that hinder the hiring of sufficient personnel might need to be changed, along with many other factors we might not even think about.
It’s these complexities, and the requirement for many factors to be simultaneously present, that explain the long road to eradication. On the flip side, smallpox needed to be eradicated only once. If we had to do it again today, it’s possible we might not succeed. But we don’t have to do it again because everything was in place for a brief period in the late 1970s, gifting us a smallpox-free world.
The role of financial incentives
Toward the end of the eradication effort, people who reported new cases were entitled to a monetary reward. Great idea in theory, though not one without practical difficulties. One difficulty was that some of the health workers to whom an individual would make a report decided to claim the reward for themselves. The second-order effect of such a move is that people lose their motivation to report a new case. They know the reward would be collected by someone else, so why bother? The solution was to pay a reward to both the original informant and the health worker who investigated the case.
A potential concern when paying rewards is that we run into Goodhart’s law: when a measure becomes a target, it stops being a good measure. If you pay people to tell you about new cases, they might be incentivized to create new cases. Not necessarily by intentionally getting infected, which was both logistically difficult and potentially lethal, but by relocating the smallpox victim to a new village and reporting the case in the new location too. This model wasn’t sustainable, however, because villagers and health workers had good local knowledge, so they quickly recognized non-locals who relocated in the hope of a financial reward.
The motivation to collect the reward must have been strong given the amounts involved, especially as global eradication drew closer. Foege mentions that the initial reward was 10 rupees. According to the WHO’s monograph Smallpox and Its Eradication, the reward in most Indian states was 50 rupees in early 1974, rising to 100 by the end of the year and to 1,000 in July 1975. Finally, when the world was believed to be smallpox-free, the WHO offered a $1,000 reward if someone could report a new case.
The increasingly large rewards reflect the marginal value of discovering and eliminating a smallpox case. When there are many cases, discovering an extra case doesn’t fundamentally change the trajectory of the eradication effort, so it’s not worth a large reward. On the other hand, eliminating an extra case when there are only a few cases left is very valuable.
At the time of the Indian eradication effort, Foege was a CDC epidemiologist seconded to the WHO. He advocated for monetary rewards early in the program, calculating that their budget would be sufficient to cover these expenses. His local contacts advised him against paying rewards that early in the program. Very wisely, it turned out, as the official number of cases vastly underestimated the real number of cases.
This episode leads us to the next point: to eradicate smallpox, the program organizers needed to know the truth, however unpleasant.
The truth, and nothing but the truth
Many health campaigns and other well-intended policies go wrong because they make incorrect assumptions about human nature. They set up a system that would be perfect if people behaved as envisioned by policymakers. For instance, policymakers might assume that virtually everybody will want the vaccine and will make themselves available when the vaccinators arrive in their village. But if some people don’t get vaccinated because they are traveling or they don’t care or know about the initiative, a mass vaccination effort that requires very high coverage to be effective might fail. One should try to build a system that works even with imperfect compliance.
In India, cases were tallied via a passive reporting system, whereby infected patients would report their cases to the health authorities. In the state of Uttar Pradesh, 437 cases were reported in September 1973 via this system. Foege’s initial belief that monetary rewards could work was based on this reporting source. If there really had been fewer than 500 cases, he might have been right. But when they switched to the active search and containment strategy, they found 5,989 new cases the next month. The passive reporting numbers were massively underreported because there was no incentive for smallpox patients to report their cases, so they didn’t.
Learning and adapting quickly
No system, no matter how well-thought-out, will work perfectly the very first time it’s implemented. That’s fine as long as the program organizers can quickly iterate on versions and make the system better over time. In India, active searches took place once every month. This meant that lessons could be shared, and strategies updated, before the search next month.
Limited resources meant that not every household could be visited monthly in every village. The searchers needed to prioritize. They identified groups of people who they thought would give them the most useful information—people like schoolteachers, village leaders, and mail carriers. They also added another element to their search strategy, visiting two households at random in the eastern, western, and central parts of each village. In later searches, they updated their strategy, directing more of their efforts at talking to schoolchildren, who turned out to be very useful information sources.
The ability to adapt to changing circumstances was an obvious strength of the program. This was only possible because the leadership at all levels was composed of problem solvers who, importantly, were not hamstrung by too many unnecessary rules.
Capable people and the authority to act
The northeastern Indian state of Assam experienced a smallpox outbreak in 1974. Americans like Foege were not allowed to visit the state. Despite the anticipated difficulties, Foege decided he would try to travel anyway. At Calcutta airport, he was refused a flight ticket to Assam. Traveling with him was Dr. Diesh, a key member of the Indian smallpox team. A mere 45 minutes before the flight’s scheduled departure time, Dr. Diesh spotted an old friend, who happened to be a minister in Assam and could pull the necessary strings to get Foege on the flight.
Thinking on one’s feet was found at all levels, as the case of Dr. George Lythcott also illustrates. He was a pediatrician working on smallpox eradication in the CDC’s Nigerian office in the mid-to-late 1960s. On one occasion, not being able to get a hold of a head of state on time, he contacted the man’s mistress to secure an important signature. On another occasion, he was traveling on an expired visa. At the immigration desk, he charmed an immigration officer, borrowing a pen from her to extend his own visa.
The game-changing idea of having search and containment as the primary strategy (as opposed to mass vaccination) came about as the result of this type of capable leadership. During his time in Nigeria fighting smallpox, Foege found himself without enough vaccines for a mass vaccination campaign. He decided to vaccinate only the 20-30 households that surrounded the infected household. If even fewer vaccines were available, only members of the same household or close contacts would be vaccinated.
All these stories are about resourceful people solving problems as they arise. Crucially, not only were they capable leaders, but they were also in a position to react quickly and make decisions on the spot. They were not held back by institutional sclerosis, too many unnecessary rules, political infighting, and other factors that frequently prevent capable people from making things better.
Other interesting things
Vaccination had a favorable reputation in Nigeria, which helped with the smallpox eradication effort. This might have originated from people getting penicillin injections in the late 1940s against yaws, a bacterial infection primarily affecting children. Even though it was the penicillin that cured people, not the form in which it was administered, people came to associate injections with a cure.
Before a vaccine was available, variolation was practiced in many places. This meant infecting a person with what was intended to be a mild case of smallpox. Variolation was a source of income for many. As such, vaccination created unwanted competition. Foege mentions that there were cases of variolators trying to harvest scabs from smallpox patients to prevent the virus from going extinct in a community.
This is an interesting example of how businesses (and people in general) react to change: 1) some try to block positive change due to self-interest, 2) some accept change (happily or grudgingly) and use their skills in a related domain (e.g., consulting on chickenpox and other rash diseases), 3) some accept change and move on to something completely different.
Foege makes an interesting critique of church groups. While acknowledging their useful role in providing medical services, he was disturbed by the fact that they prioritized curing diseases over prevention. One reason behind this, he argues, might have had something to do with visibility. Namely, a disease that hadn’t infected a child was not visible, so it didn’t bring recognition to the church group. Conversely, curing that same child of the (preventable) disease was highly visible. Invisible prevention led to better medical outcomes, but visible treatment led to better proselytizing. In Foege’s view, the revealed preferences of the church groups showed they cared more about proselytizing than about maximizing health benefits.
Another instance of Foege’s quick thinking led to a humorous encounter. They were driving in eastern Nigeria in 1967, at a time when roadblocks were common. Approaching a roadblock, they realized their brake didn’t work. Running a roadblock not being an option, the driver swerved the van into a ditch and hit a small tree. In the resulting commotion, the village chief demanded a payment of 10 shillings (a low sum), claiming that the tree was sacred and they needed to make a sacrifice to pacify the gods. Foege countered that it was the van that was sacred, and the villagers should pay him 20 shillings, the price of a sacrificial goat, to placate the gods. There was some laughter, no money changed hands, and Foege was quickly on his way again.
In the Indian eradication campaign, epidemiologists had access to a car and a driver. In late 1973, they started noticing that petrol was regularly stolen from vehicles. Identifying the culprits was not easy because the drivers didn’t stay at the same place as the epidemiologist, so it wasn’t clear to what extent the missing petrol was the result of commuting or plain theft (by the driver or someone else). To realign the drivers’ incentives, they had accommodation booked for them wherever the epidemiologist spent the night. Fuel levels were measured both in the evening and the following morning. Drivers had to pay for any non-trivial difference, incentivizing them to secure the fuel tank against third-party theft (and to stop stealing).