Empty arcade off St Mark's Square in Venice
David Harding
David Harding
7 August 2020
5 minute read

Earlier this year, as the virus spread rapidly in Wuhan, observers, including myself, scrambled to understand its properties and severity. Its severity has to be gauged in relative terms; that is, how dangerous is it compared to other known viral infections, such as influenza? This property can be partially captured by numbers such as the percentage of people who die from the virus after catching it (the infection fatality rate, or “IFR”).

At that time, preliminary estimates were that 4% of the infected were killed by the virus. This corresponds to a very serious disease indeed (for example, the IFR for influenza is normally about 0.1%). It was recognised, however, that this might be an exaggeration, as it was a percentage of those who were ill enough to be treated in hospital. If there were many mild cases in the wider population, the actual IFR might turn out to be much lower. Since it was apparent that the majority of diagnosed cases were mild, the suspicion was that the IFR would indeed turn out to be much lower.

Amidst the tidal wave of data and statistics that the subsequent pandemic has given rise to, it has now been extensively reported that the evidence is of most cases being so mild as to have remained undetected. This knowledge has been used to infer that the true IFR is much lower than 4%, with estimates now ranging from 0.1% to 1%. The US CDC’s latest estimate is 0.6%, updated from 0.28% a week or two earlier.

It is no wonder that political opinion is sharply divided. At 0.6%, the coronavirus is six times as fatal as the flu, and highly infective. As a result, the public policy measures adopted fairly generally worldwide, seem justified. At 0.28%, they would be substantially less so.

To compare the virus to other public health risks, there is a second important factor. Instead of viewing fatalities as purely binary – you die or you don’t – we should perhaps measure the life years lost. The virus is most fatal to the elderly, who have short remaining life expectancy. This is the same as for influenza and other viral respiratory diseases, but not the case for many other diseases, ranging from measles to malaria.

For example, with measles, before the advent of widespread vaccination, the disease primarily affected children. It shortened the life expectancy of the 0.2% fatalities by 70 years, compared to only a fraction of that number for coronavirus victims. By life years lost, measles would represent a 25 to 50 times more serious public health threat, based on current estimates. It is understandable that considerable effort has been made since 1968 to vaccinate the worldwide population against this virus, and against mumps and rubella, at the same time.

It has often been the case, as with BSE or swine flu, that the IFR of a new disease has been overestimated in the heat of public concern during the outbreak. Should the IFR turn out to be significantly lower than 0.6% (indeed at the lower end of the error estimates), the public policy measures pursued will probably be seen as having been something between counterproductive and disastrous.

On the other hand, what has happened, has happened and it presumably has its upside for people, as governments were effectively driven to follow their strategy by public pressure. Many people and industries have gained in various ways from the policy response; perhaps more than they have lost. The strong sense of being engaged in a collective endeavour has provided something many found missing from their lives. Moreover, the majority of the workforce, companies and investors have been protected so far by government action. Dramatically negative consequences have been inflicted on a relatively small minority: small business owners, isolated elderly people, next year’s university students, working mothers and other segments of the population. This “opposition” has not been substantial enough to resist the force of majority opinion.

It would seem uncontroversial to observe that such a public policy reaction to a disease with these characteristics could never have occurred in the past. Similarly infective diseases, such as measles, mumps and rubella, scarlet fever, typhoid, whooping cough, tuberculosis, cholera, smallpox, diphtheria and polio, circulated for generations in the human population without resulting in such sweeping government measures. The consequences would have been too catastrophic to justify the measures. Perhaps a major reason for why we have adopted the policies we have is because we can. The production of the necessities of life has not been dramatically impaired. The majority of people have not so far suffered dramatically negative consequences.

The overriding reason for this has been the technological revolution we have seen in our lifetimes. Technology discovered the virus, traced it, facilitated and maintained communication and home working, and generally made the public policy measures feasible and acceptable to most people. This explanation doesn’t make the measures wrong. For all we know, the lasting consequences may be beneficial to society as a whole, ranging from accelerated trends in technology, to a huge potential drop in common viral infections, due to improved hygiene standards. On the other hand, the measures might not be so beneficial. Bankruptcy, mass unemployment, loneliness and, in parts of the world, famine and death may ensue.

Surely a world that places such a heavy onus on each individual to consider how their actions may affect the health of others, must be one that also curtails the freedom of individuals dramatically. It is well known that schoolchildren pick up countless “bugs” at school, of which they act as a vector for transmission. Will this be acceptable in the future? Cities, public transport, air travel, offices, nightclubs, recreational events: all are sources of viral transmission generally. Will we not detect all future mutations of the influenza virus and be morally obliged to prevent their spread?

It seems likely from this logic that the entire human economy will have to be reconfigured, with incalculable consequences. Is there a reason for this to happen now? Perhaps, again, because we can. A level of prosperity has been attained so that for the majority of the human population, this is a “luxury” that we can afford. For centuries, political pressure has been for a rise in living standards, for adequate food, shelter and comfort and the acquisition of status goods. The abstruse quantity known as GDP (gross domestic product) has been the enshrined yardstick of progress, with governments and their economists competing to raise GDP, come what may. Perhaps we don’t collectively want this anymore.

Before 2008, governments cleaved to the idea of free trade as a way of growing GDP, by exploiting the division of labour and the doctrine of comparative advantage on a global level. Printing money by having central banks fund government deficits was taboo. The consequences were what was termed “structural imbalances”, persistent and irreversible trade and capital flows and the consequent accumulation of massive foreign exchange reserves by many countries. Faced with a banking system collapsing like a house of cards in 2008, the taboo on monetary financing of government deficits was abandoned. Since then, central banks have essentially financed governments at a zero rate of interest to target this, semi-mystical GDP measure, with apparent success.

But is GDP growth what we wanted? What does it measure exactly? Does any of us know? It’s true, this effort has kept unemployment down and the stock market up, keeping two major constituencies happy: labour and capital. However, there may be other constituencies in the world, not captured within that simplistic model, and they may not have been so happy. Popular analysis has suggested they were not. Faith constituencies, race constituencies, internationalist and national constituencies. Is all mankind really homo economicus? Did they like what was being done?

Perhaps they didn’t. They said they didn’t. It isn’t at all clear what the world’s rulers should have done instead. Still, the rulers have started to do different things. And now they are going to do more of them.