Health and Longevity.
THE 2016-2017 FORTNIGHTLY REVIEW serial.
Serial Preface and Contents with Publication Schedule
By ALAN MACFARLANE.
FOR ALMOST ALL OF HISTORY, humans, like other animals, were a rather defenceless prey for numerous pathogens, whether through vectors like mosquitoes, or the various bacteria and viruses which people could not see. Consequently, from one point of view, human history until about one hundred and fifty years ago could be written as a long and usually unsuccessful struggle against pain and death.
Many societies are still what anthropologists call ‘disease-logged’. Having lived in such a society in Nepal, where initially there were practically no doctors, dentists or medicines, and almost everyone was suffering from some sort of pain almost all the time, I realise how fortunate many of us now are, who live in the prosperous parts of the world which have good medical facilities.
There were, of course, in many civilisations, traditional medicines and people often hit on environmental solutions which had very large effects—boiling water, reasonable housing, clothing and shoes, the drinking of tea and beer, which lowered the toll of various diseases. Yet without being able to understand the causes of disease, and without the possibility of devising powerful anti-bacterial and anti-viral protections, most of life was precarious.
The most vulnerable period was at birth, and in the first year. In many traditional societies people lost up to a quarter or more of their infants, as well as many mothers. If it were possible to survive to ten years old, there might be a good chance, in the absence of serious incidence of the great killers like smallpox, cholera, plague, tuberculosis, influenza, or malaria, to survive into one’s fifties or even sixties. But then, with weakened bodies from heavy toil and no effective cures, even a minor illness could cause death, so that many people died in their sixties or seventies.
In such a situation, there was really very little that doctors could do, for they knew little more than did ordinary people, and there was really no point in institutions such as hospitals, which usually killed more people than they cured.
Three revolutionary developments in the hundred years from the middle of the nineteenth century changed all this, and a fourth current revolution is extending those changes.
THE FIRST HUGE CHANGE is in accurate knowledge. The story of the discovery of bacteria by the use of microscopes by Pasteur and Koch, and soon of the causes of malaria, bubonic plague, TB, cholera and other dreadful killers is well known and does not need to be rehearsed here. By the start of the First World War, much of disease causation had been mapped out. Yet it was not until the antibiotic and vaccination revolutions from the 1930s that the new knowledge could be applied through a wave of new medicines.
These two revolutions combined with a third, which was the growth of a trained medical profession centred on hospitals. After the public health reforms of Chadwick and others, and the discoveries of disinfectants by Lister, hospitals could become places for healing rather than killing. So an era began when health care was institutionalised and the medical model for referring people to specialists and clinics and hospitals began.
The current and last revolution starts with the discovery of DNA by Crick and Watson in 1953 and the subsequent mapping of the human genome. Precision genetic manipulation, using techniques of replacing faulty DNA, along with other great breakthroughs like monoclonal antibodies, stem cell research, three-dimensional imaging, means that we are currently going through another medical revolution.
Of course, it is a war against sickness which can never be decisively won for several reasons. One is that bacteria, viruses, prions and other microscopic forms can mutate faster than humans and each generation of drugs soon becomes ineffective. Another is that death cannot, as yet, be put off for ever and the nature of the killers shifts from the traditional diseases to new, age-related ones, like dementia, Parkinson’s, various cancers, heart attacks and strokes.
Furthermore new medical problems such as diabetes and obesity, new remote diseases like Ebola, new forms such as AIDS, are constantly changing and challenging our ability to bring more health and avoid disaster.
Yet things have changed and now in the more privileged parts of the world and particularly in the richer segments there, an infant can expect to live into his or her nineties, whereas (largely influenced by the high infant mortality) in most of the past and still in many parts of the world, the expectation of life at birth is a third of that.
All this may be counted as a blessing and we have come to expect pain alleviation, if not cure, as a human entitlement. We have also come to realise that the vast proportion of diseases and the high continued death rates in much of the world are preventable, not so much by hospitals and doctors but by good food, clean water and a cleaner environment.
WHAT IS NOT so often realised is that, as always, as we solve one problem it generates others. An obvious one comes under the heading of the revolution of rising expectations. We expect to live longer and to be cured and put pressure on our governments to facilitate this. This is combined with a population where there are many people living into their eighties and nineties.
Previously, through most of history, the population pyramid of most societies showed almost everyone being in the First and Second Age (childhood and work), with a few lingering over into the sixties and seventies. Now we have a situation, which is accelerating, where we have seen a change to the four ages of man (childhood and education; work; independent retirement; dependency).
Demographers and economists who have pointed to this very recent phenomenon have often warned us of the dangers of ageing populations, as in most of Europe, Japan, America and now in China. Yet their arguments are weak.
They worry that the young will not, through their labour, be able to support a growing ‘burden’ of old people over the conventional age of retirement and consequently there will be a collapse of the economy. This argument is mistaken for at least two reasons. First is that it is derived from a previous, labour-intensive (farming and factory) economy where indeed, after the late 50s, the body could do less and people retired. We have moved away from such a world, as I explained earlier, to a knowledge, machine, service economy. The seventy or even eight-year-old with a computer can do just as much good ‘work’, as a person in their forties. It is a matter of expectations and organisation.
The second, connected, fact is that there is less and less of the old style work to be done—as shown above. The problem is leisure and how to fill it. The old-age-heavy pyramid we now have is no threat to this. If we add to this the potential improvement in health provision which means that those in the third age can keep fit and well, then this is a false worry.
Yet there is another consequence which requires much more thought and has hardly been addressed, what we might call the ‘health-bucket-with-a-hole-in-it’ syndrome. Let me explain.
THE HEALTH SERVICE now current in Western countries was built up from the middle of the 19th century on a model based on other institutional structures. Faced with the problem of mass education, you put people into schools and universities where they were taught from the front by teachers. Faced with the problem of spreading religious truths, there was a similar approach, with churches and ministers. With law it was the same, a court building, lawyers and judges. With producing goods, again there were the factories, the foreman and managers. The general aim was to achieve efficiencies and economies of scale by herding people into buildings, with one or two to guard or instruct them.
This worked reasonably well for a while and in the case of health, was coupled as the economies grew, with the provision of health care, either for free or through insurance, for most of the population. Healthcare became a vast, institutionalised, business. Yet we are all aware that the system is collapsing.
Hospitals are running out of money, doctors and nurses are overworked, people are dissatisfied. It is obvious why this is the case—it is due to the fact that health, including mental health and social problems, are based on needs that are infinite. This is especially so when we remember that the doctors, clinics and hospitals are also, for many, especially the retired, a way of passing the time and a chance for human company and conversation in our often lonely world.
So, like population, unless checked the number of patients grows exponentially. As Ivan Illich long ago argued, through their diagnosis and their need for patients and the money to be made from making and selling drugs, doctors and pharmaceutical companies create patients, in the same way as police create criminals, teachers create pupils, missionaries create sinners and intelligence services create terrorists.
Yet there is, as the population ages and new medicines and treatments are available, an ever-expanding perceived need. What can be done to deal with this? Here I will discuss three approaches, two of which seem to me to be useless, and the third a possible way forward, not just for the affluent societies of the West or the richest in India and China but also in what used to be called the Third World where per capita income is still less than one tenth of what it is in America or Britain. There, and increasingly even with the richest countries, it is not possible to use vast resources to help institutions.
THE FIRST APPROACH, advocated by most politicians partly because it wins votes especially from the old, and obviously by the health lobby of manufacturers of medicines, is to spend more money. Increase the health budget at well above the rate of inflation or GDP growth, build more hospitals, buy more ambulances and expensive equipment, recruit and train more doctors and nurses, increase the budgets for ever more expensive medicines, provide more and more care to keep people alive as long as possible. The same argument is used by every group—more money for the exponential road network needed for cars, for universities for everyone, for a larger and better equipped police or Armed Forces. Yet few of these other needs touch everyone in their daily living in a way that health and sickness do, so they often have less influence.
In all cases, but particularly with the health system, seldom do such politicians or the lobbying groups stand back and ask when will it end? For it is obvious that if we spent the whole national budget on health—closing schools, shutting down all the social services, winding up the army and foreign aid, it would still not be enough. People would still be ill, drugs would still be getting more expensive, hospitals would still be stuffed with people who wanted treatment, doctors and nurses still over-worked. Thus more money without thought is useless—it is no longer the way, if it ever was.
A SECOND APPROACH which has been tried to make the health service more efficient is to treat it as a business, a factory, which produces things. Thinking of it as such a factory, sick people come in on a conveyor belt and move along it and are ejected as ‘cured’. If this is so then surely the apparently effective management and production line methods that revolutionised American business in the twentieth century can be applied? This leads one towards management.
The way to deal with the problems is to take power and decisions away from the people who give the care—doctors and nurses—and hand it to management consultants. The same model is applied to schools, universities, the law, prisons and all the other institutions inherited from the nineteenth century.
In all these cases, there will, it is argued, be a double benefit. We can look at this with health. The doctors and nurses no longer have to divide their time between administering and healing, but can concentrate on the healing side where they are qualified. It will lead to greater efficiency gains because such doctors and nurses were never trained in business schools to run an organisation—so they do not know the jargon, the way to produce forms and transparency. Running something like a company or factory has to be taught in a serious course, so the top of each hospital should have a thick wedge of ‘managers’, very well paid, obviously, since it is such a responsible and difficult job. Since they control the wage bargaining, they can ensure that their expertise is suitably rewarded. The same is true of Vice Chancellors, Principals of big schools etc.
This business model was the favourite from the 1980s onwards, combined with the idea that doctors needed to be trained for longer and longer (a specialist will not attain his or her full licence until the mid or late thirties), nurses have to have degrees at universities and so on.
Alongside this it was the idea that public institutions are inefficient and perhaps if the services could be privatised then there would be big cost savings. The new administrators as well as the ‘third way’ politicians liked the idea of private-public partnerships. They also liked the idea of borrowing huge sums from the future by Private Funding Initiatives which would not start to cripple institutions until they had moved on.
We know what happened. People who knew little about health made the decisions, the doctors and nurses found themselves spending more time on administration than before as they serviced the administrators, there was a loss of commitment and involvement by the health staff, the administrators decided on ineffective and very expensive schemes (computerised records, etc), competition about league tables and efficiency targets, obsession with budgets and transparency rather than alleviating sickness absorbed most of the energy. In other words, what happened in all of the professions at this time—the law, police, universities, schools—where administrators were introduced and the institutions became more inefficient and costly, occurred in health. So this does not seem to be the solution.
WHAT IS THE ALTERNATIVE? Let me first suggest a few of the models and ingredients which seem part of the prescription. One is derived from the experience of the civilisation which has the longest continuous development of health care in a vast population in the world, namely China. The Chinese population has traditionally been remarkably healthy and has given to the West many of the medicines and techniques which relieve much pain—tea, ginseng, rhubarb and, recently, a new class of anti-malarials based on Artemisia. Along with acupuncture and many other traditional cures and medicines for colds, cuts, bruises stomach and other complaints, which, from my own experience, seem to work effectively, the Chinese have built up a large, non-hospitalised, medical system.
China has a huge population and was per capita not wealthy enough to set up a western style institutional healthcare system. Nor did it have the institutional tradition of large residential buildings with experts. Such a system of ‘asylums’, in schools, universities, prisons, churches, were only really introduced by missionaries and other Westerners. So when it came to dealing with the health crisis after the devastation of the Sino-Japanese war and the Civil War, when the Communist Party came to power in 1949, the move was not towards hospitals or highly trained doctors which would, in any case, not have not been affordable. As in other experimental places such as Cuba, the approach was informal, local and practical—in China widely known as the ‘barefoot doctor’.
This meant that, after a very short medical training in basic health care, being paid very little, peripatetic and carrying only basic medicines, millions of health workers went across a country where three quarters of a billion people lived and brought healthcare to the remotest villages and the densest of cities.
It seems that, as I have observed, in the face of an increasing demand and a huge population, a modified version of this barefoot doctor scheme is being revived in China. In the experimental areas where it is being tried out, each small village has a little clinic, staffed by a lightly trained and modestly paid worker aided by voluntary or part-paid inhabitants—retired city-dwellers, retired nurses and doctors—who help with diagnosis and prescribing for 90% of the problems which occur at the village level and can be dealt with there. They have the time, the local knowledge and may live locally, so they can deal with the needs for assurance and conversation which takes most people to doctors surgeries and pharmacies.
Only in the ten percent of cases where the symptoms look serious and they do not know what is wrong, are patients referred to a small local hospital in the local town. This institution can then filter out, with simple equipment and lower-level doctors, another ninety percent of those referred. This leaves only one percent of all the people who originally come for healthcare to be sent on to major, specialist, hospitals for difficult diagnosis and treatment. This is a model for India, Africa, South America and elsewhere.
IF WE TAKE this story to more affluent, as yet, Western societies like Britain we can add to it two further recent changes. The first is the effect of the Internet. It is obvious that there is a vast, and often very high quality, amount of information about most diseases and symptoms available on a number of websites. Many people now consult Wikipedia or particular health websites when they have a symptom and then either go to the pharmacy, or take their self-diagnosis print-out to the GP for checking and confirmation.
This is just the start of a revolution in self-healing, which, with encouragement and some guidance, could expand. It is infinitely cheaper, more personalised, and probably better in delivering all round health than our inherited system based on highly trained and paid professionals in expensive buildings, with hugely expensive equipment and medicines.
The second element is the frequently mentioned feature of leisure in the third and fourth ages. I know that in my small fenland village near Cambridge there are a dozen or more retired or semi-retired individuals, or spouses, or others who already have the skills, perhaps as retired doctors or nurses or paramedics, or who can easily be trained to a reasonable level in a short time, to deal with much of the burden of milder sickness. This kind of localised approach has long been practiced in the fields of local government, law and the church, with magistrates, church helpers, parish councilors, working for free and for the interest and as a contribution to the society. ‘Village Colleges’, to which my children and grandchildren went up to the age of sixteen, are another part of this localisation process.
In other words, what could be created is a kind of ‘Neighbourhood Health’ system, a term derived from the self-policing movement of ‘Neighbourhood Watch’, where people living nearby taken on amateur policing functions. So we could develop a system where each community, parish, part of a city, would look after most of its own health problems in very local small clinics and with easy visiting of the sick or needy. A small supply of resources, encouragement and mentoring could work wonders – and if problems of confidentiality emerged, there could be a system of people working in nearby, but not personally known, villages or districts.
Obviously such a solution could apply not only to physical sickness, but also to all those other related problems, mental health, delinquency, loneliness, disability. This would not only be far cheaper and more effective, but, returning to the idea of civil society, it would strengthen community identity, self-government, ‘health’ in the wider sense.
Again I have seen the beginnings of this in my little village where a recently, built, small community hall, used for meetings, leisure, elections and now even with a small piece of medical equipment (to be used in case of cardiac arrest), could also easily be a health centre. It is next to the village playing-field, the children’s play area, the church, graveyard, village vegetable allotments, and the local post-office and shop. This returning of many aspects of life to the localities, including healing, could be a really significant development far beyond the health role.
It may all sound, once again, idealistic, utopian, unrealistic. It will face many vested interests and older conventional ways of thinking. It will need to be tested and adapted for different cultures, for what works in China will not easily work in the West and vice versa. Yet if we do not think in a radical way as our world changes dramatically, we are just marching into an ever-narrowing valley which will finally trap us. We should think in new ways to fit a world of increasing expectations and technology, but also increasing leisure and information. China is already leading the way, though it has also partly caught the ‘hospitalisation’ disease. So it may be that once again, as with printing, compasses, paper, porcelain, gunpowder, let alone tea and rhubarb, we can learn from the Chinese.
Alan Macfarlane is the author of more than twenty books and numerous articles covering English social history, demography in Nepal and the industrial history of England, China and Japan. His survey text, The Invention of the Modern World, is published by Odd Volumes for the Fortnightly.