No word has been more fashionable over the past twenty years than ‘community’; Community Care, Community Nurses, Community Physicians, Community Health Councils, even the poll-tax is beautified as the Community Charge. No ministerial speech is complete without some reference to it, and the urgency of returning tasks and responsibilities to it, previously supposed to have been undertaken by hospitals and welfare agencies. All this seems to assume that every­where real communities still exist, with the qualities and resources necessary for care.

Do Communities Still Exist?

Population data from national or regional statistics tell us very little about real communities. They have to be studied each in their own right. Coal mining and heavy industrial communities all over the world have long been generally accepted as the supreme example of social cohesion and self-organized mutual aid, which strengthen in adversity. Like other industrial communities, their traditions of mutual help have derived from shared work experience and a necessary unity developed in struggle with ruthless employers. The taproot of this tradition has now been cut by the collapse of the industries from which it grew. Is it now so well established that it has other social roots, and can survive long enough to resume growth in whatever new kind of society is to follow, or will it disintegrate into a demoralized, declassed, socially irresponsible rabble?

Glyncorrwg is the community I know, and is as good an example as any of a community previously employed almost entirely in heavy industry, which has now almost completely lost such employment. In 1966 we did our own occupational census. Of 554 men aged 16-64, only four were of unknown occupation, and 92% had employment of some kind. Exactly half (278) worked for the National Coal Board (69% of them underground), 34 (6%) were steelworkers, 159 (29%) were employed in other manual work mostly dependent on the coal and steel industries. For the rest, 45 (8%) were unemployed or disabled and permanently unemployable in local industries, or in long-term institutional care; only 21 (4%) were self-employed shopkeepers, farmers and pro­fessionals. It was a population typical of many others dependent on heavy industry, with almost no middle class and more than its share of chronic, sickness and unemploy­ment, but prosperous compared with the terrible years from 1926 to 1941, when local male unemployment rates reached 80% in some villages of the Afan Valley. Few women had paid employment outside their homes, many had large families with eight or ten children, shotgun marriages were commonplace and single parent families almost unknown. The village had its own vigorous social, cultural and commercial life, with three working men’s clubs, four pubs, a church and five competing chapels, branches of the Labour and Communist Parties, one cricket, three soccer and two rugby teams, a betting shop, a cafe, a hairdresser, a news­agent, an ironmonger, two drapers, three sweetshops and four grocers.

Since then the industrial base for employment in all the South Wales Valley communities has been almost completely destroyed. The three coal mines in the Afan Valley were all closed between 1968 and 1970, and by the end of the great strike of 1985-6 all the remaining mines in the adjoining Ogwr, Garw and Llynfi valleys were closed. We now have about 30 men in Glyncorrwg who still travel long distances to work in surviving pits; 93% of the jobs in coal have dis­appeared. The Port Talbot steelworks has reduced its workforce from 15,000 in 1965 to 4,500 in 1987.

In 1986 I carried out another employment census in the practice. Of 368 men aged 16-64, only 178 (48%) were in full-time employment, including 20 students; 22 (6%) had part-time work or were on government job-creation or training schemes with incomes at subsistence level, 114 (31%) were claiming unemployment benefit, and 53 (14%) were certified as unfit for work. As in 1966, the large number certified unfit reflected not only high sickness rates, but also the serious local employment problems of people with any disability; when there are a hundred applicants for any good job, and with a Disabled Persons Employment Act which has never been seriously enforced by any government, employers have no need to take people with problems. Realistically, therefore, the extent of unemployment can be measured only by including those certified as unfit for work as well as those claiming unemployment benefit, giving us a total of 45% without jobs of any kind, or 50% if we include those on job-creation schemes at subsistence wages.

One small factory was built in Glyncorrwg in the late 1960s, assisted by national policies for redistribution of industry, to provide some alternative employment, mostly for women. Three different enterprises have come and gone, though the last has now been here for 8 years and looks as though it may stay. Collapse of our local economy led to the Upper Afan Valley having the fifth highest rate of out-migration in the UK in 1971, with a fall in total population from 9,480 in 1966 to 8,640 in 1971, and a continued decline to just over 7,000 in 1987. Of all the South Wales mining valleys, the Upper Afan has probably suffered the most, so 35% of out-migrants still moved away to other valley communities resembling their own and facing the same ultimate future; 56% moved to towns in the South Wales coastal belt, with better economic prospects; mainly because of prohibitive housing costs there, only 8% emigrated to England (Rees, T.L., The origin and destination of migrants to and from the South Wales valleys with particular reference to the Upper Afan valley, Cardiff: Department of Town Planning, University of Wales Institute of Technology, 1976.). Collapse of basic industry, together with competition first from supermarkets, later from hyper­markets, has almost wiped out the small shops, and accelerated the decline of religious, political and cultural groups. The church and 4 chapels cling to dwindling congregations, one pub still survives, but we now have only the newsagent and 2 grocers’ shops.

In Glyncorrwg at least, the answer is that community does still exist; but as in most places where community was strongest and at its best, its continued development or even existence is seriously threatened by destruction of what was always its principal root, male employment in industry.

Changes in the Status of Women

These are the losses, but there have also been gains. There has been a historic shift in the lives of women, and because so much care in the community has been borne by women, this has important implications for community care.

Women’s lives have changed through changes in family structure, control of fertility and increase in paid employ­ment. Family size has declined throughout this century. The army of unmarried women who previously gave lifetimes of underpaid or completely unpaid service to aged parents, the chronic sick and the surplus children of their married brothers and sisters, has disappeared. Young mothers have one or two brothers and sisters where their parents had eight or ten. The small nuclear family, established as a social norm in the 1930s, is now a minority phenomenon. Only 26% of all British households now consist of a married couple with one or two children, and only 5% of the total workforce consists of an employed man with a wife and two children at home. Between 1961 and 1980 the proportion of one-person households doubled, from 4% to 8%, and the proportion of one-parent families with dependent children also doubled, from 2% to 4% (‘Families in the future. Study Commission on the Family’, London: SCF, 3 Park Road, London NW1, 1983.)

Oral contraception began to be widely used in the early 1960s. Legal termination of pregnancy became available under the Abortion Act of 1967, and by 1969-72 was performed at an annual rate of 7.8 per 1,000 women aged 15-44 in Glamorgan and Monmouthshire, the centres of South Wales industrial valley populations, compared with a live birth rate of 82 per 1,000 (Jones,  A.G.,  Jones,   D.A.,   Termination  of pregnancy  in   Wales 1969-72, Cardiff: Welsh Office, 1973.). Together with widespread acceptance of male sterilization by vasectomy, rapid though still incomplete progress has been made towards two inter­dependent social goals: that every child born should be a wanted child, and that no woman should have to continue with a pregnancy she can’t cope with.

Attainment of these objectives could have important effects on the health of both women and children, and there is some evidence that for children at least this may already have occurred. In 1921-25, before the 1926 strike and inter-war years of mass unemployment, infant mortality in the South Wales valleys was 2% higher than the rate for England and Wales as a whole; by 1931-35 it was 22% higher, and by 1961-65 25% higher, than the England and Wales rate; but in the period 1971-72 it fell dramatically to a level 4% below the England and Wales rate, and has stayed there ever since. Levels of income, and every other index of social advantage, have all worsened in the South Wales valleys relative to the UK as a whole, in line with renewed mass unemployment and destruction of our industrial base, but nearly all the valley communities happen to be served by hospitals which have adopted liberal policies in applying the Abortion Act in the NHS, in contrast to areas such as Cardiff and Birmingham.

The third big change for women has been their rapid recruitment to paid employment, mainly in light industries attracted to the valleys by regional employment policies. Whereas labour activity rates for men in the South Wales valleys fell from 81% in 1961 to 72% in 1971, for women they rose from 27%to 33% (still well below the UK average) (‘The role of regional policy in South Wales: with particular refer­ence to valley communities’, Home Office/West Glamorgan CDP Research Team working paper no, 7, Cardiff: Department of Town Planning, University of Wales Institute of Technology, 1974.).For the first time since the conscription of young women to work in munitions factories during the war, they have their own money in their pockets. The effect in both eras has been the same; women’s expectations and confidence in their own strength have risen. But whereas during the war there was a simultaneous rise in the expectations and confidence of working class men, jointly expressed in the 1945 election which gave birth to the NHS, women are now moving forward while men are in retreat.

Community Care of the Sick

These complex, contradictory trends are important, not only because of the destructive effects of mass unemploy­ment on the capacity of industrial working class communities to preserve their traditions of social discipline and mutual support which have always been the most important founda­tion for the work of doctors and nurses, but also because so much of that support at a personal or family level has depended on women’s unpaid and unregulated labour. It is estimated that about one and a quarter million people in Britain care for sick, disabled, or elderly people living at home (Parker, G., With due care and attention: a review of research on informal care, London: Family Policy Studies Centre, 1985.), and at least three-quarters of these carers are women (Cartwright,  A., Hockey, L., Anderson, J.L., Life before death, London: Routledge & Kegan Paul, 1973.);wives, mothers, daughters, grand-daughters, nieces, mothers-in-law, sisters-in-law, daughters-in-law and women friends and neighbours. This has been the true basis of community care for 24 hours a day, to which GPs, and Community Nurses have been visiting adjuncts.

This informal community care structure is, in general, effective and efficient. Wilkes (Wilkes, E., ‘Where to die’, British Medical Journal 1973; i:32-3.), reporting GP experience in Sheffield generally similar to ours in Glyncorrwg, found that in 1972 55% of cancer deaths were still occurring at home, 10% in long-stay geriatric units of various kinds, and the remaining third in hospitals. About half of these patients were thought by their GPs to have no significant pain, suffering or distress. About one-third did have problems, but only for 6 weeks or less. Only 15% had problems for more than 6 weeks, and only 12% made heavy demands on GPs. With important reservations, this generally optimistic view was confirmed by Cartwright, Hockey and Anderson’s large study (through surviving relatives) of the last year of life of 785 randomly-sampled people who died (40% at home, 6% in long-stay units, 54% in hospital) over the age of 15 (Cartwright,  A., Hockey, L., Anderson, J.L., Life before death, London: Routledge & Kegan Paul, 1973.). Only $% were bedridden or mainly confined to bed for a year or more, 3% for 6 months but less than a year, 8% for 3 months but less than 6 months, and another 15% for less than 3 months. As Wilkes concluded, 85-90% of terminal illnesses were being coped with well enough to present no obvious problems to GPs attending.

Even so, he described the other 10-15% (of patients and their carers) as suffering ‘deplorable neglect’, and that is a lot of people. The task is more difficult than many attending GPs or even Community Nurses appreciate. In Cartwright’s study, 32% of patients nursed at home were incontinent of urine during the last year of their lives, but 53% of them never discussed this problem with either GPs or Community Nurses attending; 28% were incontinent of faeces, but 40% of them never reported it; and even of the 20% who suffered double incontinence of both urine and faeces, 26% failed to present this as a problem to doctor or nurse. Only 11% died without any previous symptoms, and 63% had symptoms for a year or more before death. At some point in their last year 66% had pain, 36% were mentally confused, 36% were depressed, and 30% vomited or felt sick; altogether 69% had symptoms recalled by relatives as very distressing, but only 47% of these symptoms were ever reported to GPs attending.

Almost one-third (30%) of main carers (‘brunt-bearers’) were in full-time employment before they took on their caring role, and a quarter of these had to give up work, 40% of them for a year or more. One-third of brunt-bearers were themselves over 65, and 14% said that the burden of caring had affected their health. In the Sheffield study (Wilkes, E., ‘Where to die’, British Medical Journal 1973; i:32-3.),14% of terminal cancer patients dying at home were being looked after by relatives who were themselves aged over 70.

In the Cartwright study, 9% of deaths occurred in people who remained in institutional care for the whole of their last year of life. Of the other 91%, 61% were admitted to hospital at some time during their last year, and 78% had been seen as hospital outpatients. The proportion helped at some time by various kinds of community worker are shown in Table 9.1.

Relatives, friends and neighbours gave nearly all the social care and help at night, most of the help with housework and self care, and much of the nursing care. Wives, husbands and daughters were the most likely people to provide all kinds of care (self-care, nursing care, help at night, social care and housework), except financial assistance.

Table 9.1     Help from various kinds of community worker received by non-institutionalized people during their last year of life

  • GP (all contacts): 96%
  • GP (home visits): 88%
  • Community Nurse: 33%
  • Any church worker: 29%
  • Other Local Authority or voluntary worker: 12%
  • Chiropodist: 11%
  • Other nurse: 8%
  • Home help: 5%
  • Special laundry service: 2%

Community Care of the Elderly

Care of the aged involves even more people than care of the dying, and with less help from the NHS and other social agencies. These informal carers cannot go out to work, their family lives are disrupted for months, years or decades, they may be socially isolated and exhausted, and often they are sick or old themselves. Increasingly, the caring generation comes from small families of only one or two brothers or sisters, often dispersed across Britain or to Canada or Australia, while the proportion of very old people or severely handicapped younger people who survive because of success­ful medical interventions increases their burden.

Dee Jones (Jones, D.A., Vetter, N.J., ‘Formal and informal support received by carers of elderly dependants’, British Medical Journal 1985′, 291:643-5.) studied 1,066 people over 70 living in their own or their relatives’ homes or in local institutions for the elderly in Cardiff in 1984. Nearly one-third (32%) needed help with one or more of 15 daily tasks basic to daily living such as washing all over or cutting their toenails; of these, 11% were in  old people’s homes and 8% had help from statutory services only. All the other 81% (273 old people in need) depended wholly or partly on care from relatives, friends   or   neighbours—informal   helpers   and   actual   or potential brunt-bearers. Of these 273 people, all but 6 (2%) were able to identify one main carer. 79% of 256 carers interviewed were women, 20% were aged over 75. Less than half these carers had had even a few days’ break away from the old person dependent on them during the previous year. Those caring for the most seriously disabled were least likely to have had a break; 40% of those caring for people with dementia had not had a holiday of a week or more for the previous five years. Only 7% of all carers had ever received respite or relief care to give them a break; even of those caring for old people with dementia, only 10% had ever been helped in this way.

Statutory services were not fully reaching many of these people. Only 34% of 117 severely disabled old people were being helped by a Community Nurse, 28% by a Home Help and 20% by visits to a day hospital; as 40% had been visited by their GP during the previous month, many GPs apparently failed to refer needy cases to relevant agencies.

Dee Jones concluded :

It was the consistent and unremitting nature of caring for elderly dependants that was particularly stressful to carers. Respite for carers further exemplified the Inverse Care Law: the more disabled or mentally infirm the dependant the less likely the carer was to have breaks or holidays. . . Carers seemed to be faced with the stark choice of looking after the dependant with minimal support from the community at great cost to themselves or putting them into resident accommodation permanently.

Defensive Investment in Community Care

Governments have good reason to be thankful for families and neighbours who give up large parts of their lives to look after handicapped children and sick or disabled adults. According to estimates by Muriel Nissel and Luch Bonnerjea at the Policy Studies Institute (Nissel, M., Bonnerjea, L., Family care of the handicapped elderly: who  pays?,   London:   Policy  Studies  Institute,   1  Castle  Lane, SW1E 6DR, 1982.), they saved taxpayers about £6 billion in 1980. The Equal Opportunities Commission estimated that about 750,000 people looked after the disabled elderly at home, of whom three quarters were women, spending an average of 3.5 to 4 hours a day; the same care provided by home helps or hospital staff (all very badly paid) would cost about £3,000 a year for each dependant person at 1980 prices.

Since the Attendance Allowance and other cash benefits to encourage home care were introduced in the 1970s, there has been at least some official recognition of the extent of the sacrifices made by families, and above all by women, to help chronic sick, elderly and handicapped people in their own homes. 595,000 Attendance, Severe Disablement and Mobility Allowances were paid in 1979-80, totalling £620 million. By 1985-6, 1,185,000 allowances were being paid, totalling £1,130 million. More people benefited and average value of allowances rose, but only from 8.7% of average male gross earnings in 1980, to 11.3% in 1985 (Central Statistical Office. Social  Trends 17.  London:   HMSO, 1987.). NHS spending on community health services rose from 6.2% of all NHS expenditure in 1980 to 6.5% in 1985; estimated expenditure for 1987 has actually fallen to 6.4%. From 1978 to 1984, the number of Community Nurses rose by 8%, but they had to cope with 12% more patients and 26% more people over 65 (Radical Statistics Health Group, Facing the figures: what is really happening to the NHS, London: RSHG, c/o British Society for Social Responsibility in Science, 25 Horsell Road, N5 1XL, 1987). Spending on GP and pharmaceutical services, unplanned and unplannable because of independent contractor status, have risen from 15.7% of NHS spending in 1980 to 17.5% in 1985, and an estimated 17.8% in 1987. The argument that they are unplannable not because of independent contractor status but because they are demand-led is not tenable; other demand-led industries are planned, even though demand can never be precisely predictable, but GPs remain a law unto themselves.

In return for these investments in formal and informal community care, the government obtained a reduction in the number of hospitals (almost entirely by closing small local hospitals), and a reduction of 30,500 in the number of available hospital beds, despite a 10% increase in the proportion of people aged 75 or more in the general popula­tion (Office  of Health  Economics,  Compendium of health statistics, 6th edition. London: OHE, 1987). DHSS plans for reductions in hospital beds are being implemented faster than expansion of resources for within-community care; by 1984, 45% of the planned reduction in hospital beds had been achieved, but only 17% of planned day-hospital places and 16% of day-centre places had been provided.

Community Care: Sentimentalized Exploitation, or Materially Assisted Altruism?

Virtually all social and political groups affirm belief in within-community care, just as they profess their devotion to the family. As unpaid or almost unpaid care by relatives or neighbours is much cheaper than any kind of institutional care, there are powerful material motives for this, but it is also true that most people do want to lead as much of their lives as possible independently in their own homes. Attitudes to the appropriate place for expected death are changing: because more may be done, or may appear to be done, for patients in hospitals today than in the past; because families are smaller and more dispersed; and because fewer women are able or willing to be taken for granted as the natural carers for all their sick, handicapped or dying relatives, and too few men have come forward to take their place.

It is wrong to assume that the present proportion of deaths in the home rather than in hospital is necessarily right. It is difficult to get good evidence on what either patients or their relatives would really prefer, since if they can’t have what they like they must like what they have, and bereaved relatives are understandably reluctant retro­spectively to criticize management of terminal illness. Only 6% of GPs and 4% of Community Nurses surveyed by Cartwright believed that no expected deaths should be allowed to happen at home, but 14% of bereaved relatives said three months after the death that they would have preferred it to have occurred elsewhere than it did, in hospital if the death was at home, or at home if the death was in hospital, and another 16% were not sure.

The trend is towards death in hospitals rather than the home, and for greater professionalization of care for the sick and handicapped. This can be slowed, halted, or even perhaps reversed by either one of two opposed policies: by policies of sentimentalized exploitation, sustaining obsolete beliefs about the family and the status of women, isolating those who do not comply, and maintaining public opinion in a state of intolerance, blaming the victims of a society organized to give low priority to care of the handicapped, sick and aged; or by policies of materially assisted altruism, with greater investment in material support for carers, and practical steps to encourage men to accept a caring role.

Every study of what actually happens to the chronic sick and elderly testifies to the colossal burdens ungrudgingly borne by the vast majority of relatives, and often of neigh­bours and friends, and the rarity of refusal by relatives to accept reasonable responsibilities (the definition of ‘reasonable’ being what critics would accept for themselves). However, doctors, nurses, and even caring relatives and neighbours often blame the effects of bed and staff shortages in geriatric and chronic sick hospitals on the unwillingness of other relatives and neighbours to accept their natural respon­sibilities. When Cartwright asked GPs the question ‘On the whole do you find in this area that most relatives accept reasonable responsibility for home care or do they seek admission to hospital or institution?’, 51% thought that ‘unwillingness of relatives to look after them’ was the main limiting factor on community rather than hospital care, and 25% thought most relatives sought institutional care. Com­munity Nurses were less likely to blame relatives, but still 33% thought unwilling relatives were the main limiting factor, and 9% thought most relatives wanted institutional care. Doctors’ opinions were not apparently influenced by the actual proportions of their patients who died at home or in hospital, but those who believed most relatives accepted reasonable responsibility did more home visits than those who did not, and had experienced less difficulty in getting admission when it was needed.

Attempts to find relatives unwilling rather than unable to care for the sick, and without some obvious explanation (for example, that a parent was divorced and estranged, or had problems with alcohol or domestic violence) have all been unsuccessful. A careful study by Prof. Bernard Isaacs of 280 patients admitted from their own homes to a geriatric unit in the East End of Glasgow (Isaacs,  B.,  ‘Geriatric patients: do their families care?’,  British Medical Journal 1971:4:282-6.) was typical. It showed that two-thirds were admitted because basic care at home was unobtainable, but the reason for this was that close relatives were either non-existent, were not available through their own ill-health, or were already caring for somebody else. In a few cases personal relations between aged parents and offspring made care impossible, but cases of wilful indifference or neglect by relatives were a factor in less than 1% of admissions.

If communities are driven into a state of siege, their young people demoralized by unemployment or driven away in search for work, their traditions of mutual aid and solidarity expressed through their own trade union and political organizations brought into contempt and personal acquisitive­ness and brutal ambition are exalted, real community care, on which all the machinery of the NHS ultimately rests, could collapse. In some inner-city areas it may already have done so.

However, the process of demoralization is always less than frightened outsiders believe. The traditions of solidarity and mutual aid originally used by Lloyd George to initiate construction of a state primary care service are not yet dead, but they are in a damaged state which will not be revived by sentimentalizing ignorance and brutality, past or present. There are, as there have always been, fundamentally conflict­ing tendencies even within the strong industrial working class culture of mutual support in adversity. On the one hand, organized altruism on a family and local community scale has been a precondition for survival; on the other, it has depended almost entirely on the unpaid labour of women, expected always to make themselves available for months, years, or even lifetimes, regardless of their other ambitions or commitments, or of their previous personal relationship with the patient. If encouragement of community care is not to become a cover for abdication from responsibility for adequate hospitals, the social machinery of home care must be materially supported by more realistic payment for long-term care; legal safeguards for jobs during prolonged absence; better child-care facilities and nursery schools; greater readiness to provide short-term institutional relief; more Home Helps; and a higher proportion of men accepting caring responsibilities in the home, as well as an expanded primary care team.

Community Participation in Defence and Extension of Hospital Care

If communities do still exist, can they be drawn into effective alliance with doctors and other health workers not only to defend the NHS when it is under attack, but also to administer the service and minimize bureaucratic control? Is it possible for their primary care teams to become account­able to them, so that general practice in the NHS can develop as a participative democracy, giving organizational expression to a new reality of care as a form of active health production shared between the team and its population?

Hospital closures have been resisted by community groups led by hospital cleaners, porters, nurses and eventually even by doctors, together with local citizens. Occasionally, as in the Guy’s Hospital incident, even Boards of Governors have offered serious resistance to government plans, but as these consist almost entirely of unelected representatives of the business world, this is exceptional. Resistance has sometimes taken non-traditional forms such as work-ins, and there have been many examples of sympathetic action taken by other industrial workers, notably coal-miners, and of mass lobbies of parliament which have certainly influenced govern­ment policies. Such community resistance tends to get good media coverage, most of it favourable, particularly when it has forced government to reveal the ultimate brutality of its policies by sending in the police to forcibly remove elderly patients from hospital wards earmarked for closure, as happened in London in 1982. These local struggles, which have occurred everywhere but most of all in the London area, have been extremely damaging to the government, and to the Conservative Party’s cherished image as the natural party of doctors, nurses and socially responsible privilege, though it has to be said that the last Labour government, led by James Callaghan, initiated all these policies of retreat later developed more vigorously by the Conservatives.

The weaknesses of these campaigns have been that they were defensive, often sentimental and backward-looking, that in the interests of unity in an extremely fragmented workforce it was necessary to overlook serious weaknesses at all levels, particularly the weaknesses of the highest pro­fessionals, and above all that they were episodic and not sustainable in the long term. Successful resistance in some hospitals usually meant only that the same closures were imposed in others where resistance was weak. In the long run, Government could always win, though often at heavy political cost. Resistance was built mainly on past loyalties rather than positive plans for the future, and was always vulnerable to the socially divided nature of the hospital workforce, not only the gulf between earnings and expectations of doctors compared with cleaners and laboratory technicians, but also within the 101 grades, degrees and divisions of non-medical hospital workers, each with their own variety of conflicting trade unions, pro­fessional associations, and their own fiercely contested rank in the pecking order (Neale, J., Memoirs of a callous picket: working for the NHS, London: Pluto Press, 1983. Jonathan Neale’s book now out of print, should be required reading for every medical student, doctor, and everyone else working in the NHS. The only wrong thing about it is the title, which is misleading. In fact, it is a beautifully written, balanced, well informed and careful account of the almost incomprehensibly complex world of labour in hospitals, and never gives in to the rancour and divisive recrimination endemic among so many opiners on medical care.)  Because closures came first to the oldest, least well-equipped units, their defence did not involve the most innovative centres, though in general medical, nursing and technical staffs at the leading edge of medical science were no less hostile to privatization of care than staffs in hospitals threatened with closure. All too often successful defence looked like preservation of obsolete care at the expense of even slower innovation in the newer hospitals, so that opposition  to NHS cuts was divided into those with community-oriented interests who sentimentalized com­munity care and dismissed high technology, and hospital-oriented interests determined to obtain whatever technology was available, ignorant and contemptuous of the possibilities of community care if it were properly resourced. Occasionally whole chunks of new medical science have broken away entirely from the NHS to develop in the private sector, as with Dr Andrew Steptoe’s team working on in-vitro fertilization, without any effective protest from anyone, and this could be a serious portent for the future.

Successful resistance, maintained over years rather than months and personally involving a majority of the public, is possible and will be necessary if hospitals are to resume expansion in those areas of care for which they are most effective. It will have to be built on a new pattern of local democratic control through new institutions, which must include progress at primary care level.

Community Participation at Primary Care Level

The difficulties of creating and maintaining patient-participation groups or Patients’ Committees were discussed in the last chapter. The most immediate obstacle is usually the resistance of doctors to the idea in the first place, but even when this is overcome, few communities contain many people with enough time, energy or credulity for yet another committee, tying up yet another evening and probably occasional weekends, generating more paper, more meetings and eventually more subcommittees, and on previous exper­ience unlikely in the long run to sustain more public interest, recruit more workers (rather than talkers) or satisfy more than a fraction of the new expectations first raised. The people in any community who really get things done are always already overworked, and reluctant to join anything unless they are convinced it has practical value and will bring about real changes; they don’t want more talking shops.

If we listen to what people talk about in queues, at tea breaks, on the way to and from work, we hear more about their own health and the health of their families than any other topic; there is probably no other subject of greater general interest. To attribute our difficulties in setting up social machinery for participative democracy in the NHS to public indifference must be wrong. There must be some way other than the existing types of Patients’ Committee to tap the interest that is certainly there. If there is not, then we must accept that the pace of progress will continue to reflect the attitudes and interests of central power-holders whose own experience of using the NHS by normal pathways is usually remote, and in many cases non-existent.

The general strategy of retreat from public service begun by the New Conservatives, justified in the broadest terms by the alleged superiority of market decisions over human decisions, always encounters fiercest opposition when trans­lated into concrete and specific local terms. At a local, problem-oriented level at which individual people are directly affected and on which effective action is seen to be possible, the general issues break down into comprehensible parts. Clues to what we need were given by Bagehot, the greatest political theorist the British ruling class ever produced, in his classic The English Constitution, written at the historical moment when Disraeli swept his alliance of industrialists, bankers and landed aristocrats into the political gamble of extending the franchise to include their natural enemy, the skilled working man in industry:

As yet, the few rule by their hold, not over the reason of the multitude, but over their imaginations and their habits; over their fancies as to distant things they do not know at all, over their customs as to near things which they know very well. ( Bagehot,   W.,   The   English   constitution.   First  published   1867. London: Fontana, 1963.)

For almost 40 years the common people have lived with the NHS as a social right, for most of that time with no more than token charges at the time of use. They have developed deeply-rooted ‘customs as to near things which they know very well’ regarding access to the service, which are now probably impossible to take away. Access will remain, but access to what? Their imaginations have not changed, or at least they have changed very little, because doctors kept not only their own but the public imagination within the Osier paradigm. Medical science was their property, and criticism of it was their privilege. Patients were wanted only as passive and uncritical consumers, not as fellow-producers.

Both GPs and specialists, and both primary care and hospital nurses, know that the quality of care available is now drifting further and further away from what is scientifically possible, but the community cannot be an effective ally in struggle to rectify this if it is not fully informed and if critical thought is discouraged. If we want effective allies in defending the service, we must learn to encourage our public not only to hold fast to their right of free access, but to believe that the service is theirs, that it ultimately belongs to the people who must be the final judges of its quality and effectiveness, that it is within the ambit of ‘their customs as to near things which they know very well’. These customs need expansion, imaginations and habits must be raised, but this cannot be done without a frank admission that large groups of patients with common chronic conditions requiring simple, easily understood, but regular, conscientious and unhurried ambulant care and monitoring by familiar, friendly, socially accessible staff, are not actually getting these things. Some responsibility for this failure must be accepted by GPs if their demands for better resources are to be taken seriously.

The diabetics, hypertensives, arthritics, schizophrenics, asthmatics, epileptics, and otherwise handicapped adults and children who together with their carers, spouses and relatives account for a majority of any population if the word ‘family’ is defined as those close enough to be personally concerned about one another, are people with specific, definable needs, with proven effects on outcomes if these needs are not met. These patients and their caring relatives are the realistic base for participative democracy in the NHS, starting from where we are with the people we have, beginning with the immediate, practical needs that are obvious and undeniable to informed patients and their primary care teams.

These people constitute a potential voting public which already guarantees a mandate to any government with the courage to support them; but both mobilization and demands must be specific, not general, and depend on active search and patient education at primary care level in each neigh­bourhood, to define and locate these needs, and measure precisely the extent to which they are unmet. They require the kind of information, and the kinds of patient-participation, implied by the style of primary care team discussed in the last chapter, with computerized practice information systems and therefore lists of the names, addresses, and telephone numbers of each group sharing a specific chronic disorder, and audited problem-specific clinics which measure the extent to which needs are met. From these large subsets of people sharing specific concerns, educated to perceive new possibilities of what they and their carers could jointly achieve with better organization, more time, and more labour (but little more technology), we could begin to develop indestructible roots of the new popular institutions required for participative democracy. Their activists would be drawn not from the relatively narrow circle of people who already have a wider political concern and confidence that they can change the world, but from the ranks of mothers with asthmatic or brain-damaged children, the wives of men with premature coronary disease, and from hypertensives, diabetics, and epileptics beginning to accept joint respon­sibilities for their own care, demanding an effective health service at all levels rather than an impersonal hit-or-miss charity.

On such a base, annual reports, annual meetings and annually elected Patients’ Committees would achieve vigorous and independent life, rather than their present tenuous and token existence. They would work in association with Local Authorities and national patients’ organizations like the British Diabetic Association, the Chest Heart & Stroke Association, and the many other disease-specific national consumer groups which are already powerful political lobbies.

If such local practice-based patient groups were organized, it would be difficult to close the cottage hospitals and other small and cherished local facilities so irritating to tidy-minded economists and NHS managers. They would have an intelligible alternative future as local units run in association with local primary care teams, providing day-hospital and simple 24-hour nursing facilities for most transient acute illness, support for chronic disability and handicap, and terminal care for the majority of patients who do not need the full technical support facilities of a modern District Hospital. District Hospitals currently overstretched to accept all organized follow-up of common chronic conditions such as diabetes, high blood pressure, obstructive lung diseases and heart failure, would become easier to defend because they would be able to give more time to fewer people, returning the routine care of most common illnesses to expanded and improved Primary Care Teams working within guidelines agreed between specialists and generalists.

Limited Demands, Infinite Resources

Patients and their caring families and neighbours, mobilized both to participate in better care and its control by shared discussion of audited progress, are a new and hitherto neglected resource.

Enoch Powell has a gift for lucid epigrammatic conclusions drawn from plausible but unvalidated assumptions. He was the first Minister of Health to assert that infinite demands were bound to collide with finite resources in an NHS free at point of use. Uncritical acceptance of his argument led to a widely quoted Insoluble Equation of medical care:

WANTS > NEEDS > RESOURCES

The 1944 White Paper which preceded the NHS Act of 1946 foresaw that

The proposed service must be comprehensive in two senses—first, that it is available to all people and, second, that it covers all necessary forms of health care.

Commenting on this objective, the Report of the Royal Commission on the NHS in 1979 (the Merrison Report) justified retreat, using the Insoluble Equation and a less explicit version of the Enoch Powell argument:

The impossibility of meeting all demands for health service was not anticipated. Medical, nursing and therapeutic techniques have been developed to levels of sophistication and expense which were not foreseen when the NH§ was introduced.

Since 1944, health services have not been unique in developing to unforeseen levels of sophistication and expense. 385 Tornado fighters planned for the RAF in 1977 were estimated to cost slightly more than the entire production of Spitfires before and during World War II, and the cost per ton of comparable warships increased ten to fifteenfold over the same period. If trends prevailing since the 1920s continue, by the year 2036 the entire US military budget will be spent on one aircraft ( Smith, D., The defence of the realm in the 1980s, London: Croom Helm, 1980, p. 156.). However, the defence of the realm is still seen as a national responsibility, whatever the changes in costs and technology. The real change in attitudes to NHS funding since 1944 is not technical or economic, but political. In 1944, arms expenditure was justified because we were fighting for a better future for the common people, including a free and comprehensive health service embracing all effective medical science, not just its cheaper parts. The cost and efficiency of measures to save life have changed no more than the cost and efficiency of measures to destroy it. Military research does not depend on generals and admirals standing on street corners with collecting boxes.

The phrase and the equation have been used ever since as self-evident axioms, not only to excuse failure to attain the original objectives of the NHS, but to ridicule the objectives themselves. Is there any real evidence to support them? The influential University of York school of health economists led by Alan Maynard claims to be interested chiefly in micro-economic efficiency within small units of the NHS (Maynard,   A.,   ‘In   discussion’,   pp.   24-5, following, ‘Economic Directives’, pp. 12-19, in Zander, L. (ed.), Change: the challenge for the future, London: Royal College of General Practitioners, 1984.) rather than macro-economic strategies, a view currently shared by most other health economists. In the early 1970s I was invited to a DHSS sponsored conference at the City of London College, appropriately entitled ‘Thinking the un­thinkable’. Keynote speaker was Professor Alan Williams, founder of the York school. As he was speaking to a largely medical audience, his economic arguments had to be simple, so he explained his macro-economic assumptions. On the blackboard he drew the elementary relationship between price and demand for commodities shown in Fig. 9.1.

Fig. 9.1   National relationship of  price and demand according to Prof. A. Williams

9.1 Supply and demand

9.1 Supply and demand

The demand for any commodity is related to its price. At infinite prices, demand is zero; at zero prices demand is infinite. Therefore in a free health service there is infinite demand. The supply of any commodity must be finite. Therefore there is infinite demand on finite resources in any health service free at the time of use, whether paid by taxes or insurance. As for the demands > needs > resources equation, it is taken as axiomatic that consumer wants are greater than patients’ needs; GPs, who normally see their patients only when they consult and therefore see what people complain about and not what they endure unaided, rarely disagree.

Once we accept medical care as a commodity, the rest follows. The micro-economists are aware of this, and there­fore usually devote a short introductory macro-economic paragraph to explain that the reasons for not regarding medical care as a commodity are sentimental, and imply that all who refuse to accept medical care as a commodity thereby also deny that medical care is an economic unit of any kind. Since all human activities have an economic aspect in that they occupy time which might otherwise be devoted to something else, medical care is no exception. Commodity production is not the only economic form that production can take, but it is the form characteristic of a capitalist economic system. To assert that medical care requires some different form of production requires more independent thought than most people are willing to give it, at least when fixed by the beady eye of an economist, who will in any case ease their consciences by conceding that though medical care must be a commodity whether we like it or not, it is one of a special kind, to be handled with greater sensitivity than boots or cricket bats. They may concede it the special, apparently nonsensical status of impressionist paintings, grand opera or fossil dinosaurs, for which everyone accepts that the market is an ass, but since markets must rule for our own ultimate good as a part of natural law, no one can think of anything better.

Opponents of these arguments have tended to concentrate on the fact that medical care sold as a commodity puts consumers at a disadvantage because they don’t have enough information to judge what they are buying. Therefore social justice demands that though medical care remains a commodity, the buying and selling of it is underwritten by the State on the consumer’s behalf, and this is supposed to be how the NHS works. This argument applies equally to many other commodities, since as they become more and more technically sophisticated and marketing techniques are used to manipulate demand, fewer consumers are able to make rational choices of any kind, whether they are buying echo-sounders for their yachts or tonsillectomies for their children. It is an ultimately weak defence of the NHS, unless one is prepared to examine the possibility that the stringent consumer protection required to make a reasonably safe medical care market possible is probably required also for an increasing proportion of other commodities essential to life.

A   far   more effective defence is to question  whether effective medical care is a commodity at all. Of course, there are components of medical care which are commodities. For example prescribed drugs are extremely profitable com­modities, which are (in the NHS) ordered by doctors but paid for by the State on the patients’ behalf, so that in effect the doctors are the consumers, relatively unrestrained by price, and doctors are the target of promotional efforts by the pharmaceutical industry.

The processes of diagnosis, treatment, and follow-up, however, are not commodities, but joint products of co­operation between care-givers and care-receivers requiring work from both. This productive relationship can be, and has in fact usually been, ignored and concealed precisely in order to maintain market relationships, partly because even when private practice had dwindled to a minuscule proportion of all medical work as it did in the 1960s, it still provided the classical model for professional thought; and partly because having grown up in a capitalist society our imaginations tend always to be limited in this way about all creative work, so that the greatest achievements of mankind, which obviously deny the limits of commodity production and exchange, can only be attributed to genius, freakish behaviour so unique that it requires no explanation and need not fit the machinery of everyday life. Yet in reality the public wants and has a right to expect everyday medical care as free from commercial considerations as the works of Beethoven or Van Gogh.

There are apparent exceptions to this, forms of medical care in which patients really do appear to be almost entirely passive consumers of professional expertise; but these are precisely those medical and surgical crises which are ethically least tolerable as commodities to a public which still retains any feeling at all for social justice. There is no evidence of any public mandate for a commodity market to operate for emergency care even in the USA, let alone Britain.

Moreover as soon as we look at real demands on the NHS, we can see that though large they are not infinite, but defined by ultimately calculable though very complex determinants. Obviously prices attached to particular medical activities must modify their use, the classic case being the effect of charges on prescribed medicines, shown in Fig. 9.2. However, even this simple example shows that many influences other than price determine demand. From 1948 to June 1952, there were no charges for any prescribed drugs or appliances. The number of items prescribed rose steeply, but started to fall a year before the first imposition of charges, and then levelled off. The next rise in charges, in 1956, did coincide with a rapid fall in prescribing, but from then on, out of 10 occasions when new charges were applied, only one (the rise from £0.70 to £1.00 in December 1980) appeared to initiate a fall. Though prescribing was rising rapidly when prescription charges were briefly abolished in February 1965, the rise was less steep after abolition, and prescriptions were actually falling when charges were applied again in June 1968.

Fig. 9.2 NHS  prescription charges and items dispensed by chemists and appliance contractors, UK, 1949-86.

Prescriptions dispensed 1949-1986Source:  Fig.    4.12,   Office   of  Health   Economics   Compendium   of Statistics, 1987. London: OHB, 1987.

What other influences were there on prescribing, other than deterrent charges? Doctors, not patients, are responsible for prescribing. Continued demand does depend on whether patients understand and accept treatment, but no one who actually studies the very complex influences on prescribing and uses of medications (Dunnel,  K., Cartwright, A., Medicine takers,  prescribers, and hoarders, London: Routledge & Kegan Paul, 1972) could accept so blunt an instrument as prescription charges as a rational means of influencing them. Doctors are influenced by what they learn, for about six years as undergraduates in medical schools where they are taught to be sceptical about the claims of those who make and sell drugs, and then for 30 or 40 years in practice, when more is spent on each doctor for drug promotion by pharmaceutical companies than was spent on training him in the first place (Abel-Smith, B., Value for money in health service, p. 83. London: Heinemann, 1976.). Of course, doctors feel insulted if anyone suggests that their prescribing is influenced by all this, but pharmaceutical companies are not in the habit of throwing money away, and such influence is certainly their intention. A few patients do actively search for medication as other consumers pursue ice-cream or fashion footware, but a large majority do not want to be ill, and avoid both doctors and their prescriptions if they can. The idea that if medical care is free then people will consume it for fun is ludicrous. If water supplied at a fixed charge unrelated to quantity (which, as I write, it still is) has not led people to leave the tap running all day, why should a free health service lead them to consult GPs or specialists when they don’t think they need to?

The micro-economists will answer that this is not really what they mean by infinite demand. What they have in mind is that when one medical problem is solved, there always seems to be another to take its place, a fact which they think Lord Beveridge overlooked when he made his now famous assumption that a universal free health service would reduce demand by improving the nation’s health. That major and very costly problems were solved, with great relief for the service, is not in dispute. There were huge falls in demand, because of dramatic improvements in health which could not have occurred at the speed or to the extent they did without the NHS: respiratory tuberculosis required 25,000 hospital beds in 1957 (‘A   review   of  the   medical   services   in   Great   Britain’, (Porritt Report), London: Social Assay, 1962), but by 1978 the average number in daily use had fallen to 754 (Hospital In-patient Enquiry 1978, London: HMSO, 1981.); the population in mental hospitals fell by about two-thirds from 153,000 in 1954 to 67,000 in 1984 ( Office  of Health  Economics,  Compendium of health statistics, 6th edition. London: OHE, 1987.); and in the first 25 years of the NHS there was a roughly 10% reduction in the number of hospital beds in use (Godber, G.,  The health service: past, present and future, p. 38. (Heath Clark lectures), London: Athlone Press, 1975.) despite rising standards of care, an ageing population and an overall population increase of 12%.

Their grand discovery, allegedly overlooked by William Beveridge and all the liberal Keynesians of the 1940s, is that medicine never runs out of worthwhile tasks, in which it appears to resemble every other worthwhile art or science. Music is not expected to run out of compositions, architecture is not expected to cease the evolution of building design, so why should medicine run out of new problems of disease, death and unhappiness? Medicine must plead guilty to the charge that like all other worthwhile arts and sciences, there seems to be no end to it.

Pioneer of this argument was Dr D.S. Lees in his booklet Health through choice. He maintained that with increasing prosperity in the 1960s, more money would be made available for medical care by consumers buying for themselves in a commodity market, than by taxpayers for the community at large (Lees,   D.S.,  Health  through  choice:  an economic study of the British National Health Service, Hobart Paper no.  14, London: Institute of Economic Affairs, 1961.). Dr Lees now has the satisfaction of seeing his views generally accepted without ever being subjected to any objective test. There has never been any evidence from opinion polls that the general public is unwilling to pay more taxes for a better health service, but approval from a poorly-informed general public is not what the micro-economists have in mind; they are looking to people with enough power and money to sympathize with and understand their kind of accountancy, who will agree that the general public has ever since the war been both promised and given too much for its own good in the way of medical and every other kind of care. They see demand for medical care as infinite because they are infinitely disinclined to meet the social costs of better care for anyone but themselves.

Demands are, with experience, more or less predictable. More importantly, needs are measurable and at a single point in time finite, though certainly complex and greater than present resources can cope with. What should be measured, and how it should be done, was discussed in the last chapter; if we accept that the job can be done (granted sufficient labour), and if we are serious about doing it, primary care teams with listed populations can begin to measure both needs, and the extent to which they are met.

Infinite Resources: The NHS as a Mass Employer

Of course, at any particular time, more of one sort of invest­ment must mean less of another. Even the most wasteful spending on health services is more socially useful than manufacture of weapons, but this is not the most powerful argument against this facile formula for defeatism.

We now have at least three million, more probably four million unemployed people, probably well over a million women who would welcome part-time or full-time work as mature entrants to any industry close to their homes but are not registered as unemployed, and a huge reserve of trained nurses who now do unpaid domestic work. Caring for sick and disabled people is, despite an appalling history of low wages, overwork and inadequate resources, a popular career choice. This reserve of labour is readily available for employ­ment in an extended primary care service.

The   reserve   of  unemployed   people  living as  close  to subsistence level as is possible in a sophisticated modern economy was created deliberately by the first Thatcher government as its principal social and economic weapon to strengthen employers and weaken trade unions. It was the first and most important part of a general strategy aiming to raise the general rate of profit by destroying the conventions of the post-war social settlement, imposing new terms and conditions of investment and employment. Just as a small-scale war in the Falklands turned out to be politically profitable, so did abolition of ‘overfull’ employment restore the right of management to manage in its traditional auto­cratic style. However, it’s easier to start wars and mass unemployment than to stop them. Unemployment went out of control, and the destabilization of society it has caused, in terms of rising crime rates and a growing black economy (though not as yet of organized pressure for political alternatives) are now obvious.

A future government that really wants to abolish un­employment will be able to do so if, and only if, it is willing to give it up as a weapon for social control. Expansion of health services, above all at primary care level, would then be the quickest way to get large numbers of people back into socially useful work. Few difficulties at any level in the NHS arise from shortages of technology or of sophisticated skills. The big problems, for care-givers and care-receivers alike, are not enough people and not enough time. The better health services of the future, though very dependent on rapid technical development, particularly in information techno­logy, will be more rather than less labour-intensive. An expanded voluntary sector, together with trained nurses lapsed from production, would probably become the principal source of recruitment for part-time and full-time community health workers. Though an expanding primary care service with a wider social base would encourage a much bigger voluntary sector—for example, the parents of diabetic, asthmatic or mentally handicapped children—there must be expansion in paid staff, because only paid staff can be fully accountable for work planned and verified against a local population base, and because even volunteers require training, organization and support which only paid staff can be expected to give.

Examples already exist of the sort of expanded employ­ment required. At Sheffield’s Birley Moor Health Centre in 1977 (Birley Moor Health Centre, ‘Report to Joint management meeting 17 March 1987’. This centre has pioneered not only excellent neighbourhood care, but also a unique emphasis on occupational health, including management of problems of mass unemployment.) , 10,000 people were served by 4 GPs, 5 office staff, 3 community nurses, 2 health visitors, 1 midwife, 1 chiropodist and 3 cleaners; 19 staff altogether. By 1987 11,000 people were served by 7 GPs, 3 nurse practitioners, 8 office staff, 3 community nurses, 3 health visitors, 1 mid­wife, 4 community psychiatric nurses, 1 physiotherapist, 1 foot care assistant, 2 occupational health workers, 2 unemployment health workers, 1 dietary nursing assistant, 1 patient liaison worker,  2 employed workers and 2  volunteers on a project for the mentally infirm elderly, 1 Citizens’ Advice Bureau  worker,  2  patients’ librarians and still 3 cleaners; 47 altogether, a 247% increase in staff/patient ratios in 11 years. Though most GPs do not employ even their full entitlement of reimbursable staff under the 1967 Package Deal, a minority of innovating practices have discovered  many  important tasks that cannot and should not be done by doctors or even by nurses, but require specialized interpersonal skills over a fairly narrow clinical territory; for example, counsellors who can help people to lose weight,  stop smoking, control their alcohol or tranquilizer dependence, manage  their children’s diabetes, asthma, or sleeping or eating problems. These skills could be taught in day-release courses initially lasting a few weeks rather than months,   provided   that   subsequent   work is audited and discussed by the whole team. They require mature, intelligent people, not over-professionalized, who can connect easily with their clients through shared social experience.

Would such expanded teams be effective in improving health, giving easier and more effective anticipatory care for disease, and easing the burden on our increasingly costly hospital services by more accurate and closely considered referral policies, and by returning to primary care the millions of patients now attending out-patient departments for apparently indefinite follow-up? There are good a priori reasons for believing they would, but controlled trials over periods of at least five years are needed to measure the difference in health outcomes between enlarged teams working on this broader front, and smaller teams with a narrower clinical approach.

Must Defeudalization mean Dehumanization?

Whether or not such an expanded team would be good for patients, it would almost certainly be good for the newly recruited care-givers. Incomes would probably not be much better than what most of the unemployed get now on the dole, though travel-to-work costs would be small, removing one common reason for not accepting work offered. Positive effects on health, however, could be substantial. A study by the US Department of Health Education and Welfare published in 1973 showed that the best single predictor of longevity was job satisfaction (‘Work in America’, Department of Health Education & Welfare,Washington: US Government Printing Office, 1973.), and job satisfaction in personal health services is potentially high. Paid employment outside the home has an important preventive and therapeutic effect on depressive illness in mothers with preschool children. If creche facilities and nursery school provision were improved, these women could be one important source of recruitment. Another source could be disabled and handi­capped people of all kinds. Personal experience of illness or handicap is an important asset in care-givers, which could encourage greater patient input into production of medical care, more effective for care-receivers and more satisfying to care-providers.

Years ago it was assumed that the NHS, the largest single employer in the UK with over one million employees, should set an example as an employer of disabled and handicapped people. Hospitals had a long tradition of providing badly paid but secure sheltered employment for many people uncompetitive in the job market. People were taken on for a lifetime of service, so that high morale and docile acceptance of an almost feudal hierarchy of command could be maintained despite wages at or below subsistence level. The low-wage structure and feudal hierarchies of the old hospital tradition were bound to change, and in 1969 they did, not because consultants and Hospital Governors discovered that it was impossible for a hospital porter to live on a wage of £16 or a ward cleaner on £12 a week, but because hitherto docile health service unions were driven to militancy. In 1967 there was just one strike in the NHS involving 500 staff for one day, an average of 0.69 days lost per 1,000 staff compared with 100.00 days per employee for the UK as a whole. In 1973 there were 18 strikes involving 59,000 staff for 298,000 days, and average of 353.5 days per 1,000 staff compared with 324.4 for the UK (‘Report of the Royal Commission on the National Health Service‘(Merrison Report), Cmnd 7615. London: HMSO, 1979).

In 1983, the DHSS axed the root of benign paternalism; circular HC83(18) ordered all Health Authorities to put all cleaning, laundry and catering out to competitive tender. Subsequent circulars forbade protection of existing pay and conditions. As lowest tenders were normally accepted without regard to existing tenure, quality of service or job satisfaction, hospital workers were forced to submit in-house tenders for their own jobs at lower wages. The DHSS claims it has saved £73 million by contracting-out of services, and £48 million of this has come from in-house tenders—workers who ‘voluntarily’ reduce wages, hours, and staffing levels in order to keep any jobs at all. These reductions not only reduce the already very low incomes of hospital workers, but also reduce standards of work. Reviewing evidence from monitoring of work of private contractors in public services, the Labour Research Department estimated that the standard of services such as cleaning and catering had fallen by an average one-quarter to one-third (Privatisation: paying the price, London: Labour Research Depart­ment, 1987.).

What was once a secure, long-service occupation in which personal relationships and collective morale were important is now a casualized occupation without job satisfaction and with a high labour turnover. The aim is profit, not service, and loyalty of employers to their workforce or of the work­force to patients is now a thing of the past, at least so far as management is concerned. About 35,000 hospital workers live in tied hospital accommodation, and may lose their homes with their jobs; when private contractors took over catering at Northwick Park Hospital in Harrow, the Queen Elizabeth Hospital in Birmingham, and Farnham Road Hospital in Surrey, workers were evicted from their homes, and this was only prevented at St. Mary’s Hospital in Paddington by militant unionists who occupied the houses and organized a public campaign. Up to 1986, there had been a gross loss of 17,500 jobs from the NHS, and a net loss (after allowing for staff employed by the private agencies) of 5,500 jobs; further gross and net losses of about the same size are planned from future privatization.

Since building of new hospitals and scrapping of old ones began in the 1960s, management of the NHS has increasingly been modelled on management of industry. For industry in a capitalist society, production of bricks, boots and all other commodities is not an end in itself, but a particular means to the universal end of all economic activity; the realization of a maximum return on capital—profit. Any company which fails to follow this ruthless policy will quickly find it has been eaten by another which does. Though this is not yet the objective of the NHS (though nothing is now unthink­able) its principal administrators, now drawn from private industry rather than the civil service, as well as most members of its governing boards, have found a close equivalent-savings. In practice, the NHS is not organized to maximize output in terms of real and measurable improvements in health, reductions in age-standardized death rates or the prevalence of measured disabilities, but to minimize costs. Since all public service costs are seen as potentially reducible tax burdens, this is, in a way, still an indirect search for profit, so everyone can feel comfortable with what might otherwise appear a useless endeavour. The women who lost their jobs when hospital cleaning and catering contracts go out to tender, who have only taken such hard and badly paid work because they can’t manage without the money, will so far as possible claim whatever other social benefits are available to pay the rent and feed the children, and most of the savings for the health service will be lost as increased costs for unemployment and supplementary benefits. The women who remain, trying to cope with more work done by fewer people, will no longer have the time to chat to patients and listen to their troubles, which nurses are now, and doctors long ago became, much too skilled, precious and overworked to do. And the large multinational firms which all over Europe and North America recruit the easily-exploited migrant labour most profitable for this work do very well indeed. The NHS, which should be our principal growth industry for friendliness, fellowship, generosity and compassion, will be dragged into the same trough as every­thing else, for these qualities turn out to have been tolerable only within the deferential hierarchy of medical masters, patient serfs and lady-nurses-in-waiting, when medical care was all faith and no substance; human relationships are not affordable within a science-based service which is both effective and properly paid.

Unless there is a fundamental change in social direction throughout the NHS, in hospitals as well as in community services and primary care, the idea of the NHS as a source both of expanded, labour-intensive service employment, and of happier and healthier working relationships, is absurd. Above all, health workers at all levels must have the time to talk to people and listen to them—in fact, have time to do their work as they were trained to do it; not only to cope, but to care.

Community as a Precondition for Health and its Loss as a Cause of Mortality

The society which denies society, the state which claims only to hold the ring while every man fights every man, according to rules written by and for which men, limited only by what the rest will tolerate, still claims to look after its casualties. The gladiatorial life may be a rotten business, but it is, after all, wonderfully productive of the sophisticated gadgets required to make the contestants think they are alive in such intolerable conditions? If we can learn to move with the times, enjoy a life in which everyone and everything is for sale, and even the biggest millionaires have a sporting chance of losing and being eaten by their rivals, we may perhaps be able to keep ourselves amused for the 80 years or so we have on this earth.

There is now a large body of scientific work measuring the damage to health which results from such an alienated society, in which everyone must either be a winner or a loser. Its most obvious application is to mental or emotional illness. There is compelling evidence from rigorous and detailed studies of random samples of women with young children that depression, in the fairly exact sense claimed by psychiatrists, is essentially a social disorder caused by isolation and a lack of confiding adult relationships, treat­able by correcting these deficiencies (Brown,  G.W.,  Bhrolchain,  M.N., Harris, T.O., ‘Social class and psychiatric disturbance among women in an urban population’, Sociology 1975; 9:225-54.). After a lifetime of careful, cautious work in epidemiological social psychiatry, George Brown (Brown, G.W., ‘Depression: a sociological view’, pp.  225-34 in Tuckett, D., Kaufert, J.M. (eds.), Basic readings in medical sociology, London: Tavistock Publications, 1978.)reached this important conclusion:

I believe that depression is essentially a social phenomenon. . . I would not make the same claim for schizophrenia, though its onset and course are also greatly influenced by social factors. Society and depression are more fundamentally linked. I can envisage societies where depression is absent and others where the majority suffer from depression. While this is science fiction something not too unlike it has been documented. At least a quarter of working-class women with children living in London suffer from a depressive disorder which, if they were to present themselves at an out-patient clinic, psychiatrists would accept as clinical depression, while women with children living in crofting households in the Outer Hebrides are practically free of depression no matter what their social class.. . I know of no compelling reason to believe that the many bodily correlates of depression such as those revealed by work on bioamines are any more than the result of social and psychological factors.

Like other people, in unconscious practice if not conscious theory, most doctors most of the time remain Cartesian dualists, seeing mind as an independent inhabitant of the brain, rather than as its function. Of all organs, the brain is dominant, the central control system for rapid response to environmental change; the brain is therefore always involved to some extent in any disorder affecting any organ, and so is its function, the mind. Both by doctors and by the laity, disorders are in practice still considered as either physical or Psychological. Within the Osier paradigm it is not easy to conceive of purely social causes of physical disease, that is to say, physical disorder precipitated by disordered social function. Unless some intermediate agent can be found, such as nicotine, alcohol or dietary deficiency, gross social dis­orders such as isolation, lovelessness, worklessness, egotism, loss of trust, loss of respect, loss of creative function or belief in some world-historical perspective giving meaning and purpose to life, are all unacceptable and incompre­hensible as contributory causes of disease.

The first general review I know of this important field was by John Cassel in his Wade Hampton Frost lecture to the American Public Health Association in 1975 (Cassel, J,, ‘The contribution of the social environment to host resistance’, American Journal of Epidemiology 1976; 104:107-123.). A wide range of causes of death tend to be associated with unstable or marginalized status with deprivation of meaningful social contact, including tuberculosis (Holmes,  T.,  ‘Multidiscipline studies of tuberculosis’. In Sparer,P.J.  (ed.), Personality, stress and tuberculosis, Chapter 6, New York: International Universities Press, 1956.), schizophrenia (Dunham, H.W., ‘Social structure and mental disorders: competing hypotheses of explanation’, Milbank Memorial Fund Quarterly 1961; 39:259-311.(, (Mishler, E.G., Scotch, N.A., ‘Sociocultural factors in the epidemio­logy of schizophrenia: a review’, Psychiatry 1963; 26:315-51.), multiple accidents (Tillman,   W.A., Hobbs, G.E., ‘The accident-prone automobile driver: a study of the psychiatric and social background’, American Journal of Psychiatry 1949; 106:321) and suicide (Durkheim, E., Suicide: a study in sociology, London: Routledge & Kegan Paul, 1952.). Though often associated with poverty, this hypothesized general effect on host resistance to a wide range of diseases and causes of death is probably independent of this: both upward and downward social movement are associated with increased disease rates (Christenson,  W.N.,  Hinkle,  L.E.,  ‘Differences  in  illness and in prognostic  signs in two groups of young men’, Journal of the American Medical Association 1961; 177:247-253.)  and with mortality from coronary heart disease (Marmot,  M.G.,  Syme,  S.L.,  ‘Acculturation  and coronary heart disease’, American Journal of Epidemiology 1976; 104:225-47.)  and stroke (Nesses W.B., Tyroler, H.A., Cassel, J.C., ‘Social disorganisation and stroke mortality in the black populations of North Carolina’, American Journal of Epidemiology 1971;93:166-75.);so is loss of a spouse, at all ages and in all social groups and classes (Kraus, A., Lilienfeld, A., ‘Some epidemiologic aspects of the high mortality rate in the young widowed group’, Journal of Chronic Disease 1959; 10:207-17.), (Maddison, D., Viola, A., ‘The health of widows in the year following bereavement’, Journal of Psychosomatic Research 1968; 12:297-306.),(41. Parkes,  C.M.,  Benjamin, B., Fitzgerald,  R.G.,  ‘Broken heart:  a statistical study of increased mortality among widowers’, British Medical Journal 1969; i:740-3.),(Rees, W.P., Lutkins,  S.G.,  ‘Mortality  of bereavement’,  British Medical Journal 1967; iv:13-16).

The more general concept of host resistance impaired by loss of meaningful social contact, leading to increased vulnerability to a wide range of environmental risks with several final common pathways of outcome, has been tested prospectively by Lisa Berkman and Leonard Syme in the Alameda County studies in California (Berkman, L.F., Syme,  S.L.,  ‘Social  networks,  host  resistance, and  mortality: a nine-year follow-up of Alameda County residents’, American  Journal of Epidemiology   1979; 109:186-204.). Their study popula­tion was 6,928 adults, 86% of a stratified random sample of households. In 1965 these people answered questions about the number of friends and relatives they felt close to, how often they saw them, and their membership of churches and other formal or informal social groups, and 96% of them were followed up nine years later in 1974. In the age-range 30-69 at entry, there were 371 deaths. Comparing people with most social ties with those who were most isolated, and standardizing for age, isolated men were 2.3, and isolated women 2.8 times as likely to die during the 9-year follow-up. Statistically, these differences were highly significant (p. = .001). There was no evidence that the effect depended on illness at entry, and it applied to all the main causes of death. The effect was statistically independent of all other known risk factors for premature death, including health status at entry, socioeconomic status, smoking, obesity, alcohol consumption, exercise, and use of preventive health services.

Community or the lack of it, the extent to which people have a non-adversarial relationship with others to support them through difficulties, perhaps biochemical, dietary, virological, bacteriological, oncological and genetic difficulties as well as the more obvious life crises, may determine out­comes no less than the many other powerful causal factors for disease with which we are familiar. This is as much an area for controlled observation and experimental science (and just as absurd an area for mysticism and unfalsifiable speculation) as the details of biochemical transmission of nerve impulses or the actions of enzymes at cell surfaces. Man has only gone through roughly 10,000 generations since his line branched from the other apes. More than any other animal we are adaptable, omnivorous and omnipotential, capable of all crimes but also of all heroisms. Our rate of evolution has been uniquely accelerated by our capacity for non-genetic inheritance, which depends on giving—not selling—useful knowledge to our descendants. For most of that time, survival of individuals has depended on survival of groups, and survival of groups has depended most of all on that particular kind of far-sighted species-loyalty we call altruism. So much so that even when collective crime becomes state policy, criminal behaviour on the lines of 007, Rambo and other monsters is recognized as counter-productive; military murder has to be romanticized into selflessness even if the purpose and consequence is little better than piracy. Is it therefore not likely that there is, in terms of disordered physiology, some serious discordance between our social need for good fellowship, and our social reality, the society of winners and losers, the war of every man against every man?

 Mistrust of Grand Causes

Outside the public health field, clinicians are instinctively hostile to generalized explanations of causes of disease, and of the grand social remedies that follow from these, partly because these explanations are usually speculative, simplistic, and ignore the lifetimes of painstaking work required to test them, but also because they lie outside their own fields of action and competence as defined by the Osier paradigm. Their entire personal and historical experience confirms their view that the way to solve big apparently insoluble problems is to break them down into little ones, at least some of which may be soluble. Because doctors learn their trade in museums of advanced pathology, they look at the causal chains that lead to end-stage disease at its penultimate point, seeing most clearly the few, often complex and technically demanding, corrective options still open to them. The simpler alternative options that might have existed at an earlier stage are no longer there, and these doctors of advanced pathology have little experience of dealing with them in their only appro­priate setting, outside hospitals, within the community.

But even clinicians working in the community tend to have essentially the same attitude to causation, and therefore to treatment. They can, though with some difficulty, redefine Oslerian hospital medicine in terms of primary personal care, acting at a personal level on a different set of options appropriate to earlier stages of disease, but they will still, because they are clinicians and only clinicians, fail to recognize pervasive causes of disease which are actually affecting the whole population in varying degrees, precisely, because they are so pervasive. The causes are too big to be recognized. Instead, they perceive the effects, and are mainly concerned (as the patient is also) with the questions ‘Why her, why him, why me?’. The problem is seen purely in terms of individual susceptibility.

For example there is now an unmanageably vast research literature on the effects of personality and emotional behaviour on susceptibility to coronary heart disease originating from the work of Rosenman and Friedman in California (Rosenman,    R.H.,    Brand,    R.J.,    Sholtz,    R.I.,    Friedman,   M., ‘Multivariate prediction of coronary heart disease during 8.5 year follow-up in the Western Collaborative Group study”, American Journal of Cardiology 1976; 37:902-10.). They hypothesized that men with aggressive, competitive, ambitious personalities, who drove themselves hard against the clock, were more susceptible to coronary heart disease than men who were more easy-going. Early work in the USA tended to confirm this, at least for white middle-class middle-aged men, though attempts to replicate it in the UK have generally not been successful (Johnston, D.W., Cook, D.G., Shaper, A.G., ‘Type A behaviour and ischaemic heart disease in middle aged British men’, British Medical Journal 1987; 295:86-9.), and even in the USA tests of the hypothesis have tended to give tantalizing but inconsistent results (Sensky, T., ‘Refining thinking on type A behaviour and coronary heart disease’, British Medical Journal 1987; 295:69-70.), suggesting that though the hypothesis contains a truth within it, its formulation is faulty; the question is not being asked in a sociologically meaningful way ( Cohen,   J.B., ‘The influence of culture on coronary-prone behaviour’, Chapter 14, pp.191-8 in Dembroski, T.M., Weiss, S.M., Shields, J.L. et al (eds.), Coronary-prone  behavior, New York: Springer Verlag, 1978.) . The most interesting point, however, is the way in which specialists in this field seem uniquely preoccupied not with changing a form of society which produces this apparently lethal distortion in personality, exalting it and offering it every kind of material reward, but with identifying those most damaged by it and perhaps offering them some personal advice about how to compete without being competitive. High-risk subjects are advised to adopt ‘drills’ which disrupt the pattern of aggression and positively reinforce corrective behaviours such as going for a walk in the park, browsing in bookshops, or having non-working lunch hours, not wearing a watch, or spread­ing out work schedules to a more leisurely pace (Friedman, M.,  Rosenman, R.H., Type A behaviour and your heart, New York: Knopf, 1974.). There is some evidence that in patients who have already survived a heart attack, such remedial training does help to prevent a second one (Friedman, M., Thoresen, C.E., Gill, J.J. et al., ‘Alteration in type A behavior and reduction in cardiac recurrences in postmyocardial infarction patients’, American Heart Journal 1984; 108:237-48.), but all these measures presuppose that the high-risk subject has control of his pattern of work and working environment, an absurd assumption for most employed people.

The obvious alternative is to remove the cause. If there is a maniac in the town centre spraying passers-by with a machine gun, a shop selling flak suits might do good business, but the answer would surely be to call the police and have the man arrested. There are two reasons why this solution does not appeal to most doctors. First, they say that though the evidence of a causal connection may be suggestive, it falls short of proof. Certainly the evidence against socially encouraged type A behaviour is much less convincing than the evidence against cigarette smoking as the main cause of lung cancer in the 1950s, when the most eminent British and US statisticians were both vigorous opponents of the theory, so convinced of their rightness that they both accepted substantial fees from the tobacco companies. I doubt if this is the main reason why British doctors, at least, are doubtful about tackling social causes. Even if proof is incomplete, many will agree that a less competitive, more tolerant, more sharing society in less of a hurry to get everything finished the day before yesterday would be a help to us all in so many ways that possible benefits in reduced heart disease might be relatively insignificant compared with other social gains. There is less belief in the UK than the USA that what’s best for business must always be what’s best for the country and most of the people who live in it, though perhaps there is much the same cynicism about alternatives.

The second reason is more important. Social causes may be, probably are, important as causes of coronary heart disease and probably many other diseases, and individual care and advice probably do have less potential for measurable benefit to the population; but it’s not our job. Somebody has to offer personalized advice and preventive monitoring, and somebody has to do the best they can to limit or reverse heart damage when prevention fails. If we don’t do it, nobody else will. The socially and politically conscious doctor will answer that this is not enough; if doctors are convinced that a major cause of ill-health exists in society which could be remedied, whether it is the national economy which might be changed by political action, or a dangerous local failure to segregate pedestrian from powered road traffic, they should be out there with other socially and politically conscious citizens, holding up their banners, offering their petition sheets, running their sponsored miles, pestering their Members of Parliament, attending the ward meetings of their chosen parties, or even creeping out in the middle of the night to paint slogans on old colliery workings.

I happen to be a socially and politically conscious doctor, and I have done all those things, but I am far from convinced that they were ever more important or effective than my medical work. Only occasionally have they grown directly out of it and been a completely natural part of it; most of the time they have been a bolt-on option, and the machinery of local general practice would have continued much the same with or without my overtly political activity. If social factors influence the behaviour of disease on a community-wide scale, GPs and other primary care workers must concern themselves with them as a normal and central part of their work, not as a fringe option to be added by some doctors and ignored by others. GPs are paid to meet personally presented demand, not to search for needs or to concern themselves with the health of their registered populations. They have organized themselves to adapt passively to the shape of demand thrown up spontaneously by pressures of the symptom and work-absence markets, which although cash-free is still much the same old balance of supply and demand.

Effective action on community-wide causes of ill-health is too important, and already has too much consensus support, to leave to tiny groups of activists prepared to undertake agitational work, and in any case this is no way to run a railway; we are not talking about political symbolism, but the real care of real people, an important job which at present is simply not being done. If at least one GP in every group had the paid time and in-service training to take responsibility for monitoring local health and organizing the local community to act on its own behalf, we would quickly discover readiness to recognize and act on causes, rather than confine attention to effects. Of course, we would only be at the beginning, and would have to learn as we went. Most of the big ideas about causation will turn out to be wrong, but if we don’t start acting we shall never find out.

Concepts of Disease

Like other workers, doctors are comfortable with what they know, and mistrust philosophical approaches which threaten or discard old ideas without assurance of something better. They are also fond of their skills; they may be willing to improve or add to them, but they naturally resent being told they are obsolete or redundant.

In 1952 the medical scientist Sir George Pickering (Pickering,  G.W.,  ‘The  natural  history  of hypertension’, British Medical Bulletin 1952; 8:305-9.) was the first to perceive what should already have been an obvious truth, that levels of blood pressure were continuously distributed through the general population, were related to a continuously distributed risk of heart disease and stroke, and therefore that separation of people into two qualitatively distinct groups, one ‘hypertensive’ and the other ‘normo-tensive’, was an artefact arising from attitudes in professional observers rather than from biological evidence. Virtually the whole medical Establishment closed ranks against the idea, and Lord Robert Platt continued to fight on its behalf till he died 26 years later, though with somewhat diminishing fervour as his supporters thinned through age and defection. At some point Pickering’s heresy became official wisdom, it’s difficult to say when because rather than concede a knockout, most of his opponents quietly ducked out of the ring. The truth of what he was saying about high blood pressure became obvious, but the general issues he raised about the nature of disease, and his most important conclusions about the nature of decision-making in its treatment, are still ignored.

In his beautiful and perceptive essay on the Platt-Pickering contest, John Swales  (Swales, J.D., Platt versus Pickering: an episode in recent medical history, London: Keynes Press, 1985.) traced its origins to the conflicting traditions of personal clinical care and bedside teaching personified by Platt, and of clinical and laboratory science represented by Pickering. He thought a similar conflict of experience underlay the more muted differences between Sir John Ryle, the first British professor of social medicine, and Pickering’s mentor, Sir Thomas Lewis, who founded the first department of clinical science in the UK, at University College Hospital. Ryle believed that methodical observation of the natural history of disease in real people was central to the medical tradition, and the most fruitful area for research; his most influential book was called The natural history of disease. Platt, Swales believes, was in the same tradition. His ten years in private practice before he devoted himself entirely to teaching consultancy and the presidency of the Royal College of Physicians gave him the time and personal contact between doctor and patient for his communication skills to be fully developed, on which all clinical skills ultimately depend; putting his final thoughts after a lifetime at the top of British medicine, Platt had this to say in his valedictory Harveian Oration:

Clinical science has shown an unfortunate tendency to follow only the methods of physical science, which try to prove everything by contrived experiments to the neglect of discovery by deliberate and relevant observation and the kind of evolutionary or if you like teleological thinking necessary to the study of biology.

Pickering, on the other hand, though a first-class clinician, was above all an experimental scientist. To me, the accusation that British clinical scientists are in general any less sensitive to the complexities of individual patients, or any less capable of sustained observation over large numbers of cases, is unsupported by experience. That they are frequently in­sensitive to the feelings and opinions of patients is certainly true, but unfortunately that is equally true of all doctors; the clinical scientists are no worse in this respect than the rest of us, and often better. ‘There is a rabble even among the gentry’, and it can usually be roused against all who base their work chiefly on laboratory science.

Swales believes these traditions will continue in inevitable but on the whole fruitful and gentlemanly conflict; ‘The naturalist and the scientist have always made uneasy bed­fellows in British medicine.’ This seems to me to miss the essence of the conflict, which is not between bedside and laboratory science, but between the experience of consultant physicians who make uncontrolled observations on moderately large numbers of patients whose relation to any known population is a matter of guesswork, and the experience of medical scientists whose techniques are too detailed, too costly, and occasionally too hazardous to be applied to more than a few patients, but who are by their scientific training educated to accept the inherent difficulties of applying the conclusions of experiments on small numbers to whole populations. It is no accident that it was Pickering, the supposedly remote and impractical clinical scientist, who used the techniques of epidemiology to test his hypotheses on a large representative population; Pickering was the more imaginative naturalist, and opened the door to an exciting new chapter in medicine which Platt was unable to recognize. Ordinary doctors, certainly all GPs, were bystanders in that controversy, but its pivot was really their central concern. About 150 times each week GPs have to categorize a patient as normal or abnormal, well or diseased, however absurd this absolute distinction may appear, because that is how the Osier paradigm works, and that is what they have been taught. In a better-fed, better-housed society, fewer patients have gross end-stage disease, more have earlier, quantitative rather than qualitative departures from—from what? From normal? From the average? What if, as all the evidence suggests, the entire distribution of blood pressure, and therefore the average level of blood pressure, is too high compared with entire populations living under different conditions, which do not suffer from diseases consequent upon high blood pressure?

In the old days it was simple. The patient complained of a symptom traditionally attributed to high blood pressure, say headache or shortness of breath. You measured blood pressure once, and if the systolic pressure was over 140 or the diastolic pressure was over 90, the patient had the disease ‘hypertension’, with a treatment, antihypertensive drugs. Now, thanks to the machinations of clinical scientists, where there was certainty, there is doubt. Knowing that neither headache nor shortness of breath are likely to be caused by high blood pressure or cured by its control, GPs still measure it, both to reassure the patient, and because every person’s blood pressure must be known if serious but non-symptomatic high blood pressure is to be fully ascertained and treated. Knowing that blood pressure is extremely variable, they measure it not once but (for example) three times over as many days, with perhaps the following result: 146/92, 134/86,152/104. They can no longer attribute the symptoms to high blood pressure, so a longer interview is necessary to explore other possibilities. It is no longer possible to put all patients either in a white box labelled ‘normal’ or a black box labelled ‘disease’; there now has to be a third, grey box labelled ‘we don’t know’. In my clinic patients now measure their own blood pressure 28 times before we take a decision on starting what maybe a lifetime of treatment. Our decisions are based on the average of these 28 readings, on evidence of other risk factors for stroke and coronary heart disease, and on the results of randomized controlled trials in the UK and Australia which involved more than 40,000 patient-years of treatment.

The Platt earth provided a sure and stable foundation for achieving not very much; naturally so, since as Pickering pointed out we only had to know how to count to two (normal or abnormal). Pickering and his quantified doubt began to introduce the ideas and methods of science to daily practice, not just the products of science unscientifically used. Unlike the hospital specialists, GPs have the resources for this; not the technical resources (which are not very demanding) but the human resources. In my practice I have 1,700 beds, a lot more than any hospital specialist, and they are in a real community where real people live real lives. On such a population base there is no limit to what could eventually be achieved, joining the techniques of experimental medical science with those of epidemiology and of participa­tive democracy in a new streetwise synthesis; medical science with a human face. The research problems of hypertension will probably continue to pile up an increasing confusion of conflicting and uninterpretable facts until this is done (Julius, S.,  Weder,   A.B., Egan,  B.M., ‘Pathophysiology  of early hypertension:   implications  for  epidemiologic  research’,  chapter in Gross, F., Strasser, T. (eds.), Mild Hypertension: recent advances, pp. 219-236. New York: Raven Press, 1983.)

If all conditions susceptible to medical treatment were, like high blood pressure, quantitative deviations within a continuous distribution, rather than qualitative transforma­tions like a stroke or a fractured femur, the entire concept of disease would disintegrate. In fact, diseases involving major organ damage do fit reasonably well into the old idea of diseases as something which people either have or have not got. These can be listed as a sort of bestiary, with their names, modes of recognition, and the best ways to shoot them without killing the patient. The concept presupposes that the disease exists as an entity separate from the patient, a sort of obligatory parasite which cannot exist without a human host, but is sufficiently independent to be a potential target. To attack this concept as an imperfect, incomplete reflection of biological reality is stupid; all scientific concepts are imperfect reflections of reality, and the real question is, is there some other concept which is less imperfect and more useful? The answer to that depends on the nature of the human material we are dealing with. In primary care in an advanced industrialized society, it seems to me that the disease concept applies well to advanced organ damage, but certainly not to the quantified and reversible deviations from health which precede them, eventually contribute one of several precipitating causes of death, and which will increasingly become the heart of clinical practice outside hospitals.

The new concept we need is anticipatory care of health rather than treatment of disease. It differs from the old in two ways. First, it deals with quantified points on a continuous distribution of risk for events (diseases) which have not yet occurred and are not causing symptoms severe enough to ensure that people will present as patients. It can, of course, be applied to obviously sick as well as to apparent­ly healthy people. A man of 35 with insulin-dependent diabetes and a blood pressure of 154/98 unquestionably has a disease (insulin-dependent diabetes) which makes him qualitatively different from other people, and will if un­treated quickly cause intolerable symptoms (thirst, weight loss, and eventually fatal ketoacidotic coma). Population screening is not needed to identify him; he will come soon enough once the disease has got going. His moderately raised blood pressure is entirely different, and cannot use­fully be regarded as a disease; if it is not treated it is likely in a diabetic to cause irreversible kidney damage, with dialysis, transplant or death from kidney failure perhaps 10 or 15 years later, but it is not currently causing any symptoms, in fact he is likely to feel a little less well on anti-hypertensive treatment than without it. Whereas the outcome of untreated insulin-dependent diabetes is certain death within a few months or perhaps years, the outcome of untreated high blood pressure in this range, even in a diabetic, is based on probabilities, not certainties. He needs both disease care within the old Osier paradigm, and anticipatory care of his remaining health in the new paradigm.

Secondly, anticipatory care differs from the old paradigm in that it drops the pretence that any disorders, illnesses, diseases, or injuries can ever be truthfully defined without a social component, sometimes large, sometimes small, but always present in some degree. At a certain biological age, placed chronologically around 85 years, death becomes a normal event biologically, but social attitudes to this have to be taken into account and cannot in practice be disregarded in medical decisions. The question is not usually serious with well-defined disease states causing gross symptoms, but for quantified deviations from health such as high blood pressure, obesity, airways obstruction, depression, period pains, non-insulin-dependent diabetes, high blood cholesterol, alcohol and tobacco addiction or chronic back pain, the points at which medical interventions begin have important social consequences which must be taken into account in deciding whether to intervene in that poorly-defined frontier between health and disease so detested by hospital specialists, who naturally prefer a well-defined beast at which to aim their sophisticated weapons, but which is home ground for the GP. All medical decisions, follow-up with potential problems of dependence, prescription of potentially hazardous drugs, legitimized absence from work or changes in division of labour within the family, referral to hospital specialists entering a pipeline from which it may be difficult to escape, surgery, radiotherapy, chemotherapy, even the decision to do nothing but wait and see, all have social implications which should be taken into account.

Clinicians usually have the illusion that they already do all this, but this view is not supported by any evidence. Within the Osier paradigm, they regard ‘diseases’ such as high blood pressure, obesity or airways obstruction as independent wholly biological entities, requiring specific treatments for their control. Social factors are taken into account only in order to obtain compliance from the patient in accepting that treatment. Yet the controlled scientific evidence on which any rational treatment should be based is social as well as biological in nature. The Australian blood pressure trial (Management Committee, Australian National Blood Pressure Study. ‘The Australian therapeutic trial in mild hypertension’,Lancet 1980; i:121.)showed that 13,000 patient-years of treatment were required to save 5 lives with antihypertensive drugs in people with diastolic blood pressures in the range 95-110; there were 6 deaths in people randomized to treatment and 11 in the equal number of untreated controls. The Medical Research Council mild hypertension trial (Medical Research Council Working Party. ‘MRC trial of treatment of mild hypertension: principal results’. British Medical Journal 1985; 291:97.) found that about one in five of men treated with the most commonly-used anti-hypertensive drugs became impotent, a problem reversible by stopping the drug, but only if the (previously unknown) association was recognized. In return for extremely small gains in strokes prevented (and no gain at all in preventing all causes of death put together, which were the same in both groups in both these trials) a substantial price was paid in impaired social function. Obviously, wherever we are dealing with continuously distributed risks, choice of a threshold for diagnosis or for treatment (which in practice become inter­changeable terms if one always implies the other) must depend on weighing all the available evidence for and against intervention, giving full weight to social as well as biological factors.

The Restabilization of Society

The argument has moved a long way from its starting point, the necessity of community and its present endangered state. I have tried to show that effective clinical medicine is not possible without a community dimension: because the community provides most of the informal care on which professional care outside hospitals has to depend, and because the practice of medicine cannot be scientific unless it is conceived of, planned and researched in relation to the needs of real, non-institutional, unreferred populations.

There are familiar arguments, from both the traditional Right and the traditional Left, which agree that society is, and always has been, going to the dogs. In 1912, BMA president Sir James Barr (Gilbert, B.B., British Social Policy 1914-1939, London: Batsford, 1966.) predicted that health insurance ‘would impair the independence, increase the sickness, and hasten the degeneracy of a spoonfed race‘. In 1979 a retired coal miner, asked if he thought things were getting better, replied ‘Oh ah, things are getting better, it’s only people are getting bloody worse’ (Seabrook, J., What went wrong? Working people and the ideals of the Labour movement, p. 31, London: Gollancz, 1979.). There is a lot of support for both these opinions among all classes of society, and though the clichés differ, the thought is essentially the same.

There is enough truth in both perceptions for all of us with sustained experience of real work in real communities to feel some sympathy at least some of the time with both these cries of despair; but the world has been going to the dogs for so many thousands of years, in which we have achieved such obvious moral as well as material advances, that we have to recognize them as the snarls of tired and bitter old men which, though understandable, are of no help to anyone. A certain minimum credulity and optimism about the future of the human race is not only a fundamental requirement for humane science, but also justified by the balance of historical evidence. Community is under attack, as it has always been, by powerful people who think they can do better for themselves by climbing on the backs of their fellows, and by weak people clinging to their coat-tails, but to suggest that it is down on a count of nine and that all we can do is curse the world’s fate is a betrayal of the cumulative struggle of 10,000 previous generations both to survive, and to make some permanent gain for civilization against piracy; as many or more people are working and speaking up for community today, as at any time in the past.

One of the functions of medical care has always been to stabilize society. The negative aspect of this has been obvious; doctors have largely replaced parsons as respectable and respected agents of the ruling class, policing the social insurance system in much the same way as clergy once kept the keys of heaven and hell. Yet it is also true that in a world packed with dynamite, some social peace has to be preserved even for necessary and inevitable conflicts between classes and ideologies to continue without destroying civilization itself, whatever opinions we hold about the ways in which it should be socially transformed. As Marx and Engels wrote in 1847, fundamental class conflicts throughout history have ultimately ‘ended either in a revolutionary reconstitution of society, or in the common ruin of the contending classes‘ (Marx, K., Engels, P., Manifesto of the Communist Party 1847).A primary care system based on participative democracy within relatively small units of population, the communities in which people actually live, could be an important obstacle to the de-civilization of society, whatever its motives or cause, as well as a starting point for community of a new kind.

The economic and political system we live in, the assumptions it leads to about what is possible, the limits to popular imagination, have not changed fundamentally since Bagehot wrote in 1867; but the quantitative changes in production of every kind of material and intellectual wealth, actual and potential, are colossal. The capitalist system, which cannot exist without continually revolutionizing techniques of production in search of lower labour costs and higher returns on investment, or without expanding to seek cheaper labour wherever it can be found, is compelled continually to destabilize itself. Nothing and no one is ever allowed to stay still, or even to proceed more calmly and peacefully in search of a better life; relax for one moment and whole industries are gobbled up, to reappear wherever there are poorer, more desperate but for the time being more profitable people. The gap between how the world lives and how it could live gets wider and more obvious every day.

We need a transformation of the ways in which we think about society and community, about medical professionalism, about health services, about how they should be financed from the social product, and about how central and peripheral leadership and initiative should be integrated. A social base for such a transformation exists, real though presently dispersed. How it might be mobilized is the subject of the final chapter.

Post a comment or leave a trackback: Trackback URL.

What do you think?

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 451 other subscribers

Follow us on Twitter

%d bloggers like this: