“Health for All?” critically explores global moves towards Universal Health Coverage and its language of rights to health, equity, social justice and the public good. Highlighting emerging ethnographic and historical research by both young and established scholars, the series explores the translations and frictions surrounding aspirations for “health for all” as they move across the globe. The series is edited by Ruth Prince.
The ambition of achieving universal health coverage, at least for the working population, was first articulated on the global stage by the International Labour Organisation after World War One. In this article we trace the changing forms and fortunes of the efforts made since then by different international agencies. We suggest that while this has long been justified in terms of both rights and economic pragmatism, there has been a shift towards the latter in recent times.
Universal health coverage (UHC) is a shifting signifier. Today, in World Bank parlance, it implies access for all to a cost-effective package of essential services and medicines, delivered through a mix of public and private providers. Its rationale is framed in both humanitarian terms, of uncoupling medical care from financial hardship, and in the practical economics of human capital. However, adherents of the social medicine tradition prefer another UHC – universal health care – which instead denotes access to a comprehensive range of services, organised by the state and free at the point of use. In the words of Asa Cristina Laurel, this more fulsome universalism is anchored in an understanding of ‘health as a human need and a social right’.
This tension runs through UHC’s etymology. The earliest such usage of ‘universal’ dates from 1911, in an English text describing Germany’s Bismarckian system of health insurance, which at that time aspired only to include blue collar workers, not the whole population. Adjoined to ‘coverage’, the term seems to have entered the health policy lexicon in the 1960s, again signifying the field of insurance, which was one mode through which Western nations financed their increasingly inclusive health systems. Thus ‘universal’ might not denote the whole population, and ‘coverage’ need not imply the state as provider.
Consideration of UHC as an aspect of global policy similarly reveals a long-running contest between a rights-based expansive vision and a more constrained, economic formulation. The earliest articulation came a century ago, in the peace settlement that followed the carnage of World War One. To manage dispute resolution, a new League of Nations had been created, whose Covenant vowed also to ‘take steps in matters of international concern for the prevention and control of disease’, and ‘secure and maintain fair and humane conditions of labour’. These goals led to the creation of two agencies important to the foundations of global health: the League of Nations Health Organisation (LNHO) and the International Labour Organisation (ILO). The LNHO’s activities included epidemic disease surveillance, the international classification of diseases, and public health system building. While universalism was not on its agenda, it did pioneer the promotion of social medicine programmes, combining rural health care with economic and social development.
The ILO meanwhile, became the first international organisation to promulgate notions of universal health care. Medicine was not central to its remit of labour rights, though its constitutional aims of 1919 included ‘… the protection of the worker against sickness, disease and injury arising out of his employment’. The context was the success of various Western nations (though not the USA) in establishing proto-welfare states, following the lead given in the 1880s by Bismarck’s Germany. This had demonstrated that state-mandated workplace insurance to address sickness, accident and old age was both functional and popular. Other nations followed suit, partly responding to the spread of representative democracy, partly seeking to improve national productivity, and partly reacting to the emergence of organised labour, whose dissent could be neutralised with welfare benefits. Thus, in concert with tax-funded public health services and voluntary charities, working-class citizens might attain reasonably comprehensive cover, albeit with many holes in the safety-net.
Drawing particularly on the skills of German bureaucrats, the ILO aimed to generalise the social insurance arrangements which sustained these systems. Initially, it worked through standard-setting conventions that member states were intended to ratify. Its first two ‘sickness insurance’ conventions, for industrial and agricultural workers, were adopted in 1927. The aim was that ‘(p)ractically all persons under a contract of service…’ should be entitled to sickness benefit and medical care, funded by compulsory contributions from worker and employer. In the event only a handful of governments ratified, as the Great Depression forced different priorities on them. The interwar crisis did however push the ILO to develop its advisory and advocacy roles, particularly in the successor states of Central Europe and in Latin America. By 1933 it was arguing, along with the LNHO, that ‘compulsory sickness insurance must be regarded as the most appropriate and rational method of organising the protection of the working classes.’
This position hardened during the Second World War into a bolder call for comprehensive medical care by right, for ‘all members of the community … whether or not gainfully occupied.’ The ILO’s Philadelphia Declaration of 1944, along with its Medical Care Recommendation, outlined two pathways to UHC, either through Bismarckian social insurance, or through taxation on the model pioneered by New Zealand in 1938. Despite endorsement by the ILO’s General Assembly, this ambitious vision of UHC would not last. A broad coalition of hostile interests emerged, including medical professionals, governments preferring voluntary provision, newly-independent colonial nations resentful of Western influences, and the US, which decried publicly funded systems as ‘socialised medicine’. The resulting social security convention of 1952 was so diluted, with so many exclusions and exceptions, as to be ineffective. The ILO’s push for UHC was, for the moment, moribund.
The 1950s and 1960s might in principle have seen its revival. Britain in 1948 had implemented its National Health Service, the archetypal universal system that was comprehensive and free at the point of use. Other advanced industrial nations moved towards population coverage funded through mandated insurance, social security, or local taxation. In the Communist bloc, statist health systems were established, such as China’s rural and industrial insurance schemes. Even the United States enacted partial cover for its older and poorer citizens, through Medicare and Medicaid. How though could similar access to medical services be achieved for poorer, post-colonial nations, now euphemistically dubbed ‘developing’ countries? Riding the tide of the UN’s International Year of Human Rights, the ILO sought to answer this by reviving earlier calls to extend ‘economic, social and cultural rights’. It then adopted as a blueprint a more generous Convention on Medical Care and Sickness Benefits (1969).
What about the World Health Organisation (WHO)? At its inception, the WHO had refused a full-blooded leadership role in pushing for UHC. This was despite the efforts of a social medicine faction amongst its founders, who had allied with the ILO to make this a priority. Instead its main focus would be major disease programmes, notably the eradication of smallpox and malaria through biochemical initiatives. WHO’s work on health services would be less ambitious. Its constitution pledged a ‘study and report’ function and the obligation to provide technical assistance to members, for example on planning services and workforce requirements. In the early 1970s, these planning models were infused with the language of ‘systems analysis’, an approach derived from operational research in business and defense. However, this new discourse of ‘health systems’ was technocratic and politically neutral, avoiding contentious issues such as rights and redistributive financing.
In practice then, postcolonial nations found their own way, though resource constraint meant UHC was never a realistic prospect. Funding was through a mix of taxation, insurance for the formally employed, and overseas aid, whether from missions, philanthropies, bilateral flows, or the new multilateralism of UN agencies. The ILO offered technical assistance in establishing social security structures, while painfully aware of their limited reach. A new, more significant player was the World Bank, which, under the presidency of Robert McNamara, became involved through its population programmes. Established at the Bretton Woods Conference in 1944, the Bank’s mission had rapidly evolved from post-war reconstruction to lending for economic development. Initially financing infrastructure projects such as dams and roads, in the 1970s it was influenced by the ILO’s ‘basic needs’ agenda. This argued that social goods should also be intrinsic to economic development, and not deferred until growth took off. Along with lending for maternal and child health services, it joined the WHO in an effort to control onchocerciasis in central Africa. Success with these initiatives encouraged it in 1980 to begin direct loans for health services and essential drugs, offering favourably low rates for poorer countries.
This coincided with a brief period in which the global community endorsed ‘Health For All’. The policy goal was announced at the World Health Assembly in 1977, and the following year’s Alma Ata conference of WHO and UNICEF declared health to be ‘a fundamental human right’, attainable through Primary Health Care. Specifically this signified low-tech medicine, community engagement in health services, and increased use of paramedical staff. The new agenda partially reflected a consensus that vertically organized disease programmes had run their course, following the success of smallpox eradication and the failure against malaria. It also built on initiatives generated in countries across the Global South, which had piloted cheap, basic health care for rural populations, and integrated health with economic development. Once again though, choices about funding were left to individual nations, who would choose their own mix of taxation, insurance, user fees and foreign aid.
However, the hopes for Primary Health Care as the route to UHC soon faded, as the political and economic context changed. ‘Neoliberal’ and conservative governments came to power in the 1980s, hostile to costly welfare states and enthusiastic about market alternatives. Meanwhile the slow collapse of the Soviet bloc meant that Western leaders redefined ‘human rights’ as synonymous with political rights of assembly and expression, not social benefits. More crucially, a sovereign debt crisis confronted much of Latin America, Africa and Asia, the result of ‘exuberant’ Western lending and the long-term impact of the oil price rises. Neo-colonial economic hierarchies reasserted themselves. There would be no debt forgiveness for poor nations, and access to loans from the International Monetary Fund would be contingent on agreeing to ‘structural adjustment’. Essentially this meant trade liberalization, encouraging the private sector and rolling back the state.
For health, the result was a retreat from universalism to ‘selective’ interventions of demonstrable cost effectiveness. UNICEF led the way, prioritizing funding for focused programmes of child health, with an emphasis on mass immunization. Meanwhile, the World Bank made a foray into health systems financing, privileging concerns with equity and efficiency over human rights. Its aim was to divert governmental spending away from costly curative medicine, which unduly benefitted the urban middle classes, and towards preventive measures. With respect to health services, it assumed broad willingness to pay out-of-pocket costs, so recommended that borrower nations should shift to funding by user fees or insurance. This proved to be a misjudgment, as evidence mounted that user charges were a barrier to utilization. Compounding these setbacks for UHC was the spread of HIV/AIDS, and the daunting challenges of tuberculosis and malaria. The 1990s saw unprecedented levels of money flowing into development aid for health, and new actors alongside the UN agencies, including philanthro-capitalists and public-private partnerships. All these changes militated against a concern with services and back towards disease-focused interventions where measurable gains could be achieved.
What were the prospects for UHC at the start of the 21st century? The original champion of health rights, the ILO, was now a marginal player whose influence had eroded. This followed the numerical decline of once powerful labour movements in the West and Latin America, and the offshoring of industrial production to places with weak protections. The ILO therefore reverted to campaigning for minimum standards of workplace rights rather than the fulsome social benefits which it once advocated. The WHO also diminished in importance within the new global health architecture, although from 2000 it successfully repositioned itself as a broker of advice on health systems design and management, and a disseminator of performance indicators. Countries like Ethiopia (home of WHO’s current Director-General, Tedros Ghebreyesus) and Thailand provided promising examples of how UHC could be achieved, through scaling up insurance and innovation in primary care. Meanwhile the World Bank developed more eclectic approaches to financing, including community-based health insurance, microinsurance and the estimation of the ‘fiscal space’ that would allow public funding. Along with the Institute of Health Metrics, it also compiled detailed country estimates of the ‘disease burden’, using Disability Adjusted Life Expectation, an indicator that combined mortality with morbidity to better guide national service planning.
These are the fundamental dynamics shaping conditions for UHC at the start of 2020, in the era of the United Nations’ Sustainable Development Goals, of which it is a part. This is very different from its beginnings in an era when nation-states sought legitimation as welfare states, and the ILO couched its crusade as one of workers’ rights. The labour solidarities which first gave it leverage have gone, and there are no stirrings of transnational class sympathies favouring a transfer of resources from North to South. World Bank estimates of the cost of achieving an essential package of UHC today suggest at least a doubling of aid flows, to supplement the capacities of poor countries. It also assumes that governments accept the pragmatic and ethical value of pursuing such a policy. Up until the start of the COVID-19 pandemic, slow progress in the teeth of populist nationalism seemed the most plausible global prognosis.
How might the current turn of events affect these prospects? If countries with UHC are seen to have lower rates of coronavirus fatalities than those without, then clearly proponents will have a compelling argument for change. However, at the time of writing (April 5th 2020) it is too early to meaningfully correlate national mortality and morbidity patterns with coverage indicators. There are also good reasons emerging to doubt that absence of coverage per se has been decisive to the spread of the virus. Social epidemiology, for example, points to habits of trans-generational co-residence to explain high infection levels in some countries, or to patterns of masculine hygiene behavior and consumption to account for the common trend of excess male deaths. Clinical epidemiology meanwhile raises possibilities that genetic predisposition, or prior exposure to BCG vaccine may also be factors. All this awaits verification.
What is already clear though is that political action on behalf of public health by states and intergovernmental bodies has been decisive. Prior planning for bio-security, decisiveness in applying lockdown measures, and competence in procuring and distributing protective equipment and testing facilities have quickly emerged as key issues. It therefore seems most likely that strengthening public health administration at local, national and international levels will be the most immediate policy focus when the crisis subsides. Whether UHC can advance on the coat-tails of a revivified social medicine, or will be consigned to a second-order problem by battered economies, remains to be seen.
Martin Gorsky is Professor of History in the Centre for History in Public Health at the London School of Hygiene and Tropical Medicine. His early research interests lay in the development of hospitals and mutual health insurance in nineteenth- and early twentieth-century Britain, and in the political history of health system change. His current project examines the history of health systems thinking and policy-making, both within international organisations and in case-study nations including Nigeria, Colombia, Malaysia and New Zealand.
Christopher Sirrs is a Research Fellow at the Department of History, University of Warwick. He earned his PhD at the London School of Hygiene and Tropical Medicine in 2016, focusing on the history of occupational health and safety regulation in Britain. Since then, his research has revolved largely around global health, exploring the intellectual and policy history of ‘health systems’ thinking, and most recently, the rise of anxieties around ‘fake’ drugs. His recent works include a journal article exploring the evolution of the International Labour Organisation’s approach to social health protection, and a forthcoming book chapter on the development of ‘international health accounting’.
- Defining Wellbeing: Tensions in the World Bank’s approach to Universal Health Coverage
- Entrepreneurship and E-Health: How Promises of Innovation are Transforming India’s Primary Care Sector
- Health for all? Access to healthcare among precarious populations in Norway
- The Invention of Infodemics: On the Outbreak of Zika and Rumors
- How to Make Sense of “Traditional (Chinese) Medicine” In a Time of Covid-19: Cold War Origin Stories and the WHO’s Role in Making Space for Polyglot Therapeutics