Main Menu

(6) Psychiatry

Principles of Mental Health Law & Policy

Mental health law

Mental health, human rights and legislation - World Health Organization

Mental health laws

Mental Health Legislation

The first medical model

The end of the asylum era was foreshadowed by the ‘first medical model’ in the 1920s and 1930s. Interest in psychiatry had received a boost during the First World War with the need to treat shell-shocked soldiers, while at the same time the asylums had become even more overcrowded and neglected. It was only from the 1920s onwards that really effective treatments were discovered and introduced. These caused widespread changes in attitudes and restored optimism. ‘Lunatic’ was replaced with ‘mental patient’, ‘asylum’ with ‘mental hospital’, ‘certification’ with ‘involuntary admission’, and voluntary admissions became common for the first time: a truly revolutionary change in perspective.

There had been a steady improvement in the drugs used to control agitation prior to this time but two new treatments were epoch making – malaria treatment for cerebral syphilis and electro-convulsive therapy. Julius Wagner-Jauregg (1857–1940) and malaria treatment Wagner-Jauregg was the only psychiatrist to be awarded the Nobel Prize for Medicine before Sigmund Freud in 1939. He received it for his 1917 introduction of malaria treatment for cerebral syphilis (then called general paralysis of the insane, GPI).

Before effective treatments for syphilis, a small proportion of chronically infected patients went on to develop the disease in the brain with disastrous consequences. It often took 20 years to develop, by which time the patient might be a settled family man. The terror it represented for 19th-century society is vividly captured in Ibsen’s play Ghosts. Onset of mental symptoms was sudden and dramatic. The philosopher Nietzsche inexplicably embraced a horse that was being abused in the street in Turin and within days was confined to a mental hospital; he died 11 years later never having recovered. Deterioration was tragic and humiliating. It was often associated with delusions of grandeur (hence all those cartoons of patients convinced they were Napoleon), and ended in dementia.

Wagner-Jauregg’s treatment consisted of infecting the patient with malaria parasites and waiting, with careful nursing, while the high fever raged. Over 10–12 cycles this killed the syphilis germs. The malaria could afterwards be treated with quinine. This treatment was difficult and risky but the alternative held no hope. GPI was effectively cleared from mental hospitals long before effective antibiotics arrived. Malaria treatment restored optimism to mental hospitals and strengthened the professional pride of the doctors and nurses who had to manage this difficult, but effective, treatment. It also forged clearer links with general hospitals where the patients often had to go to be treated. In doing this it became clear that involuntary patients would often cooperate with treatment and this stimulated a reassessment of the need for so much compulsion.

Electro-convulsive therapy

While malaria treatment is purely of historical interest, electro-convulsive therapy (ECT) is still widely used. Psychiatrists knew that epileptic seizures often caused profound changes in mood – either exciting or calming patients in the hours after a fit. It was also thought that epilepsy was uncommon in patients with schizophrenia so the idea developed that perhaps fits protected patients against this disease. Fits were induced in schizophrenia patients from 1935 by getting them to inhale camphor or by injecting a chemical called metrazol. The results were promising, with many patients improving. Unfortunately the experience (in particular the minutes leading up to the fit after the metrazol was injected) were very unpleasant indeed, with mounting dread, so many patients refused the treatments.

An Italian, Cerletti, came up with the idea of using a weak electric current to initiate the fit and used it on his first patient in 1938 with striking results. Several psychiatrists started to use ECT and its results were truly remarkable. While it did calm very agitated schizophrenic patients, its most dramatic results were with depressed patients, many of whom made sustained recoveries. If this all sounds a bit barbaric it pays to remember that depressed patients in the 1930s (even in very good mental hospitals) often stayed for years and up to one fifth died during the admission. Initially ECT was given without anaesthetic and clearly was a frightening experience often with complications of small fractures if the fit was very strong, headache, and memory loss. For the last 50 years patients have received a short-acting anaesthetic and a chemical to block the muscle contractions so there is no fit to see and no risk of fractures. Headache and memory loss are still problems but patients don’t recall the actual seizure.

The discovery and durability of ECT is typical of many developments in psychiatry. The idea that started it (that epilepsy protected against schizophrenia) was wrong but the treatment worked, although more in depression than schizophrenia. We still don’t know why it works, but it certainly does. It remains one of the most effective treatments in psychiatry and (despite its wider reputation) the one that most patients who have had it say they would want again.

Mental health legislation

Psychiatry is unique within medicine in being able to compel treatment against a patient’s clearly expressed wishes. Consequently most countries have evolved specific legislation to permit this and to monitor it. The whole of the asylum movement was firmly based in such legislation. The developments in England in the 19th century are easy to follow because it was an early nation-state with centralized government and little scope for regional variation.

The first legislation was to regulate madhouses. All this did was to register them. It set no standards but could close an individual madhouse in the event of flagrant abuse. The purpose of the Asylum Act of 1808 and the Lunacy Act of 1845 was to ensure that care was provided and prevent exploitation of the vulnerable mentally ill. It allowed for ‘the removal of the furiously mad’ from workhouses to the asylum.

Over the next half century, public concerns shifted from the neglect and abuse of the indigent mentally ill to the spectre of malevolent incarceration of the sane to rob them of their wealth. The ‘Alleged Lunatics’ Friends Society’, with an admiral of the fleet as chairman, gained considerable parliamentary and public support in late 19th-century Britain. Georgina Weldon (a ‘spirited, attractive, wealthy and well connected woman’) filled the Covent Garden Opera House in London in 1883 for a rally to challenge her recent incarcerations, and eventually won her case. Increasing public disquiet was reflected in the 1890 Lunacy Act. This highly legalistic document, several hundred pages and 342 sections long, prioritized the protection of patients’ rights to such an extent that early and voluntary treatment became virtually impossible. The leading historian of mental health legislation, Kathleen Jones, wrote that ‘it stopped progress in mental health policy in its tracks for half a century’.

So swings the pendulum of public attitudes to mental health. Virtually every developed land is struggling to balance legal rights and therapeutic needs, to balance society’s needs with the patient’s. Asylums limped onwards for another 50–60 years, mired in legislation and inhibited from innovation apart from the welcome treatment advances in the 1920s and 1930s. It was to be another 30 years before this awesome international institution was finally challenged and moved towards its end. The move into the community

After decades of being hidden from view, the mentally ill are now very much in the public eye. Hardly a week goes by without some headline about the plight of the homeless mentally ill or an incident involving a disturbed individual. ‘Care in the Community’ has become an international preoccupation with much soul-searching and fear of violence and disorder. How has this situation come about? Is it really so disastrous and, if so, what can be done about it?


The number of psychiatric beds in the West has shrunk to less than a third of what it was in 1955. Nearly every large mental hospital in the UK and most in the US have closed. The few remaining house only a fraction of the patients they once did. Chronic wards where long-stay patients lived out their lives have virtually disappeared. In the mid-1950s there were 500,000 psychiatric inpatients in the USA and 160,000 in the UK. Now there are less than 100,000 in the USA and less than 30,000 in the UK. This trend is virtually worldwide. This process, inelegantly entitled ‘deinstitutionalization’ started by reducing overcrowding and then closing wards. The last 15 years has finally seen the closure of whole mental hospitals. It is usual to attribute this emptying of the asylums to the discovery of antipsychotic drugs in the early 1950s. This was clearly the major force but it is not the whole story. Fundamental changes in social attitudes towards the mentally ill were afoot before these drugs were introduced. The impact of the new drugs varied enormously – from wholesale discharges in some countries, to almost no effect in others. Social attitudes and radical rethinking within psychiatry also exerted powerful influences. Later, financial considerations entered the picture. But let us start with the drugs.

The drug revolution

Like so many important discoveries chlorpromazine’s antipsychotic effect was found by pure chance. A French navy anaesthetist researching trauma and shock noted how it calmed patients post-operatively without sedating them. Two psychiatrists, Delay and Deneke, tried the drug in St Anne’s hospital in Paris in 1952 and were astounded by the results. By the tenth patient they knew they had a breakthrough. Over the next four years chlorpromazine became the front-line treatment in psychotic illnesses and the atmosphere in psychiatric wards was totally transformed. At its most immediate the drugs humanized the wards. Staff could begin to get to know their patients rather than just controlling them. Episodes of illness were both shorter and less disturbed so that rehabilitation and early discharge (before family relationships and jobs were lost for good) became realistic possibilities. Initially the drugs were used only for treating acute episodes but by the 1970s it was realized that staying on them reduced the risk of further breakdowns. This ‘maintenance therapy’ has become the cornerstone of long-term treatment of schizophrenia and other psychoses.

Over the last 50 years a whole range of antipsychotics has been developed. Most are about equally effective but their side effects are very different. The original chlorpromazine-like drugs often made patients stiff and lethargic. Newer drugs avoid the stiffness but can cause weight gain and diabetes. Some of these drugs became available as long-acting injections which means the patient can forget about taking them as long as they get their injection every two to four weeks.

The drug revolution was not restricted to antipsychotics. The first of the antidepressants (imipramine) was introduced in 1958. These had a longer lasting effect than ECT and were more acceptable to many more patients – by the early 1980s US physicians were writing 10 million antidepressant prescriptions a year. Lithium carbonate (a naturally occurring substance) was noted in 1949 to have a calming effect. It was introduced as a long-term ‘mood-stabilizing’ treatment for manic depressive disorder in 1968 and has substantially reduced the risk of further breakdowns.

This is not the place to detail the developments in modern psychiatric drugs but just to note that the progress has been accelerating. We now have a wide range of drugs for most recognized disorders. However, these are not ‘magic bullets’. No drug will completely cure all patients with a specific disorder but, carefully chosen, drug treatments can make a real difference to the vast majority of patients with mental illnesses.

The revolution in social attitudes

The Second World War

Psychiatry changed radically during the Second World War and gained new confidence because its contribution was highly valued (both in the selection of soldiers and in the acute treatment of combat disorders). Its increased profile and importance brought many doctors into it who would never have contemplated work in asylums. Fresh minds were brought to old problems. Previously healthy men transformed into nervous wrecks by battle challenged old fatalistic genetic hypotheses. Dramatic recoveries from battle-trauma with practical treatments (e.g. barbiturate injections to release or ‘abreact’ emotions from recent terrifying experiences) confirmed the role of stress and trauma. Psychiatry became an active and optimistic, almost glamorous, branch of medicine.

Therapeutic communities

The treatment of acute war neuroses by drug treatments was not the only Second World War advance. Psychiatrists with a psychoanalytical training obtained influential military adviser posts in both the US and UK. They explored how organizations themselves could influence mental health and recovery and developed the ‘therapeutic community’.

The therapeutic community emphasized that the organization of hospitals (or prisons or schools or offices for that matter) has a major impact on the well-being of those in them. For psychiatric patients it can be an opportunity for self-learning and recovery. Army psychiatrists noted the problems of treating ordinary private soldiers for psychological problems because they, the doctors, were senior officers. Rank and status simply got in the way. They actively reduced status differences in their units, encouraging informality and stressing the patients’ capacity to work together to help each other and solve problems. This allowed neurotic and disabled individuals to learn new ways of dealing with their problems in a democratic, tolerant, and enquiring group environment.

The therapeutic community movement improved care in mental hospitals and subsequently in prisons and residential schools for disturbed children and adolescents. It is a victim of its own success, as its lessons have become so accepted (even in commercial organizations) that their origins are forgotten. Psychoanalysis has suffered a similar fate.

‘Institutional neurosis’ and ‘total institutions’ About the same time it was recognized that traditional mental hospital environments could have a profoundly damaging impact on patients. Hospitals could themselves be the cause of some of the problems that they were striving to treat. Long-stay patients (usually those suffering from schizophrenia) who had been inpatients for years or decades, were noted to be apathetic, self neglecting, and isolated. This had always been considered a consequence of schizophrenia (a so-called ‘schizophrenia defect state’) and the plight and dependency of these individuals was one of the arguments sustaining mental hospitals.

This aspect of schizophrenia (unlike the acute symptoms of hallucinations, delusions, and agitation) did not respond much to the new drugs. But the hospital itself appeared to make a difference. It had always been known that there were good mental hospitals and bad ones. A study of three hospitals of similar size and staffing with equally ill schizophrenia patients in the 1960s found markedly different levels of apathy and self-neglect. The study showed that the differences related to the levels of activity and variety provided in daily routines.

A psychiatrist, Russell Barton, went further and proposed that much of this apathy was a response to living in an institution which denied personal responsibility. The apathy was a consequence of disuse – you simply stopped looking after yourself because somebody else always did it for you. Barton called this ‘institutional neurosis’ to stress that its cause was the hospital, not the schizophrenia. He reorganized things to give his patients more independence, with remarkable results. Many patients flourished in the new regime and were soon discharged. Rehabilitation (helping patients regain their lost skills and abilities) became a preoccupation in most mental hospitals and optimism grew that most of these apathetic, disabled patients would no longer need inpatient care.

‘Institutional neurosis’ stimulated change but its extent was undoubtedly exaggerated. There is an apathetic state that develops as part of long-term schizophrenia but it had been magnified by hospital routines. There were even some patients who had recovered and the staff had simply not noticed! Many of Barton’s early patients embraced their independence effortlessly, but such ‘overlooked’ patients are now rare and ongoing support is usually needed.

Erving Goffman and total institutions The Three Hospitals Study and Russell Barton’s institutional neurosis shook up the professions but they pale alongside the international shock wave caused by Asylums (1961) – a book by the American sociologist Erving Goffman. This groundbreaking study (he worked ‘undercover’ for a year as a cleaner in the wards of an enormous mental hospital in Washington, DC), his clear and radical thinking and, not least, his elegant writing simply stunned the establishment. Goffman described in convincing detail what really went on in an asylum – not what people thought went on. Doctors and nurses thought they shared a common understanding but Goffman showed that they did not – doctors judged patients using a disease and treatment model, whereas the nurses made judgments’ based more on behaviour and on patient motives. More tellingly doctors thought they ran the units but it was clear that for day-to-day life nurses, aides (and even other patients) set the rules and culture and held the authority. Goffman was not sympathetic to the asylum.

He went further. He concluded that the dehumanizing and degradation of patients resulting from inflexible routines and the absence of individualized care were not simply the regrettable effects of poorly trained staff and lack of resources (the usual explanations). He argued that such institutions actively eroded individuality. This was particularly characteristic of what he called ‘total institutions’ such as asylums, prisons, and the army. These typically meet all their Members’ needs – e.g. food, shelter, company, leisure. They rely on rigid distinctions between staff and patients (or prisoners and warders, or officers and men) and on demeaning rituals to erode and suppress individual identity. He argued that they do this to enforce discipline and make large groups of people more easily manageable. In the hospital in which he worked he cited the highly structured admission process that included not only medical examination but delousing, bathing, and hair cutting for all patients as one such potent and symbolic degradation.

Whilst (understandably) initially unwelcome to the professions Goffman’s writings have been a major force in driving the closure of the mental hospitals. His book Asylums is still the most quoted text in modern sociology 40 years after its publication. Ken Kesey’s 1962 book One Flew Over the Cuckoo’s Nest (and its enormously successful film adaptation staring Jack Nicholson in 1975) vividly portrayed the unacceptable face of such large impersonal asylums.



ar bg ca zh-chs zh-cht cs da nl en et fi fr de el ht he hi hu id it ja ko lv lt no pl pt ro ru sk sl es sv th tr uk

Verse of the Day

Global Map