Into the 21st century
New technologies and old dilemmas
It doesn’t matter what he does, he’ll never amount to anything.
(Albert Einstein’s teacher, 1895)
Computers in future may weigh no more than 1.5 tons.
(Popular Mechanics, 1949)
We don’t like their sound, and guitar music is on the way out.
(Decca Records on the Beatles, 1962)
It is risky to make predictions. Psychiatry at the beginning of the 21st century is very different to that of just a few decades ago. Who could have imagined that we would be able to visualise not only the living brain’s structure in minute detail but even watch as different regions light up with specific emotions or during hallucinations? Could we have foreseen diagnoses derived by computers or psychotherapies on the web with neither psychiatrist nor psychologist involved? Psychiatry is changing and the rate of change intensifies the dilemmas raised in texts above.
Optimists believe these will fade away as the scientific base of psychiatry becomes firmer, but there is little evidence for this yet. The areas of concern may shift but they seem, if anything, as pressing.
Improvements in brain science
We will continue to experience an acceleration in our understanding of brain functioning. Neurosciences have become the ‘hot topic’ in biomedical research, driven by increasingly powerful tools for visualising and measuring brain functioning. For so long it has been a mysterious, apparently inert organ with no moving parts. Modern imaging techniques reveal the brain’s dynamism, allowing us to observe activity spread through it with individual areas activating sequentially in response to stimulation. And not just responding to external stimuli, but solving mathematical problems or even discriminating between pictures of people we like or don’t like.
The big leap forward came with imaging. The structure of the brain has been studied in great detail by anatomists for over a century. They identified the function of areas by examining the brains of people who had had strokes and lost different functions. Shrinkage or damage to the brain could be demonstrated in post-mortems and by x-ray techniques. These later involved either injecting dye into the bloodstream or air into the ventricles (fluid-filled cavities that exist normally in the brain). Visualisation of the body’s structures took a great leap forward with CAT scanning and then MRI scanning. These use magnetic fields to produce amazingly detailed images of ‘slices’ through any part of the body (including the brain) that can be used to construct 3D pictures. While these techniques were useful in diagnosing brain tumours and demonstrating dementia they were of little help in most psychiatric disorders. Indeed, the term ‘functional disorder’ has long been used as shorthand for psychiatric illnesses precisely because no structural abnormalities could be shown.
Functional imaging has been a further advance for psychiatry. There are already three different types – measuring increased blood flow, measuring cell metabolism with the use of marked chemicals, and now even directly measuring the electric activity of nerve cells. We can now show that thinking and feeling are reflected in activities in different parts of the brain and that the same parts of the brain are active when patients hallucinate as when we hear real voices. Functional imaging has confirmed the complexity and interconnectedness of brain activity.
Have these imaging techniques changed psychiatry yet? They have certainly increased knowledge and helped us understand the biochemical systems in the brain associated with disorders, and this has improved drug research. There are, as yet, no major advances in clinical practice as a direct result. Some early experiments are under way to transplant brain cells in Parkinson’s disease into areas where activity has been demonstrated to be deficient. There has even been a trial of inserting minuscule ‘batteries’ into the brains of some people with chronic depression to see if they stimulate an increased release of transmitter substances and alleviate the depression.
This is a long way from the ‘Cyborg’ fantasy of so many films where small computer chips are inserted into the brain and control behaviour. There is a noticeable reluctance among neuroscientists to develop brain interventions for mental illnesses. Interfering directly with an individual’s consciousness and taking it out of their control generates strong resistance in scientists, as in the rest of us.
This is in contrast, however, to surgery and cell transplantation in brain diseases such as Parkinson’s disease, which do not have the same implications for selfhood.
The human genome and genetic research
Ever since Crick and Watson clarified the double helix structure of DNA in 1953 genetic research has been in overdrive. Genetics used to be the territory of plant and animal breeders applying Mendel’s laws, and medical researchers tracking familial diseases such as haemophilia and Huntington’s disease. Now it has blossomed into a programme which has mapped the very chromosomes and genes themselves. Previously geneticists could only inform patients about their statistical chances of passing on disorders. Now they can know for certain in some cases if the patient is carrying a disorder and even (as in the case of Huntington’s disease, a rare distressing disorder with both psychiatric and movement manifestations) predict if an at-risk individual will develop it years in the future.
Not many psychiatric disorders, however, have simple ‘Mendelian’ genetic patterns where half (dominant) or a quarter (recessive) of the offspring are destined to have the illness. Most of the major disorders (e.g. schizophrenia, bipolar disorder) do run in families and have an undeniable genetic component but there are almost certainly several genes involved and they have been very difficult to identify. There have been many false dawns. Currently, the likeliest candidate is Neuregulin 1 (a gene identified in schizophrenia families in Iceland and the West of Scotland). However, this gene is more widespread than is schizophrenia - possibly up to 30 per cent of the population may carry it. Neuregulin 1 appears to be necessary for developing schizophrenia, but not sufficient. Some life experiences (or perhaps a combination with other genes) are also required. So an interaction between nature and nurture is indicated. This may explain why the issue has been so resistant to the ‘either-or’ solutions of the past. It holds out hope that even in those with the genes for the disorder it may be possible to prevent schizophrenia developing.
While genetic research has yet to have a major clinical impact, in practice it has certainly stirred up thinking. What level of risk are we willing to take if we know that we have a higher than average chance of our child developing schizophrenia or depression? Would it be ethically acceptable to start screening for these disorders once the genes have been confidently identified? What if we were also able to identify genes for being clever -would it be OK to screen for that? Screening implies selection. It is usually only done if the individual wants to know whether to start or continue with a pregnancy.
These problems already confront some families with schizophrenia. An Australian service for treating young people with schizophrenia as early as possible has begun successfully to identify individuals who are at very high risk of developing the illness. These are usually adolescents in families with schizophrenic members and who themselves are ‘odd’ or ‘withdrawn’ and report unusual, but not clearly psychotic, experiences. Should the team, being fairly certain that the young person is likely to become ill, offer treatment with antipsychotic drugs? They have conducted a trial where they gave one-half drugs and the other half placebo. Those given drugs developed fewer psychoses. However, not all of those without drugs did become ill (i.e. if it had not been a trial they could have been given the drugs unnecessarily). The implications for young people at such a sensitive stage of their development are clearly enormous. This is just one example of the sort of decisions we will increasingly face as technologies improve.
Specificity of gene identification in mental illness is still a long way off. Any large-scale genetic screening to avoid psychiatric disorders would inevitably mean a steady reduction in the rich variety of human behaviour. How happy would we be with that - a world without Van Gogh or without Schumann?
Brainwashing and thought control
Most of the scarier fantasies about psychiatry have usually been about its ‘awesome powers’. Ever since the term ‘brainwashing’ was first used during the Korean War in the 1950s these fears have condensed around it. In truth, psychiatrists have little extra knowledge about such procedures beyond those well known in cognitive and group psychology. Psychiatrists and psychologists do advise governments and the military about how to persuade people but their techniques are hardly more advanced (if as advanced) as those of successful advertising agencies.
Millgram’s famous experiment is mistakenly quoted as an example of this power. He used actors to demonstrate that normal people were prepared to deliver severe, even life-threatening, electrical shocks to other people if told it was part of a psychology experiment. This study did not demonstrate the awesome power of psychology, but rather the scarier, but commonplace, propensity we all have to surrender our judgement to ‘authorities’. Where some of the earlier ‘science fiction’ fantasies have proved right is in the pervasive use of mood-altering drugs. ‘Soma’ in Aldous Huxley’s novel Brave New World was a drug to keep the masses contented and submissive in a totalitarian state. How far are we from that when 30 per cent of the adult population of France were taking psychiatric drugs in the late 1990s and when 10 per cent (and rising) of US schoolboys are taking Ritalin? There is an increasing availability of such drugs which both treat mental illness and also enhance how healthy people feel. ‘Better than well’ is how many describe the effects of these ‘designer’ drugs. People have always self-medicated with recreational drugs but now prescribed drugs are widely used to deal with normal life stresses. This risk that psychiatry may invade all aspects of our life and ‘medicalize’ the human condition is increased by the emphasis on the simpler, more reliable but democratically negotiated approach to diagnosis discussed above. The size of the psychiatric population used to be restricted by psychiatrists only giving a diagnosis when the patient’s experience and behaviour were felt to be fundamentally ‘different’. If a diagnosis follows automatically from a series of complaints (without being filtered through such a judgement) then there is little restriction on expansion. We increasingly encourage self-disclosure and attention to our feelings, hence perhaps the rapidly rising number of people who consider themselves depressed or anxious. Most of us welcome this more accepting, open approach to human experience. Equally, most of us support a more balanced relationship embodied in a psychiatric consultation which takes the patient’s symptoms more seriously than the psychiatrist’s preoccupations. But are we happy with the consequences as ever-increasing segments of our lives become labelled as a psychiatric disorder?
Despite all this, we enter the 21st century with remarkably similar dilemmas with which we entered the 20th. Compulsion in psychiatry has not gone away - rather increased somewhat. Similarly the fear that psychiatry may trivialise individual differences and treat people as objects remain just as strong. This conflict may now be played out between ‘evidence based medicine’ versus ‘post-modern individualism’, where once it was the crushing uniformity of large asylums versus the dignity of the patient. Society and psychiatry will always have (and probably should always have) an uneasy relationship balancing duty to the patients and duty to society in the social control of a small number of potentially dangerous individuals. The very durability of these debates reveals them as not simply technical problems. They reflect the tensions and paradoxes that are inherent to psychiatry as a discipline and with which we started these texts.
Will psychiatry survive the 21st century?
The imminent demise of psychiatry has been predicted for most of its history. Optimists (particularly those engaged in medical and biological research) anticipate dramatic breakthroughs that will tame mental illnesses in the way that antibiotics defeated tuberculosis or vaccination eradicated small pox. The mental hygiene movement also hoped that rational child care, reduced alcohol consumption, and improved social conditions would make the analyst and psychotherapist redundant. It hasn’t happened so far. The very success of modern medicine has brought with it the challenges of an ageing population with increased depression and Alzheimer’s disease. Greater openness and respect for individual feelings has resulted in an enormous increase in the demand for counselling and psychotherapy. The numbers of psychiatrists and mental health professionals has risen inexorably across the world. On a simple head-count of staff and the mounting demand for its service, yes it should continue to flourish.
But will it survive as it is now? Certainly, things are changing. Might the psychological and psychotherapeutic treatments separate from the more traditionally medical treatment of the psychoses? In many parts of the world, psychiatry has only recently gained its independence from neurology but we now hear strong calls for reuniting them as a logical development of a more powerful medical psychiatry for the future. Many psychiatrists already call themselves ‘neuropsychiatrists’. This is the case in many Germanic systems. There are several health care systems where the psychiatrists deal with the diagnosis and inpatient care, emphasising a highly scientific medical model. Long-term outpatient care of disabled patients is managed by social workers and psychologists/psychotherapists using a more interpersonal approach. These pressures are not new. What is new is the wide availability of highly trained clinical psychologists, nurses, and social workers with the necessary skills. Responsibilities and power structures are shifting and radically different practices evolving.
There is logic to such developments. Increased knowledge drives specialisation, so some fragmentation of psychiatry is inevitable. Despite this, psychiatry is entrenching as a discipline. Establishing departments of psychiatry independent of neurology or internal medicine is still seen as a marker of progress. Similarly when people can choose they still seem to want that mixture of medical expertise (or is it authority?) combined with psychological and emotional sensitivity traditional to psychiatry. Psychiatry’s medical pedigree gives reassurance yet few of us believe that it is really just a branch of medicine.
The mind is not the same as the brain. The defining characteristic of mental illnesses (and consequently psychiatry) remains their impact on our sense of self and on our closest relationships. Working with these is the hallmark of psychiatry and there is no evidence of society losing interest in it.