
Germs, medicine, and surgery
Despite the disagreements and no little nonsense in the name of science, the trust was justified, for two theoretical and two practical reasons. Of the theoretical, neither was entirely new, but both found their full realization after germ theory. The first was the separation between the cause of disease and the patient’s body. Germs were external, and although the individual’s response needed to be understood through events inside the body, the cause had to be identified elsewhere. The disease was something that happened to the patient, and although the blame culture of illness hardly disappeared (and is still strong, especially for sexually transmitted and so-called lifestyle diseases), the gap between patient and cause made it easier for doctors to develop objective criteria for diagnosis.
The second theoretical implication for germs was the heightened notion of disease specificity. The earlier sanitarian movement approached most epidemic diseases as of a piece, capable of changing their character as they moved through a community. ‘Filth disease’ was for Edwin Chadwick a single diagnostic category, whether it manifested itself as typhus, typhoid, cholera, erysipelas, scarlet fever, or any other of the epidemic diseases that spread through the overcrowded, urban poor. Germs provided a biological basis for the distinctiveness of the different ‘fevers’, and finally rendered fever a sign of disease, not the disease itself. Disease classification had become a major medical issue after routine registration of death (and its causes) became common throughout industrialized nations. International interest in major epidemics, above all cholera, increased the need for these registers of causes of death to be understandable across national boundaries, and concern with nosology was merely part of the extensive effort to make scientific and medical vocabularies more precise. The practical reverberations of germ theory were extensive, but two ought to be highlighted. The first was antiseptic, followed by aseptic, surgery. The use of the anaesthetic agents ether and chloroform from the 1840s had transformed the priorities of the surgeon, now that pain could be controlled. That these two substances were the products of chemical investigations highlights the ongoing importance of the laboratory for clinical practice. Ether anaesthesia was, incidentally, the first major American breakthrough in medicine, although its introduction was fraught with gothic tales of priority disputes, failed attempts at patents, and sordid ends to promising careers. The first public demonstration of surgery under ether took place in the Massachusetts General Hospital on 16 October 1846. News spread to Europe as fast as boats could carry it, and national medical histories are full of local ‘first’ operations using the new substance. Chloroform followed within a year, and the search was on for other agents that could render patients pain-free.
No medical innovation is ever without contention, and anaesthesia was no exception. Its use in childbirth was resisted by a few who believed that the Biblical injunction to Eve meant that childbirth should be painful; some military doctors thought that wounded soldiers needed the stimulus of pain the better to endure the operation; and a few deaths during anaesthetic administration alerted doctors to the dangers of the substances. These issues are sometimes emphasized in the historical literature, but the rapidity with which the new possibility of pain control spread, both among doctors and their patients, is the most striking aspect of anaesthesia’s early history. Giving surgeons more time to operate made conserving tissue easier, but the longer exposure of the open wounds to the air also increased the possibility of post-operative infection. Consequently, anaesthesia enlarged the range of operations surgeons could perform, but not necessarily the chances of a patient’s surviving the ordeal. Anaesthesia provided the basis of one aspect of modern surgery. Antisepsis, and especially asepsis, provided the second. Antiseptic surgery was pioneered in the late 1860s by Joseph Lister (1827–1912). Lister was of Quaker stock. His father helped develop the achromatic microscope, so he grew up in a scientifically orientated household. He was reputedly present at the first public operation in Britain using ether, performed by the professor of surgery at University College Hospital, Robert Liston (1794–1847). Lister published substantial papers on microscopy while still a medical student, and after graduating from University College London, he headed to Edinburgh to further his surgical studies. There he married his professor’s daughter and spent almost two decades in Edinburgh and Glasgow, during which time he introduced his system of antiseptic surgery, in 1867. Lister was inspired by Pasteur’s researches on the role of micro-organisms in fermentation, putrefaction, and other vital processes, and cited Pasteur in his original publication. Combining Pasteur’s insights with the knowledge that carbolic acid (phenol) was successfully used to disinfect sewage, he used carbolic dressings on surgical wounds to show that compound fractures (that is, where the broken bone perforated the skin and was thus exposed to the atmosphere) could be successfully closed with this treatment. The usual alternative for a compound fracture was amputation, so poor were the surgical attempts to close it and thus save the limb. Lister’s rationale was complex, and he later reconstructed his early work to make it appear that his antiseptic system was rooted in a germ theory of wound infection. It was actually based on a belief that dust particles in the air transmitted the sources of contamination (Pasteur’s spontaneous generation experiments had excluded dust from his flasks), and that by dressing the wounds with carbolic-soaked dressings, he excluded the sources of wound infection. Lister’s system worked and he began teaching it to his students. A number of surgeons rejected it, especially those who had already been achieving good surgical results through simple cleanliness. The Franco-Prussian War offered a good, if unplanned, comparative trial, since German surgeons had begun to take up his measures and French ones had mostly resisted. The German surgical experience of the war was much superior to the French, and Lister’s name began to be associated with a particular kind of surgical technique. Lister himself was always a fairly conservative surgeon and continued to confine himself to the traditional domains of surgery: the limbs, joints, bladder, and superficial parts of the body.
Lister continued to modify his antiseptic regime, introducing a carbolic spray and changing the routine of after-care for the surgical wound. He continued to get good results and acquired an international reputation. He and Pasteur had great respect for each other, frequently appearing on the same platform at international medical meetings that were increasingly common in the latter decades of the 19th century. As appreciation of the role of bacteria in wound infections grew, his own system gradually changed its theoretical justification and became more closely identified with the new science of bacteriology. Antiseptic surgery had a limited life, in any case. It was soon replaced with aseptic surgery, the aim not being to kill contaminating germs, but to exclude them in the first place.
Asepsis excluded bacteria as completely as possible, by sterilizing equipment, instruments, dressings, the surgeon’s hands, and the patient’s skin. It worked on the general principle that the tissues of the body are germ-free to begin with, and if bacteria could be excluded during the operation, the wound would heal naturally, by what surgeons had long called ‘first intention’: healing of the wound without pus formation. Aseptic principles finally opened the three major body cavities – abdomen, thorax, and cranium – to the scapel, and surgery became the glamour speciality during the last third of the century. Techniques that Koch and others had introduced into the bacteriological laboratory found their natural application in the operating theatre, increasingly a separate, carefully regulated space in hospitals.
As surgeons operated on the previously forbidden cavities, their initial success rates were very low, as other problems, such as excessive bleeding and infection, emerged. (The gastro-intestinal tract, for instance, is open to the outside world at both ends, so the bowels are not sterile like most internal parts of the body generally are.) Knife-happy surgeons became convinced of the adage ‘A chance to cut is a chance to cure’, as many conditions that physicians had been able to diagnose but not to do much about seemed suddenly to be amenable to radical treatment. We should remember the early mortality of heart transplants before we condemn a previous age, but the structures of modern-day audit were not in place, individual surgeons had relationships with individual patients, and conditions that we would not judge surgical were subjected to the knife. Thus, ovaries were removed for hysteria or menstrual pain, large segments of bowels for constipation or chronic tiredness, and tonsils were removed routinely, as a prophylaxis against all sorts of childhood complaints. The doctrine of ‘focal infection’ enjoyed great vogue during the early 20th century, and was used to justify the removal of portions of bowel, teeth, tonsils, and other organs for all manner of ailments, including insanity. Modern surgery was thus built on the new power relations between surgeons and patients. Surgeons could do more and patients needed to believe in their surgeons. The historical literature tends to emphasize the outlandish operations, or those with high mortality rates and little chance of success. Looking at the impressive technical developments within surgery in the half-century before World War I, it can be seen that surgical technique grew faster than the support network (blood transfusion, antibiotics to counter infection, intensive care room monitoring), and that the ethical standards that (mostly) govern modern medical and surgical practice were not in place. There was wide variation in diagnostic fashions as well as technical ability among surgeons, so it behoved patients to choose their surgeons well. It still does.
The second major practical legacy of bacteriology was the ability to understand the sources and patterns of infections and epidemic diseases, and to react appropriately: laboratory medicine informing community medicine. Bacteriologists were ‘experts’ in ways that old-style sanitarians were not, and therefore they had more clout with governments and politicians. Chadwick advocated ‘clean’ water, but what constituted clean changed with the realization that specific pathogenic bacteria were transmitted by water, and therefore water needed to be analysed before passing it fit to be drunk. The same goes for food additives, meat quality, air purity, and the host of other things that we consume. Scientists have taken the lead in defining these things, and have provided a basis for an all encompassing public health.
Physiology: the new rigour
Bacteriology was the medical science with the most impact on the lives of ordinary individuals during the late 19th century. Experimental physiology aroused the most tangible outcry, since physiologists began systematically to operate on living animals. Bacteriologists used a lot of animals too, but their experiments did not arouse the emotion that experimental physiology did, especially in Britain, where physiology was better developed than bacteriology.
The Germans created institutes in all the medical sciences, the most notable one in physiology being that of Carl Ludwig (1816-1895) at the University of Leipzig, where students from all over the world trained. Ludwig was one of a group of four young physiologists who during the revolutionary year 1848 issued a manifesto, declaring that all the problems of physiology could be solved by the systematic application of physics and chemistry. Two of the others in the group went on to head physiology institutes in Berlin and Vienna, and the fourth, Hermann von Helmholtz, eventually turned to physics. In addition to important work in electromagnetism and conservation of energy, he was an expert in the physiology of the special sense organs, and the physics of hearing. All four of the group maintained their basic physical orientation to physiology. Ludwig’s principal research interests were the functions of the heart and kidneys, and his textbook was popular both in the German-speaking lands and abroad, through translations. German was the language of medical science in the period, so even the German edition had a wide international readership. The laboratories of these and other German physiologists began to acquire a modern look, as scientists availed themselves of the latest technological aids. Helmholtz invented the ophthalmoscope, and Ludwig developed the kymograph, a turning drum connected to a recording device that allowed the measurement of continuous functional variations, such as the pulse, muscle contractions, or changes in tension. The graphical recording of vital events has increasingly characterized biomedical research and clinical medicine. Physiology flourished in Germany, although the pre-eminent physiologist of the century was French: Claude Bernard (1813-1878). He went through the Paris medical school, and recognized that the clinical orientation that dominated it could only go so far in understanding disease mechanisms or in searching for new remedies. An unhappy marriage at least brought him a dowry to allow a career in medical research, although his animal experimentations further alienated him from his wife and daughter. Bernard was above all a gifted surgical craftsman within the laboratory. His early work elucidated the role of the liver in sugar metabolism, and the function of the pancreas in digestion. He made further important discoveries in the functions of the peripheral nerves, elucidated the way in which carbon monoxide poisons, and produced a kind of diabetes through a selective destruction of a portion of the brain. He was above all intrigued by the way physiological mechanisms worked together to produce a functional whole animal. His concept of the ‘internal milieu’ helped explain how organisms function by keeping within a narrow range many physiological parameters, such as temperature, the ionic salts in the bloodstream, and blood sugar. His concept was later renamed ‘homeostasis’ by the American physiologist Walter Cannon and it remains fundamental to our understanding of health, disease, and evolution. Bernard had a philosophical turn of mind, and he summarized his own research career, as well as developing a philosophy of medical research, in his classic Introduction to the Study of Experimental Medicine (1865). It remains a book well worth reading. In it, Bernard argued that the laboratory was the true sanctuary of medical science. In the hospital, where sick patients need care, and the number of variables means that observations are only piecemeal, no real experimental science can flourish. Only in the laboratory can the experimenter keep variables constant, so that changes can be unambiguous. Pasteur once opined that chance favored the prepared mind, and Bernard was alive to the role of chance observations leading him into fruitful investigative paths. For instance, rabbit urine is usually alkaline and turbid; observing the urine of fasting rabbits turn acidic, he reasoned that they were metabolizing their own tissues. This led him to investigate the digestion of various foodstuffs. His philosophy of discovery was what is now called the hypothetic-deductive method: a scientist forms an hypothesis about some phenomenon. He then deduces what might happen in consequence and experiments to see if his hypothesis is correct, being careful to put aside his expectations while doing the experiment. Bernard likened this to a hat being the thinking facility. The good scientist puts his hat on the rack while doing the experiment, but he does not forget to put it on again when leaving the laboratory, to think about what he has seen, and what it means. On the basis of his experiment, he can confirm, reject, or modify his hypothesis, and then, if necessary, further test it.
For Bernard, the three pillars of experimental medicine were physiology, dealing with normal function; pathology, investigating abnormal function; and therapeutics, concerned with discovering effective remedies. His own researches contributed to each of these fields, but the important point was that each had to be rigorously experimental, a goal achievable only in the laboratory. Field work, autopsies, and bedside observations could provide the raw data and help formulate pertinent questions. The essential goal of science, however, was to elucidate mechanisms and causes. Bernard and Pasteur were friends, and the former recognized the importance of Pasteur’s work, even if he died before its full potential was realized. Pasteur saw in Bernard an eloquent apologist for the experimental method within medicine: the future.
Although experimental physiology took the brunt of the antivivisection movement, only in Britain was there legislation to regulate animal experimentation. The 1876 Cruelty to Animals Act initially worried medical researchers, but in fact it provided a reasonable framework within which to pursue animal-based research, and by moving research away from private laboratories within scientists’ own homes, helped to institutionalize it within public and university settings. The most important tool for physiologists was anaesthesia. Not only did it prevent pain in experimental animals, it also made operative conditions easier. Antiseptic and aseptic techniques also served physiology, another instance of clinical medicine and experimental science reinforcing each other. A number of medical specialties benefited from physiological research. Neurology, for instance, relied on work on cerebral localization. Cardiologists made use of animal research on cardiac contraction and the regulation of the heart-beat. Endocrinology (the medical specialty of the glands) was made possible by the discovery of the hormones, by two physiologists, Ernest Starling (1866-1927) and William Bayliss (1860–1924). Medical and surgical specialties were not simply ‘natural’; they also relied on the activities of groups of individuals keen on careers and prestige. But medicine and surgery by the outbreak of World War I could call upon much knowledge that had been gained within the laboratory, and increasingly by individuals whose careers were within medical science, not clinical medicine.