|

Medical scientists have devised numerous measures of the health of individuals. Examples range from those that are easy to observe, such as body temperature, blood pressure, and pulse, to those such as X-rays, ultrasound, and CAT scans that require complex technologies.
In a like manner, social scientists have devised a variety of statistical measures of the health of nations. These can be under three broad headings: health outcomes, the provision of health services, and lifestyle choices that affect health. All are central to any evaluation of a nation's standard of living and quality of life (Engerman 1997).

At the outset, it should be understood that measuring the health aspects of the quality of life is a complicated endeavor because they include many attributes such as longevity, morbidity (illness or disability), physical vigor, and deaths from various diseases. Health is all the more complicated to measure over long time spans because theories of disease, and therefore the kinds of data collected, have changed over the centuries. Moreover, the data available for study are more limited the farther into the past one searches for them.
Nevertheless, it is possible to provide a brief overview of progress using life expectancy at birth, which is the most widely used measure of health. 1 Over the past 150 years life expectancy has doubled, increasing from 38.3 years in 1850 to 76.7 years in 1998. Childhood mortality greatly affects life expectancy, which was so low in the mid-1800s in large part because mortality rates were very high for this age group. For example, roughly one child in five born alive in 1850 did not survive to age 1, but today the infant mortality rate is under 1 percent. The past century and a half witnessed a significant shift in deaths from early childhood to old age (Cutler and Meara 2001). At the same time, the major causes of death have shifted from infectious diseases caused by germs or microorganisms to degenerative processes that are affected by lifestyle choices, such as diet, smoking, and exercise.
Although the increase in longevity was approximately continuous during the twentieth century, there were substantial fluctuations during the 1800s. Annual data on life expectancy during the nineteenth century are unavailable, however; health fluctuations are considered in the discussion on physical stature later in this section.
The largest gains were concentrated in the first half of the twentieth century, when life expectancy increased from 47.8 years in 1900 to 68.2 years in 1950. Factors behind the growing longevity include the ascent of the germ theory of disease, programs of public health and personal hygiene, better medical technology, higher incomes, better diets, more education, and the emergence of health insurance. Table Bd-A provides a chronology of important medical developments that contributed to improving health. In drawing conclusions, one should keep in mind that the table lists discoveries and first uses of new techniques that often took years or decades to diffuse to large numbers of patients.
The research of Pasteur and Koch was particularly influential in leading to acceptance of the germ theory of disease in the late 1800s. 2 Prior to the work of these scientists, many diseases were thought to have arisen from miasmas or vapors created by rotting vegetation. Thus, swamps were accurately viewed as unhealthy, but not because they were home to mosquitoes and malaria. The germ theory gave public health measures a sound scientific basis, and shortly thereafter cities began cost-effective measures to remove garbage, purify water supplies, and process sewage. The notion that "cleanliness was next to godliness” also emerged in the home, where bathing and the washing of clothes, dishes, and floors became routine.
The discovery of Salvarsan in 1910 led to the first use of an antibiotic (for syphilis), which meant that a drug was effective in altering the course of a disease. This was an important medical event, but broad-spectrum antibiotics were not available until the middle of the century. The most famous of these early drugs was penicillin, which was not manufactured in large quantities until the 1940s. Much of the gain in life expectancy was attained before chemotherapy and a host of other medical technologies became widely available. The cornerstone of improvement in health from the late 1800s to the middle of the twentieth century was, therefore, prevention of disease by reduction in exposure to pathogens and by preparation of the immune system by means of vaccination against diseases such as smallpox and diphtheria. 3
The significant gains in life expectancy do not guarantee that health, broadly defined, increases in proportion. James Riley has argued that as mortality rates fall, morbidity rates (the incidence of illness and disability) rise (Riley 1991). His conclusion is based on the records of friendly societies and other organizations that give information on sick days of adult males; these records show that the number of workdays lost per worker due to illness increased from the eighteenth century to the 1980s. Suchit Arora challenges this analysis, arguing that increased income or more generous sick-leave provisions, rather than objective medical factors, were behind the trend (Arora 2003). His data showed that sickness days in the U.S. military, which give a more objective medical measure of morbidity, declined along with mortality rates during the early twentieth century.
In the past quarter century, historians have increasingly used average heights to assess health aspects of the standard of living (Steckel 1995). Average height is a good proxy for the nutritional status of a population because height at a particular age reflects an individual's history of net nutrition, or diet minus claims on the diet made by work (or physical activity) and disease. Growth may cease in poorly nourished children, and repeated bouts of biological stress – whether from food deprivation, hard work, or disease – often lead to stunting or a reduction in adult height. The average heights of children and of adults in countries around the world are highly correlated with their life expectancy at birth and with the logarithm of the per capita GDP in the country where they live (Steckel 2000).
This interpretation of average height has led to its use in studying the health of slaves, health inequality, living standards during industrialization, and trends in mortality. The first important results in the "new anthropometric history” dealt with the nutrition and health of American slaves as determined from stature recorded for identification purposes on slave manifests, which were required in the coastwise slave trade (Steckel 1998). The subject of slave health has been a contentious issue among historians, in part because vital statistics and nutrition information were never systematically collected for slaves (or for the vast majority of the American population in the mid-nineteenth century, for that matter). Yet the height data showed that children were astonishingly small and malnourished, while working slaves were remarkably well fed. Adolescent slaves grew rapidly as teenagers and were reasonably well off in nutritional aspects of health (Steckel 1986).
Figure Bd-B shows the time pattern in height of native-born American men, obtained in historical periods from military muster rolls, and for men and women in recent decades from the National Health and Nutrition Examination Surveys (NHANES series A 52.1). 4 This historical trend is notable for the tall stature during the colonial period, the mid-nineteenth-century decline, and the surge in heights of the past century. Comparisons of average heights from military organizations in Europe show that Americans were taller by two to three inches. Behind this achievement were a relatively good diet, little exposure to epidemic disease, and relative equality in the distribution of wealth. Americans could choose their foods from the best of European and Western Hemisphere plants and animals, and this dietary diversity, combined with favorable weather, meant that Americans never had to contend with harvest failures. Thus, even the poor were reasonably well fed in colonial America.
Loss of stature began in the second quarter of the nineteenth century when the transportation revolution of canals, steamboats, and railways brought people into greater contact with diseases. 5 The rise of public schools meant that children were newly exposed to major diseases, such as whooping cough, diphtheria, and scarlet fever. Food prices also rose during the 1830s, and growing inequality in the distribution of income or wealth accompanied industrialization. Business depressions, which were most hazardous for the health of those who were already poor, also emerged with industrialization. The Civil War of the 1860s and its troop movements further spread disease and disrupted food production and distribution. A large volume of immigration also brought new varieties of disease to the United States. Estimates of life expectancy among adults at ages 20, 30, and 50, which were assembled from family histories, also declined in the middle of the nineteenth century (Pope 1992, Table 9.4).
In the twentieth century, heights grew most rapidly for those born between 1910 and 1950, an era when public health and personal hygiene took vigorous hold, incomes rose rapidly, and congestion in housing was reduced. The latter part of the era also witnessed a larger share of income or wealth going to the lower portion of the distribution, implying that the incomes of the less well-off were rising relatively rapidly. Note that most of the rise in heights occurred before modern antibiotics were available, which means that disease prevention – rather than the ability to alter its course after onset – was the most important basis for improvement in health. The growing control that humans have exercised over their environment, particularly increased food supply and reduced exposure to disease, may be leading to biological (but not genetic) evolution of humans with more durable vital organ systems, larger body size, and later onset of chronic diseases (Fogel and Costa 1997).
Between the middle of the twentieth century and the present, however, the average heights of American men have stagnated, increasing by only a small fraction of an inch over the past half-century. Figure Bd-B refers to the native-born, and so recent increases in immigration cannot account for the stagnation. In the absence of other information, one might be tempted to surmise that environmental conditions for growth are so good that most Americans have simply reached their genetic potential for growth. But heights have continued to increase in Europe, which has the same genetic stock from which most Americans descend. By the 1970s, Americans had fallen behind Norway, Sweden, the Netherlands, and Denmark and were on a par with Germany. While heights were essentially flat in America after the 1970s, they continued to increase significantly in Europe (Steckel 2000). Dutch men are now the tallest, averaging six feet, about two inches taller than American men. Lagging heights lead to questions about the adequacy of health care and lifestyle choices in America. As discussed later in this chapter, it is doubtful that lack of resource commitment to health care is the problem, because America invests far more than the Netherlands. Greater inequality and less access to health care could be important factors in the difference. But access to health care alone, whether due to low income or lack of insurance coverage, may not be the only issues – health insurance coverage must be used regularly and wisely. In this regard, Dutch mothers are known for getting regular pre- and postnatal checkups, which are important for early childhood health.
Note that significant differences in health and the quality of life follow from these height patterns. The comparisons are not part of an odd contest that emphasizes height, nor is big per se assumed to be beautiful. Instead, we know that on average, stunted growth has functional implications for longevity, cognitive development, and work capacity. Children who fail to grow adequately are often sick, suffer learning impairments, and have a lower quality of life. Growth failure in childhood has a long reach into adulthood because individuals whose growth has been stunted are at greater risk of death from heart disease, diabetes, and some types of cancer. Therefore, it is important to know why Americans are falling behind.

Health also includes the vigor of life while a person is living, which is becoming increasingly relevant as the average lifespan has been extended and as baby boomers in the American population are approaching older ages. This aspect of health can be measured in numerous ways, including the absence (or incidence) of disease, illness, and injury, and by the conditions that limit activity or restrict functions that are normal for someone of a particular age. Compromised or poor health is reflected in days lost from school or work. Biological indicators such as average stature and birth weight also measure important aspects of health.
Some health professionals prefer to measure health by combining length of life and vigor of life in a concept called "quality-adjusted life years.” This approach evaluates attributes of health, such as mobility, dexterity, hearing, and so on, as well as the ratings that people give to alternative health states. Overall health at a particular age is gauged on a scale of 0 (most severe state, equivalent to death) to 1 (no disabilities or limitations). Thus, the health quality of life is measured relative to its disability-free level of 1.0. Attractive as this concept may be, it is unavailable for these volumes because data are lacking for historical time series. If sufficient data accumulate using this concept, they may become suitable for future editions.
The example of quality-adjusted life years raises an important issue for this work: the availability of data. These volumes present historical time series, and unlike modern researchers who may design surveys or interview people to obtain information, historians are limited to what has already been collected. 6
Although historians cannot conduct surveys in the past, they have been inventive in using information collected for other purposes. Much of our price history, for example, has been acquired from old newspaper quotes of transaction prices designed to inform contemporary buyers or sellers. Use of average height, which has been discussed, illustrates this point for the history of health.
An important boon to health in the last century was the control or near elimination of numerous severe infectious diseases, such as diphtheria, scarlet fever, whooping cough, and smallpox. Figure Bd-C shows the demise of measles near the middle of the twentieth century. Prior to 1945, this disease fluctuated broadly around 400 cases annually per 100,000 population, but vaccinations nearly eliminated measles by the end of the 1960s. In opposition to this good news has been the rise of AIDS (acquired immune deficiency syndrome) in the 1980s (also in Figure Bd-C). The epidemic peaked in the early to mid-1990s at about 40 cases annually per 100,000 population. Though not nearly as frequent as the major crowd diseases of the early twentieth century, this one is more debilitating and more likely to be fatal.
The National Center for Health Statistics compiles various measures of disability for children, including school-loss days, bed-disability days, and restricted-activity days ( series Bd699–701, series Bd741–743, and series Bd778–780). All climbed approximately 25 percent between the late 1960s and 1980, which is the era when American heights stagnated. In the early 1980s, they abruptly fell to late 1960s levels, but then drifted upward to 1990. Thereafter, two of the measures (bed-disability days and school-loss days) fell to early 1980s levels.
Series Bd920 shows that injury rates have declined over the past several decades. The number rose in the late 1960s and early 1970s, such that annually, more than one third of Americans were injured in some way. These high rates helped spawn a safety movement that has brought injury rates down to about a quarter of the population per year, primarily through lower accident rates at home ( series Bd948) and in places outside of work, the home, or automobiles ( series Bd957).

Most Americans are accustomed to receiving medical care in doctors' offices or in hospitals, but during the nineteenth century and earlier, most medical care was received in the home. Until the early twentieth century, most babies were born at home under the supervision of midwives (see Banks 1999). Similarly, home health care was commonly provided by relatives or by practical nurses. Various factors led to the rise of hospital-based care near the turn of the twentieth century, including new medical equipment that was expensive and bulky (or nonportable); the growing importance of diagnosis and treatment provided by a team of closely cooperating physicians, nurses, and technicians; the possibility of maintaining a sterile environment for medical procedures; and greater opportunity for monitoring and for rapid response to emergency situations. Nevertheless, much health care is still received at home, particularly by children with minor illnesses, by people who are recovering after treatment in hospitals or clinics, and by older individuals who have lost some capacity for self-care (Kahana, Biegel, and Wykle 1994).
It is doubtful that the practice of medicine a century and a half ago was, on net, beneficial for health. In Challenges to Contemporary Medicine, Alan Gregg quotes Lawrence Henderson's statement that "I think it was about the year 1910 or 1912 when it became possible to say of the United States that a random patient with a random disease consulting a doctor chosen at random stood better than a fifty–fifty chance of benefiting from the encounter” (Gregg 1956, p. 13). On the positive side, physicians of the eighteenth century had devised reasonably effective inoculations against smallpox that were replaced in the nineteenth century by the safer technique of vaccination, which created antibodies but did not transmit the disease. Physicians also set broken bones, but many of their actions were misguided because medical scientists lacked the modern germ theory of disease. Thus, most physicians spread infections by washing their hands following, rather than before, surgery. In addition, they also lacked an effective arsenal of medications to alleviate or treat most diseases or disorders. As a result, the demand for doctors remained small and few resources flowed into the medical profession.
Today the public recognizes the contributions of modern medicine to levels of health that are extraordinarily high by standards of the nineteenth century. The transformation began somewhat over a century ago when medical scientists formulated the germ theory of disease, and improved health followed from reduced exposure to pathogens provided by clean water supplies, waste removal, antiseptic practices, and personal hygiene. Table Bd448–462 details the decline of numerous infections diseases, such as typhoid, pertussis (whooping cough), and diphtheria, which eventually followed from the revolution in knowledge.
Bacteriologists of the late nineteenth and early twentieth centuries readily identified numerous infectious agents, leading eventually to several new vaccinations and medications. The most astonishing of these to the general public was the antitoxin against diphtheria, which was developed at Robert Koch's laboratory in 1891. The capability of curing a serious infectious disease significantly upgraded the image of medicine in the public mind. The antibiotic revolution was well underway by the 1940s, when sulfa drugs and penicillin were used to combat a wide array of infections. The obvious success of professional medicine in preventing diseases or finding cures in the first half of the twentieth century led to a substantial increase in the demand for health services.
Society realized very high economic rates of return on its investments in the early years of modern medicine (see Meeker 1980; Preston and Haines 1991). It was relatively cheap to provide clean water and waste removal, and the benefits were enormous. Similarly, the early antibiotics were inexpensive and remarkably effective against many infections.
The average rate of return was far lower per dollar spent on medical care in the second half of the twentieth century. It may seem paradoxical, but the lower return followed partly from the great success of early modern medicine. After the conquest of most infectious diseases, the next challenge for medical science was the treatment and cure of chronic disease. In retrospect, we have learned that these conditions are much more difficult to cure or treat than the infections that caused so many deaths up to the twentieth century. The rise in cancer rates up to the early 1990s is instructive in this regard ( series Bd490). While it is evidence of the difficulty of curing cancer, during the previous century infectious diseases would have killed many of these people before their cancers became evident.

The enormous flow of resources into modern medicine cannot be explained simply by the demise of diseases that were readily prevented by low-cost public health measures or easily treated with chemotherapy. Health insurance, which is now often part of employment benefit packages, emerged to greatly expand the use of medical facilities. The roots of this coverage can be traced to the middle of the nineteenth century, when insurance companies wrote coverage specifically to pay cash benefits following loss of income or inability to work from accidents. 7 In 1875, Americans also borrowed from Europe the concept of mutual aid societies, in which small contributions were collected from groups of workers to pay cash benefits after disability from injury or sickness.
Although single-hospital benefit plans existed as early as 1912 in Rockford, Illinois, the modern approach to health insurance – provision of hospital or medical services to workers – began in 1929 when Justin Ford Kimball established a hospital insurance plan for schoolteachers in Dallas. For a premium of fifty cents a month, the teachers were given up to twenty-one days of hospitalization in a semiprivate room. This approach to coverage became the model for various hospital insurance plans and the Blue Cross plans that spread thereafter.
The success of Blue Cross plans for hospital expenses led physician groups to create a similar model. In 1939, the California Medical Association established the California Physicians Service, which was the first of what were called Blue Shield plans for payments to physicians. Price and wage controls that were imposed during World War II removed the discussion of wages, but working conditions and other forms of compensation were considered. In this environment, unions brought health insurance to the table as an element of collective bargaining. After the war, the Supreme Court affirmed that health insurance and other employee benefits could be negotiated, and thereafter, coverage expanded rapidly. By the 1970s, four systems were providing health care services: commercial insurance companies; nonprofit Blue Cross and Blue Shield groups; managed care plans, which include health maintenance organizations (HMOs); and the government, through plans such as Medicare and Medicaid, which were established in 1965.
But
health insurance coverage
is not yet universal. Figure Bd-D shows that in the early 1980s, about 85 percent of the American population was covered by some type of health insurance. The coverage rate peaked at 87 percent in 1987, and in the next decade fell to 84 percent. Today somewhat more than 40 million people lack medical insurance in the United States. Figure Bd-D also shows that in the past two decades, coverage rates have remained roughly constant at 86 percent for whites and 79 percent for blacks, but have fallen from 71 percent to 66 percent for Hispanics. Thus, the growing share of Hispanics in the labor force has contributed importantly to the overall decline in coverage within the United States.
While insurance coverage has been enormously beneficial for health, the administration of this coverage has arguably led to abuses. Because the insurance often has low marginal cost (or copay amounts), some patients can request tests and procedures of low or questionable value at little cost to themselves. And physicians who may be afraid of malpractice litigation are often happy to oblige, even if the tests being urged by patients have little basis in medical fact. Opposing this view, however, are those who argue that the tests and procedures sponsored by the system are sometimes helpful in making diagnoses. One cannot appraise the "efficiency” of health care administration without some assumptions about the value of health and life and the payoffs from batteries of tests or procedures.
Health care analysts have also suggested that the American health care system has unbalanced priorities, overbuilding acute-care facilities and underbuilding to prevent and manage chronic illness and disability. Daniel Fox notes, for example, that about 40 percent of the acute hospital beds in the country have been staffed and empty, at a cost of perhaps $12 billion in 1992 (Fox 1993, p. 127). Whatever the balance (or lack thereof), hospital costs have certainly risen dramatically, increasing 15-fold from 1967 to 1996 ( series Bd116) while costs of physicians' services increased 7.5-fold, at a time when the number of hospital beds decreased by more than one third ( series Bd119) and the average length of stay declined from about 17 days to 8 days ( series Bd174).
Another element in rising health care costs is new technology. Diagnosis and treatment of numerous diseases have gone high tech, and therefore become more expensive. Computerized axial tomography (CAT) scans, magnetic resonance imaging (MRI), and the like were unknown a few decades ago but have become commonplace in the modern medical world. Despite these new machines and techniques, however, modern medicine is largely unsuited to the practice of factory or assembly-line methods that have made manufactured goods so cheap by comparison. Unlike widgets, each individual medical case is potentially different enough to require interviews and medical tests, careful weighing of possible complicating factors, specialized treatment, and individual follow-up. Although some have suggested that computers might eventually help to automate diagnosis and treatment, which would lower costs, medical practice has yet to take significant steps in this direction.
Whatever the explanations for the huge expansion of the medical sector in the American economy, Tables Bd1-447 document this growth in several dimensions: expenditures, prices, facilities and use, personnel, insurance coverage, and administration.
From the perspective of economics, expenditures provide the most comprehensive single measure of resource commitment. Figure Bd-E places American expenditures on medical care as a share of GNP ( series Bd34) in comparative international perspective. The figure shows that substantial growth occurred in the past several decades. In 1960, medical expenditures as a share of gross national product were about 5 percent, only slightly above the share in Western European countries with a relatively good health status, such as Denmark, Norway, Sweden, and the Netherlands. 8 Between 1960 and the mid-1970s, the share rose in all these countries to the range of 6–8 percent. But unlike the other countries, the resources devoted to medical care in the United States continued rising and in 1995 absorbed 13.5 percent of GNP, nearly twice that of the other countries shown in figure Bd-E. Thus, it is appropriate to ask whether medical resources in the United States are being efficiently and effectively allocated.
Tables Bd1-103 track this flow of resources in various ways, including type of service, source of funds, type of expenditure, and hospital expenses. Typical of an industry that faces growing demand for its output, Table Bd104–117 shows that medical care prices have also risen dramatically, usually outstripping the gains in prices of other products by a wide margin.
Tables Bd118-293 provide details on resource inputs in the form of facilities and personnel. These include hospitals and beds by type of service and by ownership (Tables Bd118-171), hospital use rates ( Table Bd172–189), and numbers of patient contacts ( Table Bd190–201, Table Bd202–211 and Table Bd285–293). Information on mental health facilities and patients is presented in Table Bd212–216, Table Bd217–231, Table Bd232–240). Hospital personnel are documented in Table Bd257–266, Table Bd267–276. Table Bd241–256 and Table Bd277–284 show numbers of physicians, dentists, and nurses, as well as data on facilities for their training. Notably, there has been a dramatic rise between 1967 and the mid-1990s in the number of skilled medical personnel per 100,000 population, with physicians increasing from 162 to nearly 300 and nurses increasing from 322 to approximately 800.
The trends in enrollment of medical students, given in Table Bd241–256, have some connection to the evolution of the American Medication Association (for discussion of the AMA, see Rayack 1967; Berlant 1975). The association was formed in 1846, and in 1904 it established its Council on Medical Education to raise standards for training by advocating a rigorous preparatory curriculum and by urging higher standards for facilities such as libraries and clinics. Between 1904 and 1920, the number of medical schools declined from 160 to 88 and the number of students shrank by approximately 50 percent, as numerous weak institutions closed their doors or merged with stronger schools. In contrast with mildly expansionist policies on enrollment in the 1920s, the AMA strongly advocated limits on enrollment as a way to reverse the declining incomes of physicians in the Depression years of the 1930s. World War II led to a sudden jump in enrollments through accelerated medical programs, which ended following the war. In the face of rising demand for medical education and the strains placed on facilities for training, the AMA favored substantial restrictions in the 1950s. Between 1958 and 1962, this policy was reversed and training capacity was expanded, leading to a doubling of enrollments between the early 1960s and the late 1970s.
Title IX of the Educational Amendments Act of 1972 was a second dynamic factor affecting the pattern of medical education. Prior to the early 1970s, the share of women among degree recipients in medicine and dentistry (and law, too) was relatively low, typically falling well below 10 percent. Within twenty years, however, the share had risen to more than one third and continued to increase in the 1990s. 9
Tables Bd294-447, provide information on the administration of health care delivery. Table Bd294–305, Table Bd306–317, give data on health insurance coverage, and Table Bd318–326 shows the number of HMOs by plan and type of enrollment. Series Bd318 shows that a dramatic change in health care administration occurred with the rise of HMOs, which grew in number from 176 in 1976 to 651 in 1997. Medicare enrollment, persons served, and utilization are given in Tables Bd327-406 and Table Bd407–412, Table Bd413–430, Table Bd431–447 depict Medicaid utilization, recipients, and payments.

Infectious diseases were major causes of death in the first half of the twentieth century. Although personal hygiene was a factor in contagion from some diseases, early in the century several major killers operated though the water supply, the food supply, accumulation of waste, or congestion in housing or place of employment. Thus, many of those who fell ill were victims of events or circumstances outside their control (given their incomes). The eradication of disease could be viewed in large part as the elimination or reduction of forces external to individuals and their lifestyle choices.
Although medical scientists have much to learn about the causes and control of today's major killers, such as heart disease and cancer, it seems clear that individual choices play a more important role in determining longevity and health quality of life than in the early twentieth century. Considerable evidence shows that substances such as tobacco and hard drugs are quite harmful to health. Table Bd630–638, Table Bd639–652 provide data on drug use and smoking.
Smoking has been on the decline for the past two decades. Series Bd631 on per capita cigarette consumption since 1900 shows that the habit grew significantly after each world war and peaked in the 1960s. Figure Bd-F shows that slightly more than 42 percent of people 18 and older smoked in 1966, but the share declined to about 26 percent in 1990, and has remained roughly constant since that time. The decline in smoking occurred for both men and women and across all education levels, but was substantially more pronounced among the highest education group ( Table Bd630–638).
The National Household Survey on Drug Abuse queries several aspects of drug use, including the number of initiates for various types of drugs. The trends during the 1970s, 1980s, and in the past few years are disturbing. The number of people who first tried cocaine increased from 77,000 in 1968 to 1,389,000 in 1982 ( series Bd643). It then fell to 480,000 in 1991 but has once again increased in recent years, reaching 675,000 in 1996. The first use of hallucinogens has also climbed dramatically, from slightly less than 100,000 in 1965 to more than 1,000,000 in 1996 ( series Bd645). Most of the gains in this substance occurred in two waves – 1967–1971 and 1992–1996. Initiates to heroin declined from about 100,000 per year in 1970 to 32,000 per year in 1992, but reached 171,000 per year in 1996. Marijuana initiates climbed from 68,000 in 1962 to 3,185,000 in 1975, then fell to 1,376,000 in 1991 and increased to 2,540,000 in 1996 ( series Bd639). Overall, drug use grew rapidly in the 1970s and early 1980s, fell off during the middle and late 1980s, but surged again after the early 1990s.

For many generations, people have known of a connection between health and components of the diet. In the middle of the eighteenth century, for example, experiments established that eating citrus fruits or drinking lime juice could prevent scurvy. The scientific basis of the connection progressed rapidly in the twentieth century, beginning with the discovery of chemical identities for numerous vitamins, which made it readily possible to synthesize additives that fortify many foods. Although it was a significant advance in nutritional science, the success of this effort has not been altogether beneficial, as vitamins acquired a mystique of healing in some circles, leading to excess consumption. Too much vitamin A may cause birth defects, for example, and large intakes of calcium and vitamin D can create kidney stones.
In recent decades, numerous studies have explored the connection between diet and health. 10 A significant portion of the population, particularly well-educated professionals, has absorbed the results of this research and modified lifestyles in accordance with recommendations. It is a demanding process, not only to read the outpouring of research but also to evaluate and act on suggestions. It is challenging for professionals to stay abreast of the field, and even diligent readers in the general public can be confused by highly distilled versions of research that are often presented without perspective by the press. 11 For example, eggs were once portrayed as a villain that raised blood cholesterol, but recent press reports observe that they are high in protein, contain little saturated fat, and pose no significant health risk if consumed in moderation by people without high cholesterol levels. 12 Salt has been linked with hypertension, but it now appears to be harmful only for those at risk for other reasons. No doubt some people concerned about their nutrition have simply thrown up their hands, retreating to dietary habits or to whatever is affordable and tastes good.
The apparent confusion in the field makes it more difficult to assess the responsiveness of the public. Some people may be apathetic, but others trying their best may not change behavior simply because messages have not been consistent and powerful for long periods of time. Yet some recommendations have been consistently proclaimed, among them the benefits of eating fruits and vegetables and the risks of consuming large amounts of saturated fats. The outcomes of dietary choices are presented in Tables Bd559-622. The decline in per capita food consumption after World War II reflects the large increase in the birth rate and the corresponding change in the age distribution of the population. By the time baby boomers reached the teenage years in the 1960s, food consumption per capita had increased.
Evidence discussed in the Surgeon General's Report on Nutrition and Health establishes the benefits of consuming fruits and vegetables. Figure Bd-G shows that per capita consumption of fruit (fresh and processed) rose irregularly from about 229 pounds in 1970 to nearly 281 pounds in 1995, a gain of more than 20 percent. The growing year-round availability of fresh fruits and vegetables in grocery stores may have contributed to this trend. Similarly, over the same period, consumption of fresh and processed vegetables also increased by approximately 20 percent, from 335 pounds to 405 pounds. At the same time, per capita consumption of red meat, which tends to be high in saturated fat, declined from 192 pounds to 163 pounds. This decline of 15 percent was replaced in the diet by a gain of more than 50 pounds per person in poultry, which is relatively low in saturated fat. Per capita consumption of total fat fluctuated but remained roughly unchanged at 155 pounds between 1970 and 1994 ( series Bd601). Despite a few favorable trends for health in the composition of American diets, there has been an impressive increase in the number of calories consumed per person per day ( series Bd598), bolstered by a substantial increase in total caloric sweeteners ( series Bd595).
The benefits of moderate physical activity for health are now well established, particularly for reduction of cardiovascular disease, which is the leading cause of death (U.S. Department of Health and Human Services 1996). Unfortunately, it is impossible to obtain a substantial time series of information that consistently measures physical activity or fitness. It is clear, however, that Americans are becoming increasingly obese, and a decline in the physical demands of work combined with a lack of physical recreation are factors contributing to this trend. Activity can be measured in a variety of ways, including self-reports contained in diaries, logs, or recall surveys that tabulate participation in specific types of physical activity. Direct monitoring with mechanical or electronic devices, which detect duration and intensity of activity, is another approach, as is physiological measurement of air intake or carbon dioxide production. All of these approaches, however, are expensive or burdensome for participants, and self-reports may be inaccurate or lack consistency. Physical fitness can be assessed directly though endurance, muscular fitness, or body composition, but these, too, are expensive and burdensome and have not been done over long periods of time for a representative cross section of the population. Given the growing importance of physical fitness to health, however, we can hope that results from a consistent monitoring program will be available in the future.

This chapter documents a revolution in American health over the past century and a half. Important changes have occurred in health status, lifestyle choices, and health care administration. Beginning in the late nineteenth century, cheap and highly effective public health measures led to the eradication or control of several major infectious diseases. By the middle of the twentieth century, chemotherapy had also enabled physicians to reverse numerous diseases after they appeared. These and other progressive steps in health practices led to significant improvements in health, including a gain of twenty-five years in life expectancy and nearly ten centimeters in average height of Americans.
Given the benefits of public health and modern medicine, significant new resources flowed into the industry. Realizing that good health improved productivity, many employers began making health insurance available to workers near the middle of the century. Government-sponsored programs such as Medicare and Medicaid arose in the 1960s and 1970s to expand health care coverage. Recently, employers have increasingly sought HMOs to provide medical services for their employees.
The health achievements of the twentieth century are remarkable in long-term historical perspective, but they should not be a source of complacency. Strains of bacteria resistant to antibiotics are evolving, and new infections such as AIDS and the Ebola virus have appeared in the past quarter century. Americans are falling behind several Western European countries in average height, despite the relatively large share of gross domestic product devoted to health care. Thus, numerous concerns remain for health and for the provision of health care services in America.
Figure Bd-B. Height of native-born men and women, by year of birth: 1710–1970
Sources
Figure Bd-C.
Incidence of measles and acquired immune deficiency syndrome (AIDS): 1912–1998
Sources
Figure Bd-D. Percentage of persons covered by health insurance, by race and Hispanic origin: 1975–1997
Sources
Figure Bd-E. Health care expenditures as a percentage of gross national product: 1960–1994
Sources
U.S. National Center for Health Statistics, Health, United States (1996–1997), Table 117, p. 250.
Figure Bd-F.
Percentage of adults who smoke, by years of education: 1966–1994
Sources
Documentation
Data are for persons age 25 and older.
Figure Bd-G. Per capita food consumption – vegetables, fruits, red meat, and poultry: 1970–1996
Sources
Documentation
Figures include both fresh and processed fruits and vegetables.

Arora, Suchit. 2003. "The Relation of Sickness to Deaths: Evidence from the U.S. Army, 1905–39.” Unpublished manuscript, April.
Banks, Amanda Carson. 1999. Birth Chairs, Midwives, and Medicine. University Press of Mississippi.
Berlant, Jeffrey Lionel. 1975. Profession and Monopoly: A Study of Medicine in the United States and Great Britain. University of California Press.
Cutler, David, and Ellen Meara. 2001. "Changes in the Age Distribution of Mortality over the 20th Century.” National Bureau of Economic Research Working Paper number W8556, October.
Dublin, Louis I., and Alfred J. Lotka. 1938. An Era of Health Progress. Metropolitan Life Insurance Company.
Engerman, Stanley L. 1997. "The Standard of Living Debate in International Perspective: Measures and Indicators.” In Richard H. Steckel and Roderick Floud, editors. Health and Welfare during Industrialization. University of Chicago Press.
Fogel, Robert W., and Dora L. Costa. 1997. "A Theory of Technophysio Evolution, with Some Implications for Forecasting Population, Health Care Costs, and Pension Costs.” Demography 34: 49–66.
Fox, Daniel M. 1993. Power and Illness: The Failure and Future of American Health Policy. University of California Press.
Gallagher, Charlette R., and John B. Allred. 1992. Taking the Fear out of Eating: A Nutritionists' Guide to Sensible Food Choices. Cambridge University Press.
Gregg, Alan. 1956. Challenges to Contemporary Medicine. Columbia University Press.
Kahana, Eva, David E. Biegel, and May L. Wykle, editors. 1994. Family Caregiving across the Lifespan. Sage Publications.
Komlos, John. 1998. "Shrinking in a Growing Economy? The Mystery of Physical Stature during the Industrial Revolution.” Journal of Economic History 58: 779–802.
Meeker, Edward. 1980. "Medicine and Public Health.” In Glen Porter, editor. Encyclopedia of American Economic History. Scribner.
Pope, Clayne L. 1992. "Adult Mortality in America before 1900: A View from Family Histories.” In Claudia Goldin and Hugh Rockoff, editors. Strategic Factors in Nineteenth Century American Economic History: A Volume to Honor Robert W. Fogel. University of Chicago Press.
Preston, Samuel H., and Michael R. Haines. 1991. Fatal Years: Child Mortality in Late Nineteenth–Century America. Princeton University Press.
Raffel, Marshall W., and Camille K. Barsukiewicz. 2002. The U.S. Health System: Origins and Functions. Delmar.
Rayack, Elton. 1967. Professional Power and American Medicine: The Economics of the American Medical Association. World Publishing.
Riley, James C. 1991. "Working Health Time: A Comparison of Preindustrial, Industrial, and Postindustrial Experience in Life and Health.” Explorations in Economic History 28: 169–91.
Riley, James C. 2001. Rising Life Expectancy: A Global History. Cambridge University Press.
Shorter, Edward. 1996. "Primary Care.” In Roy Porter, editor. The Cambridge Illustrated History of Medicine. Cambridge University Press.
Smille, Wilson George. 1955. Public Health: Its Promise for the Future; A Chronicle of the Development of Public Health in the United States, 1607–1914. Macmillan.
Steckel, Richard H. 1986. "A Peculiar Population: The Health and Nutrition of American Slaves from Childhood to Maturity.” Journal of Economic History 46: 721–41.
Steckel, Richard H. 1995. "Stature and the Standard of Living.” Journal of Economic Literature 33: 1903–40.
Steckel, Richard H. 1998. "Strategic Ideas in the Rise of the New Anthropometric History and Their Implications for Interdisciplinary Research.” Journal of Economic History 58: 803–21.
Steckel, Richard H. 2000. "Alternative Indicators of Health and the Quality of Life.” In Jeff Madrick, editor. Unconventional Wisdom: Alternative Perspectives on the New Economy. Twentieth Century.
U.S. Department of Health and Human Services. 1988. The Surgeon General's Report on Nutrition and Health. Department of Health and Human Services (Public Health Service) Publication number 88-50210. U.S. Government Printing Office.
U.S. Department of Health and Human Services. 1996. Physical Activity and Health: A Report of the Surgeon General. Superintendent of Documents.
Weatherall, Miles. 1996. "Drug Treatment and the Rise of Pharmacology.” In Roy Porter, editor. The Cambridge Illustrated History of Medicine. Cambridge University Press.
......................................
| 1. |
For more information, see Chapter Ab on vital statistics.
|
| 2. |
For a discussion of the history of public health, see Smille (1955) and Riley (2001).
|
| 3. |
For a discussion of medical practices, see Shorter (1996). Weatherall (1996) discusses the evolution of medications.
|
| 4. |
Average heights increased by more than one inch from 1920 to 1930, when the data source changes from military records to the nationally representative NHANES. This was a period of substantial improvements in public health and personal hygiene, and thus most or all of the increase could have been genuine. But it is also possible that those who served in the military had more net–nutritionally deprived backgrounds on average, which would impart an upward bias to the series in 1930 compared with 1920.
|
| 5. |
For additional discussion, see Steckel (1995) and Komlos (1998).
|
| 6. |
The time series on various diseases are taken from a set of sources that are consistent, but unfortunately, they do not always encompass the time span of most dramatic decline. Although not entirely comparable, the time series presented could be supplemented by evidence on causes of death from insurance records, which are discussed in Dublin and Lotka (1938).
|
| 7. |
For a discussion of the evolution of health insurance, see Raffel and Barsukiewicz (2002), pp. 24–28.
|
| 8. |
Life expectancy at birth in each of these countries exceeds that in the United States.
|
| 9. |
See Chapter Bc on education.
|
| 10. |
For additional discussion of nutrition and health, see U.S. Department of Health and Human Services (1988) and Gallagher and Allred (1992).
|
| 11. |
Adding to the confusion, some recommendations that have been "overturned” were never based on solid evidence, and for various reasons the media has perpetuated some nutritional myths.
|
| 12. |
See "Health: Eating Smart for Your Heart,” Time, July 19, 1999, pp. 40–54.
|
|