Stay informed: Sign up for eNews Subscribe
Read Chapter 1

Chapter 1

Primary Care Roots

  Big Doctoring is about a way of medical life, an approach to health care and healing, a skill set, and a mind set that is called primary care. It is about doctoring that is humanist, comprehensive, efficient, and flexible, doctoring that builds on the legacy of the past and the rich tradition of care in medicine and nursing. To that it adds the science and technology of the contemporary world, applied in a measured, evidenced-based, and coordinated fashion. In our current culture of medical care—noteworthy for its sophistication of technology, its inexorable cost increases, the absence of uniform access to its benefits, a high rate of medical errors, and the uncertainty of many outcomes—primary care provides a foundation for health care that blends good science with good judgment.

Yet primary care is not a philosophy or a vocational inclination shared by everybody in the healing sciences. In fact, for decades a tug-of-war has been taking place between advocates of generalist approaches to medical care and proponents of narrower, specialty-based philosophies. The fifteen primary care clinicians profiled here are men and women whose work is characterized by a broad approach to patient care and the community: they are practitioners of applied generalism, dedicated holists. Their work, however, takes place in the larger setting of the economics and politics of health care in the United States, and an understanding of this larger context is essential to any discussion of the present and future of primary care in the United States. The definition of primary care, its values, the role of generalism in contemporary life, and a review of the history of primary care in the United States are all elements of that understanding.


What Is Primary Care?

Like the Mona Lisa's enigmatic smile in Leonardo da Vinci's masterpiece, primary care means different things to different people. Unlike many elements of health care today, primary care is not defined by an organ system (cardiology), a place (the outpatient department), or a financial precept (capitation payment). It is an idea that, by turns, describes a type of practitioner, a domain of service, or a philosophy of care. This definitional ambiguity means that many roles can be ascribed to primary care, but it also can be an area of troublesome imprecision and confusion.

To clinicians primary care is a label that describes certain types of practitioners—even though no one is actually schooled or board-certified in a discipline called "primary care." Family physician, pediatrician, internist, physician assistant, and nurse practitioner are the professions most often grouped under the heading of primary care, but on occasion many other medical and nonmedical specialties claim primary care status. To patients primary care can mean the provider (a word that is itself greeted with ambivalence) who knows them best, giving comprehensive, "high touch" care over the years. Or it can mean the designated medical grinch who bars their access to coveted specialty services. Payers and policymakers see primary care as a set of attributes and a level of care that promote the rational and cost-effective delivery of medical services in a culture much given to unwarranted subspecialty care and the use of the hospital.

These definitions are further complicated by an important but largely unarticulated divergence of opinion about the ethos of primary care, about why primary care matters. Beyond its functional role, what is the moral role of primary care in society? Many proponents have seen the primary care movement as a battle for the soul of medicine. This struggle has been especially apparent in poor and rural communities that have been largely abandoned or neglected by contemporary medical practitioners and where primary care is viewed as a special mission to serve the underserved. In both mainstream and marginalized communities the role of primary care is seen as bringing competent, comprehensive doctoring to bear, healing in an omnibus sense. In this view, biopsychosocial skills are important, as are capabilities in areas as diverse as epidemiology, Spanish, community organizing, and short-term psychotherapy. This might be called the "social justice" view of primary care.

Set against this is the "industrial efficiency" view that sees primary care as the foundation of all systems of health care. From this perspective, the primary care provider is simply the "field captain" best qualified to make sense of a complicated and often inefficient system. Not only can the primary care clinician treat the majority of the problems that patients bring to the medical system, but he can carry out well-informed triage for hospital and specialty referrals. It is this latter gatekeeping capability that has special appeal to the business-oriented values of health systems planners and insurers.1

Working in the midst of these competing and sometimes conflicting definitions, the Institute of Medicine of the National Academy of Sciences convened a panel in 1994 to deliberate on the future of primary care. That panel produced a working definition of primary care that has stood up well in the ensuing years and certainly captures the essence of primary care as it is being lived by the practitioners whose lives are documented here: "Primary care is the provision of integrated, accessible health care services by clinicians who are accountable for addressing a large majority of personal health care needs, developing a sustained partnership with patients and practicing in the context of family and community."2 Here I have limited my discussion of primary care to clinicians whose training and practice conforms most closely to this definition: general internists, general pediatricians, family physicians, physician assistants, and nurse practitioners.



The values of primary care have always been present in the medical care system under one label or another. These are the values of medical generalism (a term I use interchangeably with primary care), and they center on the treatment of the whole person with attention to his or her biology, psychology, and social and community situation. Primary care encompasses mental health, public health, and community health as well as personal health care. It involves both first-contact care and care that is given over time. Comprehensiveness, continuity, and coordination have always been associated with primary care practice, as have accessibility and accountability.3 The proverbial horse and buggy, the home visit, the doctor who delivered the baby and treated grandma in her final illness, affordable care, and general practice are traditional ideas that have given expression to the values of primary care. To these concepts have recently been added family medicine, health promotion and disease prevention, a "medical home," the general internist and pediatrician, geriatric care, patient and community education, and the new disciplines of the nurse practitioner and the physician assistant.

The generalist approach to medical care is more than a tradition or a stylistic preference. Primary care brings benefit to both the care of the individual and to the health care system as a whole. From a personal perspective, "knowing a doctor who knows you" is a widely held value but one that can be hard to fulfill in a system dominated by specialists skilled in one organ system or another but who disavow responsibility beyond their own area of expertise. A friend recently discharged from a hospital stay captured this sentiment poignantly: "It would have been fine if I could have dropped my body off the way I do my car at the repair shop, and picked it up a week later. The problem was that I had to stay with my body, and that was awful. Nobody really took charge. Everybody took care of their own thing, and I was left ultimately to fend for myself. Whatever happened to doctoring?" A skilled clinician who values discussion, education, and prevention, and who can make referrals and provide insider advice on the system, is an asset to individuals and families. Finding and retaining a clinician with these qualities is not always easy, but few among us would question the value of having a proficient generalist as a personal physician.

Studies have documented time and again that systems of care based on the generalist model cost less, provide excellent quality, and have high levels of patient satisfaction.4 In nations such as Canada and Great Britain, which have built their medical care systems on an explicit primary care model, health care systems enjoy higher levels of citizen satisfaction despite considerably smaller expenditures than in the United States.5 Similarly, in the health maintenance organization (HMO) movement, from its earliest manifestations in the 1930s to the present, has always held that strong primary care to be the basis of any sensible system of quality care.

Primary care has its detractors who argue that it is a bad idea or, at the most, a nice idea from the past that has diminishing relevance in the present and the future. This argument stems from the premise that doctoring today has emerged from centuries of practice that were based largely on tradition, personal belief, and hokum. Today's medicine is increasingly evidence-based and scientifically complex. This being the case, the argument goes, specialism, reductionism (focusing on the parts rather than the whole), and the division of knowledge and practice into ever-smaller units are natural and necessary developments. Because no practitioner can possibly stay abreast of the exploding world of clinical information, specialization and subspecialization are requirements of competency. And because the growth of knowledge is accelerating, the current arguments for clinical reductionism will be even more compelling in the future. The idea of the general practitioner is hopelessly ill suited to the epoch of the heart transplant and laser surgery.

This view is not wrong in its assessment of science or the challenges facing clinical medicine. Rather it is wrong in its assumptions about the human being. Despite the magnificent march of science, the human being remains a complex animal whose body and mind, self and family, person and community are linked in ways that will resist the effort to compartmentalize every pain or blemish as the domain of an expert but narrow specialist. Doctoring as serial specialty visits has not worked well in the past and, despite the onrush of specialized knowledge, will not work well in the future. The late Avedis Donabedian, the leading American scholar and proponent of quality measurement in health care, reflected this in a commentary about his final illness. Dr. Donabedian noted the irony in his need to coordinate much of his own care, observing that quality seemed to mean only "technical competence and, more recently, superficial attention to the interpersonal process. Keep the patient happy, be nice to the patient, call him Mr. or Mrs., remember his name. . . . Today people talk about patient autonomy but often it gets translated into patient abandonment."6 The increasing complexity of medical science, in fact, will create the need for more—not less—integrative medical care. To the traditional generalist values of comprehensiveness, continuity, and coordination will be added imperatives from the emerging system: interpretation, integration, and navigation.


Generalism in Human Enterprise

Generalism as a phenomenon is not limited to medicine. To some extent, there is a competition in all human endeavor between the instinct to keep things whole, complete, and general, and the tendency to distinguish, sort, and reduce. The famous distinction between "lumpers" (those who prefer pulling things together) and "splitters" (those given to dividing things wherever possible) is evident in our daily lives. The way we organize our desks or our refrigerators, for instance, is subject to lumping and splitting preferences, as are our patterns of friendship and our choice of jobs. Virtually any task can be approached holistically or in a reductionist manner though some clearly commend themselves more to one approach than the other.

Generalism in human terms can be defined as a tendency to remain broadly focused, protean, and varied in worldview and activity. The generalist is interested in the big picture with all of its nuances, connections, and complexities. Generalism requires a willingness to think broadly and to maintain sets of knowledge and ideas that are disparate and often not mutually reinforcing. Since the generalist's domain is typically large and complex, it often lacks the certainty and predictability that typifies the world of the specialist. The generalist needs to have a reasonable tolerance for living with uncertainty.

Generalism in human enterprise and as an approach to professional life is vitally important to society as a whole. The generalist labors in broad areas of human endeavor that call for an integrator and a coordinator, someone who can see the big picture and work accordingly. In earlier times the generalist was the norm. The family farmer, the local school teacher, the owner of the general store, the lawyer, and the banker were all general practitioners. Over the past century, however, developments in transportation, communications, and information management have created an environment where vocational specialization is possible, useful, and encouraged. Consequently, most professions have shifted toward more specialized training and practice. The benefits of reductionism are apparent in the growing variety and sophistication of educational opportunities, choice of foods, and telephone service. The specialization of knowledge and the expansion of consumer services seem to go hand in hand and suggest an inevitability to the march of specialization.

The development of specialism is favored not only by the growth of technology but also by the universal desire for personal mastery. One is more likely to achieve proficiency and excellence if one can reduce one's task to the smallest and most specific elements possible. Labor efficiency likewise favors dividing tasks in such a way as to assure a high level of worker competence in a focused and repetitive area. Specialized training and experience tend to reinforce each other and assure efficient production from a compartmentalized work force. In short, reducing knowledge, information, and tasks to units that are as small and as specialized as possible addresses important needs of individuals and societies.

The very word "specialist" implies a superiority over the nonspecialist in the hierarchy of knowledge. The specialist, indeed, has often earned that title through advanced training that has upgraded her skills in a specified area and, in the process, narrowed her field of endeavor. The result is a presumption of advanced practice and high technical competence that society tends to reward with increased prestige and compensation. Implicit also in the hierarchical idea of competency is the assumption that specialist work is more difficult and more taxing than generalist work; that teaching everything to the third grade is easier than teaching calculus to high school seniors; that doing mental health intake at a community health center demands less expertise than practicing analytic psychotherapy; that being a family physician is less challenging than being an anesthesiologist.

And yet what is the evidence that specialty work is harder that generalist work? The specialist deals in a definitionally limited range of challenges. Teaching calculus involves a much narrower curriculum and a far more restricted set of issues than teaching arithmetic, social studies, science, and reading to eight-year-olds while simultaneously coping with classroom discipline and inquiring parents. The specialist in any field may deal with severe manifestations of problems, but the problems will have predictability and repetitiousness. If a problem falls outside the specialist's zone of competence, he refers it on. The generalist does not have that luxury; she starts with ownership of all of the problems that patients bring. Triage and referral are important roles for the generalist, but diagnosis and treatment come first. The elementary school teacher, the general dentist, the parish priest, the neighborhood police officer are examples of practicing generalists all of whom have training and capabilities that provide the basic foundation of much of societal life. Generalist practice is multifaceted, unpredictable, and often complicated, calling on skills drawn from various disciplines. Sorting and weighing problems, deciding on programs of action, and knowing when and how to refer (sending a student to special education, calling for police backup, referring to a psychiatrist) are important generalist skills, competencies of exquisite importance for effective human services, but that receive far less attention and approbation than finely honed skills in very limited areas of professional service.

Education is a domain in which these issues have a long history of debate. As young people progress through school, their education tends to move from the general toward the specific. Never again will students be as generalized as they are in, say, the sixth grade, when it is fair game for the teacher to ask questions about Egyptian history, the division of fractions, the chemical composition of salt, and the Spanish word for table. As grade school progresses into high school, students begin to specialize by selecting some disciplines and avoiding others. General knowledge takes a back seat to a growing store of special knowledge in the sciences, or the arts, or in automobile mechanics. By the time a student enters the workforce, her realm of competence usually has become quite focused, and knowledge from other fields is considered of marginal importance.

The role of general education at the university level has been debated hard over the years. Should post-secondary education produce graduates with a broad exposure to knowledge and the ability to further educate themselves, or should it train individuals with specific vocational capabilities to assume places in the workforce? Although areas of concentration ("majors") are the norm, virtually all American colleges mandate some quantity of general education—the dreaded language, or science, or humanities requirement. These requirements represent the educational establishment's effort to hold the line for generalism amid the ubiquitous and powerful pressures to specialize.

Similar tensions exist in business, with many executives building careers in one or another aspect of management—finance or human resources or communications—and climbing toward the top with little sense of the corporate whole. The organizational strategist Peter Senge opens his popular book The Fifth Discipline with the following: "From a very early age, we are taught to break apart problems, to fragment the world. This apparently makes complex tasks and subjects more manageable, but we pay a hidden, enormous price. We can no longer see the consequences of our actions; we lose our intrinsic sense of connection to a larger whole. When we then try to 'see the big picture,' we try to assemble the fragments in our minds, to list and organize all the pieces. . . . The task is futile—similar to trying to reassemble the fragments of a broken mirror to see a true reflection."7


The History of Primary Care in America

Throughout the twentieth century, the practice of medicine in the United States was marked by a struggle between the specialist and the generalist. Specialists have harvested the products of the rapid growth in science and technology over this period and put them to work in clinical practice. Although the practice of generalist physicians has likewise become more sophisticated, generalists have tended to labor in the backwash of the specialty surge, continuing to provide care to much of the population in a relatively uncelebrated fashion. While the first fifty years of the century were characterized by the relatively uncontested retreat of the generalist physician, the next fifty saw generalists identify and redefine themselves in a variety of ways and initiate a fight for their position and role in medicine. By the century's end there was substantial intellectual, political, and commercial support for the generalist concept.

When the twentieth century began, the vast majority of American physicians were general practitioners treating, as well as they could, all of the maladies that patients brought to their doorsteps. They were, perforce, generalists wrestling with the medical, surgical, obstetrical, and psychiatric problems of people young and old, urban and rural, well heeled and not-so-well heeled. By current standards their science was limited, but they were present in significant numbers throughout the country, accepting payments in cash and in kind for their labors.8 The concept of specialties in medicine did not exist to any great extent in the nineteenth century. Although an occasional urban physician might achieve a reputation for expertise in one or another area of medical practice, formalized training beyond medical school did not really exist, and most doctors went to work as general practitioners on receipt of their diploma. Rapid developments in medical science in the latter years of the nineteenth century, however, set the stage for dramatic changes in medical practice in the twentieth. Anesthesia, antisepsis, the germ theory of disease, and the development of the diagnostic X ray were among the most significant of these developments.

The formal differentiation of specialty practice began early in the twentieth century with the formation of the American College of Surgeons in 1913. Surgical leaders argued that the skills and techniques involved in surgery required special training and competencies different from those practiced by the general practitioner. Internists followed suit, forming the American College of Physicians in 1915. The course followed by most ensuing specialty groups was to establish a "college" or "academy" whose membership was limited to physicians who had received training in the field and who specialized in the area. In the 1930s board examinations were introduced as measures of special competence and as requirements for membership in specialty organizations.

The norm, nonetheless, remained the GP. In 1932 the Committee on the Costs of Medical Care, sitting as the first national body to study health care in the United States, concluded that "each patient would be primarily under the charge of the family practitioner . . . (and) . . . would look to his physician for guidance and counsel on health matters and ordinarily would receive attention from specialists when referred. "9 Yet the pace of change was to increase dramatically with World War II and its aftermath. The organized use of medical manpower by the armed services during the war favored specialty-trained physicians and, in fact, provided training to many in areas such as surgery and anesthesia. Following the war, the G.I. bill was made available to physicians leaving the military and entering residency programs, providing financial incentives for postgraduate training that had never existed before. The twin postwar forces of the rapid growth of employer-sponsored, private health insurance and continued technological developments moved medical care toward the hospital and away from the community—an evolution that greatly favored specialists.

Between 1942 and 1954, the number of residency positions in the country jumped from 5,796 to 25,486, and the number of specialties grew to nineteen.10 The majority of students graduating from medical school were choosing residency programs of three or more years that would qualify them to take specialty board exams. Those who entered practice as GPs after a single year of internship became the exception rather than the rule, and the GP was replaced as the principal provider of care for the family by various specialists, many of whom were hospital-based and none of whom treated the whole family. By 1960 specialists outnumbered generalists in practice.

The rebound of the generalist started in the 1950s with the idea that a generalist could be trained as a "specialist" with a broad set of competencies. In 1961, Kerr White published an essay in the New England Journal of Medicine titled "The Ecology of Medical Care," in which he used epidemiological analyses of patterns of medical care to show that in a population of 1,000 people, 250 would seek some form of medical care in the period of a month.11 Of those, nine would be hospitalized, but only one in a university teaching center—and yet that setting was where virtually all of medical education took place. He argued that we needed to pay more heed to training of "primary physicians" who in fact worked where most medical care took place. This conclusion received concurrence from three national committees that were impaneled during the mid-1960s by leadership groups in medicine. The Coggeshall report (1965, the American Association of Colleges of Medicine), the Millis Commission report (1966, the American Medical Association), and the Willard Committee report (1966, the American Academy of General Practice) spoke variously about the need for reviving, defining, and upgrading the training and practice of the generalist physician.

The first substantive moves in the revitalization of the idea of a primary physician came from general practice. As early as the mid-1950s, the AMA undertook an examination of general practice, releasing the Sawyer Committee report in 1955 which called for the expansion of GP training and led to the first GP residency programs in the early 1960s. These programs developed slowly at first, but following the establishment of the American Board of Family Practice in 1969 and the advent of federal funding for family medicine residency training in 1971, they grew rapidly. In 1971 the American Academy of General Practice renamed itself the American Academy of Family Physicians, formally completing the molt of the old GP into the new family physician.12

The second phase of the generalist resurgence was the recognition in the 1970s that training programs in internal medicine and pediatrics had largely become way stations on the road to subspecialization. If generalism were to persist in medicine and pediatrics, attention and legitimacy would need to be given to generalist values and capabilities in these well-established disciplines. During the mid-1970s, the federal government, through its health professions funding (Title VII of the Public Health Service Act), and several private foundations—notably the Robert Wood Johnson Foundation—began providing explicit support of training programs in general pediatrics and general internal medicine.

The third phase of the new generalism was the emergence during the same period of two new, nonphysician disciplines in primary care—the nurse practitioner and the physician assistant. Jointly seen as additions to the medical workforce to deal with the widely perceived shortage of physicians, these two disciplines sprang from quite different roots.13 The nurse practitioner discipline successfully tested the premise that nurses could be trained at an advanced level to take on certain diagnostic and treatment activities once held to be the exclusive domain of "medical practice." The physician assistant idea was triggered by the return from Vietnam of military medical corpsmen who had rendered major medical care on their own but could find no employment or career tracks in civilian life. Federal Health Professions Act support was also important to the growth of these disciples, which occurred slowly at first, but at an increasing rate in more recent years.

These achievements, however, did not mean that the generalist ideal was once again securely planted at the center of the health care system in the United States. The steady growth in specialty residency programs and the continued decline of the GP meant that generalists, outnumbered by specialists by about 1960, fell to 37 percent of physicians in 1970 and to just under 33 percent for the decade 1990-2000. The actual number of generalist physicians grew during this period because of a doubling in the number of medical school graduates in the United States, with the net effect of a slight increase in the ratio of primary physicians to population ratio. The numbers of nurse practitioners and physician assistants in practice grew steadily, with most of the former and about half of the latter engaged in primary care. Through the 1980s, however, these five groups of consciously generalist practitioners, possessing federal funding, certifying authorities, and professional academies, could do no more than hold the line against the predominance of specialist practice. The nadir in medical education was reached in 1991 when an all-time low of 14.6 percent of graduating students indicated their intentions of becoming primary care physicians.14

It was increased public debate about expense and inequity in the medical system that led to the health care reform movement of the early 1990s and provided new impetus for the generalist movement. The continued escalation of medical costs in the United States troubled both patients and the business community. The continued presence of a large uninsured population and, not insignificantly, poorer health indices than those of many other developed nations expanded the debate. Both in general public deliberations and in the president's Health Care Reform Task Force, the importance of primary care as a foundation for cost-effective medical care received broad endorsement. Simultaneously, the growth of managed care in the country accelerated, stimulating the employment market for generalist providers and adding palpability to the pro-primary care policy arguments.

Suddenly, primary care was "in." Medical student interest rose rapidly, hospitals and health systems competed to purchase primary care practices, and the number of nurse practitioner and physician assistant training programs grew quickly, with 58,500 N.P.s15 and 40,000 P.A.s16 estimated to be in practice by 2000. Some physician specialties reported difficulty finding positions for their recent trainees. Many specialists and their organizations began asserting that they in fact practiced primary care, in the hopes of favorable treatment in future compensation or training systems. The primary care provided by various specialists in the course of their practices was posited by some as a good strategy to meet care needs. Many specialties began to worry about competing with themselves and considered controlling their own numbers. For a moment, the generalist was in an unaccustomed position of approbation and demand.

The failure of - President Clinton's health care reform legislation did not relieve the pressure for change in the system. The business community, payer for much of the cost of employment-based health insurance, wanted reform—by which it principally meant cost containment. If government-led reform was not to be, business was prepared to invoke "the market," a powerful distributional mechanism from which health care had previously been relatively protected. Managed care became the instrument of that "reform," the agent of the market, the institution with which increasing numbers of patients and physicians had to deal. But managed care was not a fixed entity. It began to evolve quickly, offering a staggering variety of "products" covering service delivery, finance, and risk. Some health plans, such as those with a long history of providing HMO-based care, represented serious efforts at restructuring the health care system in a financially responsible way, with a focus on systematizing quality care. Others were unapologetic business schemes designed to wring profits from the inefficiencies found everywhere in the delivery of health care. Both versions saw primary care as key to a rational and efficient system and together, often touting the "gatekeeper" model, they succeeded in moving primary care to the center of the health care stage and gave the generalist a visible if somewhat ambiguous prominence. Primary care providers were at the core of most managed care arrangements—as clinicians, as care coordinators, and sometimes as gatekeepers empowered to make decisions about specialty referrals and procedures. Some arrangements actually provided personal financial incentives to the primary care provider to limit referrals, hospitalizations, and lab work, thus creating a constant conflict of interest. It was this latter role that contributed to primary care's sudden reputation as medicine's designated miser, limiting procedures and pinching pennies. This, in turn, contributed to new patient demands for "choice," meaning the ability to opt out of primary care. "Choice" quickly became one of the planks of "patient rights." In the process, some critics labeled primary care as bureaucratic, unfriendly, or unnecessary—attributes contrary to the ethos of primary care and ones that generalist practitioners vehemently disavowed.

Ironically, then, the move to managed care in the United States made the "primary care provider" a much better known but also a much more controversial idea. The very success of primary care as a central player in the delivery of health care also burdened the concept with the problematic responsibility of cost containment. Medical student interest in primary care peaked in the late 1990s and started to decline.17 Managed care plans began promoting options that allowed patients to circumvent primary care and go directly to specialists. Despite the two-to-one predominance of specialists nationally, some began to argue that the United States had too many primary care providers and training programs should be cut back.

The managed care environment prompted changes in the way medicine was practiced in virtually all settings. Primary care providers, in particular, reported that they were required to see more patients per hour, per session, and per year, often for incomes that were falling. As with other physicians, the advent of prior approvals, specified formularies (limitations on drugs that could be prescribed), and practice profiling (computer-generated analyses and criticisms of physicians' prescribing and test-ordering habits) raised the hackles of many in primary care.

But managed care was not the only force creating uncertainty. The "hospitalist"—a physician caring only for hospitalized medical patients—emerged as a new player in the 1990s and was cautiously adopted by a growing number of institutions.18 The hospitalist idea firmly divided the worlds of inpatient and outpatient medicine, aligning primary care with ambulatory care and inserting the hospitalist as a formalized practitioner of "secondary care." From one perspective this change clarified and simplified the role of primary care, but it also impinged on continuity of care for those generalists with hospital as well as ambulatory practices. Hospitalists have made the argument that they are generalists in the hospital setting and that practice as a hospitalist is an opportunity for primary care practitioners. It is likely that the efficient division of labor inherent in the hospitalist idea will continue to prove attractive to some hospitals and that the concept is here to stay. Although it may be difficult to decide whether the hospitalist is "us" or "them" from a purely generalist point of view, most primary care physicians have adapted to the presence of hospitalists and find them to be an asset in patient care.19

One area of internal challenge for primary care is the increased clinical authority asserted within the field of nursing. Throughout the country nurses have been successful at amending state clinical practice acts to expand the scope of practice and prescriptive authorities, as well as direct compensation provisions for nurse practitioners (and physician assistants). Although this has led to increasing numbers of bush wars with state medical societies and the AMA, these legislative changes have enabled the increasing numbers of nurse practitioners (and physician assistants) to realize their clinical potential in a way that would never have been possible under the old statutes. In 1999, Dr. Mary Mundinger and associates at Columbia University published a study comparing the experiences of a group of patients whose primary care, including hospitalizations, was completely managed by nurse practitioners with those of a matched group who were treated by primary care physicians. On balance, the outcomes for the two groups were without substantial difference.20 While there are questions about the generalizability of the study, its implications raise a new set of issues for primary care—and for medicine and nursing. Can nurse practitioners actually supplant physicians? Should nurse practitioners supplant physicians? The situation raises philosophical questions as well. If nurse practitioners can "practice medicine," are they not doctors? If nurses are being trained to be doctors, what will become of the traditional stand-alone values of nursing? And if doctors and nurses engage shoulder-to-shoulder in the practice of medicine, are there two professions or one?

"Alternative medicine"—treatments from chiropractic to aroma therapy, from multivitamins to massage therapy, on which patients spend billions of dollars annually—emerged in the 1980s and 1990s as an important element of patient self-care that variously perplexed, angered, and chastened physicians. The increased availability of information (good and bad) and ever growing patient expectations for desired outcomes have contributed to the growth of alternative medicine. Dissatisfaction with physicians and the often inscrutable, fragmented system has spurred it on as well. Here lies an important opportunity for primary care as custodian of the doctor-patient relationship. By attending to the frustrations of patients and by being generally versed in the more popular "alternatives," the generalist can provide protection, instruction, and an affirmation of good doctoring.

While all of these recent developments challenge the field of primary care in one way or another, they all relate closely to it and derive credibility and energy from it. Primary care—how we care broadly for people and populations—is, in fact, at the center of many of the most important health and health policy debates of our time. As these controversies demonstrate, primary care enters the twenty-first century as an integral and important part of health care in America. Despite the unimaginable growth in medical science and technology that has taken place over the past hundred years and the inevitable demise of the GP, the offspring of general practice are well established today in medicine and in nursing.. Within medical schools, primary care teaching is the basis on which all medical education is built. In policy deliberations, primary care is seen as the key to future strategies to provide service to the underserved and, ultimately, health care coverage to everyone in the country.

The men and women whose lives are chronicled in Big Doctoring tell us where primary care has been and what it has done. In their stories, some themes of the future emerge. Most of these people find great satisfaction in their work and receive enormous appreciation and affection from the patients they care for. These basic themes of human effectiveness and gratification are omnipresent. The practitioners speak with pride about their work, a sense of common utility in their practices, a satisfaction with the teaching that they do, and a general sense of having come up the hard way, sometimes having gone against the advice of their mentors in school who counseled against primary care. They report problems as well. Managed care has tipped almost everybody off balance, including many of those who have chosen to work in managed care settings. Primary care incomes are decreasing in some quarters, and the market is tighter everywhere. But all across America there are physicians, nurse practitioners, and physician assistants who are dedicated to the science and art of primary care, who day and night use their minds, their technologies, and their hearts to practice big doctoring. They are busy building the future of primary care.