American anti-intellectualism and its impact on physicians

By | December 1, 2018

For several years now, I’ve been the social media curmudgeon in medicine. In a 2011 New York Times op-ed titled “Don’t Quit This Day Job”, I argued that working part-time or leaving medicine goes against our obligation to patients and to the American taxpayers who subsidize graduate medical education to the tune of $ 15 billion per year.

But today, eight years after the passage of the Affordable Care Act, I’m more sympathetic to the physicians who are giving up on medicine by cutting back on their work hours or leaving the profession altogether. Experts cite all kinds of reasons for the malaise in American medicine:  burnout, user-unfriendly electronic health records, declining pay, loss of autonomy. I think the real root cause lies in our country’s worsening anti-intellectualism.

People emigrated to this country to escape oppression by the well-educated upper classes, and as a nation, we never got past it. Many Americans have an ingrained distrust of “eggheads.” American anti-intellectualism propelled the victory of Dwight Eisenhower over Adlai Stevenson – twice – and probably helped elect Bill Clinton, George Bush, and Donald Trump.

Don’t make the mistake of thinking that American anti-intellectualism today is exclusive to religious fundamentalists and poorly educated people in rural areas. Look at the prevalence of unvaccinated children in some of America’s most affluent neighborhoods, correlating with the location of Whole Foods stores and pricey private schools. Their parents trust Internet search results over science and medical advice.

Remember when physicians were heroes?

For a long time, physicians were exempt from America’s anti-intellectual disdain because people respected their knowledge and superhuman work ethic. The public wanted doctors to be heroes and miracle workers. The years of education and impossibly long hours were part of the legend, and justified physician prestige and financial rewards. Popular TV series in the ‘60s and ‘70s lionized the dedication of Ben Casey, Marcus Welby, Dr. Kildare, and Hawkeye Pierce. In real life, heart surgeons Michael DeBakey, who performed the first coronary bypass operation in 1964, and Christiaan Barnard, who performed the first heart transplant in 1967, became famous worldwide.

See also  Navigating the Complexities of Passive Euthanasia: New Guidelines and Their Impact on Doctors

But over the next decades, greater opportunities for women to enter medicine coincided with a decline in public respect for physicians. Though many women in medical school and residency worked just as hard as men — or harder — to prove themselves, the money and prestige didn’t follow. Women physicians working full-time today earn an average 28 percent less than men, a gender wage gap that persists across specialties.

Could it be that the anti-intellectual tradition in America tolerates highly educated men in the doctor’s role, but can’t quite stomach giving the same respect and pay to highly educated women? Nearly everyone has heard of the Apgar score for assessing the health of newborn babies, but how many people know that Virginia Apgar, who developed it in 1952, was a physician?

Less formality, less respect

Even as more women entered the medical profession, other social trends dimmed the public image of physician infallibility. The tragic Libby Zion case in 1984, in which exhausted residents made a series of errors resulting in the death of the 18-year-old college freshman, prompted the first-ever law to limit resident work hours.

While Depression-era parents raised the “baby-boomer” generation to work hard without questioning it, their grandchildren in Generation X demanded extended parental leave, shorter work days, and more vacation time. “Work-life balance” became their mantra. Workplaces everywhere became more informal and dress codes more casual.

Patients and hospital staff began to address physicians by their first names. (As a Baylor medical student, I would have loved to see the fallout if anyone in the operating room at Methodist Hospital had addressed Dr. DeBakey as “Mike”.) Younger physicians, especially women, went along with it so they wouldn’t seem elitist or unfriendly; they started answering their phones saying, “This is Emma,” instead of  “This is Dr. Smith.” It should come as no surprise that the line between physician and non-physician “care providers” began to blur.

See also  Relaxed restrictions across US will have a dire impact, experts warn

The trap of “evidence-based medicine”

The concept of “evidence-based medicine” gained traction, mandating that every disease and procedure must be managed according to a standardized set of guidelines. Never mind that science evolves, and that early research findings often don’t pan out in large-scale studies. Forget that some published research proves to be fraudulent or tainted by conflict of interest. Ignore the fact that a protocol that works well for one disease may be exactly the wrong treatment for another, and that many patients have multiple diseases.

Individual physician judgment today is presumed wrong if it defies a standardized protocol. Compliance with checklists is viewed as proof of quality care. Ezekiel Emanuel, one of the architects of the Affordable Care Act, has even suggested that medical training be cut by 30 percent, as he believes healthcare by protocol makes all that book-learning unnecessary. In this view, all “providers” are interchangeable pawns.

Today, young physicians start their careers in a world where their advancement and pay may depend on patient satisfaction surveys, and the Internet fuels distrust of medical advice. They spend their days functioning as data-entry clerks, with more face-time in front of a computer than with patients. Innovation is stifled. Their clinical decisions are reviewed for compliance with protocols, and their hospitals are run by administrators for whom the delivery of healthcare quickly and cheaply is the main objective. They fear replacement by mid-level “providers” who can be trained to follow a protocol without question.

Today’s medical students and residents see the dissatisfaction all around them, and they note the growing number of physicians who want to change careers. Many look for pathways out of clinical care from the start of their training, obtaining additional degrees — in public health, information technology, bioengineering, or business administration — that can lead to creative careers outside medicine. Some young physicians turn away from clinical care to become entrepreneurs, designing smartphone apps or using mobile vans to deliver IV therapy for hangovers.

See also  A High Protein Reduced Calorie Diet Helps Older People Lose Weight Safely

The dystopian future

American anti-intellectualism is growing worse. Our national inability to debate political issues with reason rather than emotion is a symptom of this disease. So is the distrust of higher education and of experts in every field including medicine. I wonder every day if we are being honest with college students about the future when we encourage them to apply to medical school.

The Association of American Medical Colleges predicts a shortage of up to 120,000 physicians in 2030, both in primary care and specialties. A third of currently practicing physicians will be older than 65 within ten years. They’ll be retiring soon, and too many young physicians already are looking for an exit strategy. Even if we train more physicians, if the malaise in American medicine doesn’t get better we won’t keep them in clinical practice.

Unless something changes, we may find ourselves in a dystopian future with only ten physicians who spend all their time in Washington writing “evidence-based” protocols, while people without the education to realize the full implications of what they’re doing will decide at your bedside which protocol applies to you. Are you feeling lucky?

Karen S. Sibert is an anesthesiologist who blogs at A Penned Point.  

Image credit: Shutterstock.com


KevinMD.com