A History of HIV/AIDS Crisis: Global and National Contexts

I. Detection


In 1981, a rare form of pneumonia appeared in five young, previously healthy, gay men in Los Angeles. They also had the fungal infection candida, or thrush—most often seen in babies. At the same time, there were reports of men in New York and California with an unusually aggressive form of a generally benign skin cancer. On June 5th of that year the U.S. Centers for Disease Control (CDC) reported on these unusual clusters in its newsletter [1]. Over the following months, more of these disease clusters were discovered across the country, along with cases of generalized but abnormal inflammation of the lymph nodes. All of these are known as opportunistic diseases—diseases that occur more often or are more severe in people with weakened or compromised immune systems. By the end of the year, there were 337 reported cases of severe immune deficiency in the United States resulting in 130 deaths.


In June of the following year, a group of cases among gay men in Southern California suggested that the transmission of the immune deficiency was sexual. The syndrome was initially called Gay-Related Immune Deficiency. Soon, however, the disease was reported in heterosexual Haitian immigrants, hemophiliacs, recipients of blood transfusions, and intravenous drug users. In September of 1982, the CDC used the term “Acquired Immune Deficiency Syndrome” (AIDS) for the first time, defining it as, “a disease at least moderately predictive of a defect in cell mediated immunity, occurring in a person with no known case for diminished resistance to that disease.”[2] Initially, eleven opportunistic infections and diseases were considered specific enough to be diagnostic for AIDS. Cases of the illness were also reported in the Caribbean, Africa, and Europe. A global epidemic had begun that, forty years later, continues to kill nearly one million people every year.


II. Calculation

A. Global

According to the Joint United Nations Program on HIV/AIDS (UNAIDS) 75.7 million people have become infected with HIV since the start of the epidemic in 1981. In that same period 32.7 million people have died from AIDS-related illnesses. Globally, new HIV infections peaked in 1998 and AIDS-related deaths peaked in 2004.


Despite this apparent success in decreasing levels of infection and death, in 2019 (the last year for which statistics are currently available) around 1.7 million people were newly infected with HIV and around 690,000 people died from AIDS-related illnesses worldwide. Nearly forty-three million people globally are living with HIV in 2021.[3]


B. National

The United States Department of Health and Human Services (HHS) and the CDC report that, in the United States, more than 700,000 people have died of AIDS since 1981. As of 2018 there were more than 1.2 million people living with HIV. That same year 37,968 people received an HIV diagnosis, while there were 15,820 deaths among people diagnosed with HIV.


HIV continues to have a disproportionate impact on certain populations. People from ethnic minority groups make up eighty-eight percent of all new cases. Young people aged 25-34 have the highest rate of infection, making up nearly one-third of all new cases. The economically disadvantaged have a far higher incidence of infection. The CDC states, “The lower the income, the greater the HIV prevalence rate."[4] Gay men account for sixty-nine percent of all new cases. The highest rates of new diagnoses occur in the southern states [5].


C. Local

Over 50,000 North Carolinians have been diagnosed with HIV, according to estimates compiled from the North Carolina Department of Health and Human Services (NCHHS).[6] As of 2019, there were 38,400 people living with HIV in North Carolina. That same year 1,383 people were newly diagnosed with HIV/AIDS in the state.[7]


According to the CDC, North Carolina has the seventh highest rate of new diagnoses among US states.[8] Additionally, the CDC ranks the Charlotte metropolitan area (which includes the Belmont-Gastonia corridor) among the worst twenty-five percent of metropolitan areas in the nation for new HIV infections.[9] Because of this infection rate, Mecklenburg County is the target of an eradication program called “Getting to Zero Mecklenburg.”[10] However, in the southern United States, one quarter of new cases are identified in suburban or rural areas. Therefore, the state of North Carolina has allocated funds to expand Mecklenburg’s program of medical services and medication assistance to Gaston, Cabarrus, Union and Anson counties.[11]


III. Manifestation


Long before the symptoms of immunodeficiency were detected among men in California, before they spread to people in the cities and towns of North Carolina, it is likely that somewhere between 100,000 and 300,000 people, mostly in Africa, contracted the HIV virus. In or around the year 1908, in or near a wedge of land between two rivers in what is today Cameroon—and was then the German colony of Kamerun— the Simian Immunodeficiency Virus (SIV) was transferred from a chimpanzee into a human being through blood-to blood-contact, probably while hunting, creating the Human Immunodeficiency Virus (HIV). The virus began to spread slowly through human sexual contact along river, forest, and road networks. However, studies of variants of the virus reveal that this was not the first, nor the only, time the virus was transferred from a chimpanzee to a human being. In fact, there have been at least twelve, and probably many more, instances of cross-species transmission of HIV. In these other cases, spread of the virus was limited and halting, perhaps even dying out among its human hosts. This case would be different. The difference was Western colonialism.


The virus spread downriver to the cities of Brazzaville and Léopoldville (today Kinshasa) built as the respective capitals of French Equatorial Africa and the Belgian Congo, facing off against one another across the Congo River. The European colonial powers promoted single-crop agriculture, resource extraction, and urbanization that led to increasingly concentrated populations. The majority of the labor force was male. This disrupted traditional customs around marriage and family, leading to more casual sexual activity with an increased number of partners. The women in the developing urban areas, though smaller in numbers and in percentage of the population, were also somewhat liberated from traditions regulating their behavior. Many remained unmarried or divorced for long periods. Some turned to sex work, which was encouraged by colonial authorities. Moreover, forced labor camps had poor sanitation, meager diets, and grueling labor demands, creating a perfect storm of conditions for the development of weakened immune systems. The workers on colonial enterprises supplemented their diet with wild game, which contributed to an increase in hunting that was exacerbated by the growing availability of firearms and the environmental disruptions caused by colonial exploitation. This likely led to further incidence of human exposure to SIV.


Even well-meaning attempts to vaccinate people against smallpox, dysentery, and sleeping sickness may have had catastrophic consequences. Multiple injections to hundreds or thousands of people were administered with only a handful of syringes. One 1916 sleeping-sickness control expedition treated 89,000 people using just six syringes. Although the importance of sterilization techniques were well-understood, they were not applied to African populations. Transfer of pathogens was inevitable. Serial inoculation against smallpox—relying on the derivation of new vaccines from the pustules that grew upon recently vaccinated people—possibly led to further spread of the virus. The technique had been abandoned in Europe twenty years before, as the likelihood of infectious transmission was well-known. It was still used in the African colonies. (A theory suggesting tainted oral polio vaccines may also have helped to spread the HIV virus has been largely, but not entirely, discredited.) Despite the growing numbers of people who were infected with HIV during this period the virus remained undetected given the high mortality rates, rudimentary or nonexistent health care, and wide range of opportunistic diseases across colonial central Africa. What is certain is that, by the late 1950s, Africans were contracting HIV and dying of AIDS.


IV. Expansion


On the 30th of June, 1960, the rapacious Belgian government reluctantly relinquished the Congo. Belgian administrators abruptly departed in droves. Their departure created a huge vacuum in the newly independent nation, as the Belgian administration had specifically avoided educating its subjects or developing the structures of a functioning state. There were no Congolese doctors and very few teachers. Outsiders had to be recruited as physicians, educators, and lawyers, functionaries, administrators, and professionals. Many came from Haiti, already speaking French, proud of their African roots, and with few opportunities at home. Close to five thousand Haitians took up positions in the Congo, the second largest contingent of foreigners serving in the country. The turmoil of the early years of independence led to the US-sanctioned assassination of the first prime minister, Patrice Lumumba, the adoption of the name Zaire for the country, and the rise of the US-backed dictator, Joseph Mobutu Sese Seko. It also led to the exit of most Haitians from the country by the early 1970s. Some of those who returned home to Haiti brought HIV with them.


The disease spread rapidly through the Haitian population. Later studies of five hundred young Haitian mothers revealed that by 1982 almost eight percent of them had been infected with HIV. At the same time, Americans began harvesting Haitian blood for the growing American market for plasma. Haitians were paid three dollars per liter to undergo a process that filtered out the liquid plasma while returning the rest of the blood cells to the patient. It did not filter out blood-borne viruses. Patients could be infected by the blood of others undergoing the procedure. They could also pass along any infection to those who were ultimately transfused with the plasma. One company, Hemo-Caribbean, exported sixteen hundred gallons of plasma to the United States monthly in 1971. This plasma was used by American hemophiliacs, many of whom ultimately died of AIDS. Meanwhile, Haitian migration to the United States, particularly to Miami, was increasing. Haiti itself was also a sex-tourism destination for gay men. Sometime between 1966 and 1972, genetic sequencing of HIV reveals, the virus migrated, one way or another—from an infected person, or infected plasma—into the United States, first to New York City, then, by 1976, to California.[12] [13]


V. Proliferation


Increasing evidence suggests that forms of pneumonia and tuberculosis, retrospectively diagnosed as HIV-related, began to afflict heroin addicts and homeless people in New York City by the mid-1970s. At the time these were not recognized as signs of a new infectious disease because the people affected had precarious access to health care. Moreover, in the spring of 1975, the government of New York City suffered a fiscal crisis that led to the closing of many social services, particularly those agencies with health responsibilities. This made it even less likely that the health problems of marginalized populations would come to the attention of authorities. The few people aware of the situation did not bother to investigate.[14]


However, in cities like New York, Los Angeles, and San Francisco, the virus began to proliferate in communities of gay men due to the prevalence of multiple sexual partners and the higher transmission rates associated with gay sexual activity. The existence of a new infectious disease was finally recognized. Still, due to the long incubation period of HIV (which can last up to a decade before symptoms become apparent), by the time the first cases were reported, the prevalence of HIV infection in some communities exceeded five percent.


Early histories of the epidemic often incorrectly identified one man, French-Canadian airline steward Gaëtan Dugas, as “Patient Zero,” or the individual responsible for spreading the virus across the United States. Dugas had been labelled “Patient O” (Not 0/Zero) in a CDC study of HIV transmission in California that linked him directly to eight cases of the disease and, indirectly, to forty others. He was a handsome, charming, flamboyant, widely travelled, and sexually active gay man in the earliest years of the epidemic. He was not the original carrier of HIV in the United States. Unfortunately, his identification as the source of HIV in the US obscured the timeline for manifestation of the disease, further stigmatized it as a “gay cancer,” and took attention away from the apathetic US government response to the crisis. Gaëtan Dugas died of AIDS in 1984. [15]


Despite the prevalence of HIV among intravenous drug users and recipients of blood transfusions, initial descriptions of the disease in the United States conveyed the impression that the virus was confined to gay men. By late 1982, however, it was also clear that heterosexual women were contracting HIV from their sexual partners. In June of 1983 the first reports of AIDS in children led to fears that the virus could be passed through casual contact, feeding a popular panic. It soon became evident that children were contracting the disease from their mothers before or during childbirth, or from breastmilk. By September of that year the CDC had ruled out transmission by casual contact through food, water, air, or surfaces. This did little to quell rising widespread fear of the disease.


The virus itself had still not been identified, though a worldwide effort was underway. Between 1981 and 1984 three independent teams of researchers—one in France, two in the United States—identified the virus responsible for AIDS, calling it by three different names. A committee of retrovirologists settled on Human Immunodeficiency Virus (HIV) in 1986. After years of accusations, denials, lawsuits, and squabbles over royalties among the rival teams, the leaders of the French team, Luc Montagnier and Françoise Barré-Sinoussi, emerged with the 2008 Nobel Prize for Medicine.


When, in 1984, the US Department of Health and Human Services communicated this discovery of the virus that caused AIDS, it also announced the development of a blood test that could screen for the virus, and predicted that a vaccine would be available within two years. But no vaccine ever came. By the end of 1985, cases of AIDS had been reported in every region of the world. Over the course of the next thirteen years, despite significant developments in testing and treatment, HIV infections and deaths from AIDS would continue to climb sharply. By 1990, nearly twice as many Americans had died of AIDS as were killed throughout the entire Vietnam War. The following year the number of Americans infected with HIV reached one million. And, by 1995, AIDS had become the leading cause of death among all Americans aged 25 to 44.


In 1998, the year infections peaked, the World Health Organization (WHO) announced that AIDS was the fourth biggest cause of death worldwide and the number one killer in Africa. An estimated thirty-three million people globally were living with HIV and fourteen million people had died from AIDS. By the year 2000, WHO reported that more than ninety-five percent of all HIV-infected people were living in the developing world, where ninety-five percent of AIDS deaths had also occurred.


VI. Reaction


The first mainstream news coverage of the developing health crisis was provoked by the CDC announcement on June 5, 1981 that clusters of gay men were suffering from compromised immune systems. The Associated Press, the Los Angeles Times, and the San Francisco Chronicle covered the statement. The New York Times would not cover the story for another month, at which time it reported, “There was no apparent danger to nonhomosexuals from contagion.[16] The first televised news report on the disease from NBC began with anchor Tom Brokaw assuring viewers that only “the lifestyle of some male homosexuals” triggered symptoms [17] Due to this widespread belief that the disease exclusively affected gay men, the common initial reaction from both the broader public and political leaders was homophobic prejudice and apathy.


When, in October of 1982, the Reagan administration was first asked publicly about its response to the growing crisis, Larry Speakes, presidential press secretary, turned the question into a joke at the expense of gay men. The press corps laughed when he responded to a question about the president’s awareness of the disease by saying, “I don’t have it,” then insinuatingly asking, “Do you?”.[18] Six months later, in May of 1983, Pat Buchanan, who would soon become White House Communications Director, wrote, “The poor homosexuals: they have declared war upon nature, and now nature is exacting an awful retribution.”[19] The following month televangelist Jerry Falwell, founder of the Moral Majority and head of the religious right political coalition that was, in large measure, responsible for Reagan’s election as president, declared that, “AIDS is the wrath of a just God against homosexuals…AIDS is not just God's punishment for homosexuals; it is God's punishment for the society that tolerates homosexuals.”[20] Conservative commentator William F. Buckley, Jr. called for the tattooing of every gay man on the buttocks.[21] Other conservative Republicans talked about quarantining HIV-positive people and “rounding up” all gay men. When Dr.Marcus Conant—a dermatologist, one of the first physicians to diagnose and treat AIDS, and founder of the San Francisco AIDS Foundation—met that same year with the White House liaison for medical care, Judi Buckalew, her response was that the crisis was a legal problem, not a medical problem. Because of who the sexual partners of AIDS patients were, she told Conant, “These people are breaking the law.”[22]


Ronald Reagan did not publicly utter the word AIDS until 1985, when over 12,000 Americans had died and the virus had begun to spread through populations of blood transfusion recipients, injection drug users, and some heterosexual women. He did not give a speech on the epidemic until 1987.[23] The Reagan Administration also withheld payments to the World Health Organization (WHO) as HIV/AIDS emerged as a global killer, leading to charges of gross hypocrisy and moral bankruptcy as the United States was both the epicenter of the AIDS epidemic and the only country in default to WHO. Reagan finally released the payments in September of 1988 as he prepared to give his final address to the United Nations General Assembly.


As it became clear that AIDS was not simply a “gay cancer,” individuals far beyond the at-risk populations overreacted to potential exposure. Mass hysteria resulted. In 1983 a New York doctor was threatened with eviction from his office building for treating patients with AIDS. In 1985, Ryan White, a teenager from Indiana who acquired AIDS through contaminated blood products used to treat hemophilia, was banned from school. (White would die of AIDS in 1990 at the age of eighteen.) That same year the US Department of Defense announced that it would test all new military recruits for HIV and would reject those who tested positive. In December, polls showed that a majority of Americans favored forced quarantine of AIDS patients. The following year, 1986, Ricky Ray, a nine-year-old hemophiliac with HIV, was barred from school and his family’s home was burned by arsonists. (Ray died in 1992.) In 1987, a Florida judge ruled that a young girl with AIDS could only attend school if she remained inside a glass enclosure. That was also the year that the United States introduced a controversial immigration policy that stopped people with HIV from entering the country. (The ban was not lifted until 2010.) The US Congress directly linked AIDS to homophobic bigotry with the adoption of the Jesse Helms Amendment that banned the use of federal funds for AIDS education materials that “promote or encourage, directly or indirectly, homosexual activities.”[24]


In response to government inaction, homophobic prejudice, and public hysteria, protest movements and community-based service providers emerged to demand and model more effective approaches to the disease and its victims. In 1982, the Gay Men’s Health Crisis was established in New York City as the world’s first AIDS-oriented charitable organization. Its objectives were to raise money for research and to provide legal and counseling services for those suffering from the disease. The NAMES Project AIDS Memorial Quilt launched in 1985 as a way to humanize the disease’s victims by commemorating individual lives on quilt panels measuring three feet by six feet—the size of a typical grave. It was first publicly displayed on the National Mall in Washington DC in 1987. The quilt was last displayed in its entirety in 1996 when it covered the whole of the National Mall. (When the quilt was displayed in 2012 the panels were rotated on and off display by volunteers to ensure that the entire work could be seen.) As of 2020 it consisted of 48,000 individual memorial panels and was the largest piece of community art in the world.


Out of continuing frustration with limited funding for research, scarce access to existing treatment, and lack of public and political commitment to confronting the epidemic, the AIDS Coalition to Unleash Power (ACT-UP) was founded in 1987 to launch direct and active protests against government agencies and corporations by engaging in civil disobedience. Over the following several years members of ACT-UP would chain themselves inside the New York Stock Exchange, block off and shut down the Food and Drug Administration, storm the National Institutes of Health, interrupt on-air presentations of nightly network news, cover Senator Jesse Helms’ home with a giant condom, and scatter the ashes of people who had died of AIDS on the White House lawn. ACT-UP has been widely credited with raising awareness of HIV/AIDS, changing the perception of people living with AIDS, and effectively transforming public health policy.[25]


Two symbols that have become associated with HIV/AIDS activism also helped raise public consciousness of the epidemic and increase compassion for its victims. In 1987, the pink triangle, used in Nazi Germany to identify those sent to concentration camps for homosexuality, was adopted as a symbol of AIDS activism, often accompanied with the slogan “Silence=Death.”

In 1991, the Red Ribbon Project launched to create a symbol of compassion for people living with HIV. The red ribbon became an international symbol of AIDS awareness.


These efforts, along with a growing number of celebrity deaths, gradually began to change public attitudes and government policy. In 1985, the actor Rock Hudson died from AIDS—the first high profile fatality. Major media coverage of AIDS tripled over the next six months Other well-known figures to die of AIDS would include pianist Liberace, fashion designer Perry Ellis, choreographer Alvin Ailey, photographer Robert Mapplethorpe, philosopher Michel Foucault, painter Keith Haring, singer Freddy Mercury, and dancer Rudolf Nureyev. These deaths helped to familiarize Americans with the disease and those who suffered from it.


Other celebrities leant their fame to the promotion of compassion and empathy. In 1985 actress Elizabeth Taylor formed the American Foundation for AIDS Research to raise money to fight the disease, as well as to fight for the acceptance of AIDS patients in society. For many Americans, it was Elizabeth Taylor who brought the issue of HIV/AIDS into the mainstream. In 1987, Diana Spencer, Princess of Wales, made international headlines when she was photographed shaking the hand of an HIV-positive patient in a London hospital. She became a passionate advocate for people living with HIV and spoke forcefully against HIV/AIDS-related stigma and discrimination. By 1990, when the Americans with Disabilities Act was passed by the US Congress, it prohibited discrimination against individuals living with HIV/AIDS.


When the tennis player Arthur Ashe announced that he had contracted HIV from a blood transfusion received during heart bypass surgery his efforts to promote education about the disease provoked widespread admiration and compassion before his death from AIDS in 1993. But it was the announcement in 1991 from professional basketball superstar Earvin “Magic” Johnson that he was HIV positive—and retiring immediately from the sport—that really shattered public stereotypes and misconceptions about the disease, and prompted greater awareness of emerging treatments and the necessity of making them more widely available.


Popular culture began to reflect the changes in public attitudes. In 1993, Angels in America, Tony Kushner’s play about AIDS, won both the Tony Award for best play and the Pulitzer Prize for best drama. Philadelphia, a film starring Tom Hanks as a lawyer with AIDS, opened in theaters that same year, becoming the first major Hollywood movie about the disease. For his performance, Hanks won the Academy Award for best actor. The next year, 1994, Pedro Zamora, a charismatic young man living with HIV, appeared in the cast of MTV’s popular reality television show, “The Real World.” His death from AIDS later that year provoked a national outpouring of grief. The growth in public awareness, concern, and compassion was accompanied by developments in understanding and treating the disease.[26]


VII. Education, Medication, and Prevention


In June of 1981 the CDC formed the first task force to study what would become known, the following year, as AIDS. By then, the CDC was publicizing cases spread through blood transfusions, perinatally from mother to child, and through heterosexual contact. In 1983 the CDC established the first National AIDS Hotline to respond to public inquiries about the disease. It also published studies demonstrating that the virus was spread sexually or through exposure to blood. These studies emphasized that the virus could not be spread through casual contact, food, water, air, or environmental surfaces.[27]

The following year, 1984, the CDC stated that avoiding injection drug use and the sharing of needles were effective means of preventing the transmission of the virus. In October, the San Francisco Health Department asked for and received court orders that closed bath houses and private sex clubs in the city due to high-risk sexual activity. Similar court orders in New York City and Los Angeles followed within a year. Some gay activists supported these measures as necessary to curb the spread of HIV while others protested the closures as discriminatory.


It took four years from the outbreak of the epidemic—until 1985—for the US Food and Drug Administration (FDA) to license the first commercial blood test to detect antibodies to the virus. At that point, blood banks began to screen the US blood supply. The CDC reported that more people were diagnosed with AIDS in 1985 than in all previous years combined. New AIDS cases had increased eighty-nine percent compared with 1984. Of all AIDS cases diagnosed by 1985, fifty-one percent of adults and fifty-nine percent of children had died. On average, AIDS patients were dying fifteen months after initial diagnosis. Public health experts predicted there would be twice as many new AIDS cases in 1986.


1986 was also the year that one member of the Reagan administration—perhaps its most sincerely devout evangelical Christian—stood up to homophobic bigotry and treated the epidemic simply as a health crisis. On October 22, Dr. C. Everett Koop issued the Surgeon General's Report on AIDS.[28] Due to his conservative views on social issues and his staunch opposition to abortion, Koop’s nomination to the position of Surgeon General had been bitterly opposed by public health groups, women's and gay rights groups, and medical associations. He was frequently referred to as “fanatical.” It took eight bruising months of hearings before he was finally confirmed by the Senate.


But from the beginning of his time in office, even as religious conservative politicians pushed against addressing the growing AIDS crisis, Koop sought authorization to report publicly on the epidemic. When Ronald Reagan finally granted permission, and preparation of the report began, key advocates for people with AIDS were nervous, even fearful, of what the report might contain. They were apprehensive that it would be little more than an obsessive condemnation of homosexuality and extramarital sex. But Koop refused to allow Reagan's domestic policy advisers to review the report before its release. And critics were pleasantly taken aback that the report was guided by principle and not by ideology.


Dr. Koop wrote in the report's foreword, “At the beginning of the AIDS epidemic, many Americans had little sympathy for people with AIDS. The feeling was that somehow peoplefrom certain groups ‘deserved’ their illness. Let us put those feelings behind us. We are fighting a disease, not people.” The report made it clear that HIV could not be spread casually and called for a nationwide education campaign (including early sex education in schools), increased use of condoms, and voluntary HIV testing. It shocked many people with its explicit descriptions of oral, vaginal, and anal sex, as well as commercial sex work. But Koop made no excuses for putting medicine above moralizing. He frequently responded to critics by saying, “I'm the nation's doctor, not the nation's chaplain.”


Two years later, Koop oversaw the production of an eight page brochure, “Understanding AIDS,” that was sent to every household in the United States—107 million copies in all.[29] When the Reagan Administration would not authorize funding for the brochure, Koop worked with Dr. Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases (NIAID), to arrange an interagency transfer from the NIAID budget to that of the Surgeon General’s office. Printing presses worked twenty-four hours a day for weeks and thirty-eight boxcars were deployed to deliver the flyers to postal facilities. Once again, Koop did not allow advisers outside the health department to clear the flyer’s content. It was the first time that the federal government provided explicit sex information to the public, describing safe sex practices and recommending the use of condoms and spermicides.


The document urged the public to display compassion and support for those who were infected, stating, “Who you are has nothing to do with whether you are in danger of being infected with the AIDS virus. What matters is what you do.” In interviews, Koop acknowledged that the issues involved were not ones that many families were used to discussing openly, but he urged parents, grandparents, children, and teenagers to set a time to sit down and review the information together. He expressed the hope that schools would do the same thing. Some critics pointed out that this effort toward public education came more than two years after similar efforts had begun in European countries. But fellow evangelicals in the Reagan administration were practically apoplectic that the brochure had been distributed at all.[30]


In March of 1987, the FDA finally approved the first promising treatment for HIV, the antiretroviral drug zidovudine (AZT). Over the next four years studies demonstrated the utility of the drug in treating adolescents and children as well as in preventing transmission of the virus from HIV-infected pregnant women to their infants. By 1992, the first combination drug therapies for HIV were introduced—using a mixture of three or more drugs—that slowed the replication of the virus in the body, decreasing damage to the immune system and slowing the transition to AIDS. Despite these promising developments, in 1992 AIDS became the leading cause of death for US men aged twenty-five to forty-four.[31]


Three years later, in 1995, the FDA approved the first protease inhibitor—a class of drug that prevents replication of the virus entirely and reduces the amount of HIV in the body (called the viral load) to levels that are undetectable. This began a new era of Highly Active Antiretroviral Treatment (HAART) that quickly became the standard for HIV care. Once incorporated into clinical practice, HAART brought about an immediate decline of sixty to eighty percent in rates of AIDS-related deaths. By 1996, HIV was no longer a leading cause of death for American men aged twenty-five to forty-four. The CDC, in 1997, reported the first decline in the number of AIDS deaths since the start of the pandemic. Due largely to the use of HAART, AIDS-related deaths in the US declined by forty-seven percent compared with the previous year.[32] By 1998 national treatment guidelines issued by the CDC standardized antiretroviral therapy for adults and adolescents with HIV.[33]

However, these breakthroughs in the treatment of HIV and the prevention of AIDS were available only to those who lived in countries with access to medication and who could afford the extremely expensive care regime, at the time costing over $10,000 per year for each patient. In the US itself in 1996, AIDS remained the leading cause of death for African-Americans aged twenty-five to thirty-four. In 1998, the CDC reported that African-Americans accounted for forty-nine percent of all US AIDS-related deaths. The AIDS-related mortality rate for African Americans was almost ten times that of whites. Despite this grim news, the press release that announced the findings was headlined “New Data Show Continued Decline in AIDS Deaths.”[34]


As the end of millennium neared, the World Health Organization (WHO) announced that HIV/AIDS was the world’s fourth biggest killer and was the number one killer in Africa. WHO estimated that thirty-three million people were living with HIV worldwide, and that fourteen million had died of AIDS. In order to combat what was still very much a global crisis, UNAIDS was established in 1996 to advocate for global action on the epidemic and to coordinate HIV/AIDS efforts across the UN system. UNAIDS began negotiations with five pharmaceutical companies to reduce antiretroviral drug prices for developing countries. Ramping up the pressure, in 2001 the World Trade Organization announced the Doha Declaration, which affirmed the right of developing countries to buy or manufacture generic medications to meet public health crises such as HIV/AIDS. In response to diplomatic and political insistence, generic drug manufacturers offered to produce discounted forms of HIV/AIDS drugs; several major pharmaceutical manufacturers also agreed to offer reduced drug prices to developing countries. This reduced costs to three hundred fifty dollars per patient, per year.


Despite these efforts, in July of 2002, UNAIDS reported that, worldwide, ten million young people, aged fifteen to twenty-four, and almost three million children under the age of fifteen, were living with HIV. HIV/AIDS was still by far the leading cause of death in Africa. Spread primarily through heterosexual activity in Africa, women accounted for fifty-five percent of infected African adults. However, among those aged fifteen to twenty-four, women accounted for sixty-six percent of those infected. Average life expectancy in sub-Saharan Africa had fallen from sixty-two years to forty-seven years as a result of AIDS. Over the course of 2002, approximately 3.5 million new infections occurred in sub-Saharan Africa, and the epidemic claimed the lives of an estimated 2.4 million Africans.


Explanations for the devastating levels of HIV/AIDS in Africa included: rapid urbanization, long-distance migration of men between homes and workplaces, the prevalence of tropical diseases and parasites, limited access to healthcare, inconsistent communication networks, low literacy rates, cultural taboos against the discussion of sex, low rates of condom use, and, perhaps, sexual practices that prioritize fertility and lineage perpetuation above exclusivity. Some African countries responded with frank education campaigns, widespread condom distribution, the promotion of male circumcision (which limits transmission of the virus) and the reduction of female circumcision/genital mutilation (which may facilitate transmission of the virus), and calls for a return to pre-colonial sexual practices (with limited promiscuity and sex work)—as well as demands for more affordable antiretroviral treatment.


Provoked by the shocking toll the disease was still taking around the world, in January of 2003 US President George W. Bush announced, and Congress authorized, the creation of the United States President’s Emergency Plan for AIDS Relief (PEPFAR), an $18 billion, five-year plan to combat AIDS in countries with a high number of HIV infections. To date, this is the largest commitment by any nation to an international health initiative dedicated to a single disease. It has been called “arguably the most positive of President Bush’s legacies,”[35] “the most lasting bipartisan accomplishment of the Bush presidency,”[36] and “among the Bush administration's most notable foreign-policy successes.”[37] President Barack Obama made PEPFAR a core component of his Global Health Initiative, a six-year, sixty-three billion dollar effort launched in 2009 to develop a comprehensive approach to global health care in low- and middle-income countries.


By 2010 it appeared that these efforts were beginning to have some success. That year WHO, UNAIDS, and the United Nations Children’s Fund published a joint Universal Access Report showing that 5.25 million people were receiving antiretroviral therapy and that 1.2 million people had started treatment that very year—the largest annual increase yet recorded. Also in 2010 the US lifted its ban on entry to those with HIV, allowing the first international AIDS Conference to be held in the United States—in 2012—in more than twenty years. By 2015, African deaths from AIDS had declined by one-third from five years previously.


Meanwhile, medical advances continued. In 2002, the FDA approved the first rapid diagnostic HIV test kit. With near one hundred percent accuracy, and providing results in as little as twenty minutes, the new test allowed for wider availability of HIV detection. In 2010, the NIH began publishing studies of pre-exposure prophylaxis (PrEP) demonstrating that, when uninfected people take HIV medicine, their risk of contracting HIV if they are exposed to the virus is substantially reduced. PrEP can stop HIV from taking hold and spreading throughout the body. In 2012, the FDA approved Truvada as a PrEP. Taken daily, the drug reduces the risk of contracting the virus through sexual activity by ninety-nine percent. In 2014, consumer protection elements of the Affordable Care Act went into effect that barred insurers from discriminating against customers with pre-existing conditions and eliminated annual limits on coverage. Both of these provisions increased access to, and the affordability of, medication and treatment for people living with HIV/AIDS.


Yet access to care and treatment in the US remains uneven and is particularly impacted by age, ethnicity, and income. In 2007, the CDC reported that over 562,000 people had died of AIDS in the US since 1981.[38] Five years later, in 2012, the CDC released data showing that only a quarter of all Americans with HIV had the virus under control, and that African-Americans and younger people were the least likely to receive ongoing care and effective treatment. Young people between the ages of thirteen and twenty-four represent twenty-six percent of new HIV infections each year and sixty percent of these youth are unaware they are infected with HIV.[39]


The CDC released several reports in 2014 demonstrating gaps in care and treatment. Only half of gay and bisexual men diagnosed with HIV were receiving treatment for their infections. Among Latinos who have been diagnosed with HIV, just over half (fifty-four percent) were receiving care. Fewer than half (forty-four percent) of those diagnosed had been prescribed antiretroviral therapy, and just thirty-seven percent had achieved viral suppression. Only thirty percent of all Americans with HIV had the virus under control—nearly two-thirds of those with the virus had been diagnosed but were not receiving regular and consistent medical treatment. The New York Times reported in 2017 that, as a group, America’s Black gay and bisexual men have a higher HIV prevalence rate than any nation in the world.[40]


Most recently, new challenges for HIV/AIDS prevention and treatment have appeared. In 2015, a new and more aggressive strain of HIV was discovered that can progress to AIDS within just two to three years of exposure to the virus. The following year, 2016, researchers began encountering patients who failed to respond to some antiviral drugs, indicating that treatment-resistant forms of the virus were becoming increasingly common. There is still no vaccine against HIV. There is no cure.[41]


In 2015, UNAIDS announced the 90-90-90 Project. It aims to have ninety percent of all people living with HIV knowledgeable of their HIV status; ninety percent of all people with diagnosed HIV infection receiving sustained antiretroviral therapy; and ninety percent of all people receiving antiretroviral therapy achieving viral suppression. There is still some way to go before reaching these targets. As of the end of 2020, eighty-one percent of people living with HIV knew their status; sixty-seven percent of all people with an HIV diagnosis were receiving antiretroviral therapy; and fifty-nine percent of people had achieved viral suppression. As of 2019 there were still forty-five hundred new HIV infections globally. Of these, fifty-nine percent were in sub-Saharan Africa. Forty-seven percent were among women. Thirty-one percent were among young people aged fifteen to twenty-four. And ten percent were among children under fifteen years of age.[42]


VIII. HIV/AIDS and the Catholic Church


The historical response of the Catholic Church to HIV/AIDS juxtaposes doctrine and practice—unlike Surgeon General Koop, the Church acts as both chaplain and physician. The catechism of the Catholic Church teaches that homosexual acts are intrinsically disordered.[43] Moreover, in 1986, at the height of the AIDS crisis in the United States, the Congregation for the Doctrine of the Faith released a letter to the Bishops of the Catholic Church which decried an “overly benign interpretation [of] the homosexual condition itself,” concluding that the “inclination of the homosexual person…is a more or less strong tendency ordered toward an intrinsic moral evil.” Thus, homosexuality itself, “must be seen as an objective disorder.”[44] In addition, until 2010, the Catholic Church adamantly opposed the use of condoms in any circumstances to prevent the spread of HIV.


At the same time, the Catholic Church was (and is) a major provider of medical care to HIV/AIDS patients. Catholic hospitals were among the first to treat HIV/AIDS patients in the early 1980s. The Vatican estimates that Catholic Church-related organizations provide approximately twenty-five percent of all HIV treatment, care, and support throughout the world. Much of that work is done in developing countries.[45]


The US Conference of Catholic Bishops was, in 1987, the first church body to address the pandemic through a document entitled “The Many Faces of AIDS: A Gospel Response.”[46] It called for medical and pastoral care for those infected with HIV and condemned discrimination against people with AIDS, while rejecting the use of condoms to halt the spread of the disease. That same year, during a visit to San Francisco, a city hit hard by the pandemic, Pope John Paul II physically embraced AIDS patients. The Pope spoke of the importance of giving medical care to people with AIDS and condemned discrimination, while also saying that AIDS resulted from an “abuse of sexuality.”[47]


In November of 1989, the Vatican held a conference on AIDS that drew over one thousand delegates from eighty-five countries and included church leaders as well as top scientists and researchers. It also highlighted the delicate, divisive nature of the issue of HIV/AIDS for the Catholic Church. At the close of the conference, Pope John Paul II pledged the full support of the Catholic Church to those who were battling AIDS. He said the church was called both to help prevent the spread of the disease and to care for those infected with it. At the same time he deplored what he viewed as the destructive behaviors that spread the disease.


The conference also included presentations like that of Italian political scientist Rocco Buttiglione (whose nomination to the European Commission in 2004 was withdrawn due to his views on homosexuality) entitled “AIDS: the Wrath of God?” at which he asserted that AIDS was a “divine punishment…sent by God to call people back to truth and justice…a strong sign against the evils of our time.”[48]Although people with AIDS were invited to attend the conference, none were allowed to speak. At the opening session of the conference, John Cardinal O'Connor, the archbishop of New York, urged the public to treat AIDS patients with respect and not as public health hazards or outcasts. He also reiterated his opposition to condoms as a method to prevent the transmission of HIV and condemned those who “refuse to confront the moral dimensions of sexual aberrations or drug abuse.”[49]


Those latter comments sparked what became the best-known incident involving AIDS and the Catholic Church. The AIDS Coalition to Unleash Power (ACT UP) decided to carry out a dramatic protest during a mass at St. Patrick’s Cathedral in New York City presided over by Cardinal O’Connor. On December 10, 1989, forty-five hundred protestors assembled, many entering the cathedral for a silent “die in.” As O’Connor began the homily, protesters threw themselves to the floor. However, when it appeared that the demonstration was having little effect, some protestors began chanting slogans, blowing whistles, throwing condoms, and chaining themselves to pews. The organist began playing loudly to drown out the mayhem. More than one hundred people were arrested and taken out of the cathedral. One protester took a consecrated host and crushed it, letting the crumbs fall to the floor. That desecration became the focus of media attention, causing a tidal wave of condemnation that led many members of ACT UP (including the man responsible for the deed) to believe their exploit had gone too far.[50]

Yet it also seemed to have impacted Cardinal O’Connor. While O'Connor remained committed to the Catholic teaching that homosexual acts and the use of condoms are sinful, he made a greater effort to minister to people dying of AIDS and their families. He visited Saint Vincent's Catholic Medical Center, where he cleaned the sores and emptied the bedpans of over one thousand patients and was fully supportive of priests who ministered to gay men and others with AIDS. In later years, O'Connor endorsed statewide hate crime legislation that further censured offenses motivated by animus toward an individual’s sexual orientation.[51]


In 1991, Pope John Paul II responded to a request from the Jesuit order and named St. Aloysius Gonzaga the patron of those suffering from AIDS. Aloysius was a Jesuit who worked among typhoid victims during an epidemic in Rome in 1591, dying himself of the disease at the age of twenty-three. This provided AIDS-afflicted Catholics with their own saint to venerate and with whom they could intercede for mercy and protection.[52]


The approach of the Catholic Church to the HIV/AIDS crisis was complicated by tensions over homosexuality in the priesthood and the incidence of AIDS among members of the clergy. In January of 2000, after eighteen months of investigation, the Kansas City Star newspaper published a controversial series of articles on cases of AIDS among Catholic priests. It claimed that priests were dying of AIDS at a rate at least four times that of the general US population. After acknowledging criticism of the authors’ methods, the Star lowered its estimate to a death rate double that of the adult male population. The Jesuit magazine America argued in an editorial that the Star story exposed “the real problem” which was “tension in a church that defines homosexuality as ‘intrinsically disordered’ but relies on many gay men to celebrate the sacraments and carry out the work of the church.”[53]


The total number of priests who have HIV or have died of AIDS will remain unknown as diagnoses are kept confidential, no official records are maintained, and estimates vary widely. While there were reports of bishops and superiors of monastic orders ostracizing HIV-stricken priests or ejecting them from their positions, there are many more accounts of such priests receiving an outpouring of concern and support, particularly from the congregations or schools where they served. Thus, the experience of priests with HIV/AIDS almost exactly mirrors that of the approach to the disease of the Catholic Church as a whole.[54]


In 2010, Pope Benedict XVI—while not endorsing the general use of condoms or changing official Catholic Church teaching, which still strongly opposes contraceptives—allowed that the use of condoms by men and women infected with HIV could be a “first step in the direction of moralization,” a “first assumption of responsibility, taking into consideration the risk to the life” of a sexual partner. While the Pope did not change Church doctrine, he was indicating that condoms could be a responsible option in preventing disease.[55] More recently, Catholic Church officials have consistently lobbied drug makers and western governments to increase the provision of antiretroviral medicines to poor nations. Pope Francis invited pharmaceutical executives to meetings in Rome with representatives from the United Nations and the United States.[56]


Beyond the hierarchy of the Catholic Church, out of the limelight and with little regard for controversies over doctrine, hundreds—indeed thousands—of more ordinary Catholics simply stepped up to offer care and compassion to those in need. Among these were the members of the Sisters of Mercy of North Carolina. Endeavoring to follow their vocation—serving those in society with unmet needs—they made a decision, in 1988, to begin ministering to people living with AIDS. It soon became clear that housing was the greatest need in the area; at that time, there were no local housing options for persons living with HIV/AIDS. The House of Mercy officially opened its doors to residents on May 18, 1991 and, in the years since, has provided a home for 360 men and women living with HIV. Today, The House of Mercy remains a family care home licensed by the North Carolina Department of Health and Human Services. The facility is able to serve up to six residents at a time in a home-like setting.[57]

IX. Conclusion


The global epidemic of HIV/AIDS began with colonial oppression, environmental destruction, and economic exploitation in Africa and the Caribbean. It expanded to the United States and then worldwide due to the urgent but fraught search by human beings for the gratification of mutual physical connection. It was prolonged due to the callous calculations and apathy of politicians; aggravated by fear, ignorance, and bigotry; and intensified—particularly in the developing world—by avarice and venality.


However, the story of the epidemic also contains innumerable instances in which kindness and compassion were shown—in tangible and practical ways—toward those suffering from disease. It includes devoted and heroic investments of time, intellect, and capital to the search for treatment, to campaigns for prevention, and to the stimulation of empathy and understanding. It demonstrates that it is possible for governments and corporations to put people above profit and power, to value human lives above capital, and to esteem “the least of these” as greater than mundane and momentary gain.


None of that, of course, eliminates the damage done by the bigotry, suffering, and discrimination that was experienced by so many individuals who lived with, and died from—who are living with and dying from—HIV/AIDS. The examples of individuals and institutions who acted callously and cruelly offer a reproach to our own tendencies to do the same. Equally, the instances in which people extended humane charity and creative benevolence inspire us as we continue to encounter those who suffer from this disease and from others that will impact us now (as we have all recently experienced with Covid-19) and in the future.







[1] CDC, “Pneumocystis Pneumonia—Los Angeles,” Morbidity and Mortality Weekly Report (MMWR) (June 5, 1981, 30/21); p. 1-3 (https://www.cdc.gov/mmwr/preview/mmwrhtml/june_5.htm)

[2] CDC, “Current Trends Update on Acquired Immune Deficiency Syndrome (AIDS) --United States” Morbidity and Mortality Weekly Report (MMWR) (September 24, 1982, 31/37) p. 507-508,513-514. (https://www.cdc.gov/mmwr/preview/mmwrhtml/00001163.htm#:~:text=Editorial%20Note%3A%20CDC%20defines%20a,diminished%20resistance%20to%20that%20disease.)

[3] UNAIDS Data 2020. (https://www.unaids.org/sites/default/files/media_asset/2020_aids-data-book_en.pdf)

[4] CDC, “Communities in Crisis: Is There a Generalized HIV Epidemic in Impoverished Urban Areas of the United States?” Paul Denning, MD, MPH and Elizabeth DiNenno, PhD (2019) (https://www.cdc.gov/hiv/group/poverty.html#:~:text=The%202.1%25%20HIV%20prevalence%20rate,that%20have%20generalized%20HIV%20epidemics.)

[5] CDC, “Diagnoses of HIV Infection in the United States and Dependent Areas, 2018 (Updated),” HIV Surveillance Report 2020; 31. (https://www.cdc.gov/hiv/pdf/library/reports/surveillance/cdc-hiv-surveillance-report-2018-updated-vol-31.pdf); CDC, “Estimated HIV incidence and prevalence in the United States, 2014-2018. HIV Surveillance Supplemental Report 2020; 25 (No. 1).” (https://www.cdc.gov/hiv/pdf/library/reports/surveillance/cdc-hiv-surveillance-supplemental-report-vol-25-1.pdf)

[6] Compiled from annual HIV Surveillance Reports from the HIV/STD/Hepatitis Surveillance Unit, Division of Public Health, North Carolina Department of Health and Human Services. (https://epi.dph.ncdhhs.gov/cd/stds/annualrpts.html)