<< First  < Prev   1   2   Next >  Last >> 
  • November 15, 2020 2:53 PM | Sarah Halter (Administrator)

    by Norma Erickson

    (photo: Sisters of Charity first hospital 'Hospital Indianapolis News, June 10, 1911)

    When Vice president-elect Kamala Harris made her speech the night her running mate Joe Biden projected as the next President of the United States, she poignantly recognized  “Women who fought and sacrificed so much for equality, liberty and justice for all, including the Black women, who are often, too often overlooked, but so often prove that they are the backbone of our democracy.” She confessed she stood on the shoulders of Black women who came before her , struggling to transform our nation from a society that derided and excluded people of color, to one that could, as a whole, become a better place when all were lifted to an equal standing. In the decades that surrounded the turn of the twentieth century, women of all races and even classes undertook an effort to improve society, approaching the problem from different value systems.

    For white women who embraced the Progressive ideas of the time, their work became known as “municipal housekeeping”. Rooted in the idea of the woman was the mistress of her household domain, the existence of a healthy, well-ran home depended on a healthy, well-ran public sphere. They sought to “clean up City Hall” and improved many facets of life and work outside the home.

    (photo: ad from The Freeman, January 25, 1913)

    For some Black women, they were inspired by the Social Gospel Movement that recognized that Society—not just the Individual—required salvation.  One historian framed black clubwomen’s motives as their desire to take control of their lives and to fulfill the Social Gospel through action, like the women who followed the historical Jesus of the New Testament. Social reforms became the vehicle for saving individuals, and by extension, the civic realm. As with the white push for change, the goals of Black women included uplift specifically for women. The marriage of these two manifestations of faith—uplift for salvation and female empowerment became important for many them.

    Before going deeper in the health care history that involved these women, one misconception must be pointed out: Being Black in Indianapolis in the first decades of the twentieth century did not automatically mean you were poor. There might have been only one Madame C. J. Walker, the famous self-made millionaire,  but there were many successful black businesswomen and wives of businessmen who lived a comfortable life and desired respectful treatment. Second, even working class women, many who were domestics, desired the same respect and were members of clubs that provided social interaction and improvement activities, including adequate and dignified healthcare. One example of their battle for respectability and the quest to improve lives stands out—the founding of a hospital for African Americans in 1911.

    Several women’s clubs worked for improved healthcare in the African-American community of Indianapolis. Their efforts ranged from directly providing care, supporting the facilities, creating places for care, and training care givers. They undertook these projects with firm convictions that women possessed unique abilities that allowed them to carry out their missions of care and to do so with as much autonomy as possible. The most ambitious of these projects was the Sisters of Charity Hospital.  

    The Grand Body of the Sisters of Charity (GBSC), not to be confused with Catholic women’s religious orders with a similar name, was formed in Indianapolis in 1874 in response to the needs of large numbers of southern Blacks moving to the city near the end of the Reconstruction of the South that followed the end of the Civil War. Many women’s clubs formed for a variety of ends, some social or utility minded (for instance, sewing clubs) and some with for public goals in mind (one example is the Women’s Improvement Club). Many of them embraced the motto of the National Association of Colored Women: Lifting as We Climb. The GBSC differed slightly   from other women’s clubs in that it operated as a lodge, with benefits to its members that satisfied needs like burial to financial assistance when needed.  The underlying purpose of the hospital was to care for lodge members, but service was also extended to the entire Black community.

    Originally housed in a former residence at 15th and Missouri Streets in 1911 (where the parking garage for the IU Health Neuroscience Center now stands), the hospital moved to another house at 502 California Street in 1918 (now an open lawn on the IUPUI campus. The hospital also served the community by formally training young women as nurses. This professional activity held great prospects for the advancement of Black women. They also worked with the juvenile courts and “wayward” girls. However much these services were sorely needed, such a small institution had a difficult time keeping up with necessary maintenance and improvements that would make the hospital a suitable place for surgery or a maternity hospital. Keep in mind that the Sisters of Charity Hospital and Lincoln Hospital were providing a place for care and treatment that should have been accessible to Black doctors, nurses, and patients as citizens and tax-payers of Indianapolis. It closed around 1921.

    The Sisters of Charity pursued the quest for uplift for their community and briefly accomplished a unique achievement. The Sisters of Charity Hospital was a rare instance of an African American hospital owned and operated by black clubwomen in a northern state. 


    (photo: site of former SoCH at IU Health Neuroscience Center Garage 15th and Missouri, Google Maps 11-14-2020)

  • October 08, 2020 11:27 AM | Sarah Halter (Administrator)

    by Haley Brinker

    The idea of drinking human blood or consuming bones might sound like something out of a horror movie to people today, but it was a fairly common practice during the early modern period of history. It actually goes back even further. Medical cannibalism can trace its roots all the way back to ancient Rome, where spectators of gladiatorial fights would drink the blood of fallen gladiators in an attempt to cure them of their ills [1]. It was also thought that it this vital blood could cure things like epilepsy [4]. Now, some might say that this could just be a rare case; a few ancient vampires among a sea of ‘normal people.’ They would be wrong. Medical cannibalism was incredibly widespread. (Image: "Cannibalism in Russia and Lithuania 1571")

    The popularity of medical cannibalism hit its peak in the 1500s and 1600s [2]. The practice of consuming body parts in various, creative ways was everywhere in Europe during this time. Egyptian mummies were thought to be incredibly powerful, so the grave robbers went to Egypt to steal them [1]. Now, any movie archaeologist or horror movie enthusiast would eye this practice warily; these robbers were begging to be cursed by the spirits of the former pharaohs. However, such reports can’t be located. Not all people believed that mummies needed to be Egyptian in order to be medicinally powerful. Many thought it just needed to be the mummified cadaver of any “healthy man” [5]. However, there was such a high demand for body parts from mummies that it created a black market of sorts, with industrious would-be grave robbers creating mummies of their own [3]. Like Ina Garten, they believed in the power of homemade. With this can-do attitude, they made local mummies by robbing the graves of local poor people or criminals, sometimes even just using animals and passing them off as human remains [3].

    With medical cannibalism being so popular, it, of course, had its famous supporters throughout history. King Charles II was a believer in the power of human remains’ ability to cure the medical maladies of the living. He believed in a medicine called “spirit of the skull,” which contained real skull [1]. In fact, he wanted to make it so badly that he paid six thousand dollars for the recipe, which he referred to as “King’s Drops” [3]. Another enormous fan of consuming literal human body parts in order to cure common ailments was the 17th century brain scientist Thomas Willis. He believed that one could cure excessive bleeding by mixing together the tantalizing concoction of human skull powder and delicious chocolate [2]. Who doesn’t love a little chocolate when they’re feeling down?

    The 16th century German and Swiss physician, Paracelsus, preferred the power of more “fresh corpses” [1]. Now, while it seems that he was a vampire, working to create an army of other vampires, that is, unexcitingly, not the case. More affluent would-be blood drinkers could go to their local apothecary to acquire the hemoglobin they so desired [2], while those of less wealth and status would simply attend a public execution and kindly ask for a cup of the deceased criminal’s blood from the executioner himself [1]. Paracelsus believed that when someone died suddenly (i.e. a hanging, an execution, etc.), their “vital spirits” could “burst forth to the circumference of the bone” and the living could use their highly powerful body parts to heal their ailments [3].

    The list of supporters didn’t end there, either. Marsilio Ficino, an Italian scholar from the 15th century believed that the elderly should “suck the blood of an adolescent” who was in good spirits and of sound body to regain some of their former vigor [3]. Saint Albertus Magnus stated that a distillation of blood could “cure any disease of the body” [3]. Elizabeth Bathory’s belief in bathing in the blood of young women doesn’t seem so far-fetched now, does it? Heinous? Yes. A horrific crime of tremendous magnitude? Absolutely. A belief system totally out of line with the times? Nope.

    Bones and blood weren’t the only ‘useful’ remedies at the time. The practitioners of medical cannibalism were what some might call… creative. Blood was thought to be the “vehicle of the soul,” so it was thought to be especially powerful [4], but how to deal with the pesky taste of drinking warm, human blood? Marmalade! Blood marmalade to be precise. A Franciscan apothecary in the 1600’s had a delightfully descriptive recipe to create the culinary confection that is blood marmalade [1]. Step one (the most important step, as we all know) was to find a donor with the following traits: “warm, moist temperament, such as those of a blotchy, red complexion and rather plump of build” [3]. It is quite difficult to pin down exactly what a ‘moist’ temperament is, but I’m sure those at the time had someone in mind as soon as they read the recipe. Bones were allegedly useful as well. It was believed that ‘like treated like,’ so skull powder was a great cure for any ailments of the head [3]. Even objects near the cadaver could hold power. A moss that grew on skulls was called usnea, which literally means “moss of the skull,” was thought to prevent nosebleeds by simply holding it or shoving it right into your nose [1].

    As stated previously, bones and blood weren’t the only parts of the body that could ‘cure.’ Human fat was thought to have all sorts of medicinal properties. For instance, fat could prevent bruising of the skin [3]. The fatty fun doesn’t stop there, though. It was believed that the magical properties in human fat could be used to create something called a ‘Thieves Candle.’ This human-fat-containing candle was thought to be able to “paralyze enemies” [2]. Fat was so important to medicine that the local executioners would directly deliver the fat from executed criminals right to the apothecaries around town [3].

    While this practice of consuming human remains was widely practiced and incredibly popular at this time, it didn’t prevent white Europeans from condemning tribal practices involving cannibalism with extreme revulsion. Puritans didn’t support belief in “transubstantiation” in Catholicism [5]. They believed that transforming bread and wine into the body and blood of Christ and then consuming it was a form of cannibalism [2]. Cannibalistic ritual practices performed by Native Americans were seen as ‘barbaric’ and used as an example of why they should be subjugated by the Europeans [3]. It is an interesting juxtaposition due to the fact that Native American cannibalistic practices were social and sacred and were done in order to “reintegrate the deceased into the tribe” [3] On the flip side, Europeans often didn’t know whose remains they were consuming. Often, bodies used for medical cannibalism belonged to those on the lowest rungs of the societal ladder: the poor, the disenfranchised, the ‘other.’

    Using ritual cannibalism as a stick with which to beat down those that the Europeans deemed ‘less,’ was very common. During the subjugation of the Irish by the English, Irish skulls were unburied and sent to German pharmacies and apothecaries to be ground into powder and sold as a commodity [3]. Joseph Hall, a past bishop of Exeter, did a fiery sermon referring to the Turkish people as “bloody, man-eating cannibals, mongrel troglodytes feeding upon bloody carcasses” [3]. Bishop Hall was apparently fine with his own people consuming bones mixed with chocolate and alcohol or smearing a little blood marmalade on crusty bread, but not with social, religious rituals of respect done by non-white, non-Protestant individuals.

    While the practice of medicinal cannibalism gradually dwindled, a book published in Germany in the early 1900s noted that a pharmaceutical company was still offering “genuine Egyptian mummy” in its catalog [5]. The human body is still used in medicine today, however these practices, such as blood transfusions and organ donations, are far more medically sound and don’t require any visits to the local executioner.



    Bibliography

    [1] Sugg, R. (2008). The art of medicine, Corpse medicine: Mummies, cannibals, and vampires. The Lancet, 371(9630), perspectives. doi:https://doi.org/10.1016/S0140-6736(08)60907-1

    [2] Dolan, Maria. “The Gruesome History of Eating Corpses as Medicine.” Smithsonian.com. Smithsonian Institution, May 6, 2012. https://www.smithsonianmag.com/history/the-gruesome-history-of-eating-corpses-as-medicine-82360284/.

    [3] Lovejoy, B. (2016). A Brief History of Medical Cannibalism. Lapham's Quarterly, 9(5).

    [4] Himmelman, P. K. (1997). The Medicinal Body: An Analysis of Medicinal Cannibalism in Europe, 1300-1700. Dialectical Anthropology, 22(2).

    [5] Gordon-Grube, K. (1988). Anrhropophagy in Post-Renaissance Europe: The Tradition of Medicinal cannibalism. American Anthropologist, 90(2).


  • September 28, 2020 12:11 PM | Sarah Halter (Administrator)

    by Norma Erickson

    When the Lincoln Hospital opened in December of 1909, the African American doctors of Indianapolis could no longer continue with the state of medical practice in Indianapolis. Shut out of the hospitals of the city, they could not continue to care for their patients who required hospitalizations, a situation that led to disastrous outcomes for some Black patients. Sometimes the disconnect that occurred when the patient was moved from home to hospital left a very sick person vulnerable to mistakes.

    One such case happened in March of 1905 when Thomas Jones, a seriously ill African American man, was denied an examination at the City Hospital. He had recently been seen by two Black physicians; one wrote an order for him to be admitted to City Hospital. A carriage was called, and when the driver arrived at the hospital, the intern on duty looked at the man in the carriage, saw blood on the front of his clothes and immediately determined that he had tuberculosis. The doctor did not take Jones’s temperature nor remove the patient to an examination room, because the clerk on duty would not help him. Tuberculosis cases were prohibited from City Hospital, so the intern told Willis to take him to the county poor house. He did so, and thirty minutes later Thomas Jones died. Knowing that City Hospital would not accept TB patients, the physician would not have requested he be admitted there, nor would the nurse who saw him in his home had called for the carriage to take him to there. The nurse had collected a sputum sample at his home before he was removed. When tested later, the sample was negative for TB. The Black community was outraged by this and reporting on the case appeared in both the Indianapolis Star and the Indianapolis News for three weeks.

    During this era, the role of hospitals was undergoing great changes. No longer a place merely for the poor to receive treatment, they underwent modernization that allowed life-saving surgeries to take place. But a Black physician did not have access to those, even in public tax-supported institutions like City Hospital. Black patients who would have liked to receive treatment in a hospital rather than homecare were put off by the uncomfortable environment of all-white medical and nursing staffs.  Between the loss of revenue and prestige as a surgeon, and the patients’ low confidence in the system, it was clearly time for a new approach by the African American community. If the segregationist rules did not change, then it was time for a public hospital for African Americans. The only way to get it was to start their own.

    Like Ward’s Sanitarium, the Lincoln Hospital also launched a nurse training program that attracted students from around the state. It also included a free dispensary to treat the poor, just like the public hospital. Women’s clubs stepped up to gather funds and donate goods. Two prominent white men, a business owner and a politician donated substantially to the effort to get the project off to a start. The physicians published a first annual report with glowing successful cases and also revealed the cases they lost. Five years later, the hospital closed.

    The reason most often cited was the lack of funding. That certainly could be true, but could there be another reason? Could it be that the Black doctors of Lincoln Hospital allowed it to end because it was time to make a push to be installed at the City Hospital? For five years, they managed a facility and demonstrated their abilities to successfully perform operations. One of their own would run for city council and win that year, dangling the hope of making changes at City Hospital almost within their reach. War had begun in Europe, bringing a possibility that young Blacks would enter military service soon—another way to prove the mettle of the Race.

    But the entrance of physicians into Indianapolis’ public hospital would not happen for another thirty years and access to both adequate and trusted healthcare would continue to deteriorate.

    Next month: The Sisters of Charity Hospital


  • September 14, 2020 3:05 PM | Sarah Halter (Administrator)

    by Haley Brinker

    The story of radium began in the laboratory of Marie Curie and her husband, Pierre, in 1898. It was there that they discovered the power of this element, a discovery that would earn them a Nobel Prize in Physics [1], but radium’s story was just beginning. Soon, the entire world would be at rapt attention, hungry for any news of or product containing what some called a “miracle element” [2]

    By the early 1900s, radium was synonymous with health. Products of all kinds, touting the benefits of radium for the human body were everywhere. Radium water was incredibly popular and one company, Bailey Radium Laboratories, called far and wide that their product, Radithor, was the tonic to cure to all of the public’s bodily ills [3]. It was even picked up by a celebrity sponsor of sorts, Eben MacBurney Byers. He loved Radithor so much, and spoke its praises so highly, that he consumed over a thousand bottles in only five years [3]. There were radioactive toothpastes, radioactive makeup, and even entire spas dedicated to the healing power of radium [1]. One product, the Radio-Active Pad, even claimed that it could cure blindness (yes, blindness) by wearing the pad “on the back by day and over the stomach at night” [4]. Consumers at the time were spoiled for choice. They could have their radium in the form of a kissable pout with lipstick. They could spend a day with the girls, lavishing in a lush spa, cucumbers covering their eyes, while they received treatments enhanced with the restorative powers of this wondrous cure-all. 

    Radium in branding was so popular, in fact, that some companies simply named their companies after it but did not actually put any into their products. One of these companies was called “Radium Brand Creamery Butter,” which didn't likely contain any of its namesake element [4]. Where people today would jump at products labeled ‘organic’ or ‘gluten free,’ folks during the radium craze lunged for any product that claimed it was associated in any way with radium. This was the power of popularity. Radium was trendy; radium was chic.

    Like all fads, the public’s love affair with radium would not last forever. Stories around the world began circulating, highlighting the serious health problems of those who were ingesting radium. Our Radithor loving, celebrity endorser, Eben MacBurney Byers, was soon afflicted with a host of health problems. After five years and over a thousand bottles of Radithor, the radium in his system had caught up with him. An attorney, Robert Hiner Winn, went to interview Byers in 1930 on behalf of the Federal Trade Commission. When he met Byers, he was dumbfounded; this previously hardy and healthy man was now a shadow of his former self. Byers’ radium poisoning was so severe that “half of his face was missing” [3]. The health tonic Byers had spent years promoting and using had been his downfall. He had been misled, and he was not the only victim.

    During the peak of radium mania, the US Radium Corporation had set up factories to produce watches with glow in the dark dials. These dials were hand painted using radium laced paint by young women who, to keep the numbers crisp and precise, pointed the tips of their brushes with their lips. Each lip-pointed brush stroke was “slowly irradiating” the women “from within.”. Soon, the women began to come down with symptoms of radium poisoning [2]. Bright, young women attempting to make a good living became bed-ridden, in worse health than most septuagenarians, and they had not even reached the age of thirty.

    With these two stories, and a plethora of others, the world turned its back on radium. The government, namely the U.S. Food and Drug Administration, outlawed any patent medicines that had radium as an ingredient [2]. Ingesting radium as medicine became a thing of the past, a bad memory in the public consciousness. Its popularity and downfall give meaning to the phrase that anyone eyeing a product that seems to good to be true should remember: buyer beware.


    Bibliography

    [1] Crezo, Adrienne. “9 Ways People Used Radium Before We Understood the Risks.” Mental Floss. Mental Floss, October 9, 2012. https://www.mentalfloss.com/article/12732/9-ways-people-used-radium-we-understood-risks.

    [2] Moss, Matthew. “The Radium Craze – America's Lethal Love Affair by Matthew Moss.” The History Vault. The History Vault, January 15, 2015. https://thehistoryvault.co.uk/the-radium-craze-americas-lethal-love-affair-by-matthew-moss/.

    [3] Brumfield, Dale M. “The Blessings of Radium Water Made His Head Disintegrate.” Medium. Medium, March 18, 2019. https://medium.com/lessons-from-history/the-blessings-of-radium-water-made-his-head-disintegrate-3ac052cb8620.

    [4] Orci, Taylor. “How We Realized Putting Radium in Everything Was Not the Answer.” The Atlantic. Atlantic Media Company, March 7, 2013. https://www.theatlantic.com/health/archive/2013/03/how-we-realized-putting-radium-in-everything-was-not-the-answer/273780/.


  • August 31, 2020 1:14 PM | Sarah Halter (Administrator)

    by Haley Brinker, IMHM graduate intern from the Public History Department at IUPUI

    In the Bacteriology Laboratory of the Indiana Medical History Museum, you’ll find a photograph of Dr. John Hurty, hard at work at his desk. Next to this photograph, you’ll discover a large poster depicting a goblinesque typhoid germ, beckoning and inviting you to meet it at the town pump. This poster, commissioned by the ever public health-conscious Dr. Hurty and created by cartoonist Garr Williams, is a reflection of the very serious typhoid problem threatening the health of Indiana’s citizens at that time. In order to combat this problem, Dr. Hurty recognized that commissioning memorable posters that left little room for confusion of their messages would make it easier for the public at large to understand the public health issues facing them.

    Another of these posters (above) depicts a Creature from the Black Lagoon lookalike, rising from a bottle of milk, while a helpless, diapered child looks on, his rattle his only defense. Looking at this poster today, one can’t help but wonder what on earth could be so deadly about drinking something so seemingly harmless as milk.

    To put it simply, milk, prior to pasteurization and federal regulation, was absolutely disgusting. One analysis showed that a sample of milk in New Jersey had so many bacterial colonies that the scientists just stopped counting. Dairymen at the time often used cost-saving and morally questionable tricks in order to ensure that they could milk (sorry) the most profit out of their product. One such trick was thinning the milk with water. In one case, a family reported that their milk appeared to be “wriggling.” Upon investigation, it was discovered that the milkman had used “stagnant” water nearby, which was apparently full of tiny, insect eggs that grew into tiny, insect larva, causing the “wriggling” the family had noticed. Aside from being a scene out of one of your elementary school lunchtime nightmares, it further illustrated the need to regulate the industry. After the thinning process, the milk would sometimes be discolored. In order to solve this problem, the dairymen simply added things like chalk or plaster to turn it back to the crisp, white color their customers expected. Then, it gets nauseating. In order to make doctored dairy look “richer” and more cream colored, a puree of calf brains would sometimes be added to the mixture.

    Samples of milk tested during that time often had “sticks, hairs, insects, blood, and pus,” but it gets worse. There was also a lot of manure present. There was so much manure in Indianapolis’s milk in 1900 that “it was estimated that the citizens of Indianapolis consumed more than 2000 pounds of manure in a given year.” How could the powers that be possibly fight against all the rampant bacteria and the illness it caused? With formaldehyde of course! What better way to cure society’s ills than with embalming fluid in the food we eat and the milk we drink. Even our illustrious Dr. Hurty was on board at the beginning. However, he soon realized that it was doing more harm than good. Often, formaldehyde-related outbreaks of illness would occur, and could even be deadly, especially in children. In 1901, Hurty stated that over 400 children had died from milk tainted with either the chemical, dirt, or bacteria.

    When the federal government finally got around to passing the Federal Pure Foods and Drugs Act in 1906, the practice of putting formaldehyde in food was finally banned. While government-mandated pasteurization of dairy was still a long way off, the tireless efforts of Dr. Hurty to remove formaldehyde from milk helped pave the way for legal change to better protect the public from those that would profit at the expense of their health.

    TO SEE MORE OF HURTY'S COMMISSIONED CARTOONS AND LEARN MORE ABOUT INDIANA's 1899 PURE FOOD & DRUGS ACT, VISIT THE ONLINE EXHIBIT "FOOD FIGHT!"

                                       

    References:

    Blum, D. (2018). The 19th-century fight against bacteria-ridden milk preserved with embalming fluid. Retrieved from https://www.smithsonianmag.com/science-nature/19th-century-fight-bacteria-ridden-milk-embalming-fluid-180970473/#:~:text=In%20late%201900%2C%20Hurty's%20health,was%20%E2%80%9Cwriggling.%E2%80%9D%20It%20turnedAugust 6, 2020.

    Thurman B. Rice, MD. “Dr. Thaddeus M. Stevens- Pioneer in Public Health [Chapter XIV].” In The Hoosier Health Officer: A Biography of Dr. John N. Hurty, 57–60, n.d.


  • August 24, 2020 9:43 AM | Sarah Halter (Administrator)

    by Norma Erickson

    It’s sometimes difficult to grasp why racial health disparities still exist in the twenty-first century. There are many aspects to the problem. One that is very relatable to everyone today is …money. How is healthcare paid for and who pays for it?

    In the late 1800s and early 1900s, there were few choices. Starting with the most expensive, the very rich were cared for in their homes. Their physician made house calls and private duty nurses provided round-the-clock care. If one had the means, a private sanitarium (a for-profit hospital typically owned by one doctor, sometimes a group of them) cared for patients in need of surgery or other higher level care. If you had a little money, the public or municipal hospital offered affordable care for paying patients and the patient’s own doctor could still have charge of their case.

    The public hospital also admitted the poor, whose care fell to the hospital staff physicians. In the case of a municipal hospitals with connections to medical colleges, interns and student nurses gave care under the guidance of professional staff (Indianapolis City Hospital for instance). For minor care and medications, the very poor could access publically funded dispensaries; again, these often doubled as teaching sites.

    At the end of the Civil War, most of the nation’s African American population lived in the South, existing in an agriculture-based economy that placed no expectations on education. Eventually, many would leave to find better opportunities in the North’s large cities. Indianapolis was a very interesting northern city because, unlike some the larger metropolitan areas, its African American population grew at a relatively slow pace. This allowed the white population to become more familiar with their new neighbors and the establishment of businesses and occupations that crossed over the color line, a line of social segregation between the races that stood solidly until the latter years of the twentieth century.

    The Black community developed class strata, just as did the white side. There were well-to do folks, a hardworking middling group, laborers, and the indigent on both sides. On the Black side of the line, no matter the group, an underlying missing element—that most of the white side enjoyed as a given—was respect.  African Americans could find that respect within their own environment, but truly adequate healthcare existed only on the other side of the line, where respect was hard to gain. For many in the middle group (small business owners, craftspeople, high-level service workers like train porters), the public hospital was the only option, and they knew that even if they paid, they would be admitted to the worst section of an aging building without access to their own doctor and at the mercy of a staff that might not respect them.

    The leaders in the Black community understood that the providing and receiving healthcare was an economic issue. The community was missing out on opportunities for employment (nurses and developing technology specialists) and higher level physician skills with required modern surgical equipment and support.

    Except for the Alpha Home for Women that cared for aged black women, no institutional medical facilities for Blacks existed in the city until the 1896 when a new physician, Fernando Beamouth, opened a sanitarium at 651 North Senate Avenue. The Freeman, a major Black newspaper, noted that this was the first sanitarium in the state for Black patients and only the second in the nation to be started by doctor of color. Beamouth died in 1897. In August 1903, several prominent men in the Black community, including Dr. Sumner Furniss, tried to purchase a building in the 900 block of North Meridian to start a clinic, but abandoned the project when white neighbors objected.

    Later, Dr. Joseph H. Ward opened his sanitarium on Indiana Avenue around 1906 (the actual date is unclear).  This first viable effort mostly served the portion of the population able to pay for private care.  For the first few years, Ward did not advertise his sanitarium in the newspapers, but the society pages occasionally announced hospitalizations there, naming patients known as elite members of the black community.  Later, he was Madame C.J. Walker’s personal physician. It is likely he also cared for a few charity patients, too.

    Beamouth, Ward, and Furniss were also members of the Black business league. Ward acted on the fuller economic function of health care as a source of good-paying jobs by starting a nurse training program. His sanitarium filled a gap for the elite, but the middle class needed an alternative to the City Hospital. In 1909, Furniss and several other Black doctors formed the Lincoln Hospital that would function as a public hospital for the African American community with the ability to pay for care. The Lincoln Hospital and its physicians will be the subject of the blog post next month in The Struggle for Adequate Healthcare for African Americans in Indianapolis-1906-1925 Part III.

    Photo: Officers of the National Negro Business League, at Indianapolis in 1904 from the collection of the Schomburg Center for Research in Black Culture at the New York Public Library. Dr. Sumner Furniss is the first on the left in the second row. 

  • August 19, 2020 3:02 PM | Sarah Halter (Administrator)

    by Sarah Halter

    Despite the ongoing pandemic and our temporary closure, these are exciting and productive times at the Indiana Medical History Museum.

    This organization has come a long way in recent years. Among other things, we are making it a priority to better manage and care for all of our collections, and, as much as possible, make them accessible to the public. In late 2019, after successfully completing a large project to catalog the Museum’s library collection, we began a similar project to catalog, organize, and better protect our extensive archival collection. Our goals were to improve accessibility of the materials, identify holes in the collection, better track conditions, prioritize materials for digitization, and better manage and track use of the materials.

    We currently don't know the full extent of our archival collection precisely, but we estimate that the collection contains approximately 5,500 documents (personal papers, research notes, pamphlets, charts, instruction sheets, loose records, photographs, sketches, advertisements, class photos, etc.), including many oversized or rolled documents, plus hundreds of pieces of framed artwork, ledger books, and 16mm film reels and about 11,000 (!) glass plate negatives.

    As was the case with the library collection before we completed Phase I of this project, we just don't know everything we have. We can't always locate materials that we know we have, because storage locations in many cases have changed numerous times over the years. Our archival collections have been disorganized and inadequately protected on shelves that are sometimes unstable and frequently inefficient and unsecured. To protect and make better use of these materials, we must organize and store them using archival quality materials and secure, and in some cases fire and water resistant, shelves and cabinets. Last month we were awarded a $15,000 Heritage Support Grant provided by the Indiana Historical Society and made possible by Lilly Endowment, Inc. to help us accomplish this.

    This is such important work. It’s critical, in fact, to our mission to preserve and present Indiana’s rich medical history. We are stewards of a wonderful collection that contains a wealth of knowledge and many rare and very historically significant materials. When this project is completed, these materials will be much more useful for our internal research, publications, and exhibits. And most will be available to patrons, as well, when we reopen to the public and establish our Reading Room hours.

    We miss seeing you all here in the Old Pathology Building for tours and programs. But we’re making good use of this time to improve our digital and virtual offerings and to improve your experience and your access to our collections when it’s safe to have you back. Thanks for your patience and your continued support! It means so much to us.


    PHOTOS

    Top: The IMHM collection includes many pieces of artwork, including works created by patients. The works of the transient artist John Zwara are among the most exceptional. We have 22 of his paintings, 21 of which were done while he was a patient at Central State Hospital in the spring and summer of 1938. Most depict the grounds of the hospital as they were at that time. He painted several of the hospital’s large buildings, like this one of the Pathological Department that now houses the IMHM, as well as areas of the grounds.

    Bottom: Our collection consists of many ledgers of autopsy records from Central State Hospital as well as admissions, bookkeeping, and other types of records from a number of other hospitals. Here is a ledger from Long Hospital in Indianapolis.

  • August 13, 2020 11:34 AM | Sarah Halter (Administrator)

    by Erin Powers of Ball State University's Applied Anthropology Laboratories

    Before the pandemic, I loved that my job involved being outdoors; now in the midst of the pandemic, I m grateful that my job involves being outdoors! I am an archaeologist in the Applied Anthropology Laboratories (AAL) at Ball State University and while we are braving these unprecedented times as best we can, we have been fortunate to be able to continue doing student-centered research in the past months. This means that we are now able to set a date to conduct a geophysical survey in an unmarked cemetery used during the early years of Central State Hospital. Originally, we scheduled this project for June 2020, but the pandemic had other plans. This project targets the oldest and first cemetery associated with the Central State Hospital that was used from 1848 to 1905.

    Currently, we do not know how many individuals are buried there and the extent of the cemetery. Sometime in the mid-twentieth century, the grave markers were removed. These individuals deserve to be acknowledged and represented, which is what we hope to address with this project through geophysical survey. Along with the Indiana Medical History Museum (IMHM), Indianapolis Metropolitan Police Department Mounted Horse Patrol and Canine Unit, and the Caroline Scott Chapter of the Daughters of the American Revolution, we intend to find the extent of the cemetery and how many individuals are buried there using non-invasive ground penetrating radar (GPR).

    This project is using GPR as opposed to traditional archaeological excavation because it is non-invasive and it will not destroy the cemetery or property. GPR projects requires fewer archaeologists and is much faster than traditional archaeological projects. Normally when you think of archaeology, you imagine a big square trench where everyone is troweling the soil, exposing artifacts, and removing dirt. Since cemeteries are incredibly sensitive historic spaces, traditional archaeology cannot be applied here. Instead, archaeologists have started to use geophysical equipment, like GPR, that read or send electromagnetic waves into the earth and collect information. This information is analyzed in the laboratory and it lets archaeologists see into the ground without removing any earth.

    While we analyze the data, we are looking for disturbances in horizontal soil strata, which indicate that the soil was disturbed. In cemeteries, we are typically looking for patterns of disturbance that resemble grave shafts and metal artifacts that could be associated with the burials. For example, coffin hardware or metal jewelry buried with an individual will typically show up in the GPR data. In an ideal situation, once the GPR data is processed, an AAL staff member can clearly demarcate each grave shaft and the boundary of the cemetery. In most cases, there are many disturbances that the GPR picks up and some of them are associated to animal activity, various construction phases, and metal fences or poles nearby. GPR can tell us the extent of the cemetery, the number of grave shafts, and the minimum number of individuals.

    Back in March 2020, Sarah Halter (Executive Director, IMHM) and I created a crowdfunding campaign at Ball State University in order to raise the funds needed to fund this project at Central State Hospital. Thanks to all of our donors and supporters, we exceeded our goal of $5500 and raised over $5800. This money will go to the AAL staff members conduct the GPR survey and processing the data. The AAL team is going to resume fieldwork very soon in a socially distant, safe, and healthy way. We will keep the public updated about the fieldwork, data processing, and results of this project. We are looking forward to getting this project underway and bringing to light the marginalized individuals buried at Central State Hospital. 


  • August 04, 2020 9:00 AM | Sarah Halter (Administrator)

    by Rhea Cain and Allison Linn

    “Does public opinion indorse [sic] sterilization? The following report of a nationwide poll gives the answer.” 

    The Indianapolis Star, Sun, May 23rd, 1937

    It sounds a bit like a line from a dystopian novel by Margaret Atwell, but alas, it is from Indiana’s own not-so-distant past. Most folks may not realize that Indiana has a lot of “firsts” under its belt: The first city illuminated by electric light? Wabash, Indiana. The first gas pump?  Ft. Wayne, thanks to the forward thinking (what a name!) Sylavanus Freelove Bowser. The first compulsory sterilization laws used against the mentally ill? Indiana again.  

    This law, enacted in Indiana with overwhelming support from the eugenics community in, was simply referred to as The Indiana Plan. Its justification can be traced back to theories of eugenics. Eugenics as a branch of science developed in the late 1800’s, which sprouted from Gregor Mendel’s cross breeding of plants in the mid-19th century. Mendel, is commonly referred to as the father of genetics, but it was his research and his theories on inherited traits that provided the foundation to the eugenics movement.

    By the beginning of the twentieth century the adapted theories of Eugenics had established a firm following. The Eugenics Movement became popularized because it was believed that by installing some form of control over human reproduction, society would form a healthier and stronger human population in later generations. By the time Indiana passed its sterilization law, eugenics was a serious scientific study which held powerful advocates within the state and across the country. Indiana was a state known at this time for its reform in welfare and charity, therefore eugenics was a response to the state’s concern with its impoverished and mentally disabled citizens. Looking back, was the movement’s goal to improve the human race by assisting evolution (a slow process) or simply remove troublesome inherited characteristics from reaching the next generation (quick and brutal)? 

    So how did Indiana become the first state culpable in designing such a destructive law to begin with? There are three key Hoosiers who were major proponents for the passing and implementation of the Indiana Plan, the first being Dr. Harry Sharp. Sharp was the prison physician at Jeffersonville reformatory. Starting in 1899 Dr. Sharp, with written consent from the convicted, began conducting routine vasectomies on his male prisoners. Between the years of 1899-1909, Dr. Sharp had performed four hundred and sixty-five of these surgeries. Three hundred and eighty-four surgeries were done before the passage of Indiana’s sterilization law. Dr. Sharp and his clinical studies on the surgeries itself, as well as the impact on male prisoners following the surgery, were published in clinical magazines all throughout the country.  They were then used as a scientific legitimization that sterilization can be deemed beneficial physically and mentally to those operated on. Dr. Sharp often argued in his publications that crime and degeneracy were hereditary in nature, and therefore sterilization is necessary in order to eliminate most crime within a community.

    The next two individuals were G. Henri Bogart and John Hurty, both physicians highly revered within the medical community, who advocated for the sterilization of those deemed “unfit” by licensed professionals. Their opinions were published in a plethora of medical journals as well as influential national papers during the early twentieth century. Dr. Bogart alone published thirty articles on human sterilization between 1908 and 1910. Dr. Hurty not only promoted the use of sterilization but also the enactment of marriage restrictions (who was restricted?), public vaccination and public sanitation. All subjects he bundled together and used as his platform during his involvement with The United States Health Movement.

    There were some legal hiccups along the way; the law was overturned in 1921 by the Indiana Supreme Court due to perceived violations of the 14th Amendment.  but diligent legislators put forth a new more stringent law in 1927 once the United States Supreme Court sided with Virginia in the now landmark case, Buck v. Bell. As a result of the Buck case, states felt emboldened to move forward with their crusades to combat poverty and disease via eugenics.

     Ultimately, the real reason became quite clear. Indiana legislators worked to enact laws that would sterilize citizens that they deemed undesirable: criminals, the mentally ill, the rural poor, orphans, county home residents, unwed mothers, etc. Why? To save the State of Indiana as much money as possible. “Many of these shiftless, feebleminded folks can barely eke out a living for themselves, but that does not deter them from marrying and propagating their kind, thus adding to the burden of the state.” CITATION Ass96 \l 1033  (Associated Press 1996) If “weak minded” and “morally corrupt” people were prohibited from reproducing, there would be less need for institutions of all kinds (because clearly all behavior must be hereditary, right?).

    So how did this process work in an institutional setting? It was up to the superintendent of the institution to “nominate” patients for sterilization and present them to the institution’s governing board. Then a hearing was held for the patient and potentially a family member to take part in that would determine of the patient was a candidate for sterilization. IF the patient or their family was unhappy with the outcome of this hearing? They could fight the governing board in the courts. In 1931, it was decided to allow county judges (as long as they had the approval of two licensed physicians), to order the sterilization of a patient during their commitment procedure, circumventing the institutions’ boards altogether. In 1935, Representative Dr. Horace Willan went even further when he proposed that any patient that could potentially become a parent be sterilized within 30 days of admission to a state hospital if physicians deemed it “necessary”.  CITATION Uni35 \l 1033 (United Press 1935). (And 80% of those polled in the cited article were most definitely in favor of mandatory sterilization in 1937.)

    Except for the gap between 1921 and enactment of 1927, the sterilization laws remained in effect in Indiana until 1974. CITATION Sta74 \l 1033  (State of Indiana 1974)  In the period the law was enacted, roughly 2500 Indiana residents were sterilized in order to protect the tax payers of Indiana. And while state mandated sterilizations all but stopped in the 1960’s due to cost, there will still other ways of thwarting “undesirables” from having children. One of the most enforced was simply making it unlawful for certain populations of Indiana residents to get married. Indiana Law 111 passed in 1905 prohibited a marriage license from being issued to anyone under guardianship as a person of unsound mind, and specifically prohibited men who had spent any time in the past five years in a county asylum or home for indigent persons. The Indiana marriage laws remained on the books in some capacity until 1977. CITATION Sta77 \l 1033  (State of Indiana 1977)

    Public approval of eugenic theory began to wane after clear examples from Nazi Germany showed what could happen in a society when its people began placing value on certain members within a group and denigrating members of others. Science was also starting to reveal that earlier data on biological heredity was not as accurate as once believed. New information revealed that a multitude of mental disabilities are not inherited at all, while human behaviors are shaped more by environment than by heredity. Near the end of the war, the world learned of the atrocities committed at the hands of Nazis and their concentration camps, this only strengthened western culture’s opposition to sterilization and eugenics. America would instead place higher emphasis on individual and personal rights, and finally in 1974, Indiana’s sterilization laws were repealed.

    In reviewing the history of Indiana’s compulsory sterilization laws, it becomes all too clear that sometimes, history does repeat itself—and not for the better. Overwhelmingly the victims of this program were the under-educated, the poor, and people of color. We owe it to those victims to not only know about our state’s part in this chapter of American medical history, but to work diligently to ensure it never happens again. We also need to acknowledge that systemic racism and classism, direct mechanisms behind Indiana’s eugenics movement, still negatively impact our communities today.


    Further Learning:

    Imbeciles: The Supreme Court, American Eugenics, and the Sterilization of Carrie Buck by Adam Cohen

    Eugenic Nation: Faults and Frontiers of Better Breeding in Modern America by Alexandra Stern

    A Century of Eugenics in America: From the Indiana Experiment to the Human Genome Era edited by Paul Lombardo

    The Eugenics Crusade. WGBH Boston https://www.pbs.org/wgbh/americanexperience/films/eugenics-crusade/


  • July 15, 2020 9:02 AM | Sarah Halter (Administrator)

    by Norma Erickson

    Imagine for a moment that you are in desperate need of a complicated surgical operation, one that cannot be performed as an outpatient. It is such a serious surgery that you could die, if it is not successful. Some of the success of the operation depends on good nursing care during your recovery. 

    Now imagine you are in Indianapolis in 1904 and you are a Black patient in need of surgery in a hospital at a time when your personal physician, also Black, is not allowed to practice in City Hospital (now Sidney & Lois Eskenazi Hospital)—the only hospital that will admit you. You don’t know how your white surgeon feels about operating on an African American. You also don’t know—when you are finally anesthetized—if an intern will be holding the scalpel that will be slicing into your abdomen, practicing his newly-learned surgical skills. The nurse who will be taking care of you will also be white and may not like going to the dank and dark basement Colored Ward to care for you.

    Is it little wonder that you waited so long to see a doctor and relied on home remedies or even resorted to magical charms to evade the possibility of mistreatment that folks in your neighborhood warned you about? They described the hospital as a “terror” many times. Your Black doctor may not know the feelings of the white doctor, because the normal way of getting to know other physicians, the local medical society, does not allow him membership.

    This scenario shows just one of the reasons the African American community experienced a disparity in healthcare in the early twentieth century.  But what could be done? Racial segregation was a fact of life, and it appeared that nothing would change. To gain some control over the situation, there had to be healthcare that the community could trust, and there had to be adequate places to deliver such care. For this reason, individuals and groups decided to “yield to the inevitable” and began an effort in the city to alleviate this problem by establishing hospitals and private sanitariums to provide good medical care and nurse training programs to uplift the Black citizens of Indianapolis both economically and socially.

    Subsequent blog posts will tell a bit of the story of three such institutions that existed in Indianapolis between the years 1906 and 1925—Ward’s Sanitarium, Lincoln Hospital, and the Sisters of Charity Hospital. As you read these stories, keep in mind that although they disappeared in the first quarter of the century, the problems they sought to cure did not, reaching even until the present day.


<< First  < Prev   1   2   Next >  Last >> 

Copyright © 2017 Indiana Medical History Museum

3045 Vermont Street, Indianapolis, IN 46222   (317) 635-7329

Powered by Wild Apricot Membership Software