Women's Healthcare in the 20th Century United States
From Wikipedia, the free encyclopedia
Women’s healthcare in the United States has been constantly evolving to more fully address the needs of women's health throughout the U.S. During the twentieth century, many policies, practices, and treatments improved in order to better fit the needs of women. Men were often viewed as the appropriate professionals to attend to healthcare needs in the United States, including those of women. This made healthcare in the United States a sexist system.[1] Women holding positions of power within our government as well as some men, would use their political stance to better address the needs of women. Policies were being revoked and new ones were being put in their place; policies that include women from minority groups that face racial prejudice not only from within the workforce but from healthcare institutions as well. In the 1950s into the 1960s, these healthcare institutions had scientists and doctors working on producing contraceptives, despite their controversial public opinions.[2] Feminists within the U.S. were speaking out against the injustices, and inequality women were facing in the twentieth century to bring awareness to the needs of women and ensure that, as a country, the U.S. better address those needs within the twentieth century.
This article is an orphan, as no other articles link to it. Please introduce links to this page from related articles; try the Find link tool for suggestions. (November 2020) |