Overview
Women in American Indian Society, part of the Indians of North America series, delves into an area that has long been misrepresented, if not entirely neglected, by mainstream scholars. Traditionally, native women played important roles in their society, but as soon as Europeans set foot on Indian soil, these women began losing ground. Denied their rightful positions of responsibility, excluded from tribal councils, and stripped of their property, Indian women felt sorely the chauvinism that whites forced upon their culture. However, in the 19th century, after being placed on reservations and forced to learn the ways of whites, native women wielded newfound and traditional knowledge to surmount U.S. government efforts to obliterate Indian cultures.