There are many Myths of the American West, that have been deeply engrained in our written history and our popular imagination. Within these myths, women’s roles mirror the landscape in their submissive dichotomy of being simultaneously glorified and degraded. Women were seen as important elements of the colonization of the West, bringing civility, community and culture to a “new frontier.” However, when the history was written women’s roles, stories and their work were deemed unimportant. The land itself is likewise treated with great reverence for it’s beauty and it's expanse but is also something to own, use and dispose of as needed.