Saturday, June 18, 2016

An Honest Look at Rape in the American Culture: Rape Culture Defined


Rape culture is real. It isn't a made-up concept or a political phenomena. Rape culture is not exclusive to the west,east or middle east. It isn't something new or newly discovered. Rape has been a common element in every society from the ancient to the modern. Rape is more than sex - rape is power.

While the act of rape victimizes both men and women, rape culture is patriarchal in nature. Emilie Buchwalk defined rape culture in her book, Transforming A Rape Culture, as
a complex set of beliefs that encourage male sexual aggression and supports violence against women. It is a society where violence is seen as sexy and sexuality as violent. In a rape culture, women perceive a continuum of threatened violence that ranges from sexual remarks to sexual touching to rape itself. A rape culture condones physical and emotional terrorism against women as the norm . . . In a rape culture both men and women assume that sexual violence is a fact of life, inevitable . . . However . . . much of what we accept as inevitable is in fact the expression of values and attitudes that can change.
 This culture creates a system that harms all of us. Men are taught to be silent and unemotional; dominant and aggressive; ruled by logic, yet unable to control their sexual appetites. Male victims of sexual assault are ignored, trivialized, viewed as weak, and often are seen less as victims than as lucky recipients. Women are taught to be meek and emotional; weak and subservient; hypersexualized, but vilified for being sexual beings. Female victims of sexual assault are ignored, trivialized and often viewed as seeking out or otherwise causing their own victimization. It's an impossibility for us all.



No comments:

Post a Comment