andragogytheory.com
America’s Rape Culture
The term “rape culture” was first introduced in American society in the 1970s by feminist in an effort to raise awareness and educate the public on problems associated with sexual violence towards …