Home » Sectors » The West too has a ‘rape culture’

The West too has a ‘rape culture’

Share with your friends


Rape culture is a concept in feminist research, which explains the prevalent attitudes, norms and practices in a society that trivialises, excuses, tolerates, or even condones rape. It “is a complex set of beliefs that encourages male sexual aggression and supports violence against women”, as defined in the Encyclopedia of Rape. Read More

Leave a Reply

Your email address will not be published. Required fields are marked *