American women
are ceasing
to feel ashamed of accusing men of raping them.
The idea of being ashamed of being assaulted in any fashion makes no
moral sense to me. I can only understand it intellectually.
It was common to for families to despise women who were raped, along
with women who had sex without authorization, treating women as
possessions of the family rather than as persons. For instance, a
legend about the foundation of the Roman Republic admires a woman for
committing suicide after being pressured into sex. In some
twisted societies, this attitude continues today and is the basis for
"honor killings".
I suppose women internalized this condemnation and converted it into
shame.