So I made a post about feminism in anime, and got some “LOL yeah, feminism in Japan. yeah right” replies.
This idea that feminism ONLY exists in Western culture is really baffling to me, and honestly, it reeks of “white culture is so much more sophisticated and enlightened than the rest of the world.”
Like, do people really think that Asia, Africa, the Middle East, Latin America,( and oh, ya know, everywhere in the world that isn’t the US and Europe) just…doesn’t have feminism or civil right’s movements or….?
women exist everywhere, therefore feminism exists everywhere.