dark cloud
Well-Known Member
Since i was young i had an inferiority complex because i was born a girl. I know i shouldn't but it's a bit difficult for a girl to not have an inferiority complex compared to boys and men if in school all the important humans in history she learns about are men or at least the majority are men etc. At least in my country (i live in europe)
I try by reading (opinions of some people online) to figure out if patriarchy really exist.
I recently read that feminism was created for no reason by two misandrists women. I felt weird.
Some people think that women are on top now and always were. In my opinion this is something really strange to believe provided that the majority of people in authority back then and now are men.
Some people think toxic masculinity it's a myth and yet many women have a story to tell about toxic masculinity who hurt them. Does that mean all these women are suddenly crazy and dramatic or lying?
The way i understand patriarchy makes sense and i will explain.
Patriarchy doesn't mean that men have easy lifes and no struggles compared to women. Of course they have struggles! Patriarchy system wants specific roles for men and women which it requires them to follow and hurt both of them and if they don't want to follow those roles they're struggle and if they follow those roles perfectly they admired by others.
The difference is the role that a woman have to follow is more confined.
There's still people who believe now in 21 century that woman is the weak sex and her more valuable role is to become a wife and a mother. Any woman who don't want this role get blamed.
Some people really underestimate women.
They blame women for almost everything out of insecurities. They get annoyed by women's sexual freedom and power but at the same time men enjoy these without anyone stigmatize them. Some men complain that they have a hard time to have sex or relationship (and they blame women for it). I don't know if it's just me but seriously?
They can go whenever they want to have sex with sex workers women-men and even underage girls and boys to make their (if they have) pervert fantasies come true without shame or guilt. There is pornography for them to watch whatever they like. They learn their sexual desires are important and have the freedom to express them.
They can have many sexual partners if they want or takes part in orgies and threesomes and never be called a whore or a slut. They can sexualize themselves and society normalized it so much that we might don't even realize it sometimes (i mean they can sexualise, capitalized their bodies without stigma). They can go around in public without a shirt and nobody ever tells them ''what are you doing? put a shirt on, you want to get raped by a woman?'' or ''don't complain if women view or treat you like a piece of meat now''.
I believe they can have orgasms easily compared to women.
They can do all these stuff while women even though they can do these stuff as well they stigmatized and might its difficult for them to have orgasm. And men are the ones who complain? As for relationships they can take care themselves and have a normal character (not be a jerk) and they can have a partner.
I don't know how many people with this mentality exist but if they are many of them i believe feminism is needed.
Recently i was called feminazi just because i said men and women are equal, different but equal.
Do i have to be a feminist for saying the obvious stuff? I can't understand.
I really want to have a reasonable objective opinion to this but i realized it's not black and white unfortunately and this is driven me crazy.
But in another hand i really don't want be viewed as the weak sex just because i'm female that the most valuable thing she can offer in this world is children because it's not something i want.
And at the same time i tend to gaslighting myself and i hate it. I think the anti-feminists people might they're right feminism isn't need it anymore. Maybe feminists are overreacting and want payback and superiority. And what anti-feminsists believe? that we are already equal or that women should stay home raising children and be good, obedient wifes?
If men and women are really equal that means if a woman want to work in a job or anything which we think is male-dominate she won't confront understatement or sexual harassment, right?
Are we really still need feminism or is my confusion and my inferiority complex speaking and all feminists are suddenly hypocriticals and liars who speak about situations that don't exist anymore?
What's your opinion?
I try by reading (opinions of some people online) to figure out if patriarchy really exist.
I recently read that feminism was created for no reason by two misandrists women. I felt weird.
Some people think that women are on top now and always were. In my opinion this is something really strange to believe provided that the majority of people in authority back then and now are men.
Some people think toxic masculinity it's a myth and yet many women have a story to tell about toxic masculinity who hurt them. Does that mean all these women are suddenly crazy and dramatic or lying?
The way i understand patriarchy makes sense and i will explain.
Patriarchy doesn't mean that men have easy lifes and no struggles compared to women. Of course they have struggles! Patriarchy system wants specific roles for men and women which it requires them to follow and hurt both of them and if they don't want to follow those roles they're struggle and if they follow those roles perfectly they admired by others.
The difference is the role that a woman have to follow is more confined.
There's still people who believe now in 21 century that woman is the weak sex and her more valuable role is to become a wife and a mother. Any woman who don't want this role get blamed.
Some people really underestimate women.
They blame women for almost everything out of insecurities. They get annoyed by women's sexual freedom and power but at the same time men enjoy these without anyone stigmatize them. Some men complain that they have a hard time to have sex or relationship (and they blame women for it). I don't know if it's just me but seriously?
They can go whenever they want to have sex with sex workers women-men and even underage girls and boys to make their (if they have) pervert fantasies come true without shame or guilt. There is pornography for them to watch whatever they like. They learn their sexual desires are important and have the freedom to express them.
They can have many sexual partners if they want or takes part in orgies and threesomes and never be called a whore or a slut. They can sexualize themselves and society normalized it so much that we might don't even realize it sometimes (i mean they can sexualise, capitalized their bodies without stigma). They can go around in public without a shirt and nobody ever tells them ''what are you doing? put a shirt on, you want to get raped by a woman?'' or ''don't complain if women view or treat you like a piece of meat now''.
I believe they can have orgasms easily compared to women.
They can do all these stuff while women even though they can do these stuff as well they stigmatized and might its difficult for them to have orgasm. And men are the ones who complain? As for relationships they can take care themselves and have a normal character (not be a jerk) and they can have a partner.
I don't know how many people with this mentality exist but if they are many of them i believe feminism is needed.
Recently i was called feminazi just because i said men and women are equal, different but equal.
Do i have to be a feminist for saying the obvious stuff? I can't understand.
I really want to have a reasonable objective opinion to this but i realized it's not black and white unfortunately and this is driven me crazy.
But in another hand i really don't want be viewed as the weak sex just because i'm female that the most valuable thing she can offer in this world is children because it's not something i want.
And at the same time i tend to gaslighting myself and i hate it. I think the anti-feminists people might they're right feminism isn't need it anymore. Maybe feminists are overreacting and want payback and superiority. And what anti-feminsists believe? that we are already equal or that women should stay home raising children and be good, obedient wifes?
If men and women are really equal that means if a woman want to work in a job or anything which we think is male-dominate she won't confront understatement or sexual harassment, right?
Are we really still need feminism or is my confusion and my inferiority complex speaking and all feminists are suddenly hypocriticals and liars who speak about situations that don't exist anymore?
What's your opinion?