One thing that has bothered me a little in the past years is that being a man seems to have been twisted into something negative. For example the phrase toxic masculinity. It's just a little upsetting, you're born a man, you grow up a man and then you have people telling you that who and what you are is wrong. How can it be wrong when we are born that way. It's not nice and people seem to expect us to just swallow it and accept it. Men are often masculine, that's just nature. It's not something men did or chose. I just think it's not nice.
What's weird is that it's also deemed not-OK to have feminine traits as a man. Which one does society want? lol
I think, oddly, you described the negative connotations coming from females and I described the ones coming from other males, but it's still funny that people want to try to shape society through other people instead of themselves.