I've been told that women are independent and equal to men, and then on dating apps, I see that a lot of women are looking for dominant men. Do you understand why men are so confused now? This isn't to be mean, but I feel like the problem is being inadequately discussed in society and we are only allowed to tell one side of the story. If this is a true liberal society, then individuals should ultimately decide what is right for them. This is really more of a question than a comment. Am I crazy or is this for real?