Uh, have you EVER checked out the headlines for women’s magazines??? There is TONS of stuff about how to “make your man happy”… plus there are TONS of relationship books published every year and I’d bet you anything that most of the people reading those books are women.
Now, some of these books are of the variety — “Be a stronger, more empowered woman instead of a doormat” but there are also plenty of books that attempt to help women understand men better, make changes so that they are better partners, etc.
Bottom line, of course it’s just as important for men to be happy in a relationship and women need to take responsibility for their part in relationships.
But from what I’ve seen — women read a lot more about relationships and discuss it with each other than men seem to. Maybe I’m wrong, but I have both male and female friends, and this is what I’ve observed.