I have a theory that may or may not be true.
It seems to me that women are more socialized to care whether somebody likes them or not–and to worry more if someone is upset with them.
Something interesting happened to me yesterday. I’ve been very frustrated with my optomotrist, and in fact have tried to get a full refund and start again with somebody else. I was arguing with the female optician in the office about this yesterday (after showing up for my umpteenth appointment and having to wait for half an hour and STILL having four people in front of me) when she said something that really took me off guard.
She said, “Look, Barbara, I really like you, and want to make things right for you.”
probably wasn’t very nice–something like, “Frankly, I don’t care whether you like me or not–I just want glasses that help me see properly!”
So that’s my question: Would anyone ever try to reassure a man that they were liked?