Men have been taught by decades of Western films, books, and tv series that sex is either a goal to accomplish, or something they need to “get” from women! That women own it or hold it and that they need to take it from them.
American men do not learn that it’s an activity to be shared, or even that it’s supposed to be fun for all involved. How often have you watched movies and shows where it’s a deadly serious event that has to be approached like it’s rocket science? Where Nobody is smiling or laughing, or being silly? And once accomplished it’s over, you smoke a cigarette and no more needs be done or said with that person? No wonder so many men are confused.