I have never been a very big, die-hard fan of Westerns as some people have. I just don't understand their mentality about the movies. Each Western follows practically the same pattern, involves a bunch of guys wearing silly pants shooting each other without mercy, and has some form of sex- and alcohol-drive aspect to the plot. These aren't the most upstanding morals, I think many people can agree. So then why are they such a big part of American culture? Sure, pretty much all Westerns have been made by Americans, but why would Americans want to label themselves with that kind of name?
This is something I ask because I don't understand it. Apparently movies about guys acting like unforgiving, disrespectful dim-wits (to put it in a less politically-correct way) are enjoyable to watch? Help me understand.