The cowboy is one of the most enduring icons in American popular culture. Thanks to Hollywood, we imagine cowboys as rugged gunslingers, always ready for adventure and danger on the open range. But how accurate is this image? Movies and TV shows have painted a thrilling, but often misleading, picture of cowboy life. The reality of life in the Old West was far more complex—and sometimes far less glamorous—than what’s been shown on screen. Let’s bust some of the biggest myths about cowboys that Hollywood got totally wrong.