Golf is a pretty simple game, right? Just hit the ball into the hole in as few attempts as possible. The person who takes the fewest strokes after 18 holes wins.
While the scoring is a little counterintuitive -- golf is one of the few sports where a lower score is better -- most people know the difference between a birdie and a bogey, a putter and a driver, a green and a fairway. Then there's the golf handicap. Even the most avid weekend golfers may stumble when trying to explain what it does or how it's calculated. But what may seem like the product of arbitrary numbers and equations is actually a highly developed and largely standardized system that allows players of different skill levels to compete on the same level.
Standardized golf handicapping has been in existence for more than a century. Around 1900, the Ladies Golf Union (LGU), an organization based in Great Britain and Ireland, became the first group to establish a unified handicap system. The United States Golf Association (USGA) followed suit in 1911, creating a standardized course rating system that made handicapping consistent from course to course. The idea soon caught on in Great Britain, where the Royal and Ancient Golf Club of St. Andrews adopted the Standard Scratch Score and Handicapping Scheme in 1926. Today, the USGA administers the handicap system at more than 10,600 golf courses in the United States, providing precise rules and regulations that golf associations and clubs must follow in order to participate in the county's unified handicap system [source: USGA].
Now that you know a little about the history and purpose of the golf handicap, we're ready to dive into the details of just how it is calculated.