The Hot List: Hot or Not?

January means Golf Digest’s annual Hot List issue. But is it really a helpful resource, or closer to one large advertisement?

Bag DropWe’ve all been there – it’s time for a new set of irons, or perhaps a new driver. Old reliable really isn’t getting it done anymore, and deep down, though you’d never admit it, you’re jealous of some of your buddies and the shiny new clubs in their bags. So you take to the Internet, in search of a new driver, but after a 5-10 minutes, you’re quickly overwhelmed with options, and totally unsure of what your next move should be. Of course, you could take a trip to your nearest retailer and ask for opinions, but you never really know if they have a hidden agenda going on, like if they sell one more R9, they get a free trip to Sawgrass or something like that. It sure would be nice to go to that store with four or five options in mind and then test them all to see what works best.

Hot List 2011

Then the thought hits you – Golf Digest‘s annual Hot List. Of course! Back in the day, it was a great place to find what’s new, and what’s hot. So you then run out, pick it up, and hurry back home to see what valuable info it holds inside. A short while later, you’re left even more dejected than before. Why? Follow along to find out.

Before we dive headfirst into the behemoth that is the subject of Golf Digest‘s Hot List, let me first throw a few statements out there so that everyone is on the same page. First and foremost, I am not here to bash it just for the sake of bashing something. There are a hundred other subjects that I could take down that road long before I’d get to this subject, including New Orleans traffic, the questionable dignity of people at Wal-Mart, and why some people can’t accept the fact that the Steelers played a great second half in their comeback against the Ravens. If you’re just looking for a big whine-fest, you won’t find it here, but the Internet is filled with places to do that, so good luck.

While opinions on the Hot List will be voiced, the agenda of this column is to improve the quality of the information that is available to everyone and offer constructive criticism and ways that the annual list can improve. In this installment, we’ll take a look at some of the common complaints, and in the next, we’ll offer up suggestions on how it can be improved.

A week ago, in our equipment forum, I submitted these following questions about the Hot List, in order to get an understanding of our member’s feelings on it:

  1. It’s perfect; leave it just how it is!
  2. Your take on the Hot List, good or bad?
  3. What parts of it would you change, and why?
  4. Is the way they rate clubs helpful to you as a potential buyer?
  5. What parts would you keep exactly as they are, and why?
  6. Do you think it has/has not lost a little bit of its luster or credibility over the last few years, and why?
  7. Do you think the smaller companies get short end of the stick, while the big boys get all of the spotlight?
  8. I also wonder if and how many people out there make purchases solely based on content in the Hot List?
  9. If not, I wonder if it even has any influence at all on a purchase.
  10. When you see a full page ad for xyz iron set, and that silver or gold star is somewhere on the page, does it really even register or draw attention?

I also submitted these same questions to the staff here at The Sand Trap to get their opinions as well. As you can see by the questions above, nothing was presented in an overtly negative manner, and they were presented along with questions about how to improve it. As I said before, we’re not bashing it just for the sake of bashing something; it’s a product in which I think a lot of people would honestly like to see some improvement.

So with all that in mind, let’s get back to that little story and see why the potential customer was so dejected after the purchase of the Golf Digest Hot List issue.

First and foremost, we’re forced to flip through at 100 pages before we get to the equipment, with 46 full-page ads in that 100. Now, I think we all understand that ad revenue keeps magazines going these days, as print media has certainly taken a hit in recent years. The problem is that you realize that while this is supposed to be a definitive equipment guide, it really isn’t. Apparently they’re saving ball ratings for another issue. So 100 pages were given to other content and advertisers, and then there’s a little under 40 pages of equipment, and at the end, you see a small note that tells you to visit golfdigest.com for expanded coverage.

Let’s examine this for a sec. Quite simply, there are more ads than there are pages dedicated to equipment evaluation. Not only that, you’re advised to go to their website for expanded coverage (and likely more ads). Now, I have no problem with building a great online resource, but there’s just one problem – people paid money for this issue! Everything should be in print, in the magazine that people actually pay for. On top of that, you’ll be looking at a whole separate issue dedicated to the ball ratings. All of this combined leads me to my first major point: The Hot List issue should be comprehensive.

The fact that it isn’t immediately makes me question credibility and validity of the information therein. I’ve also seen in previous issues where some clubs didn’t make it because they didn’t arrive in time. Well, maybe this particular issue should be pushed back a month. Like we say here, it’s not about being first, it’s about having the best quality. Adding fuel to that fire is the subject of one of the most frequent responses I got, and is probably the biggest can of worms we have here – the way the clubs are rated, including scoring, criteria, and methodology.

“Demand” is factored into the ratings system? Come on, that’s simply a marketing tool. I’m surprised they rated any club less than three stars in that category. The only reason I can see it is because it IS the Hot List and telling us which clubs are hot… but is that because they are the best performing clubs or because they are in demand due to marketing?

Dave Koster

This is probably one of my biggest gripes as well. What effect does “demand” have on a club’s performance? None whatsoever. Rarely is a product with the most “demand” or biggest degree of public hype actually the best on the market. Our own Jamieson Weiss had a great addition on this matter:

The “Buzz” or “Demand” or whatever it is this year (however meaningless to the overall rankings) is pointless. Consumers know the buzz about the clubs because we are the ones being bombarded with ads in their magazine and on TV.

Jamieson Weiss

The consensus is that this criteria adds nothing useful at all. Neither do the player comments and HOT/NOT statements, according to a number of responses.

The comments aren’t very helpful and neither are the ratings. Tech Talk is the most useful.

Alan Olson

Can you really sum up a club in a couple of condensed paragraphs? If you’re going to publish club reviews, you need to really review the clubs and include the findings. What the manufacturers’ marketers say is pretty irrelevant. Every driver out there promises more distance and forgiveness, after all.

George Promenschenkel

The rating system is next to useless for the purpose of a mass-testing of golf clubs. The star system is decent, but not prominent enough. I think a numbered scale would be better. Occasionally they seem to nit-pick in the HOT/NOT thing at the bottom – why is Callaway’s Jaws wedge being knocked because Golf Digest would “Like to see more of an effort to educate players on finding the right mixture of bounce/loft”?

Let’s take a look deeper at this, as it really does stand out as one of the reasons why this issue isn’t as helpful as it can be. Listed as the NOT comment for the Nike VR Pro fairways:

We’d like to meet any golfer who needs 32 settings for a 3-wood

How exactly is that a negative against a club? So you’re allowed a wide array of adjustment options. The STR8-Fit tech has allowed for this for the new VR models, as well as last year’s. Why is this relevant all of a sudden?

On the Mizuno JPX-800 Pro:

Having a JPX-800 and JPX-800 Pro model could lead to consumer confusion

Why is this a negative against the JPX-800 Pro, and not the JPX-800? Furthermore, how is this any different than TaylorMade’s TP designation, Nike, Callaway, and many others’ “Tour” designation, or the countless ways manufacturers name their clubs? In some cases, the difference between the two models is pretty significant, other times, it’s simply a better shaft, but regardless, how does this have any effect on the performance on the club?

On the TaylorMade R11:

Even with two degrees of launch adjustability, this is still a driver for better players

Well, so are the Nike VR Pro and Titleist 910 D3, which are both adjustable, but neither has that as a mark against them.

Comments such as these are littered throughout the list. It doesn’t specifically say whether or not these Hot/Not comments have any effect on a club’s score, but if they don’t have any bearing on the final score of a club, why include them? Is it really necessary to find some irrelevant dig on every single club on the list?

An additional complaint is the narrow selection of testers used, in terms of ability as well as the lack of insight into the testers’ swing tendencies. Ron made a good point when he responded that knowing which clubs worked well for someone with more of sweeping swing versus what worked well for someone that was more of a digger:

I’d prefer to see them describe the sort of swing that a certain set of clubs worked best for. Did diggers like one set of irons as opposed to sweepers who liked another? Which drivers were better suited to someone with a steep angle of attack and which worked better for which swing speeds? That would give better context for golfers who know the tendencies of their swings.

Ron Varrial

I, along with Alan (as you can see below) also thought there was a curious lack of 20+ handicappers.

Their criteria for high handicap was 15 or higher and only three of the 20 fit that category. I would use a few more. Throw in some 20+ folks out there.

Alan Olson

Hot List Drivers

The problem is that the number of players above a 15 or 20 handicap is far greater than that of players on the other end of that spectrum. So then one must ask, is the Hot List only aimed at better players? I wouldn’t think so, especially with the inclusion of a number of SGI clubs, but why aren’t the people some of these clubs are aimed at the same guys doing the testing? Additionally, in previous years, we at least got the benefit of irons being classified as players, game improvement, or super game improvement. Why can’t drivers and fairway woods be classified in the same manner? The majority of manufacturers have made clear distinctions between their game improvement lineup of drivers and their players drivers. Prime examples include TaylorMade’s Burner/R(7/9/11) line, Nike SQ (which includes Machspeed)/VR line, Ping G/K/I lines, and even Titleist, who makes the distinction between the D2 and D3 models. Why hasn’t Golf Digest caught on to this trend, especially when the driver seems to be the club the majority of high handicappers struggle with the most?

While there are a number of other topics that could easily come to the surface here in this space, at some point, it becomes overkill. There is one final critique that really resonated across almost all responses, and it’s that the overall silver/gold star rating system really has become pointless. Some of what has been said above feeds into that, but are only part of the bigger picture. Anytime the final gold/silver ratings come into discussion, so does the observation that all of the big guys in the industry seem to always be the ones racking up the medals and furthermore, that all we’re really looking at is a huge advertisement for those guys.

The famous story is that one of the big OEMs threw a hissy fit when it didn’t receive the rating it expected. Whether that’s fact or myth feels more like reality when reading the latest issue. The Gold List reads like a lineup from PING, Titleist, TaylorMade, and Callaway. This fact alone robs the luster from the list, because if everything’s the best, then everything’s average. As I read through, I caught myself paying attention to only the smaller companies who earned a mention, such as TourEdge Exotics, Scratch, and Adams, thinking that they must have really done well to get on the radar. Perception or reality? As a reader, does it really matter?

Ron Varrial

The Hot List to me is just advertisement. Plain and simple. I like the fact that they put in a lot of effort and to have a larger panel is great. I just want to see more critical opinions of the clubs but that will never happen when the manufacturers are paying the bills of the magazine (and I’ve been wrongly accused of that here as well!). To have every driver listed a gold or silver does not give me any clue on which is better other than showing which one was a “category leader”

Dave Koster

Mostly bad. I find it an advertisement for the major equipment manufactures. Everybody gets either a gold or silver and the alleged negatives are pathetic. I’m not sure how impartial the judges are when the major advertising in Golf Digest is from the same equipment company you are testing. It reminds me of how sports are played in school, everybody gets a trophy and a pat on the back.

Alan Olson

Can you really sum up a club in a couple of condensed paragraphs? If you’re going to publish club reviews, you need to really review the clubs and include the findings. What the manufacturers’ marketers say is pretty irrelevant. Every driver out there promises more distance and forgiveness, after all. And the gold/silver system needs to go. A club in the Hot List is either “Great” or “Really Good.” They can’t all be, can they?

George Promenschenkel

The rating system is next to useless for the purpose of a mass-testing of golf clubs. The star system is decent, but not prominent enough. I think a numbered scale would be better.

Jamieson Weiss

I could add my own comments, but I think these guys pretty much said it all. Pull the curtains back, give us more detail, technical data, and other pertinent information. I believe it would go a long way in bringing some credibility to the Hot List, and furthermore providing a much more useful resource to those in the market for new clubs.

So that leaves us with a few major points to take into the second part of this topic (which will be coming later this week):

  1. The Hot List issue should be comprehensive
  2. A significant portion of the space for each listed club could be much better used for other info instead of the Hot/Not comments
  3. The range of players, in terms of skill, is too limited
  4. The classification of clubs is not consistent
  5. The rating system and the silver star/gold star system is useless, considering the lack of technical data as well as a lack of a full list of tested clubs.

So stay tuned, as in the upcoming days we’ll deliver the second part in this series, in which we present a number of different ways to actually improve the Hot List in ways that would benefit everyone. See ya then.

14 thoughts on “The Hot List: Hot or Not?”

  1. I opted out of the poll, since opinion of the “Hot List” may be somewhat tainted by my overall impression of Golf Digest’s editorial content and the way golf equipment in general is presented to potential consumers. Suffice to say, my opinion is a somewhat negative one, but since I have read previous “Hot List” issue cover to cover (while in the washroom), here’s why I have such disdain for that particular issue.

    1.) They don’t seem to take into account people who’ve played this game for a long time already know what they want (and if they don’t by now they never will). At the same time, they don’t attemt to educate new buyers what the particular design characteristics are meant to do. Why do some experienced players swear by a certain clubhead design for their game, yet have no use for the one on the next page that looks very similar but different somehow? How and why is it different? I realize the issue isn’t a textbook, but how about at least standardizing the terminology from review to review instead of using OEM doublespeak?

    I know from personal experience what types of clubs will help or hurt my game, and for the most part I can adjust to any small bias any particular piece of equipment (within a certain genre or type) brings to the table, but new golfers can’t necessarily do that. Either way, nowhere is there solid data on any of these tendencies and biases (sp?). Maybe that’s because there’s no “one way” to set up a piece of equipment for a new golfer or a lifelong 20+ capper, just like there’s no “one single swing fault” that causes an avid golfer to be stuck at a 20+ index.

    The variables are too great. But are they really? When I go the golf shop, all the clubs I see on the shelves have stock grips and stock shafts. Why not provide data on just those and let the club hos sort out the rest on their own? Without any science behind the review for each piece of equipment, I’m left with making my short list based on aesthetics (personal), brand recognition (marketing), and experience (based on a mish mosh of purely subjective data and heresay).

    2.) Anyone familiar with any other sport, or major purchase of any kind is familar with “equipment issues”. We’re accustomed to getting an overview of all products available on the market that year. Granted, not every product will get an in depth review, and for those that do, only a few get a nod for being a good value, a top performer, or just generally worthy of consideration, but at least they’re all there to see (or at least as close to “all” as possible) so we don’t wonder later why we never even tried out Brand XYZ irons that were a similar price and look much better.

    3.) I guess I don’t see the logic in buying a product solely because its “Hot”. A new set of clubs isn’t cheap. Some people will only get one or two new sets in their life, so having the tools to make an informed decision is important. In the case of many other consumer products, publications like Consumer Reports can give you a feel of the overall quality and value for each brand and product line based on real customer feedback and editor test data.

    One thing that becomes abundantly clear when looking at reliability and value data is that jumping in with both feet to buy the first generation of any product is a dicey proposition. Often times customers pay through the nose to be beta testers for a product that either disappears almost immediately from the marketplace after release, or is quickly reintroduced with all the necessary improvements.

    Basically, I just want a comprehensive equipment buyers guide, not a “Hot List” so perhaps I’m not the intended audience for Golf Digest in the first place.

  2. sean_miller, a good point tucked away at the end, though: if you’re not the intended audience – you’re a golfer, after all – then who is?

    I’m not sure that Golf Digest has a good answer.

  3. Do you think it’s likely that the reason they don’t put drivers into the hands of 20+ hcp players to test is because they just don’t have the consistency to hit that club? Hell, I play off 10 and I don’t carry a driver because I can’t justify taking it out on the course with me. After all, no manufacturer wants their new shiny new driver put in the hands of someone who reports “I was slicing this piece of rubbish all over the place!” when it’s likely that the the problem was a swing that needs a lot of work and not the club.

    I suppose I have a bit off a bug bear when it comes to drivers, though. When adverts promise “more distance” and “more accuracy” it’s difficult to convince someone that, until they develop somewhat of a repeatable swing, those qualities would more likely emerge if the money was spent on lessons.

    I am reminded of a humerous situation at the golf range last year. I’m in a bay next to a couple of guys who are obviously quite new to the game. Most shots are going along the ground, or nowhere at all – and their shots weren’t much better, boom boom. Thanks, I’m here all night – That’s fine, of course, we’ve all been there. I mean, we’re at the range to improve, right? From the conversation they’re having, one of them has obviously bought new clubs so he’s trying them out. After he heavily tops the umpteenth shot he turns to his mate and says “wow, these are so much nicer to hit than my last set. The four iron is beautiful!” :o)

    I guess the point I’m trying to make is that we all love to buy new gear even if we don’t have the game for the clubs we’re looking to buy. Do you think, however, that there is a minimum standard, below which the equipment actually has a no measurable effect on the level of play? Particularly with drivers?

    Great article, by the way. Let’s hope they pay attention.

  4. I think Golf Digest’s Hot List is awesome. I wouldn’t want too much detail on all the clubs they look at anyway. That would be too boring. The way they approach the article makes it fun to read. Each club is looked at from various perspectives and I find it refreshing. They do an excellent job of categorizing the clubs by swing speed and handicap. As a result, it’s easy to see which clubs are a good match for you. Another thing worth praising are the side bar articles like the one on “the upside/downside of hitting a 3-wood off the tee” and another one on “comparing a 19 degree hybrid and fairway metal”. Frankly, the Hot List gets better every year.

  5. My favorite “Not” comment in this year’s issue if for the Callaway X Series Jaws CC wedge (which got a gold award and 4, 5, 4, and 3.5 stars in the categories). It seems like GD couldn’t think of anything to complain about, so they put “We’d like to see more of an effort to educate players on finding the right mix of bounce/loft.” Okay, but how does that have anything to do with this wedge?

  6. i hate it.

    i would like to see robot testing on all clubs and balls. this would be definitive, rather than garbage like “the ball flies off the face” or “the feel is great” or “the club practically swings itself”. i mean, what does this tell us? nothing.

    give me robot testing that shows how the clubs perform on center hits and off center hits. show me results for balls at different clubhead speads, spin and distance.

    to me, anything else is worthless.

  7. I agree with Ron Varrial’s comments.

    The main reason I read the Sand Trap is for the club reviews. The GD Hot List feels like paid advertising.

    In that regard, I’d like to see more discussion of what kind of irons different types of players should use and why. I can hit any iron off a mat, but playing off grass is a different story. Discuss “diggers” vs. “sweepers” with pictures, please.

    And more articles about golf balls. How about some data on swing speed vs. golf ball type. And remember that some of us only swing 85mph. Any studies on new balls vs. recycled balls?

    I’m sure there is more I want to know, but let’s start with those questions. Thanks

  8. It’s interesting that the computer and photographic magazines, which also rely on advertising content, don’t seem to have a problem denouncing sub-par equipment.

    And this “Demand” category is completely bizarre and has no place in a serious objective evaluation.

    Finally maybe Iron Byrons should be used in evaluating clubs intended for high handicappers (like me). They could very consistently hit shots off center etc and the forgiveness factor could be evaluated that way?

  9. I agree with Colin007 and get the human element out of it and give me Iron Byron. Not only could you see how clubs from this year compare but you could also see how much or little improvement there is from previous years models.
    My suggestion is to get fit by a professional or find a professional club maker to build you a set. Otherwise you are certainly welcome to spend hundreds of dollars every year on the new “Hot” club if that is what makes you happy.

  10. Honestly the ‘Hot List’ seems to be more the ‘Companies that spend a lot of money on advertising list’…

    I think the original Hot Lists had more credence. Now they are filled with quotes like “If you don’t like this club, you don’t like vanilla ice cream.” (actual quote) If you asked NASCAR to put together a golf review, this would be it.

    When I first started out 6 years ago, I thought the Hot List was awesome. After going to my first demo day with all the big companies there, I can’t even bear to look at the Hot List.

  11. The “Hot List” now ranks right up there with televised Skins Games and the NBA Slam Dunk Contest. Worthless and pointless.

  12. Hot List lets me know what’s out there. Let’s me know what to check out at the next demo day.

    If the GD Hot List authors were invited to be a guest speaker at a college class, they would end up in a lot more marketing classes than physics classes.

    GD took a hit with me when it published the “Obsolete” article last May, saying we should sell off our clubs if the batteries are low. My clubs, however, are not powered by batteries – they are powered by me!

Leave a Reply

Your email address will not be published. Required fields are marked *