We’ve all been there – it’s time for a new set of irons, or perhaps a new driver. Old reliable really isn’t getting it done anymore, and deep down, though you’d never admit it, you’re jealous of some of your buddies and the shiny new clubs in their bags. So you take to the Internet, in search of a new driver, but after a 5-10 minutes, you’re quickly overwhelmed with options, and totally unsure of what your next move should be. Of course, you could take a trip to your nearest retailer and ask for opinions, but you never really know if they have a hidden agenda going on, like if they sell one more R9, they get a free trip to Sawgrass or something like that. It sure would be nice to go to that store with four or five options in mind and then test them all to see what works best.
Then the thought hits you – Golf Digest‘s annual Hot List. Of course! Back in the day, it was a great place to find what’s new, and what’s hot. So you then run out, pick it up, and hurry back home to see what valuable info it holds inside. A short while later, you’re left even more dejected than before. Why? Follow along to find out.
Before we dive headfirst into the behemoth that is the subject of Golf Digest‘s Hot List, let me first throw a few statements out there so that everyone is on the same page. First and foremost, I am not here to bash it just for the sake of bashing something. There are a hundred other subjects that I could take down that road long before I’d get to this subject, including New Orleans traffic, the questionable dignity of people at Wal-Mart, and why some people can’t accept the fact that the Steelers played a great second half in their comeback against the Ravens. If you’re just looking for a big whine-fest, you won’t find it here, but the Internet is filled with places to do that, so good luck.
While opinions on the Hot List will be voiced, the agenda of this column is to improve the quality of the information that is available to everyone and offer constructive criticism and ways that the annual list can improve. In this installment, we’ll take a look at some of the common complaints, and in the next, we’ll offer up suggestions on how it can be improved.
A week ago, in our equipment forum, I submitted these following questions about the Hot List, in order to get an understanding of our member’s feelings on it:
- It’s perfect; leave it just how it is!
- Your take on the Hot List, good or bad?
- What parts of it would you change, and why?
- Is the way they rate clubs helpful to you as a potential buyer?
- What parts would you keep exactly as they are, and why?
- Do you think it has/has not lost a little bit of its luster or credibility over the last few years, and why?
- Do you think the smaller companies get short end of the stick, while the big boys get all of the spotlight?
- I also wonder if and how many people out there make purchases solely based on content in the Hot List?
- If not, I wonder if it even has any influence at all on a purchase.
- When you see a full page ad for xyz iron set, and that silver or gold star is somewhere on the page, does it really even register or draw attention?
I also submitted these same questions to the staff here at The Sand Trap to get their opinions as well. As you can see by the questions above, nothing was presented in an overtly negative manner, and they were presented along with questions about how to improve it. As I said before, we’re not bashing it just for the sake of bashing something; it’s a product in which I think a lot of people would honestly like to see some improvement.
So with all that in mind, let’s get back to that little story and see why the potential customer was so dejected after the purchase of the Golf Digest Hot List issue.
First and foremost, we’re forced to flip through at 100 pages before we get to the equipment, with 46 full-page ads in that 100. Now, I think we all understand that ad revenue keeps magazines going these days, as print media has certainly taken a hit in recent years. The problem is that you realize that while this is supposed to be a definitive equipment guide, it really isn’t. Apparently they’re saving ball ratings for another issue. So 100 pages were given to other content and advertisers, and then there’s a little under 40 pages of equipment, and at the end, you see a small note that tells you to visit golfdigest.com for expanded coverage.
Let’s examine this for a sec. Quite simply, there are more ads than there are pages dedicated to equipment evaluation. Not only that, you’re advised to go to their website for expanded coverage (and likely more ads). Now, I have no problem with building a great online resource, but there’s just one problem – people paid money for this issue! Everything should be in print, in the magazine that people actually pay for. On top of that, you’ll be looking at a whole separate issue dedicated to the ball ratings. All of this combined leads me to my first major point: The Hot List issue should be comprehensive.
The fact that it isn’t immediately makes me question credibility and validity of the information therein. I’ve also seen in previous issues where some clubs didn’t make it because they didn’t arrive in time. Well, maybe this particular issue should be pushed back a month. Like we say here, it’s not about being first, it’s about having the best quality. Adding fuel to that fire is the subject of one of the most frequent responses I got, and is probably the biggest can of worms we have here – the way the clubs are rated, including scoring, criteria, and methodology.
“Demand” is factored into the ratings system? Come on, that’s simply a marketing tool. I’m surprised they rated any club less than three stars in that category. The only reason I can see it is because it IS the Hot List and telling us which clubs are hot… but is that because they are the best performing clubs or because they are in demand due to marketing?Dave Koster
This is probably one of my biggest gripes as well. What effect does “demand” have on a club’s performance? None whatsoever. Rarely is a product with the most “demand” or biggest degree of public hype actually the best on the market. Our own Jamieson Weiss had a great addition on this matter:
The “Buzz” or “Demand” or whatever it is this year (however meaningless to the overall rankings) is pointless. Consumers know the buzz about the clubs because we are the ones being bombarded with ads in their magazine and on TV.Jamieson Weiss
The consensus is that this criteria adds nothing useful at all. Neither do the player comments and HOT/NOT statements, according to a number of responses.
The comments aren’t very helpful and neither are the ratings. Tech Talk is the most useful.Alan Olson
Can you really sum up a club in a couple of condensed paragraphs? If you’re going to publish club reviews, you need to really review the clubs and include the findings. What the manufacturers’ marketers say is pretty irrelevant. Every driver out there promises more distance and forgiveness, after all.George Promenschenkel
The rating system is next to useless for the purpose of a mass-testing of golf clubs. The star system is decent, but not prominent enough. I think a numbered scale would be better. Occasionally they seem to nit-pick in the HOT/NOT thing at the bottom – why is Callaway’s Jaws wedge being knocked because Golf Digest would “Like to see more of an effort to educate players on finding the right mixture of bounce/loft”?
Let’s take a look deeper at this, as it really does stand out as one of the reasons why this issue isn’t as helpful as it can be. Listed as the NOT comment for the Nike VR Pro fairways:
We’d like to meet any golfer who needs 32 settings for a 3-wood
How exactly is that a negative against a club? So you’re allowed a wide array of adjustment options. The STR8-Fit tech has allowed for this for the new VR models, as well as last year’s. Why is this relevant all of a sudden?
On the Mizuno JPX-800 Pro:
Having a JPX-800 and JPX-800 Pro model could lead to consumer confusion
Why is this a negative against the JPX-800 Pro, and not the JPX-800? Furthermore, how is this any different than TaylorMade’s TP designation, Nike, Callaway, and many others’ “Tour” designation, or the countless ways manufacturers name their clubs? In some cases, the difference between the two models is pretty significant, other times, it’s simply a better shaft, but regardless, how does this have any effect on the performance on the club?
On the TaylorMade R11:
Even with two degrees of launch adjustability, this is still a driver for better players
Well, so are the Nike VR Pro and Titleist 910 D3, which are both adjustable, but neither has that as a mark against them.
Comments such as these are littered throughout the list. It doesn’t specifically say whether or not these Hot/Not comments have any effect on a club’s score, but if they don’t have any bearing on the final score of a club, why include them? Is it really necessary to find some irrelevant dig on every single club on the list?
An additional complaint is the narrow selection of testers used, in terms of ability as well as the lack of insight into the testers’ swing tendencies. Ron made a good point when he responded that knowing which clubs worked well for someone with more of sweeping swing versus what worked well for someone that was more of a digger:
I’d prefer to see them describe the sort of swing that a certain set of clubs worked best for. Did diggers like one set of irons as opposed to sweepers who liked another? Which drivers were better suited to someone with a steep angle of attack and which worked better for which swing speeds? That would give better context for golfers who know the tendencies of their swings.Ron Varrial
I, along with Alan (as you can see below) also thought there was a curious lack of 20+ handicappers.
Their criteria for high handicap was 15 or higher and only three of the 20 fit that category. I would use a few more. Throw in some 20+ folks out there.Alan Olson
The problem is that the number of players above a 15 or 20 handicap is far greater than that of players on the other end of that spectrum. So then one must ask, is the Hot List only aimed at better players? I wouldn’t think so, especially with the inclusion of a number of SGI clubs, but why aren’t the people some of these clubs are aimed at the same guys doing the testing? Additionally, in previous years, we at least got the benefit of irons being classified as players, game improvement, or super game improvement. Why can’t drivers and fairway woods be classified in the same manner? The majority of manufacturers have made clear distinctions between their game improvement lineup of drivers and their players drivers. Prime examples include TaylorMade’s Burner/R(7/9/11) line, Nike SQ (which includes Machspeed)/VR line, Ping G/K/I lines, and even Titleist, who makes the distinction between the D2 and D3 models. Why hasn’t Golf Digest caught on to this trend, especially when the driver seems to be the club the majority of high handicappers struggle with the most?
While there are a number of other topics that could easily come to the surface here in this space, at some point, it becomes overkill. There is one final critique that really resonated across almost all responses, and it’s that the overall silver/gold star rating system really has become pointless. Some of what has been said above feeds into that, but are only part of the bigger picture. Anytime the final gold/silver ratings come into discussion, so does the observation that all of the big guys in the industry seem to always be the ones racking up the medals and furthermore, that all we’re really looking at is a huge advertisement for those guys.
The famous story is that one of the big OEMs threw a hissy fit when it didn’t receive the rating it expected. Whether that’s fact or myth feels more like reality when reading the latest issue. The Gold List reads like a lineup from PING, Titleist, TaylorMade, and Callaway. This fact alone robs the luster from the list, because if everything’s the best, then everything’s average. As I read through, I caught myself paying attention to only the smaller companies who earned a mention, such as TourEdge Exotics, Scratch, and Adams, thinking that they must have really done well to get on the radar. Perception or reality? As a reader, does it really matter?Ron Varrial
The Hot List to me is just advertisement. Plain and simple. I like the fact that they put in a lot of effort and to have a larger panel is great. I just want to see more critical opinions of the clubs but that will never happen when the manufacturers are paying the bills of the magazine (and I’ve been wrongly accused of that here as well!). To have every driver listed a gold or silver does not give me any clue on which is better other than showing which one was a “category leader”Dave Koster
Mostly bad. I find it an advertisement for the major equipment manufactures. Everybody gets either a gold or silver and the alleged negatives are pathetic. I’m not sure how impartial the judges are when the major advertising in Golf Digest is from the same equipment company you are testing. It reminds me of how sports are played in school, everybody gets a trophy and a pat on the back.Alan Olson
Can you really sum up a club in a couple of condensed paragraphs? If you’re going to publish club reviews, you need to really review the clubs and include the findings. What the manufacturers’ marketers say is pretty irrelevant. Every driver out there promises more distance and forgiveness, after all. And the gold/silver system needs to go. A club in the Hot List is either “Great” or “Really Good.” They can’t all be, can they?George Promenschenkel
The rating system is next to useless for the purpose of a mass-testing of golf clubs. The star system is decent, but not prominent enough. I think a numbered scale would be better.Jamieson Weiss
I could add my own comments, but I think these guys pretty much said it all. Pull the curtains back, give us more detail, technical data, and other pertinent information. I believe it would go a long way in bringing some credibility to the Hot List, and furthermore providing a much more useful resource to those in the market for new clubs.
So that leaves us with a few major points to take into the second part of this topic (which will be coming later this week):
- The Hot List issue should be comprehensive
- A significant portion of the space for each listed club could be much better used for other info instead of the Hot/Not comments
- The range of players, in terms of skill, is too limited
- The classification of clubs is not consistent
- The rating system and the silver star/gold star system is useless, considering the lack of technical data as well as a lack of a full list of tested clubs.
So stay tuned, as in the upcoming days we’ll deliver the second part in this series, in which we present a number of different ways to actually improve the Hot List in ways that would benefit everyone. See ya then.