The Hot List: How to Improve It

Here are our suggestions on how to turn this issue into a the ultimate resource for someone buying new equipment.

Bag DropWe’re not going to waste much time here with introductions, just a quick background. Last week, we took a look at a lot of the common criticisms of Golf Digest’s Hot List, and received a lot of quality feedback from the staff and forum members here.

The bottom line is that subscribers and newsstand customers want to see a change. As one of, if not the biggest name in golf media, you’ve got the money and resources to give us something better, something more helpful. What is it exactly that we want? Well, it’s funny you should ask – I’ve got those answers for you right here.

Before we get started, first lets have a look at the five major points of improvement we addressed in the first part of this look at the Hot List.

  1. The Hot List should be comprehensive
  2. A significant portion of the space for each listed club could be much better used for other info instead of the Hot/Not comments
  3. The range of players, in terms of skill, is too limited
  4. The classification of clubs is not consistent
  5. The rating system and the silver star/gold star system is useless, considering the lack of technical data as well as a lack of a full list of tested clubs.

In this episode, we’re going to offer some advice on how to make improvements in these areas.

The Hot List Should be Comprehensive
Yeah, we get it. Spread out clubs and balls over two issues, and overall you sell more magazines, right? Combine that with effectively doubling your ad revenue, and from a business standpoint, yeah, it makes a lot of sense. The problem is that, by taking that approach, the credibility and quality of product both begin to suffer. There are a few relatively simple answers here.

First, combine the two Hot List issues (clubs and balls). Seriously, it’s that simple. If you want, charge an extra $1-$2 for it. Again, I realize that the bulk of the income doesn’t come from subscriptions, it comes from advertisements, but consider this a way to reach out and show readers that you are paying attention to them and trying to do the right thing.

Okay, so you’re intent on keeping it spread over two issues. The advertisers are simply paying too much money. If it must be that way, include a voucher for the ball Hot List in the club Hot List issue. Make it so that people buying the issue off the newsstand (like myself – canceled my subscription sometime last year) get the second one for $2 or so after the coupon. The advertisers are happy because they still get double the space, and GD is still happy because they’re making the same amount of ad revenue as before.

Overall, I enjoy the issue a lot because it is fun to see what new stuff is out there. It is a good starting point when buying new equipment and it feeds my equipment buying addiction in the winter months. That being said, I assume Golf Digest makes its money by advertising and not by subscriptions. This of course presents a conflict of interest and hence why we have so many Gold/Silver rated items from all the big-name manufacturers. Additionally criticism seems to be limited to “the shaft looked too busy” and “its expensive.”

Pensfan79, Forum Member

As an additional note, and I said this in the first part, if you haven’t received a club or two from a couple of manufacturers, especially ones you know a lot of people will be curious about, don’t release the issue yet! It’s not about being first to the press, it’s about being the most thorough and comprehensive.

I also noted that they recommend visiting golfdigest.com for expanded coverage. All of this should be in the actual magazine! A lot of people say print media is on the fast track to obsolescence, but that doesn’t mean you should give the people still buying the magazine an incomplete product. Give us a reason to want to continue to read your physical magazine! There’s a large percentage of golfers who I think can safely be classified as traditionalists. As in, the same people that likely still have newspaper and magazine subscriptions as well. Take care of these people. Faldo’s trials and tribulations can be saved for next month.

I’ll even compromise on this one. If you’re wanting to get people over to golfdigest.com, that would be a great place to have a direct comparison of the items from this year’s HL and last year’s. In a few minutes, we’ll get to other suggestions on the rating system, but when you get to that section, keep this in mind and tell me that some sort of comparison tool wouldn’t be perfectly suited for their website, where for example, you compare the original Machspeed’s technical results to the new Machspeed Black. I don’t think there’s any way one could deny this.

Make Better Use of Space to Include Valuable Information
This one ties heavily into #5, but is probably better to be on its own. As you saw in part one, most all of us thought that the whole Hot/Not thing is dumb, incoherent and inconsistent. Think of all the relevant info that could be squeezed into this space. How about listing stock shaft length for every driver and fairway? Weight, or the reduction thereof has been a hot topic. How about listing the total weight of each club? Volume (in cc) for each would be great too, for a side by side comparison. Manufacturers have gotten better about releasing something smaller than 460cc, we’d like to be able to easily see that. Price is listed for each item, but really, isn’t that almost out of date as soon as the issue hits newsstands? That’s the one thing that varies the most, as opposed to these raw, technical points of information. Other items that could easily be listed include stock shaft brand/model, ….

We’re going to go a little light here for right now, but again, keep this suggestion in mind as well when we visit the topic of overall rating system.

Range of Tester Skill Levels is Too Limited
Remember this response?

Their criteria for high handicap was 15 or higher and only three of the 20 fit that category. I would use a few more. Throw in some 20+ folks out there.

Alan Olson, The Sand Trap .com Staff

Or maybe this one?

I’d prefer to see them describe the sort of swing that a certain set of clubs worked best for. Did diggers like one set of irons as opposed to sweepers who liked another? Which drivers were better suited to someone with a steep angle of attack and which worked better for which swing speeds? That would give better context for golfers who know the tendencies of their swings.

Ron Varrial, The Sand Trap .com Staff

Perhaps these ring a bell?

I like that they do it – if nothing else, it’s good discussion fodder — but the ratings can be very misleading because individual swings can vary so much. Just because a club earns a gold medal on the list does not mean that it will fit your swing. When they publish the reviewers’ shot tendencies (not just handicap), that alone makes the Hot List useful. You can find someone with similar tendencies to yours and then find the clubs that they like (which may or may not earn gold medals).

I’d add more about the reviewers’ swing characteristics and how they play the game. Then include each reviewer’s individual rating of the club on each category. That would help readers match up clubs to their swings.

George Promenschenkel, The Sand Trap .com Staff

I’d like to see a club test where they get 50 golfers in each of these 3 handicap ranges:

– Scratch to 7
– 8 to 15
– 15 and up

Being a 19/20 handicapper, I could care less about what a scratch golfer likes/dislikes as we are not playing the same type of clubs.

Then let each group test: Blades (forged), Tour level clubs (cast), Game improvement and Super Game Improvement irons.

Base the awards on the following categories:
1) Best Technology
2) Best Looking
3) Best Value for the Money
4) Best Playability (i.e. which clubs did you hit the best at the range)
5) Which clubs would you be most willing to buy

erock9174, Forum Member

Do I really need to say much more on this one? Expand on the selection of club testers, give us a better distribution of higher handicappers to go along with the mid and low handicappers. Plain and simple.

Also, as few people have said, how about figuring out a way to provide information on the tester’s swing? Ron said it above – some people have more of a sweeping swing, while others dig. Just like George said, this absolutely makes a difference in how a tester will perceive a tested club. I’d imagine if that type of feedback was provided, trends would emerge quite quickly, leading to a more customized suggestion based on the players knowledge of their own swing.

Classify Clubs Consistently
Plain and simple – ALL pieces of equipment should be classified as game enhancement, game improvement, and to a further extent, super game improvement. Balls can potentially be classified this way as well. Let’s be honest, I know you love your Pro V1, but there are probably a lot of people that shouldn’t be playing it. Hell, I may be one of them (Okay, Penta in my case, but you get the point)! Playing something a little harder might leave you in the fairway as opposed to 5-10 yards off of it.

I definitely hit on this in part one, but with the major manufacturers now classifying their drivers, Golf Digest should have been following suit for a year or two now. Again, this section will be a bit light as it heavily ties into the next point.

The Ratings System is Useless Given Lack of Technical Data and Full List of Tested Clubs
This one is a monster, but let’s break it down to smaller parts. First and foremost, pull the curtain back and give us every single technical detail and test result you can. A big part of me hates to say this, but take the human element out and get a robot! They’ve been around for a while, and the Hot List issue has been published for a years, and I assume they expect to continue to publish it for years to come. It’s the only way to remove any and all question about what performs and what doesn’t. The robot could care less what brand the club is. The robot will tell the story of the club as a whole. What are we looking for out of robot tests?

  • Spin rates
  • Distances, potentially broken down for 4 or 5 swing speeds (80 MPH, 90 MPH, 100 MPH, 110 MPH, etc.)
  • Distance (Carry)
  • Distance (Total)
  • Launch Angle, or if anything, tell us whether it’s low/mid/high launch

I really want to say it’s good, but I don’t think it is. The gold/silver/etc. breakdown that GD provides isn’t very objective in that the scientific/technical data isn’t presented or published. I’d like to see more numbers. Average carry distance + average total distance, average distance from target line, average distance from hole, etc.

uttexas, Forum Member

I’d like to see a more pointed approach to club testing, where specific metrics are defined and clubs are tested and evaluate using discrete data. Things like distance, spin, ball speed, shot dispersion, even weight and swingweight.

delav, Forum Member

Not only that, publish this data for EVERY club tested.

One question also sticks in my head: What was left off the list because it didn’t rank high enough, and what simply wasn’t tested. I can’t ever see them doing that, because it would bring attention to the negative instead of focusing on the positive, but it would be a much more honest project if they did.

Ron Varrial, Sand Trap Staff

Let’s take it even one step further. Test individual clubheads, by themselves. Why go that far? There are a few reasons. First, I’ll use myself as an example, though I know I’m not the only one in this boat. This big trend towards extremely light components isn’t for me. But what if I’m interested in a new driver, yet not interested in something as light as some of these have become? By testing each individual driver out, you can produce a number of EXTREMELY useful measurements, including but not limited to:

  • CoR Values
  • Size of Sweetspot
  • Forgiveness (potentially as distance in relation to a percentage away from sweetspot)
  • Graphical representation of location of sweetspot unique to each driver
  • Spin rates

Take these measurements from a wide variety of speeds, starting at something like 75-80 MPH and up to 120-130 MPH, at 10 MPH increments. See if there is a difference in behavior for a certain club in a fast swing as opposed to a slower swing. Shaft flex chosen per manufacturers specs/recommendations. As a constant, or control to the tests, the same model of ball should used throughout all tests.

How would you test the clubheads without testing the shaft? Actually, quite easily. In this video (at the 2:30 mark), they have a chamber in which the clubhead is tested by firing golf balls at 130 MPH at various points on the face. It’s really pretty slick and I highly recommend that you take a look at the vid. I’d probably give body parts to take a visit to the Oven and see this stuff in person, this kind of stuff fascinates me. Next week would be a great time, seeing as how my Steelers will be in the Super Bowl in Dallas. Ummmm, Arlington, I mean. Sorry! 🙂

Let me pose another question. How many of you buy aftermarket shafts, and have a few makes and models that are favorites, or simply just work better for you? Do you know exactly why it works better for you? I’d imagine that the answers vary between feel, spin, launch, and/or all that along with the correct weight.

Here’s the reason I ask those questions – by testing clubs as a whole plus the clubheads by themselves, I believe that the player will be able to accurately determine a near perfect combination (that is, if the stock setup isn’t optimal). Consider this scenario for a minute: Suppose the player is very strong and athletic (read:high swing speed), but has problems with accurately striking the ball. Before anyone can say it, yes, I know, save money on the club and get lessons – I agree, but bear with me here. So anyway, we know the player needs what would likely fall into more of a GI driver, fairway woods, and irons. The problem is that nearly everything in this category is being produced with lightweight shafts, grips, etc. Our player has no use for these swing speed enhancing technologies, but will definitely benefit from an abundance of forgiveness. In fact, the additional speed provided could very likely turn into a hindrance, as the ball will be that much further off line, and have that much more of a hook or a slice due to an increase in sidespin.

Of course, the immediate response is for the player to get fitted, but the same argument can be made for the Hot List as it stands now. By providing isolated clubhead data, the player will be able to browse the Hot List and say to themselves “Hey, the Burner SuperFast driver head looks like a great option, but maybe with the Fujikura Motore 60 shaft instead of the lightweight Matrix Ozik Xcon shaft. Same goes for irons. A number of irons now are coming with TT Dynalite Gold, while some players would likely benefit more from something like a Dynamic Gold.

This brings up two potential problems. Manufacturers may not like the raw technical data put out there, as it could pose a significant advantage or disadvantage. My take on it is that this becomes all the more reason for any manufacturer lacking in certain areas to step up their game. I don’t see it as being any different from the tech industry, where components are often benchmarked. Of course, the lower performing components can at the same time be great values in terms of performance versus price. There are other intangibles that play into the decision to purchase a club or not anyway; qualities such as feel and aesthetic properties. For example, while a certain club from manufacturer A may slightly outperform the same club from manufacturer B, the club from manufacturer B might be more visually appealing to you and have a superior feel.

The other potential problem is that your casual player may be easily overwhelmed by the abundant technical data. This can be easily overcome with a good introductory explanation at the beginning of the Hot List section, along with a simple guideline on how to use these numbers, and the emphasis that the only way to be certain you’re making the right choice is by visiting your local clubfitter. Again, the purpose of the guide should be to inform in an unbiased manner, and let determine a number of options they’d like to try prior to making a purchase decision.

Again, give us the full list of clubs tested, along with this data, for a variety of swing speeds.

One question also sticks in my head: What was left off the list because it didn’t rank high enough, and what simply wasn’t tested. I can’t ever see them doing that, because it would bring attention to the negative instead of focusing on the positive, but it would be a much more honest project if they did.

Ron Varrial, The Sand Trap .com Staff

By actually measuring performance on the number of items listed above, you then have a basis for awards, and can fully justify every award or ranking that is published. Now you can also start to see how our previous points tie in here. By providing this type of data, for all tested clubs, the list gets a lot closer to being a comprehensive resource, and certainly a lot of this data could be fit into the area previously dedicated to those Hot/Not comments. I would also expect that the line between game enhancement and game improvement would probably be pretty evident through the statistical data provided.

Of course, I could expand on all of these topics even more, but I think you guys get it and that I’ve made our point pretty clear. I realize that you don’t go from a 30 handicap to scratch overnight, and I wouldn’t expect drastic changes such as these to be made over a one year period, but how about showing us, the players, and ultimately the guys spending the money, that you are working on your game, making improvements, and always trying to get better.

2 thoughts on “The Hot List: How to Improve It”

  1. Great article. Makes me wonder why there is not a Consumer Reports-type organization for golf equipment. Maybe THESANDTRAP should look into becoming one. A nice little nonprofit organization that does product testing. There must be an Iron Byron for sale some place.

    I’d kick in $100 a year if it were tax deductible to get the data described in this article.

Leave a Reply

Your email address will not be published. Required fields are marked *