I get where Bluecollar is coming from - there is a lack of objective data from a scientific approach to golf club testing. But I also suspect that if there were truly such a thing that data would likely be misused and misunderstood by those trying to buy clubs and those trying to sell them.
Initially, it would seem there would be some value to taking a robot and testing different drivers hitting the ball in different spots, for example balls hit did center, balls hit 1/4 inch toward the toe and heel, etc. Theoretically, this would give a better objective understanding of how well each driver hits the ball under identical conditions. Perhaps one would see that a R11 hits it 4% farther than a Callaway XHot on dead center hits, but the distance falls off faster on mishits with the R11 than the Callaway (my example here is purely made up and not meant to be indicative of these clubs' actual performance). Some buyers might find this useful information - someone who has trouble hitting the sweet spot may then find the Callaway better than the R11 (in this fictional example).
But the problem is that no golfers hit like that robot. One would almost have to take detailed measurements of their club dynamics and then program the robot to mimic it perfectly to get useful advice. And most golfers themselves have sufficient variability in their swing that it would negate much of those results. The danger in having such objective data is that the club sellers would of course jump on any data that makes their club look a little better: "Proven in Robotic Testing: The New TaylorPing Callamade is 6 yards longer when struck 11 millimeters from the Sweet Spot!" Buyers would be swayed by this whether or not it mattered to their individual swing or not. We already certainly see this in advertising such as the Rocketballz claims of 17 yards longer, and this advertising is powerfully persuasive.
So much of hitting a club is subjective to the user and those things that make it subjective vary completely from user to user. What we perceive when we look at the head, how the balance feels in our hands, how the shaft feels as it is loading up, the sound made at impact, the feel of any vibration through the grip.....these are all things that make each of us perceive clubs differently, and these cannot be measured objectively. But most players don't even really know what makes these factors different or how we perceive them. One example is with putter shapes and how Edel has built a business around how the eye perceives the different head shape - a head may look good to you, but why? Another example is where Mizuno research showed that impact "feel" with irons is almost all about sound, and not what the hands perceive. Players will say that the Mizuno forged clubs feel like butter hitting the ball and they love that, but in Mizuno research they could make people think the feel was completely different simply by having headphones on and changing the sound pattern.
To me, the only really useful club reviews are those that focus on one or a few clubs specifically and offer a detailed explanation of how the club performs for that particular reviewer and why. The reviews in something like Golf Digest to me are totally useless. For example, they'll say "One of the longer drivers tested" and "Some testers found this longer than their current club". So what? Unless I know what their current club is, that is a useless piece of information. For all I know they could be hitting an old Macgregor persimmon, and having the new driver be longer is no surprise. And unless I know what their swing tendencies are there is no way to relate that to what I'm looking for in a club.
Detailed reviews such as you'll find on forums like this tend to offer a lot more information, and that makes it easier to assess if what the reviewer is discussing is applicable to yourself. And the various comments from other forum members may help ferret out the needed information. But even with that information at your disposal, it is still not sufficient to make a buying decision. Reviews may help you narrow down the field, but then you need to go get the only objective information that is really going to matter - how you personally hit it as measured with a launch monitor.
So, I understand where the OP is coming from in his opinion that club review data is often too subjective, but I feel that trying to get an objective 3rd party perspective is not very helpful. Instead, seek out detailed subjective explanations, examine your own capabilities and compare them to those subjective viewpoints to narrow down your choices, and then use launch monitor data to objectively finalize your decision.