I often see customers perusing the myriad of different beers in my store with their phones in-hand, looking at Untappd, BeerAdvocate or RateBeer. I, too, have all three of those apps running on my phone and often reference them on a daily basis. They are great for tracking what you’ve had in the past and jogging your memory as to what you thought of those beers. They also show you what others are saying about the beers — but who are these “others” and why should anyone care about what they are saying? What about a four-cap rated beer versus a beer with an 86 rating? Why do any of these numbers matter?
The concept of ratings is taken straight from the wine world and used in the same manner for the beer world, but there is a large difference between the two: wine ratings are done by two wine rating sources staffed by professional Sommeliers, and beer ratings are done by people like you and me from all over the world. In theory, the few people who make the wine ratings know what they are doing and wine stores love using those ratings as part of their displays. Part of the reason for displaying ratings in wine stores is that it makes it easier for the general public to come in and browse without having to talk to a person who knows about the wine. Talking to people is horrible I guess.
Another large and important difference between beer and wine ratings is the environment from which the product was taken. For wine ratings the product is given to the Sommeliers straight from the wine maker and if there happens to be an issue with the wine, a bad cork or something else, it can be rectified right then and there. Beer ratings don’t always come from such a nicely controlled environment.
Maybe you try a beer at your neighborhood brewery that tastes like a bag of nickels was used in the brewing process and it was then set out overnight just so it was good and flat, which, unbeknownst to you, is the result of an infection in the tap line. You’ll get on Untappd and give it a ½ cap score or get on BA/Rate Beer and give it a 55 or whatever their lowest score is. Or maybe you bought an IPA from a grocery store but you don’t notice that the beer’s “best by” date was a year and a half ago. That beer will then sent to the digital land of suck and trolled for everyone to see. The point is that environment has a lot to do with the beer you’re trying. Tap lines can be infected or dirty, servers may get tap handles confused and give you a hoppy amber when you thought you were getting a wheat beer, corks can fail, the bottle’s been light stuck for months at a store, the beer has been overheated and cooked in a back room.
Then there’s the issue of personal taste. Someone who is a massive hop-head who only drinks Imperial IPAs orders a Belgian beer and gets knocked over by the bizarre complexity that comes with that style and saddened by the lack of hops. ½ cap review: ‘Belgians L2Brew!!!’ Clearly that guy likes hops, which is fine. He doesn’t like Belgian styles, also fine. Fortunately there are thousands of other beers for him to enjoy that aren’t Belgian. But what happens to that beer that was horribly down ranked with a non-descript ½ cap rating? Someone else sees a ½ cap review and immediately dismisses the beer. Who wants to drink a beer that’s bad? All of that brings me to my point: You don’t know who’s ranking these beers. You don’t know the conditions that beer was bought in or where the beer was drank.
When reading these reviews, it also helps to understand what beer tastes like after it has been infected, light struck (skunked), cooked or is just old, so you’ll know what is likely meant when a beer is described as tasting like “nickels” or “cardboard” or the ever charming “ass.” For example, there’s a brewery from New Mexico that arrived in Columbus last year. When deciding whether or not to bring the beer into our store I looked at BeerAdvocate to find out a little bit about them (i.e. location, styles, different beers made by the brewery, history, etc). The overall scores were quite low, and the reviews were all consistently bad, but everyone was describing the same thing: beers that got over heated and cooked and also got heavily light struck. Skunk and wet dog smells with stale cardboard tasting beer. All of these reviews were isolated to a region and they all came in around the same time – their distributor hadn’t stored the beers properly, and the brewery’s customers paid for that mistake. There were a few other reviews that were very positive but they were at different times and different locations. Based off the good reviews I brought their beers in and as it turns out, that brewery makes really good beers.
Next time you’re out of town or shopping for beer locally, pull out your phone if you need to and check things out, but don’t rely solely on what you read. Ask your friends what they like and what you should look for. Talk to someone at your beer store and try something they enjoy. Ask a store worker to pick some new beers out for you based off of your likes and dislikes. Take a chance on a beer that you’ve never had but don’t pass on a beer that sounds good to you just because it has an 86 rating. That may be the beer that makes your life complete.









3 Comments on "The flaw in beer ratings"
One problem with these ratings: I’ve seen people who don’t rate according to style. If they are a hophead, they won’t give a brown or Amber over 3 caps (or equivalent.) Beer Advocate does a decent job with reviews in their mag, but they don’t make a habit of telling us to stay away from a beer. Local reviewers don’t call out bad local beer. We’re all on our own to determine what’s what, which is a chore these days. No immediate solution in sight.
I am one of those people who have the phone in hand every time I go into a beer store, for me though, the reasoning is a bit different. I’m from West Virginia and the lack of beer choices there, while getting better, is still quite off par. I have been lucky enough to try thousands of beers from around the world over the last 10 years, so when I go out of state I pull out the phone to try and make the best choices on beers I have never tried before. I will say 9 times out of 10, the beers with the higher ratings are actually good beers, but that doesn’t mean I don’t choose local offerings that I will likely never have a chance to pick up. Thus, when I visit a beer store, I want to make sure I’m getting the best possible beers they have, and not settle on mediocre brews, when my opportunity to grab quality craft beers are so few and far between.
Case in point. Just drank a Bam Biere from Jolly Pumpkin. Beer Advocate rates it at 88. Rate Beer has it at 96. Untapped puts it at a 3.67 cap average. Hopefully as the cicrone program is expanded maybe we can get to some sort of unified ratings system similar to the way the wine industry does theirs.