How Good is Your Homebrew? Get the Best Feedback on your Beer
How good is your homebrew? How would you know? Some homebrewers just want to know if they’re truly making good beer, and how to improve it. Others would like to know how their brews stack up against some sort of average. In this article we’ll examine some ways to approach answering these questions, including the advantages and the limitations of each approach. I’ll categorize these approaches as self assessment, friends’ feedback, competition feedback, and competition results analysis. This article will not be so much about how to assess your beer, but more about who should assess it.
Self Assessment
Perhaps the most obvious method of evaluating a beer is by using self assessment. Taste your own brews and decide if you like them, using whatever criteria you want, e.g. comparison to commercial examples, or Beer Judge Certification Program (BJCP) guidelines, or your personal preferences. That sounds simple enough. But is it objective? In 2014, I polled 100 homebrewers on a popular homebrewing forum, and of those who felt they had enough information to form an opinion, 69% believed their homebrew was better than the average homebrew at large, 25% believed their homebrew was about average, and only 6% believed their homebrew was below average. Assuming those who responded represent a sample even remotely close to random, it would be fair to say that many homebrewers have a somewhat higher opinion of their own beers than is warranted. That’s not necessarily a bad thing. Homebrewing is largely a personal pursuit, and a brewer who enjoys the fruit of his or her efforts can be a happy hobbyist. At this point, some readers are probably feeling a little uncomfortable, or proclaiming “But I’m my own worst critic” in their heads. Good! But I encourage you to read on.
One limitation to the self assessment approach is that there are very often flaws that you are not particularly sensitive to, or haven’t trained yourself to recognize. For example, the taste threshold for diacetyl — a buttery or butterscotch flavor that can be produced by yeast or by bacterial infection — varies from person to person. If you are not particularly sensitive to a flavor, it will be a potential blind spot in your evaluations. Another more serious limitation to self assessment is that it can lack objectivity. In the wine world there’s a term, “cellar blindness,” that refers to the tendency to overlook the flaws in the wines in one’s own cellar. Homebrewers can be susceptible to a similar phenomenon when evaluating their own beers. So if self assessment isn’t enough, the obvious next step is other peoples’ assessment.
Friends’ Feedback
Having friends taste and evaluate your beer can provide useful information, provided the friends have good palates, a good beer tasting vocabulary, and are capable of brutal honesty. For tasting purposes, I’m defining friends as anyone you know: Close friends, family, acquaintances, the guys in your homebrew club, or even a professional brewer at your local craft brewpub. An advantage to getting feedback from friends is that it puts additional palates — beyond your own — on the job. If you’re hearing similar specific details, particularly concerning flaws, from multiple sources, those are probably valid comments worth considering.
There is however a major limitation to this approach: The tendency for tasters to avoid negative comments when providing feedback to someone they know. My friends love my beers, and rarely say anything negative about them. The same friends also loved my beers when I was a new brewer and probably making a lot more mistakes. While my early brews were not terrible (in my opinion!), they were also not great. In retrospect I’m fairly certain I was (and am) not getting unfiltered feedback from most of the people I know. It’s equivalent to my mom telling me I’m handsome. Also, some tasters are better and more educated than others, and it’s sometimes hard to know which opinion is worth more than another.
Competition Feedback
Entering your brews in competitions and receiving feedback, in the form of comments on the score sheets, is a method of assessment that overcomes some of the limitations of the previous methods. Because the judges don’t know who you are, they are unbiased. And they are less likely to sugarcoat their comments, because not only do they not know who you are, it’s also their job to identify and describe flaws. In the case of BJCP certified judges, they have been trained in the BJCP guidelines and have passed tests proving their tasting and assessment skills. This does not mean that judges can’t make mistakes. They can and do. Beer judges are only human, and things like a palate fatigue or being assigned to judge a category they don’t like (something that competition organizers are supposed to avoid), can lead to poor judging. Also, if you are entering competitions that are mostly local, you can end up having evaluations by the same judges with the same biases and blind spots at two different events. However, if you believe that good beers largely tend to score well overall and that poor beers largely tend to score poorly, then written feedback from judges can serve you very well if you’re looking for an honest assessment. As with friends’ assessments, if you are receiving similar comments from two or more competitions, those are comments worth considering. Note: I said “two or more competitions” and not “two or more judges.” In most competitions, a beer will be judged by two (sometimes three) judges in the flight in which it is entered. These judges are not working in isolation, and the scores and comments will tend to be similar between the two sheets. Thus any mistakes (errors in judgment) are likely to appear on both sheets. On the other hand, when judges from two (or more) different competitions are telling you the same thing, it’s a much safer bet that there is something to it, and there is a true flaw to address in your recipe or process. Also, two competitions is the smallest possible sample size; if you really want to get a broad range of opinions and feedback, enter more competitions.
Naturally, there are limitations. The first already been hinted at: The feedback from a single competition may or may not be useful without confirmation from a second (or third or fourth and so on). You can mitigate this somewhat by tasting a sample of the same beer while reviewing a score sheet, and I wholeheartedly recommend this. But see the self assessment section earlier to understand its limitations. Second, reviewing judges’ comments, even from multiple competitions, probably will not provide a very good sense of where your skills stand overall as a homebrewer, unless perhaps every one of your entries is winning ribbons or every comment is glowing. So recalling the survey already discussed, how can you possibly determine if your beers are above average? That’s where the next method comes in.
You don’t have to believe that judges “get it right” every time, only that they tend to get it right more often than not in terms of good beers tending to score higher than beers that are not as good. Take constructive feedback, particularly when it’s from two or more independent sources, and use it to modify your recipes and processes. If the changes are successful in improving your beer, you should see the ratio begin to increase. You may want to compare ratios from different periods. For example, compute them month by month (if you enter a lot of competitions) or year by year. Or compute them for entries before and after you made a major change in your brewing process, like switching from mashing in a converted cooler to using a RIMS system, for example.
In summary, you can improve your beers by looking for common feedback from multiple independent sources, and making changes based on that feedback. Don’t completely discount comments from friends, but unbiased feedback can be more valuable. Entering competitions — the more the better — is the best way to get unbiased feedback.
Photo by Paul Peng Wang