SA’s best sauvignons? I don’t think so….

There might well be good reasons for having the huge panel tastings favoured by Wine magazine and the various competitions, but we should be immensely cautious about the claims made for the results. The latest Sauvignon Blanc competition list of winners is more or less plausible – but are they really “South Africa’s top sauvignon blancs” as is claimed? (The full results are available on the Wine mag website.)

Consider, firstly, the following possible top ten (in alphabetical order): surely even more plausible, going by track record and reputation, than the actual winners, wouldn’t you think?

  • Ataraxia 2010
  • Cape Point 2009
  • Chamonix Reserve 2009
  • Constantia Glen 2009
  • Neil Ellis 2009
  • Quoin Rock The Nicobar 2009
  • Reynecke Reserve White 2009
  • Springfield Life from Stone 2010
  • Vergelegen Reserve 2009
  • Waterkloof 2009

In fact none of the above ten were even entered in the Wine magazine line-up (unless they were included in the unspecified five wines that achieved no rating; on which note, I wonder why those also-rans were not named: there used not to be this coyness – perhaps it’s the thin end of a wedge and soon we’ll be given only the high scorers, as happens in most competitions in order to save the organisers and the producers embarrassment). In the absence of wines like the above, claims about what is best or not should be a little more modestly considered.

Even more siignificantly, what about the following as a plausible list of winners:?

  • Boschendal Reserve Collection 2009
  • Diemersdal Eight Rows 2009
  • Durbanville Hills Biesjes Craal 2010
  • Fleur du Cap Unfiltered Limited Release 2010
  • Fryer’s Cove Bamboes Bay 2009
  • Iona 2010
  • Lomond Sugarbush 2010
  • Oak Valley 2009
  • Southern Right 2010
  • Waterford 2009

All of these well-known examples were relegated by the panel to the bottom end of the rankings – to three stars or, in most cases, less. If the panel couldn’t recognise the virtues of such wines (it’s pretty safe to assume that most of them at least were up to their usual standard) – then why should we trust that they got things right at the top end? What basis has the panel given us for accepting their judgement that five-star Anura is “superlative, top class”, while one-star Iona is merely “acceptable, ordinary”?

Should Thys Louw of Diemersdal be scratching his head and wondering how he can get things so wrong that, of his five wines entered, the two cheapies were rated superior to two of his more serious wines? No, I suspect that he’ll do as most producers do and realise that the results are basically a load of nonsense.

Should the team at Fryer’s Cove, makers of some of the country’s most interesting sauvignons, wonder where they’re going wrong? No – they’ll realise that the consistently bad results they got in Wine mag’s competition this year result from two things. Firstly, tasters, however talented, are not good at picking up subtle styles in a huge line-up of blind-tasted wines; secondly, tasters, however talented, cannot unerringly identify quality in such a line up. Quality and subtlety just do not reveal themselves in these circumstances. The best one can hope for is plausibility and a slightly better than random result. Winemakers know this only too well, and it’s perpetrating a fraud on consumers to pretend otherwise.

You’ve got to love the way the magazine’s introduction to the results talks about how the panel decided to reward such and such characters but spurned others – when, frankly, it looks like the usual set of results that could have been largely achieved by drawing lots.

Let’s have such tastings by all means – it provides the panel with innocent amusement (innocent, that it, as long as they don’t drive themselves home afterwards!) – but let’s be clear that they seldom tell consumers much that’s useful.

Leave a Reply

Your email address will not be published. Required fields are marked *

Are you human? *