Controversial post - points are a useful lie.
Posted: Sun Nov 01, 2015 10:37 pm
So, I’ll just go ahead and put this out there. There’s SO much talk about scores and points and numbers in this activity. I have to wonder – why does anyone care what these judges think?
I visited one of the more successful SoCal bands a few years ago – I guess it was on a Monday following a performance, and the director seemed very pleased with, in fact almost obsessed with, the numbers his band was putting up. In fact it was all he could talk about to his kids – this number, that number, all the scores and rankings and all that junk. Like, dude. You’re going to be at championships, and you’re going to be in the finals, and you’re going to be in the top 5. Shut up about your numbers already. And it really wasn’t all that much different at a follow-up visit several months later.
Nowhere was there any mention of how the performance could have been improved, or where they did well and where they did less so, or how to address these things and why to approach certain performance challenges or practice/rehearsal objectives in particular ways – just numbers. It was all he could talk about, and I thought it was really bizarre. I know he did the work, and so did his band, because they play and move really, really well, but everything seemed to be about getting more points.
These scores. How on earth can anyone take these things seriously? If you’ve got two bands, and one wins by 0.1 points, or 0.5 points, or even 2 points, how is that not just statistical uncertainty? How is an 85.6 any different – qualitatively – from an 85.7? Hell, how is an 86 any different from an 88? Not only that, but a single judge can deep-six a band if he wanted to, just because of the way the point system works – maybe it's unlikely to happen, but the fact that this is possible shows that this system of ranking is broken.
If you don't know how it works, the judges basically evaluate the band and then assign a number that describes how well the band did within some particular category - like individual musicianship (which apparently means, are individual players handling their parts well). A bunch of these numbers for different categories get added up and that determines your score, which then determines your ranking. But in an entirely subjective field, and judging panels with zero or minimal knowledge of the works they are evaluating, is a fine-grained evaluation system like this really the way to go?
The fact is, this ranking system is basically nothing more than throwing numbers at a wall within some very loose parameters and then crunching them in some way. I wouldn't mind so much if they didn't then pretend that the numbers imply some sort of objectivity and therefore validity to the process.
The whole BAND XYZ WINS SWEEPSTAKES BY 0.05 POINTS! thing is misleading at best and a lie at worst. I think we need to rethink the way we evaluate bands and rank them in competition, and we can start by eliminating the concept of scoring.
It’s a detriment to the activity that people chain themselves to this sort of thing.
I visited one of the more successful SoCal bands a few years ago – I guess it was on a Monday following a performance, and the director seemed very pleased with, in fact almost obsessed with, the numbers his band was putting up. In fact it was all he could talk about to his kids – this number, that number, all the scores and rankings and all that junk. Like, dude. You’re going to be at championships, and you’re going to be in the finals, and you’re going to be in the top 5. Shut up about your numbers already. And it really wasn’t all that much different at a follow-up visit several months later.
Nowhere was there any mention of how the performance could have been improved, or where they did well and where they did less so, or how to address these things and why to approach certain performance challenges or practice/rehearsal objectives in particular ways – just numbers. It was all he could talk about, and I thought it was really bizarre. I know he did the work, and so did his band, because they play and move really, really well, but everything seemed to be about getting more points.
These scores. How on earth can anyone take these things seriously? If you’ve got two bands, and one wins by 0.1 points, or 0.5 points, or even 2 points, how is that not just statistical uncertainty? How is an 85.6 any different – qualitatively – from an 85.7? Hell, how is an 86 any different from an 88? Not only that, but a single judge can deep-six a band if he wanted to, just because of the way the point system works – maybe it's unlikely to happen, but the fact that this is possible shows that this system of ranking is broken.
If you don't know how it works, the judges basically evaluate the band and then assign a number that describes how well the band did within some particular category - like individual musicianship (which apparently means, are individual players handling their parts well). A bunch of these numbers for different categories get added up and that determines your score, which then determines your ranking. But in an entirely subjective field, and judging panels with zero or minimal knowledge of the works they are evaluating, is a fine-grained evaluation system like this really the way to go?
The fact is, this ranking system is basically nothing more than throwing numbers at a wall within some very loose parameters and then crunching them in some way. I wouldn't mind so much if they didn't then pretend that the numbers imply some sort of objectivity and therefore validity to the process.
The whole BAND XYZ WINS SWEEPSTAKES BY 0.05 POINTS! thing is misleading at best and a lie at worst. I think we need to rethink the way we evaluate bands and rank them in competition, and we can start by eliminating the concept of scoring.
It’s a detriment to the activity that people chain themselves to this sort of thing.