SCSBOA judging question

Topics and polls that cover the overall marching band activity

Moderators: malletphreak, Hostrauser, instrumental director, Trumpet Man 05

Does SCSBOA need to change their scoring system?

Poll ended at Tue Dec 09, 2003 2:59 pm

B. Yes
C. Or make a couple of adjustments
Total votes: 41

User avatar
New Recruit
New Recruit
Posts: 19
Joined: Wed Nov 20, 2002 1:27 am
Location: southern california

Post by Sam » Tue Nov 11, 2003 12:43 pm

Looking at the 6a division, it seems that the judging has been consistent in the ranking over the weeks. But if you look at the other divisions, you can see inconsistencies. I guess it depends on what tournament you go to and what judges. There are some scores that have changed over 4 points from one week to the next and that goes in both directions. An example would be Serrano wo went from a 72 to a 76 back down to a 70. Or how about Claremont who stayed around the high 70's and at one show scored a 82 and then was back down the next week to their high 70's. That type of fluctuation does not help the "Championship" grid. I am not taking away from these schools, I am just trying to show a couple examples in the scoring. I can go up and down the list with more examples. How much does personal choice play into the SCsBOA scoring? Having read the rubric on the back of the sheets, I thought the judges were suppose to judge on what you have and how you do it. Not on their personal feeling toward a certain type of music or marching style.

User avatar
Section Leader
Section Leader
Posts: 937
Joined: Mon Nov 11, 2002 7:04 pm


Post by altohack » Tue Nov 11, 2003 1:26 pm

maybe the kids just had a bad day.
It was freezing at Chino, and I'm sure I wouldn't be able to play my best in those conditions
No regrets

User avatar
Posts: 426
Joined: Mon Nov 11, 2002 10:43 pm
Location: Cypress, CA


Post by Teever » Tue Nov 11, 2003 9:14 pm

While it's true that the scoring criteria is all objective, it is impossible for any judge, under any conditions, in any association, to completely eliminate the subjectivity of his musical background, training, and beliefs. By that I don't mean that a judge's personal "opinion" should ever effect their evaluations, but that their interpretation of how a group meets the objective criteria will always be affected by the eyes they see things through.

Regarding score spreads (again!), a variance of a 72 to a 76 to a 70, or from high 70's to 80's and back would be consistent with a group who was always in the same criterion range (ie, always a high "box 3", etc). Quite to the contrary of your point, it shows a great amount of consistency in the evaluation process! Keep in mind also that the size of a show and the divisions within a show effect the amount of useable numbers available, and that also produces scring anomalies. A typical example of this might be in a show which has a large 6A and 4A division - the 4A numbers tend to get "squeezed" because some adjudicators are anticipating needing some of the higher numbers later and want to avoid ties (this should be their #1 concern if their priorities are correct).

The use of proper numerical spreads, keeping track of point totals within your caption and the avoidance of sub caption ties would help this greatly, and I wish the association would reinforce this concept much stronger. Case in point, at a recent competition, there where only 2 6A bands, and one judge, while giving different sub capition scores, ended up giving both bands the same score - thus negating his scores/#1 purpose for those two groups entirely!!!!!! I have seen recaps with several caption ties within divisions as well, and it is just embarassing to be a part of a panel when this happens that much. A good topic for conference, I suppose.

Ryan, I think you're right on target with your parade tape response - it would really slow things down tremendously! I have had to do tapes on the jr. high groups a little bit north of us, however, and it is kind of fun, but somewhat difficult to do commentary and follow the band down the route! (reminds me of on-the-field judging).

User avatar
New Recruit
New Recruit
Posts: 10
Joined: Sun Nov 09, 2003 4:38 pm
Location: So Cal

Scoring should vary with the weather

Post by guardlady » Tue Nov 11, 2003 9:19 pm

A example how of weather should effect the score.

The group I work with performed at Hacienda Heights and it was dreadfully hot. The side effect was that the kids were dropping like flies even though we kept pushing the water. The performance lacked energy and there were lots of mistakes, technique basically went out the window. Score= 76.x

Then this weekend at Capo, had optimum weather. The show went sooooo much bether. The band sounded great, tons of energy and the musical quality was so much better. The guard was tons better and really put out a great visual effect. Way less mistakes and much better recovery. So you would think the score would go up, even a little. Great performance translates to 76.x.

So as guard instructor, who is a musician going back to school to be a band director I have really struggled with this. At this point, I have realized that I have to depend on the sheets, and that there must have been something that the judges were seeing that we didn't. Also, that improvment like that is hard to quantify so they put out the number they thought worked with what they saw. I am trying to learn everything that I can the next few seasons, so that I can be more prepared to be a director.

So any other insights into these events would be great.

SCHS Guard Instructor
"It's just music without the Guard!"

New Recruit
New Recruit
Posts: 62
Joined: Sun Oct 12, 2003 8:27 pm
Location: Murrieta

Post by Coach » Tue Nov 11, 2003 10:00 pm

As a former performer and current educator I have had much time to evaluate my thoughts on the association. I personally see no problem with the current judging system. We all got in to this activity for one reason. MUSIC. With much respect, I appreciate the fact that SCSBOA gives so much attention (50%) to the musical aspect of their judging criteria. We all need to look at the roots and reevaluate why we decided to participate in this great activity. You did not decide to start playing trumpet or drums because you thought marching was cool. Otherwise there is ROTC.

Over the years there is one theing that begins to concern me more and more with this association every year. It has nothing to do with the judging criteria but the judges them self. I do know that prospective judges need to be referred to the association by another judge in order be considered to join the panel of judges. The problem I see is that more discresion needs to be used in this selection process. Without getting to deep in the matter, a director that cannot cordinate and produce a quality music program of their own does not deserve to be on a panel of adjudicators offering constructive criticism. Try experimenting on your own group to see if things work and not use others as guinea pigs for your experiments. If you are, save the tape for someone else.

Once again this is from my own experience and does not reflect the ideas of others.
Be the Best YOU can be,
Because YOU can!

Drum Major
Drum Major
Posts: 1668
Joined: Sun Nov 10, 2002 4:22 pm

Re: Ties

Post by dr » Tue Nov 11, 2003 10:41 pm

Teever wrote:Case in point, at a recent competition, there where only 2 6A bands, and one judge, while giving different sub capition scores, ended up giving both bands the same score - thus negating his scores/#1 purpose for those two groups entirely!!!!!!
I guess I'm confused. Your opinion regarding eliminating ties seems rampant in SCSBOA, though, so I guess I am the minority. If the judge gave honest, thoughtful, and candid thought to his/her subcaption scores and the scores came out a tie, then why ISN'T it a tie? Are you suggesting that one subcategory, already pre-determined with a weighting toward the end score should be deemed after the fact to be more relevant than the others (e.g., the award goes to the high music score)? Or are you suggesting the judge should reevaluate his/her subcaption scores to change them so there is no tie? I've seen that happen. Even in the Olympics there can be a tie, and those results are usually determined by a single measurement. As a tabulator and a mathematician, it seems very odd to me that people have decided that a final score totalled from three to six different judges should NEVER be a tie. Ties happen and the "solutions" I've seen appear arbitrary. I realize a tie can be disappointing to the competitor that wants to know who is rated the best, but I thought this was supposed to be an educational experience, not a sport. We can't share a title??????

Sorry, I'll go back to lurking now. That's a sore subject with me.

User avatar
Posts: 426
Joined: Mon Nov 11, 2002 10:43 pm
Location: Cypress, CA


Post by Teever » Fri Nov 14, 2003 2:51 pm

Don't worry Dennis, I've heard that comes with age!

My point wasn't so much that ties should NEVER occur, as much as it was that they should be avoided as much as possible. All too often, I've sat next to someone who just wouldn't make the call, or have the cajones to use an appropriate spread.

BTW, regarding the olympics and scoring, they have an interesting process (at least in some events, like skating): After the first competitor, they stop and "adjust" the scores - what they do is look at the numbers given, and if any one judge seems way off (ie; everyone else gave a 9 or 8 and they gave a 5), the head official has that judge adjust their number and base the rest of their scores in relation to the new number!

Post Reply