In Pursuit of a Truly Objective Awards Ceremony
A recent commenter noted that by their very nature as fan-voted awards, the Singing News Fan Awards aren’t a particularly objective measure of success. He was right, as are other readers who make the same observation every year. This is a genre in which people pick favorites and then stay loyal to them for a long, long time. If we ask loyal fans their favorites, that’s probably not the most objective measure of success.
If we seek greater objectivity, perhaps we should follow the lead of other genres and institute peer-voted industry awards. Problem is, peers can also be and often are subjective and vote for people whom they personally like. So that also is probably not the most objective measure of success.
We could get even greater objectivity if we determine awards the way that NQC determined this year’s Songwriter award—based on the Singing News Radio Airplay charts. Problem is, there’s still some level of subjectivity there; no matter how good a song is, Southern Gospel radio DJs are also humans and prone to subjectively vote for songs from artists (and radio promoters) that they personally like.
(Of course, far be it from me to say that DJs don’t give the next “Canaanland is Just in Sight” a fair shot, too. Those exceptions happen, but most songs on the charts are from established artists or at least established promoters.)
So let’s say that fan votes, peer votes, and DJ votes are all too subjective. We could get even greater objectivity if we could obtain actual airplay reports from performance rights agencies (BMI, ASCAP, SESAC, and SoundExchange). Since those reflect actual airplay across thousands if not tens of thousands of venues, they are more objective. But, even here, there is some level of subjectivity: To this day, most airplay decisions are made by human beings capable of subjective opinions—with the program directors at a few large stations sometimes having a substantial impact on the end result.
Perhaps we could base a truly objective awards ceremony on actual sales reports from Nielsen SoundScan, tracking as fans vote with their wallets. There’s only one problem: Many if not most Southern Gospel groups don’t bother reporting their sales through SoundScan, and a very significant percentage of Southern Gospel album sales are at concert tables.
(And while those numbers are quite objective, many would argue that they aren’t the only valid indicator of success. Legend has it that the Blackwood Brothers outsold the Statesmen by 3-1 or more back in the groups’ glory days, but nobody would claim the Blackwood Brothers were three times more successful.)
In the end, then, it seems that the relentless pursuit of objectivity rapidly leads us into absurdity.
So let’s take a step back. Of course, as another reader will invariably point out every year, none of this matters from the vantage point of eternity. Go ahead and throw up an online poll here or there, and let an accounting firm certify that there aren’t duplicate votes if you like. Cheer on Southern Gospel’s finest once or twice a year, as they thank fans for their support and recognition. But let’s not take any of this too seriously.
(It’s not like most groups, especially the ones at the top, take it all that seriously themselves.)