With today’s discerning consumer demanding that their wearable tech be as functional as it is fashionable, the HUAWEI WATCH GT 5 Series steps boldly…
The great internet radio swindle
All media, online or offline, survive and trade on credibility. Trust is everything and it’s possibly more of a factor in a media business than any other type of business. You lose this trust, you close your media business down. Rupert Murdoch closed down one of the largest newspapers in the world when trust was broken. It gets that bad.
You lie about your numbers, and it’s over. Media companies sell audience to advertisers, and if they are paying money for a phantom audience, it’s fraud.
That’s why I guess there has been such a fuss (because some have been wondering: what’s the fuss?) over the true listener numbers of internet radio in South Africa. There have been some recent runaway successes and quite a bit of hype in the sector with many heavy hitters playing in the space.
But has a fraud been committed? Unlikely. What is more likely, is that we’re witnessing confusion and breathtaking misinformation from just about every player in this nasty little spat so far, from the whistle-blower (Shaun Dewberry, the blogger/techie who broke the story) to the service provider of the audio streams (that would be a company called NetDynamix) to the very internet radio stations themselves (there are several of them). None of these have actually understood what is going on here, nor have they been talking to each other (lots of shouting and posturing though).
History repeats itself
It’s not the first time we’ve seen this on the internet. In the late 90s, major, reputable websites reported their numbers chaotically… in fact the situation wasn’t too different to the current position of South Africa’s internet radio stations. In fact it got so bad that it precipitated a breakup and split from the Audit Bureau of Circulation or ABC (the online arm was known as ABCe). To sort it all out, the online media industry then decided to take matters into its own hands and promptly formed the Online Publishers Association (OPA), later relaunched as the DMMA to include digital agencies.
What we are seeing in this internet radio saga is almost a mirror copy of what the early online publishers went through in the 90s when trying to work out a measurement standard we all could trust. Back then the dominant form of measurement was a site’s page impressions. A unique browser, roughly equivalent to a reader, was unheard of (*see glossary of measurement terms below).
The figures and numbers that the top sites were reporting became ridiculous, so ridiculous that the number one site in the country, called Africam, was merely the largest by virtue of an automatically refreshing page that served pictures from a camera in a game reserve overlooking a watering hole. The second biggest site was a site called iafrica.com. It was then massive because its online chat facility (which no longer exists) was a refreshing HTML page. That’s how fraught the industry was. No one’s fault, just mass confusion in the absence of a standard and an independent auditing body.
And the online publishers knew it. They knew that if advertisers were going to take them seriously and they were going to become proper media businesses and make revenue from online content, they needed a proper standard. So the DMMA was set up to regulate these figures and create confidence in the industry. It was started by a handful of players including iafrica.com, News24, Mail & Guardian Online, MWEB, Ananzi, Avusa and others — and has grown from strength to strength.
It’s about standards, stupid
It’s been interesting to see how misinformed the players in this debacle are, from the blogger who made the very accusation (that’s Dewberry) to the company that provides the audio streams, NetDynamix, to the internet radio stations, who have — I’ve thought — rather deftly passed on responsibility for this saga to their technical suppliers NetDynamix.
The concurrent user trap
Dewberry smelled a rat: the audiences that some new internet radio stations were reporting did just not add up. Their fast-growing, magnificent numbers just seemed too large. But Dewberry was only half right and his argument poor (his first point that used social media followers as an indicator for traffic was weak). In his detailed blog, Dewberry mounted an argument looking at concurrent user numbers from the streaming provider, saying that these numbers did not add up. And he’s right, but only to an extent.
Concurrent users are an indicator, but there are reasons why a concurrent user would not stack up to a media entity’s weekly, monthly or daily audience figure:
1. Concurrent users are tiny compared to a monthly or weekly audience, as these are an aggregate of a media entity’s audience over that period (with duplication removed).
2. For example, Memeburn does about 120 concurrent users at any one time, yet its monthly reader tallies about 200 000 unique browsers (or thereabouts). The session figure is higher, but we don’t report that. (Google analytics)
Looks odd, but it’s not. How do we get there? Well the difference is that it’s out of our control. It’s an industry-standard that weeds illegitimate traffic such as bot traffic and also prevents duplication (ie. counting a returning visitor as more than one user). The standard is applied by virtue of us using a shared stats authority (such as Effective Measure administered by the DMMA or Google Analytics) to calculate our figures. It’s not perfectly accurate, but it’s good enough. More importantly everyone’s audience is calculated the same way so we are measuring apples with apples.
It appears NetDynamix was showing user sessions — which is the number of connections a user makes to the radio station and not an indication of unique listeners. The radio stations were interpreting from the hourly session numbers what their monthly/weekly/daily listenership is, instead of using some industry standard — resulting in massive duplication. (I can be one user and create a hundred sessions in a month, but that stills means I am one user).
So for Dewberry to look at the concurrent users, and then extrapolate that the larger figures reported by the stations are wrong, is a misunderstanding of how audience numbers are reported to advertisers. But despite this, there is some merit in Dewberry’s claims.
NetDynamix the scapegoat
I feel sorry for NetDynamix. The company has become something of a scapegoat in this bar fight (is that what you call it when players are engaging in confused shouting?). The company is easy to hate right now because it sent a rather nasty legal letter to a blogger. When a company threatens legal action against a lone blogger who appears to be acting on his conscience, you are not going to win the hearts and minds of the internet. Public relations blunder.
But let’s put that moment of stupidity aside and look at a few facts. When Memeburn interviewed the company it was clear to us it did not really understand the difference between a “session” and a “unique browser”… and you know what — it’s actually ok that it doesn’t. And it’s ridiculous to expect it to.
Expecting your hosting and technology supplier to also provide you with advertising-grade audience numbers and be the standard is ridiculous. In fact it is very “early internet” where no-one quite understands where their expertise starts and ends. It happens all the time in the internet because the medium converges disciplines and brings them clashing and smashing together.
It’s the equivalent of Mail & Guardian blaming its hosting company Internet Solutions (IS) for inaccurate readership numbers because it is basing this on the raw server stats (remember when we talked about “hits”). It’s not really IS’s responsibility or field of care/influence/expertise, even though the company can provide raw stats (a convergence, but not quite an overlap of expertise).
I don’t agree that it is NetDynamix’s responsibility to provide advertising standard-statistics, the equivalent of Google Analytics or DMMA’s Effective Measure. These are separate fields and businesses entirely. NetDynamix’s response should have been to acknowledge this, and then clarify its role and express a need for a standard. It actually did the latter, which was good.
So whose responsibility is this then? Let’s move to the internet radio stations themselves.
How the internet radio stations passed the buck
I’ve also watched with fascination at how some of the internet radio stations have ducked responsibility, appearing to lay the blame on NetDynamix’s doorstep. NetDynamix, which provides the technical solution and some stats that go with it, is not responsible for a media entity’s business model. If media sells audience, it better make sure that those figures are accurate, and if they are not sure come up with a standard or report conservatively.
In the absence of a standard I suspect, at worst, some internet radio stations were happy to accept the numbers if they looked good, making overly optimistic interpretations of what their reader numbers were based on either the sessions or concurrent users. Either that or they are just completely clueless, which is a distinct possibility too.
Now what?
I think we have to realise there are no bad guys here, just an absence of understanding and communication in what is a very new and pretty innovative media medium (yes, audio streaming has been around for ages, but only recently become viable as a media business).
NetDynamix should supply audio streams and stick to this. The company should not attempt to be at the forefront of creating an auditing body, unless it thinks it can create a new business along the lines of Google Analytics or Nielsen Media or Effective Measure (and good luck with that if that’s its decision). NetDynamix should bow out and do what it does best and by all accounts has been doing a great job on: supplying a great technical solution.
The internet radio companies then need to engage an industry body, whether they find that in the digital media realm (the DMMA), or the traditional radio side (a traditional approach may even cut it believe it or not), on measuring their listenership and restoring faith in their brands.
Let’s be frank here — the two main radio stations mentioned in all this (2OceansVibe and Ballz radio) — may have dodgy-looking audience figures, but they have pretty exciting and innovative offerings. I’d like them to succeed and grow, we all do, and they will sort out this mess by coming up with a solution that the industry has faith in. (PS: Firing NetDynamix won’t get you anywhere. The next solution provider is not going to give you the solution you need either).
So can we all now just grow up and recognise the problem for what it is, then sort out a solution?
* Glossary of terms:
Unique users = an old measurement equivalent to a reader/listener/viewer, now abandoned in the multiple device age, because it is impossible to measure unless you are Facebook (usage depends on an authenticated state).
Unique browsers = the “new” standard acknowledging duplication in audience owing to multiple devices/browsers, but duplication controlled via cookies to prevent duplication in a month period. A user visits 10 times in a month he/she is counted as one user.
User session = a visitor visits 10 times in a month, he/she is counted as 10 user sessions. A sign of activity. Not reported for advertising or business purposes, but a useful metric.
Page impression = we all know what that is, banner ads are sold on this basis.
Hits = the number of times the server is hit when a page loads. An indication of server activity really, not much else. If someone you know talks about “hits” in reference to a website, immediately take out a hit on his/her life.