A Blog by Jonathan Low

 

Jul 1, 2015

Did Nielsen Kill the Radio Star?

Metrics can kill and bad metrics can kill faster. JL

Carl Bialik reports in the 538 blog:

The industry’s credibility as a measurable medium is undermined, making Nielsen and radio look unfit for the data-rich digital age. What if advertisers look at the mess and say, “Get me Spotify on the line”?
t the start of 2008, everything was going well at Chicago’s WNUA 95.5-FM. The smooth-jazz station ranked in the top five in the city in listeners age 25 to 54, and its research showed that the station had a passionate, loyal, engaged audience. “The arrows for WNUA were still pointing up,” said Rick O’Dell, a former host and program director at WNUA.
But by July, arrows for WNUA and other smooth-jazz stations started pointing down, seemingly overnight in some cases. Ratings fell, sometimes sharply — down 20 percent for WNUA from the spring period. Advertisers fled. Station owners ditched the format, at WNUA and in just about every other major market.
The gloomy headlines followed. “‘Smooth jazz’ outlets across America have been tumbling like redwoods after an earthquake,” the Chicago Tribune reported in 2010. “Smooth jazz stations already have been shuttered in most major U.S. cities,” wrote the Seattle Times in 2011. Jazz Times asked, in 2012, “Is Smooth Jazz Dead?”
That question is easy enough to answer: Yes, smooth jazz is essentially dead on major-market radio. But as to what killed it … that’s a mystery that’s bedeviled the radio industry.
The usual suspect is demographics. Advertisers buy radio like they buy television, paying more for the 25-to-54 demographic. Smooth-jazz stations struggled to attract younger listeners.
But what if smooth jazz wasn’t killed by neglect, but by bad data? Radio stations run on ratings, and Nielsen is by far the dominant ratings provider, as it is in television. It has a near-monopoly on the biggest markets in the U.S. But many in the industry are starting to wonder if Nielsen has been getting the ratings wrong.
What if people kept listening, but weren’t all being counted anymore? What if a new Nielsen counting method wasn’t working as it was intended to? That failing would hurt many stations in the ratings, but some more than others, and possibly none more than smooth jazz.
That, at least, is how an alternative theory goes. And it’s a theory that’s gaining supporters because of a new device that’s helping stations of all types regain some of the listeners they lost. All they have to do is turn it on.

The more radio stations turn on that device — a box called Voltair — the more Nielsen’s business and statistical model is at risk. Nielsen measures radio exposure using Portable People Meters — pager-like devices that detect audio codes embedded in radio signals. Arbitron, the former dominant radio ratings provider, developed the PPM devices and encoding before Nielsen bought the company in 2013. Arbitron engineers did extensive testing to make sure the audio signals weren’t audible to the human ear, but were at a frequency just below the dominant one of the audio being encoded.
Rather than trusting panelists to remember and record their radio listening, Arbitron claimed PPMs could pick up exactly what they’re exposed to — whether it be the radio station they chose to listen to in the car or the classic rock blaring from the next cubicle.
This was what was supposed to make Arbitron’s technology better suited for the 21st century compared with the old method, in which people had to recall what they’d listened to and enter it into diaries. Arbitron sold the technique as a way to help radio stations accurately measure their audience, producing numbers their advertisers would believe when paying for placement. When The New York Times Magazine reported on the new meters in 2005, it wrote, “In all likelihood, the Houston trial will show that people are exposed to far more media and advertising than they think, or remember.”
Radio ratings determine how profitable different radio genres are, which in turn determines what commercial stations play and what options you have when you tune your radio. If ratings aren’t accurate, and haven’t been since the introduction of PPMs, that could have cost the industry billions of dollars by artificially shrinking its audience — an audience that remains immense but is listening far less than it used to, according to Nielsen’s numbers. It could also mean that Nielsen was driving stations toward formats that weren’t really what their listeners wanted to hear, or that they could hear at least as easily online — while hurting other formats, costing the jobs of radio professionals who specialized in them.
Some types of radio, and the people who worked in them, might have lost out because PPMs weren’t listening, not because people weren’t. “What a shame to not play certain songs or hire certain talent because they don’t ‘encode,'” said Valerie Geller, a radio consultant. “That’s nuts.”
bialik-feature-radio-1
Sixteen smooth jazz stations in PPM markets changed to formats such as rock and sports soon after the meters were introduced, according to radio consultant Richard Harker, a longtime PPM critic. There were 101 stations playing jazz of all types in all markets in April 2015, down from 159 nine years earlier, according to data provided by InsideRadio.com. “No other format had this level of defection,” Harker said in a Skype interview.
Smooth jazz may have suffered more than other types of radio because its soft sound didn’t leave much room under which to tuck audio signals while keeping them inaudible, Harker and others say. “It’s a format that’s played in the background,” Harker said. “It’s a format that doesn’t necessarily have a lot of energy in that critical band that encoding signals are embedded in.”
“Smooth jazz was at the edge of a cliff,” O’Dell, who in November 2013 launched SmoothJazzChicago.net, said by email. “The Portable People Meter could have helped pull the format back or push it over. It turns out PPM gave it a swift kick right over the edge.”

PPMs were created, in part, to turn radio-audience measurement into a science fit for the digital age. Radio is inherently difficult to measure, Michael Harrison, editor and publisher of the talk-radio information website Talkers.com, said in an email, and that encourages programmers to game the measurement. “There are no tickets being sold, seats being filled or physical products being shipped,” he said. “Effective programmers have always tried to beat the system by playing to the way in which the game was scored.”
For instance, during the diary era, because Arbitron used to credit stations for quarter-hours of listening only when listeners tuned in to the first five minutes, “programmers would make sure never to break on the quarter hour — but, rather, at least five minutes into it,” Harrison said.
There were still questions, though, about whether PPMs would work: Would panelists remember to take the meters off their charger when they got up in the morning? Would the sample sizes be big enough to measure niche stations? Should ambient sound from the next cubicle or apartment count as much to advertisers as a program listeners chose?
There was also, early on, evidence that the devices missed a lot of potential listening. Radio Joint Audience Research, or Rajar, which is responsible for radio measurement in the U.K., tested PPMs and other, similar devices in a north London hotel in November 2004. People holding the meters walked through rooms with various levels and types of recorded background noise, such as from cars or shopping centers. A person in each room rated the ease of hearing, from 0 — optimal listening conditions — to 10, the most difficult. PPMs didn’t count 41 percent of exposures overall, which could be acceptable if it was missing only the exposures where the meter carrier probably didn’t hear the radio. But for level 1, which was nearly optimal listening conditions, PPMs picked up just 79 percent of exposures. For level 2, they picked up just 71 percent.
“One of the prices you pay for granularity and more data is that a lot of your listening ends up just disappearing,” Rajar Chief Executive Jerry Hill said in a telephone interview.1
From the start of development of the new technology, Arbitron engineers were trying to balance sound fidelity with the potential for meters to not pick up the code. “We had concern about various types of formats,” a former Arbitron engineer said in an email. “It was thought that classical and some talk formats would be challenging to encode and possibly decode reliably.”
As markets switched to PPMs, their audience numbers fell.
bialik-feature-radio-2
Arbitron told stations it would try to persuade advertisers to pay more for a rating point because ratings were now harder to come by — a 0.7 rating now was worth roughly 1 rating point before. It didn’t work. The average time people spent listening to radio declined in major markets. Radio’s ad dollars, and its share of all ad dollars, fell accordingly, according to the Interactive Advertising Bureau.
IAB
Broadcasters had doubts about the new technology that was being used to measure their audience, but no good way to test it for themselves. PPMs picked up raw data and transmitted it to Nielsen for editing and interpretation. Only the company and its auditors could get their hands on the devices and test how and whether PPMs worked. The process felt to many broadcasters like a black box.

But now a silver box is testing Nielsen’s credibility. It’s that device I mentioned earlier, the one called Voltair. Its maker says it’s being used by more than 600 radio stations nationwide — including at least one in each PPM market. It costs $15,000 and promises to more than pay for itself by improving the encoding process that stations use to mark their audio. The maker claims that stations that use Voltair will be picked up by more PPMs and see their ratings increase. And, according to one radio-company executive I talked to whose company has used Voltair since February, it really works — spurring ratings increases in certain demographics by 20 percent to 80 percent.
Voltair
Others agree. According to its maker, 25-Seven Systems, no one has asked for a refund and several broadcasters have come back for second or third orders. Two stations shared data with the consulting company Harker Research, leading Harker, the company’s senior partner, to write, “We can say with confidence that the two stations’ gains were a direct result of Voltair.”
Dick Harlow was hearing Voltair testimonials at the same time he was considering dropping Nielsen. Harlow is the vice president and general manager of Dick Broadcasting Co. Inc.,2 which owns two stations in Greensboro and Winston-Salem, North Carolina. Harlow said he was paying more for Nielsen ratings than for any budget item other than personnel. He decided to drop Nielsen, opting for competitor Eastlan instead. But his company’s stations will still appear in the Nielsen ratings, because some of his competitors continue to pay Nielsen to measure the market.
DBC is using some of the savings on Nielsen to buy a Voltair box, which is expected to arrive next month. Harlow decided to buy one when he saw a competitor’s ratings increase after it got a Voltair device. “That’s where we said, ‘Wait a second,'” Harlow said in an interview. “Number one, this isn’t right, but number two, we can’t ignore it.”
“When you see those kinds of gains from your competitors, it’s a pretty strong incentive to buy one yourself and then scream at Nielsen,” Harker said.
To the chagrin of radio-station managers and others in the industry, Nielsen won’t comment beyond a statement that it “is evaluating and testing the Voltair product,” working with the manufacturer and with the Media Rating Council (MRC), which accredits its measurement methods. “Until we’ve completed our analysis, Nielsen cannot endorse use of the Voltair product.”
“The PPM generally works,” said George Ivie, executive director of the MRC, in a telephone interview. “I mean, it generally works pretty well.” He said the MRC was studying Voltair. “If it turns out that Voltair is creating credit where credit is due, then we need to fix the system,” Ivie said.
Numeris, Nielsen’s nonprofit counterpart in Canada, also uses PPMs, and it asked Canadian broadcasters to stop using Voltair as of June 14 while it tested the technology. Jim MacLeod, president and CEO of Numeris, said in a telephone interview that about a dozen stations in Canada were using Voltair. “If one station puts it in and one does not, how does that contribute to a level playing field?” he asked.
In a statement, representatives from the Telos Alliance, a group of brands that includes Voltair manufacturer 25-Seven, called Numeris’s move a “power grab.” “What’s next? Search warrants on your equipment racks? Processing security police?” the company’s representatives said. “Shouldn’t Nielsen-Numeris back down and show some transparency before it costs radio more millions in lost revenue?”
Voltair users are reticent, careful not to publicly provoke Nielsen nor to tip off competitors. The one I spoke to asked for anonymity, as did the two who shared data with Harker.
Sean Hannity led a remarkable panel at a June 12 talk-radio conference in New York about the industry’s measurement problem. Hannity said he’d stayed up until 2 a.m. the morning of the conference reading Harker’s blog. He pressed panelist Jon Miller of Nielsen about Voltair. “Nielsen realizes how big of a deal this is,” Miller said. “We want to be as thorough as possible, as rigorous as possible.” He said Nielsen plans to “share our viewpoint” in the coming weeks. “This is our lives, this is our careers here,” Hannity said. “This is a big deal.”
Exposing the industry’s measurement problems could backfire. Sure, the outcome could be more accurate, and higher, ratings. Or it could be that the industry’s credibility as a measurable medium is undermined, making Nielsen and radio look unfit for the data-rich digital age. What if advertisers look at the mess and say, “Get me Spotify on the line”?
It’s worth the risk, said the radio executive who has used Voltair since February. “When you’re buying on a flawed system, nobody’s winning.”
O’Dell incorporates Voltair into a radio programming course he teaches at Columbia College in Chicago. He likens it to steroid use in baseball in the 1990s, pointing out many parallels: a belief they work, without much hard data to prove it; users won’t say they’re using it, let alone what effects they see; and competitive pressure on non-users to get with the program. “Sure, they might not work, but you just can’t take the chance,” O’Dell said.
Even if Voltair increases ratings, it might not mean PPMs killed smooth jazz. Some in the industry blame other factors such as demographics for most of the decline, which has continued since the introduction of PPMs.
O’Dell is no longer in a position to test Voltair, with no spot on the radio dial. Can Voltair help smooth jazz? “Sadly, we’ll probably never find out,” O’Dell said. “There aren’t any major market stations left playing smooth jazz to allow us to do a proper test.”

1 comments:

Rent a car said...

rental cars customer service, ez car rental jacksonville fl ez car rental jacksonville fl best rental car rates

Post a Comment