Tuesday

Local Measurement Takes a Giant Step Forward. Interview with Nielsen's Kelly Abcarian



With the recent announcement of the addition of Comcast set top box data to Nielsen’s data arsenal, there has been a dramatic expansion of Nielsen’s local measurement capabilities. Kelly Abcarian, SVP, Nielsen Product Leadership, has been on the frontlines of these efforts.  I had the opportunity to sit down with her and ask her the following questions:

Charlene Weisler: When did Nielsen first launch their initiative to add STB data to their local measurement and what prompted that initiative?

Kelly Abcarian: We’ve always believed in the power and strength of big data and set-top box integration has been part of our overall strategy of incorporating large ‘census like’ data into all of our services and products.  We are changing 210 markets of Local TV measurement over the next 12 months by bringing the scale and granularity to our measurement that our clients need as audiences continue to fragment across screens. 

Weisler: What does Comcast add to the measurement potential if previous STB datasets have already been modeled into the measurement? Why add more?

Abcarian: With Comcast’s addition, Nielsen now has the largest set-top box data set of any supplier. We have partnerships with multiple providers to provide us the breadth and depth of data across 210 markets.  For Nielsen, it is not about the number of households, as once you get past a certain high threshold, the value of each additional household is diminished. It is much more about the quality of the data, methodology, and variety of sources; and critical for local is making sure “locality” or true local information is included, not just modeled down from a national view.

Weisler: How granular will all of this data be across all 210 markets but particularly on markets 71+? ​

Abcarian: Advertisers and media owners will have consistent daily electronic measurement for all markets 365 days a year. With consistency in measurement and larger sample sizes, clients can dig deep with custom data segments or geographic areas or go big and look across markets that match the media plan for the advertiser.  

Weisler: Is there a plan to match local measurement capabilities to national - that is, daily overnights for some if not all markets? If so, when? If not, what will be offered, especially to smaller markets.

Abcarian: We will still produce daily data with a one day delay in markets we do today. We will shift to next, next day reporting for any overnight markets with set-top-box data. Former diary markets will receive data monthly and with granularity at the daily quarter-hour level.  Eventually we would like to provide data within 48 hours to all 210 markets but it will require close collaboration with our data partners to be able to provide this to all 210 markets at this time.

Weisler: What is the status of code readers​ in diary markets to measure OTA​? Will STB data be incorporated?

Abcarian: Over-the-air (OTA) homes can account for 10%-65% of a station’s audience for news and sports alone, depending on the network. And simply having a home in every zip code does not mean the ratings are representative or reflective of this growing, important viewing segment.  Remember that OTA is not available with set top box data. Nielsen meters are being rolled out across the 140 markets (15,000 TV audience meters in approximately 7,000 homes) that will be directly targeting OTA homes to provide a projectable measurement source for all OTA viewing occurring in the local markets

The installation of these electronic meters will address viewing gaps and provide the truth-set needed to address the limitations of solely using STB data for audience measurement and also deliver actual persons-level viewing. The meters will be concentrated on specific types of homes, specifically OTA, and will have known demographic and TV viewing information, allowing Nielsen to project audience estimates for over-the-air tuning. 

Weisler: Where do you see local measurement in the next two years? 

Abcarian: Consumers will have new devices, new ways of engaging with content, new behaviors both in and outside of the home that will be important to continue to measure.  As technology evolves and new broadcast standards like ATSC 3.0 become more prominent, new “big data” sources from Smart TVs, connected devices and other devices will enable interactive advertisements, audience-based buying, addressable and many other enhancements. 

Full electronic measurement will be in place across the entire ecosystem in the next 12 months – ensuring continued quality of this data as it evolves from STB tuning behavior to smart TV to Virtual Reality. We will also have the capabilities to understand audience changes in real time with our Nielsen Marketing Cloud across a wide spectrum of marketing execution platforms such as search, social media, email, video, mobile, OTT.  It will analyze real-time streams of anonymous audience data and instantly adapt segments to reflect changes in Consumer media and buying behavior, Movement in the consumer path-to-purchase, Audience composition across millions of consumer attributes (including demo-, geo-, behavior, personality) and Market dynamics (including seasonal & local market demand, competitive actions, advertising).

Two years from now, advertisers will be able to purchase more traditional TV programmatically and their efforts will be supplemented with product purchase data and demos. And, as long as we  remember to keep the consumer at the center and connect media exposure with buying behavior, this industry will continue to grow.  Ultimately, this is what matters most to the advertiser – what combinations of media choices drive optimal sales results.

This article first appeared in www.MediaVillage.com
 

Friday

To Advance Data Quality, The First Step Is Best Taken Together

GUEST COLUMNIST JANE CLARKE 
This article first appeared in AdExchanger:

As 2018 gets underway, there is one issue that should be at the top of everyone’s list in the new year: data quality.

Certainly, transparency and data accuracy were already dominant topics in 2017, but as the industry moves toward ever more granular forms of targeting and measurement, while combining differing data sets, the need for assurance that the information represents what it purports to represent will become even more critical as the year progresses.

2017 saw some solid progress taken toward addressing this.
CIMM and ARF proposed a data labeling initiative to create a “nutritional label” for data, enabling users of third-party data to understand its source and composition. A similar effort is being undertaken by the IAB.

The initiatives provide a critical first step in ensuring confidence and trust in the data that’s used in increasingly granular and effective targeted advertising. It, in effect, helps marketers to better understand the impact of their advertising efforts and the role that the data played in those outcomes. Innovation in targeting would stall if uncertainty about data – the “raw ingredient” used – persisted.
While the approaches being undertaken by these associations differ, they do complement each other, and each has merits and challenges. The CIMM-ARF initiative is broader in scope than the IAB’s initiative, which is limited to digital media and doesn’t include a TV focus. CIMM and ARF want to make the labeling appropriate to audience-based buying in TV, and also bring more buyers into the process.

But as is sometimes the case with our industry, methodology and technology are not the main impediments to addressing challenges – consensus and unanimity are.
At this stage, what is needed is not that the perfect solution be found, but that the first step be taken together. The best technology or methodology at this point is not all that is needed to achieve the objective. It is also important that all facets of the industry agree that the end goal is of sufficient importance that we move forward together. Working together, the best approach for ensuring transparency on data quality will emerge.

Certainly, an agreement on a universal data identification protocol benefits everyone. The stakeholders for such a protocol are primarily the data owners, but the benefits are widespread. And while the initiative is focused on disclosure, there is also a validation aspect to the project.
It promises to successfully link together all current and future big data sets used to accurately measure the consumer journey across platforms. Data labeling, which would be an actual listing of the components, sources and characteristics that were compiled to create targets, would raise overall data quality because of its focus on transparency. But it also requires a uniform, agreed-upon structure so all interested parties can participate and an implementation process created that fits into both standard and proprietary systems.

There are inherent challenges in gaining consensus on technically complicated problems and navigating the natural forces of competitiveness that define our industry and make it great. For example, as the information for the labels would first be self-reported, vendors could, of course, provide misleading or incorrect information. But that misinformation would ultimately be revealed through third-party validation.

At the moment, we don’t even know how many lists and providers would participate. But through this process, undoubtedly some of the poorer-quality lists and providers would be found out as third-party validation and experience of matching reality to reported information kick in.
Never has the need for consensus and placing cooperation over competition been greater than in finding systems to advance data quality. All that we want to accomplish as an industry, in terms of better targeting and accurate measurement, depends on confidence in the data on which we base insights and decisions.

I am interested in people’s ideas, approaches and their “aha” moments for how to best ensure data quality. As 2018 gets underway, let’s take that first step to this goal together as a statement on how even when dealing with the most challenging of issues, consensus can emerge as the greatest trend in the year ahead.

Follow CIMM (@CIMM_NEWS) and AdExchanger (@adexchanger) on Twitter.

Sunday

Looking Back on the Technological Advancements of Yesterday



At this time of great disruption in the media industry, I find it interesting to look back and realize that disruption in media was always a constant. The systems we used to measure content continually changed, improved and even disrupted our ways of doing business. I recall that, when I was an intern at NBC years ago, I was impressed that my computer did not require punch cards. Am I dating myself? Probably. 

As we embark on 2018, I asked others in the industry to answer the question: “When you first started in the industry, what was the most amazing device/application/program/aspect/item at the time?” One person noted that in the 1990s when she was at Discovery “it was PCs and the internet. That technology changed everything.” For others, it was a range of other advancements:

Arlene Manos, President Emeritus, AMC Networks: When I started at A&E, we did a lot by spreadsheet. Someone I hired as an intern’ recently mentioned in an article, that he shared a computer with me since they were scarce. The first system we were on was Columbine, followed by a Nesbitt system for planning and posting. Don’t remember any more than that.

Mitch Oscar, Advanced TV Strategist, USIM:  In 1999 it was the introduction of TiVo, The inventor came to my office to talk to me about advertising and TiVo’s functionality. At about the same time, the head of IPG called me and said, “So advertising is dead?” TiVo was momentous because everyone was worried about the impact of two functionalities – the recording of programming and the ability of fast forwarding to skip commercials. We wondered if the speed be would be fast or slow enough to see the brand messaging.

Caroline Horner, Co-Founder, Spicy Tequila: Well, this will show my age...a desktop PC with a spreadsheet and database application and for data...LNA, MRI, Scanner data (IRI, I think.) and IMS (I started in a healthcare agency.) Then it was online services (pre-AOL) and then anything internet… and a laptop, cellphone and modem. Then there was the introduction of Java and JavaScript and dynamic webpage generation with ad serving, SAS enterprise miner, set top box data, mobile video, growth of marketing database companies, Programmatic. Addressable TV!

Kathy Newberger, Advanced Advertising Consultant: I was working in local ad sales at the time and we said it was going to be digital ad insertion. We were going from six networks that were inserted using tape decks to sixteen networks using digital equipment. We thought that was going to be amazing … and it was. Now it’s amplified by 500 times more – every network is insert-able. And on top of that is OTT.

Brad Adgate, Independent Media Researcher:  I think the most important introduction early in my career were spreadsheets. Long gone products like Lotus 1-2-3 and afterwards Quattro Pro were being used. Before that, workers used those large green accounting pads and calculators to fill in the data, took a lot longer and more error prone.

Dave Morgan, CEO and Founder, Simulmedia: In early 1993, I was working in "new media" helping newspaper companies develop ad and content strategies for early online services and partnerships with telcos and cable companies and had a chance to play with the Mosaic browser. It was pretty clear, even then, that a user managed rendering engine like the browser would change the media industry, particularly for print companies with text and still photos, which rendered well even without high speed internet. It certainly did.

Jane Clarke, CEO, Managing Director, CIMM: Back in 1982, we were analyzing clickstream data from set top boxes in a Pilot Test for Time Teletext, which was a text and graphic service similar to the early AOL, but delivered via the Vertical Blanking Interval (VBI) of a channel on Time Warner’s cable system!  I never thought it would take this long to get to nationally representative samples of Return Path Data!

Sheryl Feldinger, Media Consultant: I often comment to my 16-year-old that the biggest difference between growing up today versus the 1970s is the pace of life. Everything happens so much faster today. The pace of communication, especially, flies at warp speed. Confession: early in my career, fax machines were a game changer. They revolutionized the work place. No longer could you tell the client, "We will messenger it to your office first thing tomorrow." The new retort was, "Why wait? You can fax it tonight!" It didn't matter that the edges of the thermal paper curled. All of a sudden, deadlines got pushed up and we all had to work faster.

Next article – Looking Ahead to 2018.

This article first appeared in www.Mediapost.com


Thursday

ABC and Accenture Strategy Discover the Secret of Sales ROI



One of the more challenging aspects of advertising sales is calculating ROI. This is made even more complex with the proliferation of content platforms and consumer devices. What really contributed to Sales uplift? ABC, in addressing this issue, just released Phase 2 of an attribution analysis conducted by Accenture Strategy. Phase 2 is the follow-up to a custom study completed in 2016 and used four big data sources to prove the role and value of content in context in driving Sales ROI. 

Cindy Davis, Executive Vice President Consumer Experience, Disney ABC Television Group, took me through the study and how it contributes to their overall research strategy. “Our objective in this study is to measure what matters and there was industry pressure to measure ROI,” she stated. To that end, “we found a very interesting connection between engaged audiences and their content and the sales and ROI we can drive to clients who participate.”

To achieve that goal, Disney|ABC commissioned Accenture Strategy which, according to Davis, “had a robust database of marketing spend. This year in our Phase 2 of the study, we examined 26 national brands over six industries and their corresponding sales data representing $25 billion in marketing spend, with $11 billion in television spending.” 

Mike Chapman, Managing Director, Accenture Strategy, global lead for Media and Entertainment Strategy Practice added that his company provided three years of data, which provided a “closed loop view of advertising ROI – types of impressions delivered, how many, which channels, what prices were paid and the impacts from those impressions delivered on incremental sales week over week.”
In addition to Accenture’s marketing data, Davis asked Accenture Strategy to incorporate four other datasets in their recent study – Nielsen ratings, E-Poll, Nielsen Social and Magid’s Emotional DNA, which Davis described as, “intriguing because we are in the business of connecting with viewers emotionally and Magid’s DNA work speaks to that.” 

Phase 1 Takeaways
Davis and her team, focusing on the impact of multi-platform TV (premium long-form video across screens and devices) and how advertisers can leverage that impact, discovered three major takeaways from the Accenture Strategy 2016 study:

      1.       There is a halo effect on sales with multi-platform television. “This doesn’t get talked about a lot,” Davis noted, “You hear about last click attribution in digital advertising. But TV goes a long way to establish and amplify the impacts of all media.”

      2.       Multi-platform is under-valued, under attributed and under-represented in the industry. Eighteen percent of all of the ROI impact is traditionally attributed to digital but it should actually be attributed to multi-platform television. “Television has been traditionally undervalued and digital over-valued,” Davis concluded.

      3.       Multi-platform TV has a long-term amplification impact. The study compared sales lift over years and found that by year two or three, you no longer see a sales lift impact from digital. But the study proved that there is a long-term effect on sales lift with TV.

Phase 2 Takeaways - Drivers of ROI        
Davis highlighted three key drivers to ROI that were identified in the Phase 2 study. 

      1.       Audience size matters. “Higher-rated programs deliver more ROI than lower rated programs by 2X,” she stated, “so not all programs are created equal which makes sense.” And notably, these higher-rated programs deliver more ROI than their cost premium indicating that higher rated and therefore more expensive programming is worth the cost in greater sales lift ROI. This is because these programs have a greater footprint, greater social amplification and therefore have the ability to reach people beyond a narrowly defined target audience.

      2.       Consumers’ to commitment to the content matters.  “We looked at both the expressed and the observed commitment to the content,” Davis stated, “and found that the greater the effort to watch, the greater the ROI.” And there is 2X the ROI with Magid’s Intentionality measurement.
  
      3.       Content quality matters. Davis’ group examined perceived quality, as defined by the viewer, and quantified quality indicators using Magid’s emotional dimensions.  They found that the higher the perceived quality of the content, the greater the ROI. And, using Magid, the three most impactful dimensions for higher ROI were Smarts (programs that are informative, real and inspiring), Edge (unpredictable, outrageous and funny) and Relatability (originality, suspenseful and intelligent). “There is a direct connection between the emotions viewers feel about a show and the benefits advertisers gets in terms of greater ROI,” Davis concluded.

“We are already starting to have good conversations with clients as to what this means for them,” Davis stated. “It goes without saying that not all GRPs are created equal and now we can prove that. Yes, higher-rated shows command a premium but they deliver even greater ROI at that level.” Adding to all of this insight the impact of social connection and emotional dimensions, ABC is poised to help their clients take advantage of the best that multi-platform TV can offer.

This article first appeared in www.MediaVillage.com
 

Wednesday

How to Attain Digital Satisfaction. Hint: Don’t Be Creepy.


 

As retail becomes increasingly digital, greater pressure is being felt on the brick and mortar side of the business. But even online retailers face challenges, according to Esteban Ribero, Senior Vice President, Planning & Insights, Performics. His company recently fielded a study using the Digital Satisfaction Index™ (DSI) to measure online consumer attitudes.  “We executed a retail-specific DSI surveying 1500 respondents that  compared digital satisfaction for retailers in general, as well as for specific brands,” he explained.


Retailers, whether online or in-store, need to be able to deliver the goods to consumers in terms of quality, service and value while also finding the careful balance between personalization and privacy. I sat down with Ribero and asked him the following questions:


Charlene Weisler: What do you mean by digital satisfaction? What are the most important drivers in digital satisfaction?

Esteban Ribero: There are all kinds of studies done around consumer satisfaction but there has never been one for DSI retail. We wanted to see what drives customer engagement in this area. We found that there are four components of digital satisfaction:  

       1.       How useful the experience is. Can people accomplish what they set out to do when they visit your site? How easy is that to do?
       2.       How secure is your online environment? There are still a lot of concerns about privacy where people have to feel comfortable about sharing their personal information online.
       3.       Trust, which is different from privacy. Retailers have to make sure that the information they are giving online is truthful, accurate and reliable, especially in the context of fake news. It is more important than ever now.
      4.        How social is the experience? How much customers can get a peek into other peoples’ lives to create a more engaged experience, how much they can read reviews and comment on those reviews.

Weisler: What do shoppers generally think about the user interface of retail websites and apps? Is there a constant? Do some retailers do it better and if so, what do they do to stand out?

Ribero: We were surprised to find out that consumers were very satisfied already with the utility of their retail websites and apps. We thought that perhaps some consumers would find sites clunky or not very human but the research shows that people find the experience positive. Of the three retailers in our study (Lululemon, Gap, H&M), Lululemon customers were the most satisfied with landing page and app experience, and Gap customers were the least satisfied.  This could be due to Lululemon offering a more modern digitized experience.

Weisler: What is the balance between privacy and personalization? Is there a concern about the ultimate use/sale of personal data?

Ribero: This was the most interesting takeaway from the study. There is a trade-off between privacy and personalization. We go with the assumption that consumers want more personalization and the industry strives to ascertain ahead of time what consumers may want to insure greater personalization. However, as we have done that, consumers may push back because we have been tracking them using information that they did not give explicit permission for us to use for tracking. So they feel more concerned about privacy and all the information we gather about them. But at the same time they say that they want more personalized experiences. The struggle is they want more personalization without giving us any information to do that.   What we need to do as an industry is be more open with the consumer as to what information we use to track them. When we don’t do this it tends to backfire on us. The trick is to make it feel like a generic message but finding a way to tailor it to the consumer. Don’t put one’s name on it – it feels creepy.

Weisler: What is showrooming? What kind of shoppers are most likely to showroom, and how does showrooming fit into driving digital satisfaction for retailers if at all?

Ribero: Showrooming is the ability of consumers to experience the merchandise without having to actually order online. When consumers are shopping in a brick and mortar store, they may at the same time use their cell phone’s mobile apps to browse products for that same store or competitors. That behavior is here to stay and we are seeing it more and more.

Weisler: Based on what you have seen in your research, where do you see the future of retail in the next 3-5 years?

Ribero: We always dream about that moment where, as in the movie Minority Report, Tom Cruise enters a store and they know all about him – his preferences, his past purchasing. I think we are getting to that but in a way that consumers see as more controlled in their environment and their choices. At the end of the day, that where I see where the future is headed. Consumers taking control of their experiences, of the events they want, of the way they want to engage with brands. And I see continued merging between the digital space and the brick and mortar space. Brands will continue to transform their stores as showrooms to get a seamless way for people to interact with the brand.


 This article first appeared in www.MediaVillage.com