Showing posts with label attribution. Show all posts
Showing posts with label attribution. Show all posts

Aug 14, 2022

Clear Channel Succeeds in Capturing OOH Incremental Reach

There have been unprecedented advancements in measuring the effectiveness of an out-of-home ad with Clear Channel Outdoor leading that effort. For Campbell Keller, Clear Channel Outdoor’s Director of Product Development, this initiative is not only best-in-class but also sets a new standard for measuring incrementality and, by extension, attribution.

Incrementality is, according to Keller, the observation and comparison of consumer behaviors from a group that is examined pre and post campaign. It ascertains, as he explained, “how actions changed as a result of the campaign exposure causing new incremental behaviors that wouldn't have been seen as a result of the campaign going live.” It is not simply a lift metric comparing exposed versus unexposed. “It's taking into account other additional attributes that can really paint a picture for a brand where they are actually driving new customer interactions among people that haven't visited the brand before or consumers who were already engaging with the brand but are engaging more so as a result of the media exposure,” he added.

To measure incrementality in an unbiased standardize-able manner, Clear Channel leveraged its RADARProof attribution solution which measures, “offline visitation for a consumer visiting a particular location - what days of the week, what times of the day, how often from a frequency perspective,” as well as online activities such as visits to a particular site or other points of interest. “In thinking about it from an attribute perspective, it depends on what are we trying to measure. We wouldn't model visitation behaviors if the KPI was a tune in, for example. We want to make sure it's customized for each campaign goal.”

The results offer added insights not only for advertisers but also for Clear Channel that prove the value of out-of-home. “There has been higher scrutiny on advertising dollars as a whole and we want to demonstrate how our channel is driving strong outcomes for our brand, especially for new customers, or driving increased number of customer interactions so that we can not only preserve spend but also drive new spend towards our channel,” he explained.

To solidify all insights, Clear Channel has partnered with leading attribution providers who set the pre and post baselines for measuring each campaign and construct the consumer groups. In this way, Clear Channel is not grading its own homework, instead offering advertisers an unbiased study of impact using a range of data and metrics. Keller noted that, “From an upper funnel perspective we've been doing brand surveys,” upper funnel lifts, awareness consideration, intent and where lifts in purchase intent are aligned with an actual lift. “But,” he added, “we're seeing a lot more traction in lower funnel not just on CPG sales lift and purchase panels, but also on automotive purchase and driving spend for new categories like pharmaceutical drug lift as well as sourcing loyalty card data to define pre purchasing behaviors.”

There is also an effort to be able to merge different studies using, “multiple different measurement solutions that drive correlations between two different studies, with two completely different panels,” he stated as well as adding more mobile app based data collections. This is particularly interesting because by using geo location data, it is now possible to match out-of-home exposure with as close to a point-of-sale interaction as possible.

The pandemic has changed old parameters and Clear Channel is pivoting to be able to address new consumer behaviors. “There are now these new behaviors from Covid where individuals might not be interested in sitting in a fast food chain restaurant but actually just picking it up and leaving that location and having that engagement driven through their app,” Keller noted.

With some categories, this pivot has presented some challenges. With credit unions, for example, “the market and consumers have changed. Most new account openings are now happening online versus in person so customizing measurement solutions to be more web based for us obviously has its own set of challenges since you can't necessarily click on a billboard. But being able to showcase how we're still able to drive consumers to a brand new website has been incredibly key for us,” he stated.

When it comes to out-of-home and the use of geo location among other data sets, privacy is pivotal. Keller explained that, “We license aggregated and anonymized persistent location data to understand the consumer movement against our assets. Privacy compliance is at the core of everything that we do. It sets us up for a mobile ad ID that is essentially our currency in the attribution landscape.”

Because of the range of data available, it is possible to glean surprising insights from the results. In a specific case where the advertiser experienced no lift, Keller revealed a surprising takeaway. “We had one particular study where there was no lift at all,” for the campaign. “But we were able to identify that out-of-home did a great job in increasing competitive share for the CPG product. We were able to steal market share from other competitors and we were able to drive new customers.” The surprise was, “understanding that lift does not (necessarily) determine success of a campaign. There are a lot of ways you can tell a story and how effective a campaign is through other aspects of conducting measurement and attribution,” he shared.

Challenges are no different in out-of-home than it is in other platform measurement. “Scale – the challenges of scale and wanting to make sure that the measurement that you're running is statistically significant across the different attributes,” Keller stated. “Those are challenges we're going to continue to face with changes in the location data landscape, where individuals are consenting or not consenting to their ID and location services being shared.” The secret is to monitor the privacy landscape and guarantee that all measurement studies are in compliance with state and federal laws.

For Keller, the goal is, “to continue to offer solutions for advertisers to quickly make decisions, especially among sales channels like programmatic out-of-home that drive strong results,” as well as to increase all measureable touch points and align out-of-home with other platforms on both a national and local level.

This article first appeared in www.MediaVillage.com 

Artwork by Charlene Weisler

 

Jul 18, 2021

Is There a Future for Research? An Interview with Jeff Boehme

I have known media veteran, Jeff Boehme, from our days at NBC in the 1980s and since then, he has had a varied and interesting media career path. “I’m a veteran of local broadcast rep firms, NBC, ABC, NCC Media, Nielsen, Kantar Media, Rentrak and Comscore,” he explained where he concentrated on audience evaluations and processes for media currency acceptability. He has some strong opinions about where media is today and the role that research and data plays in it.

Charlene Weisler:  What role should data play in media today?

Jeff Boehme: Data always played a critical role in media. Content is now distributed on more types of technology than ever. Virtually all of these digital devices collect usage information and have been enabled in the marketplace by a multitude of companies. Content providers have taken advantage of technology by supplementing their traditional distribution infrastructure with streaming capabilities through over the top (OTT) platforms. Brand marketers realize the potential of reaching customers with far greater efficiency and effectiveness through addressable advertising across multiple platforms and content.

But defining the benefits of efficiency and effectiveness is not a standardized process; there are real issues surrounding the massive data sets collected from these digital devices and becoming ubiquitous as media currency. Ultimately data can and should be leveraged to maximize the effectiveness of the three basic pillars of brand advertising – creating awareness, reinforcing equity and driving purchases.

Weisler: What types of data are most important and what is currently missing?

Boehme: Over five years ago we understood the remarkable advantages of ‘big data’ expressed as the three V’s - volume, velocity, and variety. The sheer scale of anonymous, passively-collected user information provides much more statistically sound results than traditional small panels and surveys. However, most every big data set is incomplete and may not include essential data elements required for currency acceptance, making traditional tools still necessary to supply missing data points. I would add there should be a few more Vs to consider – the validation of the data (how accurate it is) and the ultimate V – its value. The value of the data ultimately answers the questions posed by the brand and can be accepted as currency on all sides of the ecosystem with confidence.

The good news is we now have more data than ever before - the bad news is that there are significant inconsistencies with the sources, collection techniques, methodology, standards, transparency and importantly – conclusions. All major cable MSOs are offering their tuning data to a variety of companies, as are virtually all connected TV (CTV) manufacturers. I have seen significant disparities on results depending on whom and how a company processes, manages, applies statistical corrections and matches census segments.

Weisler: Should age and gender still form the basis of currency?

Boehme: While age/gender metrics are still valuable criteria of value for brands and media, they have been supplemented with more relevant information including major census breaks and product usage. It was only in the late ‘70s when automotive brands finally looked at the data and revealed that women were the dominant influencer in car purchases. This transformed the industry in terms of understanding the real consumer, how to design new vehicles (think mini-van) and media investment placement strategies. Currency options now include actual auto ownership household impressions based on ‘auto intenders’ created by matching massive tuning and car ownership.

It really wasn’t until 1987 when Nielsen launched their people meter service that age/gender metrics became the de facto currency. However, many brand marketers learned that age/gender weren’t enough to efficiently plan or buy media – specifically for high spending categories such as automobiles. Most consumer purchaser data sets available today are household-specific and include information more relevant than just age/gender. Knowing that a household has pending lease expiration for a BMW is more valuable than simply counting adults 25-54.

Weisler: What is your opinion of the general state of attribution?

Boehme: Channeling Sergio Leone’s epic masterpiece western film “The Good, the Bad and the Ugly” - The Good is we now have a plethora of consumer-based intelligence and media companies are able to use attribution techniques to see a finer view of the customers’ behavior across screens and determine what components of media campaigns work (or don’t). The Bad is the complexity of data, multiple data sources, missing data points/deprecation and differing methodologies. The Ugly is there doesn’t appear to any consistent standards – resulting in significant outcome discrepancies.

Last year, CIMM completed a study on attribution which found the inconsistency of key television attribution inputs, not technology, is the main cause of variance in outcome measurements. They compared eleven different providers and determined, “more stringent media measurement standards are required to ensure attribution results that are consistent and comparable from provider to provider, with exposure data, more than occurrence data having the biggest impact on outcome results.”  I agree with their findings and with their report’s other recommendation requiring additional standardization, such as commercial IDs similar to Ad-ID, for identifying ad occurrences and in defining exposure and reach.

Weisler: What do you think is the most important issue facing Research at this time?

Boehme: Most research groups are a cost entry on a ledger, requiring investment without a direct responsibility for cash flow. Many successful researchers have learned to move quickly, adopt better data skill sets and provide actionable input into a sales process and discover how their company can be more profitable. Many companies see data scientists as a replacement for the research process but smart companies see the value of both, with complementary skill sets and valuable disciplines. The simplest distinction may be that the data scientist determines what could be accomplished with data and the researcher helps define what should be done with the data.

Weisler: Where do you see the Research function at media companies in the next five years?

Boehme: Data science has helped us improve our capabilities with disciplined scientific and technology-enabled approaches, beyond traditional research processes. However, Research is still a vitally imperative function as it is responsible for the objective analysis of the data with the clear communication of insights, business implications and recommendations. We have all witnessed the perils of utilizing large datasets without sufficient oversight in its contextual use case. Ultimately the most successful companies will discover research and data science are opposite sides of the coin – connected they bring greater value.

This article first appeared in www.Mediapost.com

 

 

Feb 3, 2021

Getting the Industry to Work Together Towards Attribution and Cross Media Measurement Solutions. An Interview with CIMM’s Jane Clarke


On February 3 and 4, CIMM, as part of the ARF, presents its 10th Annual Cross-Platform Video Measurement & Data Summit. CIMM has been focusing on attribution for a few years and has helped spearhead efforts to get the industry to work together. 

Where are we with attribution and cross media measurement? Jane Clarke, CEO and Managing Director of the Coalition for Innovative Media Measurement (CIMM), explains.

Charlene Weisler: How close are we to true attribution? 

Jane Clarke: The challenge with complete multi-touch attribution is identity resolution, and particularly cross-channel identity resolution.  Most of the ID resolution solutions can’t incorporate impressions from the walled gardens with identity attached since the walled gardens see this as a data security challenge.  Their POV is that they are protecting the identity of their customers within their platform.  The WFA initiative was created primarily to address this challenge: how to get impressions linked with an identity out of the walled gardens.  The initial approach is called Virtual ID (VID), which is a probabilistic model that also groups customers into segments.  They’re also working on a Secure Media Identifier (SUMID), which should be announced soon.  There are also many commercial approaches to linking identity across TV and digital media, but only a few of them have managed to incorporate secure ad impression data from the walled gardens. 

Currently, attribution is being conducted only for digital media, but increasingly for TV, using Smart TV or STB data or commingled datasets.  These approaches create exposed and unexposed groups of customers and measure outcomes against them to see if those exposed have a lift in the outcome metric.  In 2020, CIMM completed a study on Unpacking Data Inputs into TV Attribution, which pinpointed some of the challenges with getting consistent results across providers and suggested best practices, such as standardizing ad measurement and combining Smart TV and STB data.

Weisler: What are the best practices for cross media measurement and who is doing it correctly?

Clarke: There aren’t enough successful solutions to cross-media measurement for best practices to have emerged yet.  However, vendors can be evaluated by how far along they are with the four building blocks for cross-media measurement, which are: 1) Standardized and scaled granular Smart TV and STB data for content and ads combined to be as nationally representative as possible; 2) Standardized digital content and ad exposure data across sites and mobile apps; 3) A single-source cross-media measurement panel, or a linked combination of single media measurement panels, to calibrate the large “census-like” datasets; and 4) a solution for ID resolution to connect all the datasets and deduplicate them.  CIMM has produced recommended best practices in some of the areas, such as combining Smart TV and STB Data and TV Attribution.

Weisler: Where do you see media measurement one year from now?

Clarke: Now that media companies and TV OEMs are “data owners,” more walled gardens are being created.  This means that there will be even more pressure to perfect solutions in development from the walled gardens for secure encrypted ID resolution.   Additionally, more companies will try to authenticate their customers, to enable more personalized experiences and better measurement.  Regulations surrounding privacy and data security will become clearer, so companies will be able to adapt their solutions.  Nielsen will most likely offer “ad” measurement for TV by this time next year.  This will be a big step towards comparable metrics across media.  There may be some changes coming in which TV OEMs are willing to license their data for measurement purposes, since they are increasingly wanting to use the data for their own ad sales purposes.  So, progress will be made by a number of the vendors working on cross-media measurement solutions, but new technical challenges will also present themselves.  I don’t think that the WFA/ANA solution to ID resolution will end up working for TV, as it is currently envisioned, so alternative solutions will need to be developed.  The WFA/ANA solution is designed to dedupe individual devices vs. the way that Smart TV and STB data are deduplicated at the household level.  

This article first appeared in www.MediaVillage.com

 

Helping the Industry Move to Cross-Media Measurement. CIMM’s 10th Annual Measurement and Data Summit.

Every year CIMM launches its Annual Cross-Platform Video Measurement & Data Summit which brings together experts from the industry. This year, its tenth, is virtual and can offer insights into how the industry is adjusting during the pandemic and beyond.

Charlene Weisler: What are the biggest issues facing media measurement at this time?

Jane Clarke: Media and cross-media measurement is viewed differently depending if you are a buyer or a seller.  From a buyer POV (marketer/agency), the biggest issue is complete cross-channel ROI measurement, which includes all marketing, advertising and promotional aspects of a campaign or ongoing marketing effort.  Marketers try to link one common impressions metric across all forms of advertising and marketing, by connecting them to an ID-graph that can provide ID resolution across all touchpoints and link the impressions to an outcome KPI, such as sales, site visits, app downloads, offline store/restaurant visits or other metric.  From the POV of a media seller, they are typically trying to deduplicate reach across traditional and digital forms of their media, such as between all forms of TV/premium video, and prove outcomes for their inventory. 

Weisler: What initiatives are in the forefront of solving for these issues? 

Clarke: The Media Committee of the World Federation of Advertisers (WFA) has published a Framework for Cross-Media Measurement, along with a Technical Blueprint.  The main goal is to deduplicate reach across the walled gardens and other digital publishers and TV, in a way that protects data security for the data owners.  The WFA design is being adapted to work as a Pilot Test by ISBA in the U.K. and the ANA in the U.S.  However, since the design was originally from a digital data security POV, it has been challenging to incorporate TV data, which uses different methodologies in different markets.  There are also many commercial initiatives to address these measurement challenges, as well as proprietary systems created or in development from agencies, media companies and MVPD consortiums. 

Additionally, the MRC launched their cross-media measurement standards, and the IAB is working on a replacement for the cookie.  CIMM has completed initiatives aimed at addressing some of the four building blocks for cross-media measurement: 1) Standardized and scaled granular Smart TV and STB data for content and ads combined to be as nationally representative as possible; 2) Standardized digital content and ad exposure data across sites and mobile apps; 3) A single-source cross-media measurement panel, or a linked combination of single media measurement panels, to calibrate the large “census-like” datasets; and 4) a solution for ID resolution to connect all the datasets and deduplicate them.  We just launched Best Practices in Combining Smart TV and STB Data, and last fall we published to our site a design for TV Data Interoperability & ID Resolution.

Weisler: Has the pandemic impacted any measurement issues and if so, how and what? 

Clarke: TO panel measurement has been more challenged than other research during the pandemic, since it’s been hard to recruit new panelists when they don’t want to allow home visits.  Existing panels, such as Nielsen, have had challenges replacing panelists and monitoring issues with current panelists and maintaining compliance with “checking in for person’s measurement,” as more panelists stay in the panels longer.  New panels have been challenged to launch, due to these same considerations. 

Weisler: What will be the most impactful efforts we can do to improve measurement? 

Clarke: Data owners need to agree with the methods being developed to protect data security, in order to agree to make their data available to industry solutions.  Standardizing digital video app and site player usage is critical to cross-media measurement.  Many companies use Conviva as a standardized mobile SDK for monitoring customer experience within an app, and it gathers second-by-second viewing data that is standard across their customers, but the data are still owned by the media company.  It would be great for the media companies to standardize around this solution. 

Weisler: How close are we to an industry effort?

Clarke: It has been a big change to get marketers involved in creating the solutions for cross-media measurement, since they have leverage.  However, the TV industry needs to decide which solution it wants to support.  The different media companies, MVPDs and consortia such as OpenAP, Ampersand and Xandr all have different proprietary approaches to creating a unified and standardized platform to plan, activate, measure and conduct attribution against all their TV/premium video inventory.  They need to come together around one solution before they can collaborate additionally with the walled gardens to deliver the solution that marketers seek.

 

This article first appeared in www.Mediapost.com