Nov 22, 2020

New York Is and Will Be the Future of Media. An Interview with Media Lab’s Steve Rosenbaum

A big discussion point in some media circles is how to interest, prepare and recruit the next generation of media executives. For New York City, this effort has been in place for several years in the form of the NYC Media Lab. Executive Director Steven Rosenbaum shares his views on how the efforts of the Lab have helped and are still helping to harvest the best and brightest for our industry.

Charlene Weisler: What exactly is Media Lab and what is its history?

Steven Rosenbaum: Back in 2010, the New York City Economic development commissioned a survey to plan for the future of media and technology in New York City. The report was exhaustive, and it’s available for reading

There were a myriad of conclusions, with one calling for the funding and creation of the NYC Media Lab which was launched as a consortium of NYC Universities.  NYU, Columbia, Prat, CUNY, The New School, SVA and Manhattan College all joined together. The concept was that New York’s Universities had at their disposal the best and the brightest students, faculty, and graduate students in the world - and that they together could provide the City’s largest media and technology companies with a powerful resource to invent the future of media. There was of course a workforce development aspect as well, we wanted students who graduated from our leading universities to feel welcome to stay here, building companies, get great internships, and build career long opportunities that would strengthen our infrastructure of innovation. I don’t think the drafters of that report could know just how successful those plans would turn out to be. 

Weisler: How can you keep pace with all of the changes in the industry? 

Rosenbaum: The changes may seem shocking and destabilizing to outsiders but if you have the benefit of working with extraordinary students, as we do at the Media Lab, then you can see with some clarity what the road ahead looks like. Just by way of example, Artificial Intelligence - which is sometimes called machine learning, is going to change everything about how information is created, sorted, shared, monetized, and validated. Already our students are working with media partners to create content that is produced in conjunction with AI, and that results in an increasingly massive firehose of information. Then, AI steps in to help platforms find and filter information. Of the thousand pieces of content that fight for screen time in your feed, it’s an AI algorithm that is choosing what makes it to your feed. And finally, AI is increasingly reading content to help humans filter what is relevant. So machines are making media, and machines are reading media. These changes will only move faster as more devices capture information, and sift passive consumers into media producers.  

Weisler: What are the major trends that you can identify for the Lab?

Rosenbaum: Well, certainly machine learning and AI. But that’s the tip of the iceberg. Our students are working with The NY Times on a practice known as photogrammetry, capturing news stories in full 3d so that readers/viewers can walk inside of stories and locations. Another team is working with Consumer Reports to explore how consumers control their data, and what companies ask for an should be shared with the click of button. In education, we’re deep in the world of 5G, working the public schools and the folks at Verizon 5G labs to develop software and teaching environments to expand education into virtual worlds. And with ASCAP Labs, we’re exploring the future of music and entertainment - just how will the future of performance change as machines become part of the creation and performance experience. And finally, with our friends at Bloomberg, the future of how machines engage in media is core to our ongoing investigations. And let’s not forget Virtual Reality and Augmented Reality - those worlds are going to be core to how stories are told to new immersive audiences.

At the same time, there are some big questions on the horizon. How is social media impacting our community, our democracy, and our children? Tristan Harris from the Center for Human Technology keynoted our 3 day SUMMIT in October, and his data was eye opening and shocking. Finally - there are some significant questions ahead about how government should limit or legislate technology. Do we really want free speech curtailed by legislation, or do we trust that platforms like Facebook, Twitter, YouTube, and Reddit will limit themselves.  

Weisler: How prepared are students today for work in the industry? 

Rosenbaum: I think that’s the wrong question. I think the question is, how prepared is industry to embrace the changing expectations of young audiences, and will they explore and learn from new audiences and provide information and entertainment that meets they're changing expectations. The nature of holding an interactive platform that create and distributes content has changed forever the formerly passive experiences that audiences expect. Young viewers want to watch, comment, share, and re-mix content within their social networks. For media professionals used to creating work that is ‘viewed’ by an audience, this new remix culture can be very de-stabilizing. But new audience will expect nothing less.  

Weisler: How has the pandemic impacted your work? 

Rosenbaum: Our organization was already working on distributed platforms. And our students and companies were comfortable shifting to remote projects. So the impact has been in some ways a driver for us. We feel a unique urgency to think about how New York tech will evolve to remain vibrant, and we’re working to open more of our programs so that we can grow our audience and our community. 

Weisler: Where are the openings for jobs?

Rosenbaum: New York’s tech community is growing quickly. We’re seeing huge footprints and headcount for Facebook, Amazon, Apple, Google, and in the media space Netflix and Amazon are growing their production footprint. I think you’re going to see a growing demand for developers, designers, writers, creators in 3d and VR, and people who have a creative mix of stories to tell, and new visual ways to tell them. Will you be acting on stage, or acting in a virtual reality world - or performing as an Avatar? It’s too early to tell, but the nature of storytelling is about exploring ideas in front of an audience, and then learning from that experience - so New York is positioned to lead the future of innovation and ideas.  

Weisler: Where do you see the trends 3-5 years from now? 

Rosenbaum: Well, there’s no doubt that there a rough patch ahead. New York will need to tighten its belt, and find new ways to invite young creators to call the city home. This is in many ways a media challenge. My experience in New York over the past six months has been amazing. People in my neighborhood are being careful - masks always - but also being social, and warm, and supportive. So it’s important that we share those stories within our networks, rather than let the stories that will tend to be recorded by the news be amplified and exaggerate what are certainly serious not necessarily representative of the city at large.  

The city, and business need to pull together and create some clarity around aligned interests. We need to double down on innovation, on the start-up economy, and on providing infrastructure for the idea economy. We’re going to need new spaces for collaboration, new ways to support how teams work in both collaborative and remote units, and the funding to embrace what makes New York the best place in the world. I’m a writer, a filmmaker, a technologist, and an academic. But what fuels me is listening to Jazz on the sidewalk on the upper-west side, seeing Calder and Hopper at the Whitney, having dinner on a rooftop in the West Village.  Music, Food, Fashion, Literary Events, as well as public discourse about politics, the environment, media, and technology. Our diversity is our most powerful strength, and embracing and supporting that diversity is our superpower. We need not forget that. 

This article first appeared in Mediapost.com

 

 

The Challenge of Measuring Video Inventory. An Interview with AMC Networks’ Tom Ziangas


The question of measurement looms large, especially in how we treat remnant inventory.  How should the industry monitor, handle and best measure this inventory?  Tom Ziangas, Senior Vice President Research for AMC Networks, recently participated in a panel on addressable advertising hosted by Mitch Oscar, USIM’s Advanced TV Strategist, for BIA.  Ziangas shared his views on how addressable will impact linear and remnant inventory measurement.

It should be noted that the definition of remnant inventory can vary by platform. For addressable it is often the fraction of a specific unit that might be unsold. In Linear it can be the full unit that might then be sold as direct response.

Charlene Weisler: What do you envision as the best methodology to measure addressable?

Tom Ziangas: Just by the virtue of legacy measurement and the utility of census level RTB STB data, we will need a hybrid approach to measure addressable ad exposure.  While today national ads are measured via Nielsen panel to provide C3/C7 commercial Ad Measurement, measuring national addressable will need the hybrid approach of panel and census level data.  Addressable will be a footprint of the total U.S., we will still need to back out the addressable impressions (census level RTB STB data) from the currency C3/C7 national panel measurement from Nielsen, and while this complicates measurement, we need to make sure that the advertiser is made whole and provide accurate measurements of their ad placement in the linear and digital world. This applies to both linear and remnant inventory with a little caveat, since most of remnant inventory is not guaranteed we may have more flexibility for non-currency reporting.

Weisler: What are the other challenges that you see regarding data and measurement in this space?

Ziangas: As stated above, the biggest challenges are the “mixing” of methodologies (panel and census) and creating standards amongst the addressable players in this space and how we will all work with Nielsen to get this done.  Just think about if we have Nielsen traditional linear measurement on one side and we will need to integrate addressable measurement from multiple players (Canoe, OAR, Nielsen Addressable, etc.) on the other side; all side will need to collaborate together to make the buying/selling process seamless.  If this does not happen, we will be in the same place VOD is in today, under-valued and under-monetized. These challenges affect both linear and remnant similarly, we need to have measures and metrics for both to best understand the performance of the casmpaigns.

Weisler: Where do you see measurement in this space next year at this time?

Ziangas: While addressable is moving in the right direction, it is not keeping up with the pace of change. While I am talking about measurement, there is a lot of work that needs to be done operationally, and along with that, we need the time to make sure this process from traffic to Broadcast & Technology to planning and reporting is all in sync.  So, we will take the steps as we are doing today, such as pilots with our partners, learning with our partners and implementation.  We will be in a better place next year this time, but it will continue to be a work in progress.

 

Oct 27, 2020

Insuring the Advancement of Quality Data. An Interview with Scott McKinley of Truthset

Scott M. McKinley - Founder & CEO @ Truthset - Crunchbase Person ProfileThe question of data quality is becoming critical. Which datasets are useful and accurate and which might lead us to false or misleading conclusions? Scott McKinley, CEO of Truthset has set out to clarify and define the term “quality data” and set the standard for its future usage. “The world increasingly runs on person-level data,” he explained, “Marketing, advertising, personalization, dynamic pricing, and customer analytics all rely heavily on person-level data to power every interaction between a consumer facing business and its customers and prospects. But all that data has substantial error, which hurts performance and costs business money.” 

Charlene Weisler: What does your company do?

Scott McKinley: Truthset is the first company to measure the accuracy of record-level consumer data at scale, so that companies using record-level data for any use case can understand the quality of that data, make better decisions, and improve the performance of any data driven activity. We believe that bringing accuracy to the data ecosystem will help all boats rise. Data providers, data marketplaces and platforms, marketers, and even consumers will benefit from improving accuracy in the data used to understand, profile, target, and activate.

Weisler: What is your definition of quality data?

McKinley: There are many aspects to data quality. We focus exclusively on the  accuracy of record-level data. We know that accuracy is not binary, so we have developed a method to measure the likelihood a given key value pair is true on a spectrum of .00-.99. As an example, if a data provider makes an assertion as to the gender of a specific ID, we can measure the likelihood of their assertion to be true.. Higher likelihood of truth equates to quality. Our mantra is that there is no perfect data set out there, and that even a good data set can have portions of it that are not good.  Data sets can be measured at both the aggregate and record level to ensure that buyers can compare entire data sets, and users can pick and choose their own level of accuracy, because scale and price are still highly relevant in decision making about data purchases.  For data sellers/providers, we measure the accuracy of their data at the record level and provide both an absolute metric that the provider can use internally, and a relative index that can be used externally to help them differentiate in a sales environment. .

Weisler: Does quality data vary by company, use, timing etc? How do you manage the changeable nature of what is actually quality data?

McKinley: Absolutely.  From a data provider standpoint, they certainly believe that data quality varies, and many sell on that point to differentiate themselves from other competitors.  The tough part, up to this point, is that they haven’t had an independent third-party to point at to back them up.  We have created our Truthscore Index to give the market a relative look at how data providers are performing comparatively, and an easier way to say they are X% better than the average score. It is very much the same on the data buyer/marketer side. Everyone has their own threshold as to what level of data quality they will accept.  Throughout the year, campaign-by-campaign, marketers choose different segments and at varying granularity or scale, and with each, they have to make individual decisions to address each appropriately, balancing scale, their budget, and now the quality of the data.  With Truthset scoring at the record level, the marketer/data buyer can experiment with the level of quality that meets their needs and feel confident that they are getting what they are paying for.  Regarding the changing nature of data, we require that every data provider we work with be measured quarterly to ensure that we always have the freshest possible data.

Weisler: Does quality data span first to third party?

McKinley: The measurement and rating of data quality that we are focused on is record level data.  This means that whether the data is zero-party, first-party, third-party, or anywhere in between, collected via SDKs or pixels, via aggregating up other data sources, CRM files, etc., We can reliably assign a Truthscore to the record level data so that the owner, buyer, or user of that data can understand the accuracy of each record.  Truthscores are a numerical value between 0.00 and 1.00, that quantifies the probability that consumer-level data is truthful and accurate.

Weisler: What data sets do you process and vet?

McKinley: We have built what may be the largest cooperative of consumer data with leading data providers and we use that data to compile the most accurate view of demographic assignments for most of the US population. Today, Truthset is keys off of Hashed Emails and the demographic attributes that describe those records.  For example, a record would have my email address (hashed, of course) with “female, age 35-44, Hispanic, etc.” as the descriptors.  We evaluate if each one of those attribute values is correctly assigned to that hashed email. To do this, we work with multiple data providers and have them all contribute their weighted vote as to if that association is correct or not.  The weighting in their vote comes from taking each data provider and comparing them to validation sets (these are our “truth sets”), giving each data provider (at each attribute value) a weighted vote. All of the data providers come back together to vote, giving each record a Truthscore (0.0-1.0) value.  We then offer out to the market an index of these Truthscores, data provider by data provider, attribute value by attribute value, to make the relative comparisons that we talked about earlier.

Weisler: How can quality data best be used by a company?

McKinley:  In so many ways - The desire to use sets of consumer ID’s combined with attributes (“Audiences”) is expanding as more companies rely on data to inform processes such as marketing and advertising, offering financial services, attracting tenants for real estate, or recruiting talent.  One example is digital advertising. A major alcoholic beverage campaign may be targeting Hispanic beer drinkers. We have seen that many of the IDs that end up being targeted are either under the legal drinking age, or not Hispanic. By applying Truthscores to that ID pool before the ads are delivered, the brand can suppress unqualified IDs and make sure their advertising dollars are spent on IDs that ware actually in the target and might actually convert.

Another example is data enrichment. Many large enterprises acquire 3rd party demographic data to append to their CRM records. We have seen error rates of up to 40% in commodity demo data - even for the most common demographics. That error causes the enterprise to make incorrect conclusions about their customers and leads to waste in advertising, as target ID pools are built on incorrect data. Truthset can “filter” the appended file to allow the enterprise to understand the probability of each 3rd party record being accurate, and use that information to hold the provider accountable, negotiate pricing based on accuracy, and suppress incorrect records. Better data leads to better results for all downstream use cases.

Weisler: How can you track the use of data through a process to see if at any point the data becomes compromised?

McKinley: Another great reason as to why Truthset was created.  A number of us have been at companies that specialize in identity, or have used data science to transform data, or have bought data and inventory based upon one thing to learn that the measurement after the fact told us another thing had actually happened.  It’s not just the story about how data can be good or not-so-good - in fact, a number of data providers we work with have great data at scale, but when it goes through “the hops” the accuracy can be degraded (or improved in some cases!).  Truthset believes that we should be inserted at every point of hopping to ensure that transparency in what occurred either improved or maintained the quality of the records handled.

Weisler: Should the industry have a standard for data quality and if so how to implement and monitor? Or is it not possible given all of the walled gardens and silos?

McKinley: This is a really great question, and one that hits on so many things.  First, yes.  In order to make things better, you have to measure and understand, as well as have ways to continue to improve.  In our estimate, to be successful as a Data Quality Measurement Solution, you need to hit upon 6 key points:

  1. Independent, unbiased, unconflicted
  2. 100% transparent methodology
  3. Agnostic to ID spaces, attributes, and marketing channels
  4. Prescriptive, measuring BEFORE and not after-the-fact
  5. Supports any data-driven use cases
  6. The phrase “all boats should rise” should be true; the entire advertising ecosystem should benefit from transparency in data accuracy

Second, as we are driving more towards the consumer-privacy first ID space, even with it having the potential to be fragmented, with Truthset focusing on record level data, we can score whenever and wherever these IDs exist.  It’s one of the reasons we chose to start with Hashed Emails. Lastly, markets run better with measurement. When there is opacity and uncertainty in any market, there is friction. Standards bring transparency between buyers and sellers, and remove friction in the market so the market can grow faster.

Weisler: Has the pandemic created issues with data quality?

McKinley: The pandemic has created opportunity with data quality.  Now, more than ever, the data driving marketing decisions has to be accurate, to be efficient with spend and to ensure a good customer experience continues.  As people’s individual lives are changing, targeting someone with a past income range that they may no longer have will waste budgets and make that customer feel negatively against the brand.

Weisler: Can we rely on COVID-19 tracking data for the US - is it of high quality?

McKinley: While we aren’t engaged in the COVID-19 data tracking space, the situation does pull into frame that transparency and granularity in data (within consumer privacy guidelines) is critical to making decisions and acting in the best interest of creating solutions.  Also, measurement of data is crucial to determining if there is actual success.  Very much like understanding COVID’s daily funnel of metrics (tests, cases, hospitalizations, recoveries and deaths), setting a benchmark for data quality via an independent third-party, and continuing to track and measure is important to determining if actions are resulting in success.  We have encountered a number of data providers who are about to go through data cleansing, bringing on new sources, or other data science transformations, and some have hesitated to be measured at this point because they want the best data (assumed to be post-cleansing/data science) to be scored.  Our response is that we think they should cut once-measure twice; be scored before the changes, make the change and score afterwards.

Weisler: Where is Truthset now and where do you see the company in 2 years?

McKinley: We want to become the standard for how buyers, sellers, and users of consumer data measure the accuracy of record level data. The Truthset flywheel starts with data cooperation as the first step.  We are squarely focused on bringing accuracy to the marketplace between hashed email addresses as consumer IDs tied to the attributes that describe those records.  Next, we want to help brands and enterprises understand how bad data hurts marketing performance. We are already engaged with major CPGs and will be producing case studies to demonstrate the cost of error in data, and how we can help fix it.

In the upcoming months (this year and early next year), we’ll be working with leading sell-side and buy-side companies in the ad ecosystem to bring the data accuracy scoring into the equation.  We want Truthscores to be available wherever data is available to build audiences and activate. From there, we expand to additional attributes (interest-based segments, purchase, etc.), open ourselves into new environments like CTV, and move ourselves further into the programmatic exchange of buying/selling.

This article first appeared in www.Mediapost.com

 

 

Oct 19, 2020

The Unruly World of CTV During the Pandemic. An Interview with Unruly’s Terence Scroope

2 "Terry Scroope" profiles | LinkedIn 

CTV is still the Wild West and never more so than during this time of pandemic turbulence. Unruly’s Terence Scroope, VP of Insights & Solutions has recently released a study in collaboration with Tremor Video to help make sense of the landscape for marketers. 

It is a global survey of 2,562 consumers in the US and the UK in the summer of 2020. “We work with accredited panel providers in every market to recruit a demographically-represented panel. We have also conducted the research in Germany and will be rolling the research out to APAC markets as well,” he added.

Charlene Weisler: How often is this study conducted - are there any comparisons pre and during the pandemic?

Terence Scroope: This was the first time we have conducted this specific survey, but we have a long history of data and research including our recent consumer behavior survey at the start of COVID-19, which showed US consumers increasing their media consumption significantly — particularly across mobile and CTV formats. 

Weisler: What were the takeaways and were there any surprises?

Scroope: We found three major takeaway toplines – that CTV is more effective than Linear TV in driving consumer behaviors, that 35% of US consumers have tried a new ad-supported streaming service since outbreak of COVID-19 and that 42% now plan to cancel Cable TV.

It was interesting to see that more than a third (35%) of US consumers have tried new ad-supported content since COVID-19, with 79% saying they will continue to do so. Also, it was clear from the research that CTV crosses all ages, and is not just something that Millennials are drawn to. Also, it was interesting to see that US consumers are more likely to take positive actions — including searching, purchasing and improved brand favorability — after being exposed to CTV ads compared to linear TV. The presence of ads is not seen as a negative for consumers, with only 21% of people who pay for subscription services doing so to avoid ads. 

Weisler: Any differences by age or gender or region?

Scroope: We discovered that CTV consumption has mass appeal across all age groups. 74% of 35-44 year olds actively seek free ad-supported TV content, followed by 71% of those ages 45-54 and 63% for those 55+ — these groups were actually more likely to seek free ad-supported TV content than younger generations 25-34 year olds (62%) and 18-24 year olds (57%). While the US market is the most mature when it comes to CTV adoption and offerings, we are seeing some global markets quickly catching up. 

Weisler: What do you think this bodes for the future of linear? And of CTV?

Scroope: Based on the convenience CTV offers, and the increased usage we’re seeing, I think it’s safe to say that CTV is here to stay. As media buyers move from buying shows and broad segments to buying specific audiences, the wealth of data and the addressable nature of CTV will make it a far more valuable proposition moving forward. 

Weisler: How should marketers re-apportion within the next year and three years from now?

Scroope: We expect the trend of adoption to continue upward, and as more viewers cut cable and limit paid subscriptions, there will be an increasingly distinct audience that is only available within CTV. We may also see some evolution in monetization models, as some subscription services are already testing out ad supported formats in foreign markets.

This article first appeared in www.Mediapost.com

 

 

Oct 16, 2020

The State of the U.S. Video Marketplace. An Interview with FreeWheel’s David Dworin

The media marketplace has not only transformed over the past eight years, it has also offered advertisers and programmers a wealth of information from premium video content data. 

Discerning the trends has been especially important to FreeWheel which has been using its data since 2012 to help inform the industry. “We wanted to get a sense of size and the pace of growth of the market. And it’s really grown,” explained David Dworin, Vice President Advisory Services, FreeWheel.

The most recent wave of their U.S. Video Marketplace Report, fielded during the pandemic, offers a fascinating overview of the market in flux.

Methodology

Using FreeWheel data offers a, “comprehensive look into ad supported premium video because we see it across all of the different devices, across all of the major movie companies, the major television companies, the different platforms, the different apps, which offers a holistic view,” he noted.  The methodology has been refined over the years including the terminology they use and the data set has also expanded over time. The data is powered by Free Wheel for dynamic advertising, “anywhere where two different people may get a different ad versus scheduled advertising in traditional advertising,” he explained.  The data consists of digital, set top box VOD and some addressable television.

Study Takeaways

“We’ve been saying for years that the power of premium digital video is that you weren’t constrained by watching it on a cable box. You could watch whenever you wanted, wherever you wanted, whatever you wanted regardless of when something was traditionally programmed or what device it was for,”Dworin began. “The wherever has changed,” he continued, “because everyone was watching stuck at home in the first half of the year,” because of the pandemic. This has resulted in impressive growth in the number of video views and ad views. “What is really interesting is that we are seeing these double digit increases every year but they are increasing off an ever growing base, up 30% in ad views.”

Another takeaway is that, “nearly three quarters of viewing is happening on the big screen, on a traditional television, about half of it is happening though a connected TV device and about a quarter of it is happening through set top box VOD,” he stated. Compared to 2012 when the study was launched, “it was almost all happening on desktop, with a little bit happening on tablets.”

Finally, “we have seen the content mix that people watch be fairly stable over the years,” he explained, “But this year because of pandemic, we’ve seen sudden shocks to it. Traditionally we see a steady share of entertainment, news, sports but we’ve seen it shift. Entertainment stayed high. But we saw in March a spike around news with a turn to digital channels for their source for news.”

Is this a temporary result of the pandemic? Dworin doesn’t think so. “This is a year of accelerated change,” he explained. “What we are seeing happening in our data and anecdotally is that this is not a blip in 2020. All of these trends that we have been talking about for years, this year was the catalyst that really started to transform the industry and push so many of these things forward faster.”

What the Data Revealed

FreeWheel focused in on a specific week in March that was peak pandemic, to answer the question, “Are people shifting when they decide to watch TV now that they are home all the time? We saw almost a 15% bump in the amount of time people spent watching between noon and 6pm. Then another bump in the late afternoon and then bumps in the morning, almost 10%, and in primetime,” he revealed.

When you examine the data by platform, Dworin noted that, “the biggest differences by platform occurred in the first half (of the pandemic). But when we looked at the full first half, the biggest thing we see is continued explosive growth around connected TV, growing over 40% year to year, and it is now about half of the digital viewing.”

With the lack of live sports at this time, “advertisers have had to use data in more sophisticated ways in order to find those audiences in other places. We will see a return to sports by viewers and advertisers as well as seeing advertisers adopt new tactics that they had to employ without sports there,” he stated.

As far as frequency is concerned, “the technology is there to reduce the amount of repetition of creative so in 98% of on demand sessions, creative repeats either zero or one time,” he said, although, “it is difficult to frequency cap today across different publishers. If you watch something on one app at one point and then on a different app, you may not get frequency cap across all of them.” But he noted that there are some initiatives out there to address this issue. Instead of the term incremental reach, Dworin prefers to call it “unique reach which is people that you are not able to reach on traditional linear television. This is a way that you can do it in a less cluttered environment.”

Next Steps for Advertisers

According to Dworin, advertisers are still playing catch up. “The amount of advertising dollars is not necessarily proportionally up to the amount of viewing that’s happening in what is still a fairly constrained supply environment,” because there is less inventory in connected TV. He advises advertisers to consider their cross screen video strategy and, “how to best incorporate data, how to incorporate programmatic and addressable advertising and accessing this inventory long term in a way that gives them access to really differentiated and important inventory while also making sure that publishers are able to keep it brand safe and compliant - everything everybody loves about traditional advertising.”

This article first appeared in www.MediaVillage.com

 

Oct 15, 2020

Viewership and It’s Impact on the Business. An Interview with NBCU’s Mark Marshall

Mark Marshall – Future of Television 

Perfecting cross platform measurement is not only a lofty goal for the industry but also a pivotal one. NBCU has been focused on it. “We started down this path in 2018 of looking at cross measurement in a different way in order to give marketers a single view of impressions whether that impression was running on digital or television,” noted Mark Marshall, NBCU’s President of Advertising Sales and Client Partnerships.

Cross Platform Consumption Report

His recently released Cross Platform Consumption Report, now in its second year, confirmed what he and his team have seen in viewing data. “Consumers are really their own programmers at this point,” he began. “You can look at the study and see that people are consuming at their own times, in their own manners, in the way that they want to consume.”

But the surprise for Marshall was, “as much as we talk about the disaggregation of how people are consuming content, the thing that brings them all together is typically the television set.”  He noted that, “Ninety-seven percent of consumption is on a television set. Even with the huge growth we had in streaming over the past few years, that streaming has driven people back to the set just like they were years and years ago.”  

Viewership Trends

The migration back to the set is an important takeaway. But we need to go deeper to really understand how this can impact networks and marketers. The report, he explained, “for the first time, broke out the programming on digital as well as linear and put it all together. It showed what was on that night, the consumption that happened that night on linear and what was the consumption that happened on digital.”

What it showed for a program like This Is Us, for example, is that it is a top show on the night it airs on linear as well as a top show across the entire week when all of the streaming is added. “The idea of looking at linear television as a point in time is something we want people to get away from,” he stated. Happily, “most marketers have evolved. But maybe not as quickly as the viewership has changed overall,” he averred.

What is notable is that this consumption pattern has changed the way programmers gauge success. “It’s funny,” Marshall said, “We used to sit and wait for the overnights and that was a determining factor if a show was a success from the night before.  Now we really start to evaluate shows after 35 days. That is really the full picture.” Of course the viewing pattern depends on the type of show. The Voice, as a contest, may have higher viewership within a few days of its original airing while This Is Us can be viewed over a longer timeline.

The Impact on Sales

All of this begs the question, if we are looking at a month of a combined linear and streaming to confirm viewership, how does that impact sales flights? Marshall explained that, when they brought CFlight, their one platform sales solution, to market, “We did two things. We wanted to unite and get a consistent vision of where impressions were running and start to flight their schedules differently.” Pricing, he stated, “is determined by the marketplace supply and demand,” but, “let’s stop looking at linear and digital as two distinct markets. Let’s look at it as one holistic market as one holistic supply and demand view of all of it in order to give marketers the most comprehensive package. And price that package to work for them instead of having two separate negotiations as was done in the past.”

The Impact on Scheduling

When the viewer is in charge and can essentially view on demand, scheduling strategy – time period, day of week, lead in and lead out – may no longer factor. But Marshall disagrees. “Scheduling does matter,” he noted, “What you see is appointment television at the front end of the week and as you get to the back half of the week people are catching up digitally. When we think of scheduling, we no longer think of scheduling as just a specific flow from show to show. We think about it from medium to medium.” The report notes that consumption has risen 23% in the last decade as we get towards the end of the week.

Is it possible that the pandemic has been accelerating these trends? “It’s possible, “he noted, “We saw growth in our co-viewing numbers, even in news. So the idea that people are home more and people are sitting down and watching television more often absolutely has shown changes in the viewer habits over this time.”

Metrics and Measurement

Metrics for Marshall is personal. “I just turned 50 this year. At 50 I no longer count in 18-49. I am in the market right now to buy a car. The idea that I no longer count to a car maker makes no sense. What an automaker really wants is reaching a consumer who is in market. Our metrics need to keep up with where we are at – a buying audience instead of a demographic audience – to drive sales,” he explained.

With metrics, “delivery is where it’s at now where impressions are the common measurement tool. But the long term goal is to get us to transact on different ways such as our recently announced total transaction impact. We can actually talk to the auto industry and show an auto manufacturer what percentage of sales NBCU was responsible for and start to build towards guaranteeing on actual sales as opposed to impressions,” he stated.

With full cross platform measurement, “This is an issue that we have to take on as an industry to get us to a common metric. Let’s stop treating impressions the same. Let’s start looking at what the real value and impact an impression is and not pretending that a two-seconds-with-the-sound-off impression is the same as a thirty second spot seen in its entirety. Those can’t be valued the same.”

The Future

Looking ahead, Marshall predicts the full confluence of consumption. “We will continue to see the evolution to digital but at some point you are not going to see the term ‘digital’ being used. It is going to be the video consumed and total consumption,” he explained. And for advertisers, “It is hard to evolve and make distinct change unless you take a risk and throw away some of the legacy. What is more aligned with the future consumer and viewing habits instead of the past trends, looking at things on a one platform basis, on an audience level and letting consumer habits determine where your ads are going to run,” he advised.