Kodi Foster, SVP Data Strategy, Viacom,
is watching the media ecosystem carefully and is seeing some trends that could
spell disaster for media companies. His talk at the recent PSFK Conference was a lesson in caution.
For
one thing, he believes that we are marketing to the wrong targets. Further, we
may be coming to the wrong conclusions on the data we are capturing because the
data itself is skewed, collected in echo chambers of like voices. Tread
carefully, he warns, lest you be led astray.
The Internet is Dead
Foster’s
company, Viacom, is focused on understanding the current themes of technology
and one theme is clear; “The internet is dead,” he began. “But what I really mean
by that is that the worldwide web is dead. It is essentially cloud storage and
social media. When you really think about it, how many times are you on a dot
com nowadays? Maybe Google, but everything else you are doing is on an app.”
Or, he added, messaging which is poised to be even bigger than apps in the next
year or so. He also believes that surfing the web is “not a thing” anymore and
will probably not come back. The overall direction of the internet is decidedly
evolving, as with everything else.
Technology Moves Our Cheese
If Foster believes
that the tech landscape is changing as people use apps and messaging rather
than sites, that leads us down a dark marketing path. According to Foster, “We
don’t market to people anymore. We are marketing to the devices that are
between us and people. We game theory the algorithms of these technologies so
we can get our marketing and content in front of a human being.” Because of
this, marketers are actually not trying to understand the human beings. They
are trying to understand the biases of these technologies. Foster explained, if
you are in marketing, understanding the biases of these technologies has become
your job. “Your job is NOT to understand human beings,” he stated. This is
leading us down a dangerous analytical path where the data we create every day “is
shit” and “when you put shit in you get shit out.”
Losing Focus in the Echo Chambers of Data
So when you
use this bad data to make predictions and craft insights, there is the risk
that the outcomes will be wrong. “Machines are not trying to connect people,
they are usually promoting something,” Foster added. Promotion is curated. “Who
is deciding that curation?” he asked. It becomes a self-fulfilling prophesy
within ecosystems because the algorithms are only putting up only certain
things it wants you to see and when you click and engage on it, it reaffirms
the algorithm, even if it is not something you wanted to see in the first place.
“Socially
connected people tend to be similar,” he posited, so there is the danger of
creating “an echo chamber.” When you are fed information and you agree with it,
the algorithm will calculate to feed you more of the same. “It’s giving you
what it thinks you want because that is the only stuff you are seeing,” he
added. This creates a social contagion, especially when it involves false
information. This is only starting to come to light but, even now, we may not
be fully aware of the ramifications. The data we are gathering and using from
this echo chamber may be biased and suspect.
Amplifying Bad Results
The advent
and growth of fake news is especially troubling. “We assume, because of these
finely structured networks, that everyone within our network believes the same
thing that we believe. We assume that the belief is bigger across the world
than it actually is and that more people share that belief because the only
people we are around believe the same thing,” he warned. That is the danger of
social echo chambers.
We may not
be taking this war on reality as seriously as we need to, despite the discovery
of the privacy transgressions of Facebook and Cambridge Analytica. The reliance
of the harvested data, collected in stagnant social pools and analyzed without
context is leading us down a dark societal path. How can you have an accurate
reflection of peoples’ behaviors if the data is not accurate? “What happens to human
civilizations when the connected tissue on reality becomes more and more fragmented?
How do you create a predictive algorithm for anything when there are tens of
thousands of different versions of reality? Whose reality are you predicting?”
Foster noted.
Knowing that
this is happening and seeing the ramifications leads us to a stark juncture
point. Should we just keep calm and continue to play the same marketing game? I
believe complacency at this time would be a grave mistake. If we are being led
astray now with human-created algorithms, wait until machine learning and
artificial intelligence really ramp up in the next few years. Sleep tight,
children.
This article first appeared in www.MediaVillage.com
No comments:
Post a Comment