Can we liberate and protect our digital usage? Or will a misguided attempt at policy response end up making it all the more confined?
Your online choices are rarely private. With the recent revelations of the ‘do not track’ scandal, customers need to be more aware of whether the privacy options they are offered actually perform as advertised. More importantly, consumers need to understand that the wrong policy response may help to improve some versions of privacy at the expense of accelerating other, more innocuous trends that are shaping how online communities are organized and created.
Giving people control of their data without giving them the tools to understand that control may end up being negative. This would not be negative in and of itself but rather because this process does not go far enough. With new data laws, we are being given more choice as to how and when our data is used – or at least more visibility into its use – and shown how the ease of creating echo chambers exceeds the ease of creating the spaces that connect us. The implication is that the very process we expect to liberate and protect our digital usage may end up making it more confined.
For the most part, we meet people by accident and without prior design. But who we choose to become friends with is never accidental. We tend to pick people who agree with us in some regard. The same goes for the sites we repeatedly visit, the news we read, the items we buy, how we find apartments, how we find online dates – the list goes on.
We can curate our reality more readily and more precisely, but under such trends, the nature of that curation, and control over that curation, based on what kinds of information we allow the algorithm to use.
We adapt to the spaces available, to the tools given. Whether or not we intend it, we live in a marketplace of social judgement. We avoid some kinds of relationships and choose others, often by how they make us feel – comfortable, part of a community, or whatever it may be that makes us repeatedly pick that site or person over another.
How businesses see us is increasingly a matter of the data they have access to and the inferences that they feel comfortable making with that data. Producers at Netflix are known to base their recommendations on production decisions that are more heavily data driven, identifying how the choices they make translates to viewer losses. Indeed, Netflix is less a game of content than of producing a menu for various forms of disengagement, what to leave on in the background, to let people feel comfortable with on a fine grained level in the script and structure of the content itself.
The process of self-selection means businesses might be incentivized to curate personalized experiences in a way that mirrors echo chambers - competing to be the most inviting, the most ‘like-you’, the least judgmental. We pick experiences that affirm us in some sense – and often this means affirming our beliefs.
This process of self-selection is increasingly being decided in how we respond to privacy, or how we understand it at all. The nature of data profiles which businesses and governments have on us - or which we may eventually choose to allow - are already shifting the framework of who best understands us, who has access to the data that predict our choices and preferences, and furthermore who might be able to judge us least. Echo chambers are only incidentally digital, as they stem from a more fundamental habit of following the path of best convenience and the desire to curate our experiences accordingly.
Nike’s recent adventure into political marketing allows a clear vision of this trend without the personalization. Can Nike be everything for everyone? Can it selectively appeal to the right and the left with targeted political marketing and a neutral widespread campaign?
Beyond business, we can push deeper into the question of our personal lives. Socially, every person we meet has a different mental projection of us, created from different experiences and from their own biases; the same goes for businesses and governments. Minute differences in data sets and proprietary algorithms create – even if minutely – different experiences among these firms.
The worry is less that this process will accelerate how echo chambers can outnumber the spaces that connect us than this trend’s ability to push us to a place where we cannot tell the difference.
As more control over data is ceded to citizens, they can opt into different arrangements and businesses based on how they believe their data will be used. When we self-select how algorithms view us, selecting the data available for them to predict and perceive us, we need to consider not only how we want to be judged, but, unfortunately, also what that judgement serves to do in the context of the larger social dynamics at play.
We should beware a world where the likelihood of unexpected connections continues to decrease. The attempt to correct the imbalance between firms and citizens in data control may end up consolidating the control with businesses, a path which many have tried to avoid. We are moving to a world by design, where serendipity holds less sway.
The article is also featured on The RSA, and World Economic Forum, and is written by Danny Goh, Mark Esposito, and Terence Tse.
Danny Goh
Serial entrepreneur and an early-stage investor, co-founded and been CEO of Nexus FrontierTech, investing in early-stage start-ups with 20+ portfolios; currently serves as an entrepreneurship expert with the Entrepreneurship Centre at Said Business School, University of Oxford.
Mark Esposito
Professor of business and economics at Hult International Business School and at Thunderbird Global School of Management at Arizona State University; a faculty member at Harvard University since 2011; a socio-economic strategist researching the Fourth Industrial Revolution and global shifts.
Terence Tse
Professor at ESCP Business School and a co-founder and executive director of Nexus FrontierTech, an AI company. He has worked with more than thirty corporate clients and intergovernmental organisations in advisory and training capacities.