This article originally appeared in the World Economic Forum on and the Oliver Wyman Forum.
The benefits of digitization are growing. Even before COVID-19 struck, digital goods and services were expanding four times faster than the overall economy in the United States. Then video conferencing, online shopping, telemedicine and the like enabled tens of millions of people around the world to adapt after the pandemic erupted last year. Today the five largest US technology stocks account for nearly a quarter of the value of the S&P 500 Index while China’s big three account for nearly a third of the value of the MSCI China Index.
Yet consumers worry about the way companies capture their data and influence everything from their news and music feeds to the advertisements suggesting what they should buy and where. The majority of consumers say they prefer to maintain their privacy and avoid sharing information with companies, according to a 10-country sentiment survey by the Oliver Wyman Forum.
Without deep reform of the way companies treat data and governments regulate it, this mistrust threatens to become for the digital economy what carbon dioxide is for the physical world: an unseen pollution that threatens the sustainability of data ecosystems. And like carbon, those apprehensions have externalities that can cause societal harm. Willingness to share health information to contain the coronavirus declined as the pandemic worsened last year.
A Tipping Point in Data Mistrust?
According to a survey of US consumer attitudes toward 400 brands by Lippincott, the brand consultancy arm of Oliver Wyman, people rate major global social media brands lower than others in healthcare, finance, media, retail, and consumer products on questions including whether the brand understands my needs, shares my values, always has my interests at heart, and does more good than bad for society. Consumers also express less willingness to share data with those companies than with firms in the other industries.
That finding may seem paradoxical given that people in practice share large amounts of data, some very personal, with social media companies. Yet each new breach or misinformation campaign erodes public trust and risks a tipping point in consumer willingness to share.
Political pressure is growing for tighter regulation. The European Commission has drafted legislation that would enhance consumer rights and protections and crack down on potential monopolistic behavior by tech platforms. The CEOs of several big social media firms told a recent Congressional hearing that they were open to reforms of the liability shield they enjoy under US law.
Focus on Transparency, Consumer Choice and Competition
Companies should take the lead in rebuilding trust, and that starts with transparency. A seven-country survey by the Oliver Wyman Forum found that providing transparency about how data is shared was one of the two top priorities of consumers, with 51 percent saying it would make them feel comfortable giving mobility companies access to their data.
Firms should be open about the types of information they collect, the steps they take to keep it secure, how the data will be used, what benefits consumers can expect, and whether data will be shared and for what purposes. Equally, firms should specify wherever possible what data they will not collect. These disclosures should be in everyday language, not dense legalese. And companies should consider working with nonprofits or civil society organizations to reinforce transparency by auditing data practices.
Data-sharing also should be reasonable. One way to ensure that would be to share only anonymous data. Fifty-one percent of respondents to the Oliver Wyman Forum mobility survey said this assurance would make them more comfortable sharing data. And given the ubiquity of information sources available, anonymized data is sufficient for many tasks, such as serving relevant ads to consumers.
Transparency needs to extend beyond data itself to the algorithms companies use to tailor news feeds, sell advertising, and make decisions on things like hiring and lending. Pressure is growing for new rules to enforce accountability and prevent algorithmic bias, but industry doesn’t have to wait for regulators or legislators to act. Reassuring consumers that the choices and information they receive are sound and fair should promote responsible data-sharing and build trust.
Finally, companies should reinforce transparency with empowerment. That means giving consumers the ability to access their data, to decide whether it can be shared with third parties, and to request that data be deleted or made portable so a customer can take it to another service provider. Companies also might work with other organizations to foster the creation of data trusts or cooperatives, which would store data and give consumers greater control over how it is used.
For policymakers, building trust and ensuring a level playing field should be the guiding principles of any new regulatory initiatives. Existing measures like Europe’s General Data Protection Regulation and the California Privacy Rights Act have strengthened privacy protections but don’t address issues like misinformation or algorithmic accountability.
Filling that gap won’t be easy considering today’s political polarization and the sensitivity and lack of global standards on issues like free speech. But a few key principles should guide policymakers.
Start by measuring the effectiveness of existing regulations in building trust. Then ensure that new regulations are designed to promote greater choice. Measures that empower consumers or require algorithmic accountability should have enforcement mechanisms proportional to the size of the firm. Imposing the same burdens on start-ups as on tech giants can stymie innovation and competition.
Rebuilding trust won’t be easy, but the risks of inaction are far greater. It’s time for technology companies and policymakers to get to work.