An Ideas-Based Online Magazine of the Global Network for Advanced Management

Can We Protect Data Privacy?

Earlier this year, the Cambridge Analytica scandal brought new scrutiny to the ways in which online companies make use of customer data; a few weeks later, the European Union's General Data Protection Regulation came into effect, creating new requirements for data protection and disclosure. Global Network Perspectives asked experts around the Global Network for Advanced Management how consumers, companies, and governments in their regions are responding.

Data Privacy

Germany

Martin Schallbruch, Deputy Director and Senior Researcher of Cyber Innovation and Cyber Regulation, Digital Society Institute (DSI) Berlin, ESMT Berlin

Germany has a long tradition in that the handling of companies with their customers' data is regulated in detail by law. The first Data Protection Act came into force in 1977. The principle of "prohibition with reservation of permission" has always been a decisive legal principle. This means that a company must have an explicit legal basis or explicit consent in order to process customer data. This principle also underlies the GDPR. However, its implementation will probably be even more consistently observed in Germany, too, because the penalties imposed by the GDPR for breaching the legal regulations are extraordinary high.

However, this very strong regulatory character of the handling of personal data is also criticized by many European companies because it makes new digital solutions, such as big data analyses, considerably more difficult. This applies, for example, to the health sector. Here, the aggregation of data from different sources could provide considerable added value for research and product development, but often suffers from high legal hurdles.

Read more.

Hong Kong

Hui Kai Lung, Chair Professor and Deputy Head of the Information Systems, Business Statistics and Operations Management, HKUST Business School

In Hong Kong, the regulation leans more towards the European Union model, which essentially treats individual privacy as a basic right. This significantly affects firms’ degree of freedom in using consumer, employee, or other individual data. For example, the Privacy Commissioners in Hong Kong have taken actions against firms for misusing consumer data from loyalty reward programs and in direct marketing, extended storage of consumer bankruptcy records, unnecessary collection of employees’ biometric data, and aggregating and sharing consumers’ litigation profiles without consent. These incidents have caused local firms to be cautious in handling personal data. However, a serious limitation is geographical and political boundaries. The Hong Kong law largely applies to firms located or having primary businesses conducted in Hong Kong. This means that foreign firms need not be affected by these regulations.

Read more.

South Africa

 

Tim London, Senior Lecturer, Alan Gray Centre for Values-Based Leadership, UCT Graduate School of Business

There are three, directly interrelated issues to tackle to improve policies for the use and sharing of data: the legal requirements of companies, the companies’ statements of intended data usage, and consumers’ abilities to understand and make informed decisions regarding how their data will be used. The legal requirements for how data is handled by companies – for example, how data can and cannot be shared with third parties – has come under increased attention lately. While the reasons for this public attention are indicative of significant problems, hopefully the scrutiny now afforded to the issue will help ensure better, more nuanced laws and regulations can be written.

Related to this, however, is the need for companies to develop readable data usage policies. Currently, the legalistic, incredibly long and detailed statements are complete gibberish to the vast majority of consumers. In addition, even the most committed consumer would have trouble reading through the voluminous statements on data sharing, leaving almost every consumer completely in the dark as to just what they clicked “I Agree” to in the first place. While companies certainly need legally robust protection in this arena, they can also develop “translations” of the complex legal terminology into short, clear statements that their consumers can actually understand and interrogate. If companies are serious about complying with both the letter and intent of the relevant laws, having both of these available to their consumers is a must. While lying or omissions have certainly happened in the corporate space as it pertains to data usage and sharing policies, the vast majority of issues we see are actually due to data sharing that is buried somewhere in a massive pile of legalese.

Finally, assuming companies take the necessary steps to create these more clearly understood statements, consumers need to commit to reading them. Many consumers have previously just clicked on the necessary boxes to get to using their apps and programmes; as recent scandals have shown us, there can be severe consequences for this lack of attention. So while there is, and should be, a much higher expectation for companies to commit to ethical and understandable policies, it is also incumbent on consumers to engage with these policies more rigorously. While that’s not terribly fun, now that we have seen what can happen when we’re not paying attention, we have to put some of the onus on ourselves to ensure the situation improves.

Tied into all of this should be a change in expectations of the companies we give our business to, as it relates to their transparency. For the vast majority of products and services, we have many different options; consumers should begin to factor in how user friendly a company’s data usage/sharing policies are, just as we would weigh the UX of their website, service, or product. Increasingly, we’re going to see consumers stop choosing companies that continue to make it difficult for their consumers to understand their data sharing policies. In the shorter term, this will be at least a small aspect of product and service differentiation in the market; in the longer term, this type of user friendly approach to transparency in data sharing will become a bare minimum requirement in the market.

Switzerland

Vanina Farber, Professor for Social Innovation, IMD Business School

Facebook’s most valuable asset is the knowledge it accumulates about its 2.13 billion monthly active users. Every emoji, post and friend connection gives the social network more information about a user’s preferences, including shopping choices and political views. That data is then used by advertisers who provide Facebook almost all of its annual revenue, which stood at US$40.7 billion for 2017.

Advertisers pay for that data to target their messages to the right audience segments. Facebook and other ad-stuffed tech companies have faced some scrutiny, particularly from privacy regulators and campaigners about their business practices, but the Cambridge Analytica scandal has brought those concerns to the masses.

It’s worth noting that other industries have been berated over such business practices. Whether it is the pharmaceutical industry’s approach in the early 2000s to marketing AIDs medication in low-income countries, the impact of mining practices on the environment or the contribution of highly processed foods to childhood obesity and diabetes, corporations are under pressure by investors and consumers to account for the impact of their behavior on society.

Read more.

Michael Wade, Professor of Innovation and Strategy, Cisco Chair in Digital Business Transformation, IMD Business School

Imagine for example that you could identify a segment of voters that is high in conscientiousness and neuroticism, and another segment that is high in extroversion but low in openness. Clearly, people in each segment would respond differently to the same political ad. But on Facebook they do not need to see the same ad at all – each will see an individually tailored ad designed to elicit the desired response, whether that is voting for a candidate, not voting for a candidate, or donating funds.

Cambridge Analytica worked hard to develop dozens of ad variations on different political themes such as immigration, the economy and gun rights, all tailored to different personality profiles. There is no evidence at all that Clinton’s election machine had the same ability.

Read more.

United Kingdom

Andrew Stephen, Associate Dean of Research & L’Oréal Professor of Marketing, Saïd Business School, University of Oxford

How was it that Cambridge Analytica was able to collect the personal information of 71 million Facebook users, despite only 270,000 of them consenting to their campaign? The answer lies in the apps we download and the ways in which they harvest data from phones or mobile devices. Each time we click ‘accept’ to install an app, it opens a backdoor that enables it to access our contacts which could include not only personal details but call logs, or photos of family and friends. Studies have shown that few people rarely review privacy policies and permissions, and in my own research, conducted with a sample of 287 business students in London, 96% of participants failed to realize the scope of all the information they were giving away. Imagine that each consumer has 200 contacts on their phone, and each of those has a further 200, and it’s easy to see how the multiplier effect works to rapidly build a huge data repository for an organization, even though that data was gained without permission from the owner.

Read more.

United States

David Bach, Deputy Dean, Yale School of Management

Much of the world—with the U.S. being a notable exception—had converged on earlier European data privacy rules, and countries such as Argentina, Brazil, and South Korea are already discussing legislation to bring their domestic rules in line with GDPR. Their own business communities are often driving these efforts because (a) they need to comply with EU standards to process European data and (b) ensuring greater protection for European customers than for domestic customers is not only costly and inefficient but also generates reputational risk. In fact, for the same reason, many American companies are taking the leap and are extending GDPR-compliant policies to all their customers. As my colleague Abe Newman and I explained in an earlier paper(link is external), it’s the combination of a highly attractive market of 500 million consumers and potent regulatory authority over that market, including the ability to deploy this authority extraterritorially, that gives the EU so much global regulatory clout—call it the “Brussels effect.”  

Read more.

Global Network for Advanced Management