Indecent Proposal? When Identity, Privacy and Ethics Collide

Indecent Proposal? When Identity, Privacy and Ethics Collide

How much would I have to pay you to do something that was against your moral code? The 1993 film “Indecent Proposal” explores this concept – the price (literal and figurative) people are willing to pay, the compromise they will make, to gain something they want. “I buy people every day,” John Gage (billionaire played by Robert Redford) says when Diana (Demi Moore) tells him he can’t buy people. “In business, maybe, but not when real emotions are involved,” Diana returns. 

 

Are “real emotions” (trust?) involved when it comes to privacy, ethics and data? Can organizations respect the privacy of their customers and instil trust whilst simultaneously using their data to drive business growth? 

 

In the movie, Gage ultimately gets the object of his affection, but the damage is done. Trust, relationships and moral codes are broken. Businesses today are facing a similar form of mental and organizational discomfort — a moral dilemma, of sorts — when trying to navigate the competing concepts of privacy and transparency. And for consumers, the question is whether or not organizations can ethically respect, be entrusted with and value data privacy.

 

The volume of data available can lead to misuse of this data if not treated with great care. As a business whose goal is to make money, how do you both mine and capitalize on the data you’re collecting and ensure that you’re treating the identities of your customers and employees in a fair and ethical manner? On the one hand, earning and cultivating your customers’ trust is critical to business success. On the other, creating value and succeeding in the digital environment requires building technology platforms that deliver customized and tailored services, ideally at industrial scale, to many millions of individuals and businesses. Understanding individual search habits, buying behavior and online preferences, which are often driven by emotion, is core to the value proposition of Google, Netflix, Amazon and Airbnb. While data privacy and protection laws exist, often the laws can’t keep up with the velocity of new technologies, creating a potential ethical grey area. 

 

Traditionally, privacy was something the consumer owned and could give over to a business — signing a letter of consent or checking the terms of a service agreement. In today’s all-digital landscape, privacy is far more transactional. You share some of your identity every time you access a private Facebook group or download a piece of content. 

 

The ability to mine business intelligence and provide real-time analytics gives organizations a competitive edge in developing, adapting and providing services and processes. How then do organizations succeed at gaining a competitive edge whilst maintaining the protection and privacy of consumers and employees? 

 

This question has become a real concern for organizations, which are having to grapple with the role of data and the individual. For instance, how much information should you collect on your employees if it is to protect them from cyber criminals and phishing attacks? If we assume that digital business is global, we cannot expect consistent answers to such important questions around privacy and data usage. The EU has put a strong stake in the ground around personal data and enshrined it in the GDPR. In 2018, California passed the California Consumer Privacy Act (CCPA). Many other governments are following suit, yet clear and fundamental differences remain between interpreting these regulations and in understanding their underlying assumptions. Privacy laws don’t necessarily align with cultural expectations. Therefore, navigating the collision between privacy and ethics can become a full-time job. 

 

Many companies are being forced to define and adopt a data ethics strategy that is incorporated into their organizational values and embedded in their culture. The challenge is how organizations enforce and implement that strategy. Recognising identity attributes and the needs and protections for all of your stakeholders allows you to further assess how open access and use of data can be balanced with the need to track and control access. Identity and privacy really are individual things, and it’s likely that most consumers simply trust that you are acting ethically with their data. I think it’s fair to say that Facebook users never suspected Cambridge Analytica could gain access to their personal information. Taking a holistic view of your organization will help inform your ethics strategy. 

 

What Optiv can do for organizations is highlight the difficulty — and often competing nature — of transparency and privacy and then address the data governance challenge at an individual, employee, consumer and corporate level. We believe the key to a successful ethics strategy is to place identity at the centre

 

Whether or not your next privacy paradox involves a tuxedo-clad man in a casino, a coherent identity and data governance strategy will be your best route to solving the dilemma.  

 

Assess your privacy and governance needs and improve business operations. Download our service brief to learn more. 

 

Andrzej Kawalec
CTO and Head of Strategy, EMEA
Andrzej brings experience from some of the world’s largest companies. Most recently, as chief technology officer and head of strategy and innovation at Vodafone, he led the company’s enterprise vision of cybersecurity preparedness for more than 462 million users. He previously served as CTO and director of security research at Hewlett Packard.
Would you like to speak to an advisor?

How can we help you today?

Image
field-guide-cloud-list-image@2x.jpg
Cybersecurity Field Guide #13: A Practical Approach to Securing Your Cloud Transformation
Image
OptivCon
Register for an Upcoming OptivCon

Ready to speak to an Optiv expert to discuss your security needs?