The tech industry isn't a bunch of scrappy underdogs anymore. In the wake of numerous data lapses and incidents of personal data being monetized in ways users never expected, consumer trust is dropping—according to a recent survey from Selligent, 75 percent of consumers are worried about brands tracking their browsing behavior. And it's not just customers: unethical behavior is causing tech talent to balk at working at certain companies, and some employees are refusing to work on projects they deem morally dubious.
It's become more urgent than ever for organizations to establish a code of ethics, laying down strict guidelines to circumscribe potentially dubious actions. But drawing up a set of guidelines is only half the battle: you need to give an ethics policy teeth, or it'll be cast aside as soon as breaking the rules can give the company—or some division or employee—a momentary advantage. "This comes down to the top team owning and embracing the policy," says Alexander Lowry, who directs the ethics-focused Financial Analysis program at Gordon College. "They need to live it and embody it. That example is essential."
Importance of a code of ethics
Most of us probably think of ourselves as ethical people. But within organizations built to maximize profits, many seemingly inevitably drift towards more dubious behavior, especially when it comes to user personal data. "More companies than not are collecting data just for the sake of collecting data, without having any reason as to why or what to do with it," says Philip Jones, a GDPR regulatory compliance expert at Capgemini. "Although this is an expensive and unethical approach, most businesses don't think twice about it. I view this approach as one of the highest risks to companies today, because they have no clue where, how long, or how accurate much of their private data is on consumers."
This is the sort of organizational ethical drift that can arise in the absence of clear ethical guidelines—and it's the sort of drift that laws like the GDPR, the EU's stringent new framework for how companies must handle customer data, are meant to counter. And the temptation is certainly there to simply use such regulations as a de facto ethics policy. "The GDPR and laws like it make the process of creating a digital ethics policy much easier than it once was," says Ian McClarty, President and CEO of PhoenixNAP. "Anything and everything that an organization does with personal data obtained from an individual must come with the explicit consent of that data owner. It's very hard to subvert digital ethics when one's ability to use personal data is curtailed in such a draconian fashion."
But companies cannot simply outsource their ethics codes to regulators and think that hewing to the letter of the law will keep their reputations intact. "New possibilities emerge so fast," says Mads Hennelund, a consultant at Nextwork, "that companies will be forced by market competition to apply new technologies before any regulator has been able to grasp them and impose meaningful rules or standards." He also notes that, if different silos within a company are left to their own devices and subject to their own particular forms of regulation and technology adoption, "the organization as a whole becomes ethically fragmented, consisting of multiple ethically autonomous departments."
And creating an ethics policy has definite business benefits. "It's advantageous for brands to put consumer data privacy at the forefront of their data strategies," says Gladys Kong, CEO of UberMedia. "Brands have a responsibility to be clear on knowing what data they have access to and what permission levels they've been granted by consumers. As long as companies are respectful of that principle, there is no 'gray area' in a data strategy—it's clear to both company and user how data is gathered and handled."
Building a digital ethics policy
If you're going to build your own ethics code, how are you going to go about it? Nextwork's Hennelund, offers a framework for doing so, based on five ethical themes.
- Security: "Do we have the necessary security in place to protect users?"
- Individual control: " Do we put the individual before business value and enable him or her to control the personal data that he or she owns?"
- Segmentation: "Do we segment the customer base in any way that might harm customers?"
- Behavioral change: "Does our digital interaction and monitoring change the customer's behavior in any way that may not be sound or reasonable?"
- Incentivization: "Are we incentivizing data sharing, which raises socioeconomic issues, or incentivizing the customer to lie about his or her data—to get a better insurance product, for instance?"
This is a fairly theoretical treatment of the issues, but it covers most of the ways you might interact with customers and their data. Hopefully it will help you start to figure out how to create a policy around your company's own data retention and other behavior, and will allow proclaim to the world your ethical standards.
Now comes the hard part: following the rules.
How to enforce ethics in the workplace
Most companies aren't run by mustache-twirling villains who abuse customer data for the thrill of it. In practice, they take incrementally more unethical steps because of the siren song of profits or doing less work—and it's hard to combat those kinds of pressures with an ethics policy that's only a statement of principles without enforcement mechanisms.
One of the simplest methods of enforcing an ethics policy is to not make it profitable for individual employees behave unethically. For instance, Gordon College's Lowry suggests not paying a bonus to an employee who tries to monetize data in ways that violate policy.
But of course, for that to work, there needs to a way for those policy violations to come to light—something that's especially difficult if employees are faced with ethical violations being requested by people higher up in the company food chain. Katie Smith, the Chief Ethics and Compliance officer at ethics software company Convercent, talks about three major structures companies need to have in place in order to create what she calls a "speak-up culture":
- Start at the beginning. "New employees should start work with a clear understanding of how to report perceived violations of the company's code of ethics." Smith also notes that "it's impossible to report unethical behavior and perpetuate an ethical culture without knowledge of what is and is not off limits," which is why it's necessary to have a written ethics code that new hires are familiarized with right away.
- Keep the lines open. "Employees must feel safe and trust the process, and feel confident that organizational justice will occur," says Smith. "They need to know that the company wants to hear from them, and when they speak up, that their concern is heard, taken seriously, and action is taken. When it does come time to report unethical behavior, clearly delineated avenues for reporting must be in place to make this process as simple as possible. It is important to provide a variety of mechanisms to report a concern—phone, the web, anonymous SMS, chatbots, etc.— to meet the reporter where they are comfortable. If the process for reporting is complicated or unclear, those that are already hesitant to report or concerned about privacy and anonymity may refrain from doing so altogether." (Jonathan Briggs, Co-Founder and Academic Director at Hyper Island, who's run projects and courses on whistleblowing, specifically says that employees should have anonymous, digital access to boards and shareholders to report wrongdoing.)
- Keep whistleblowers safe. "Companies must have systems in place to protect employees against potential retaliation once they have reported a concern," says Smith. "Following up with reporters 30, 60, 120 days after an investigation closes, tracking a reporter's compensation and performance data to spot negative trends, and measuring anonymous reporting rates are ways to identify potential retaliation. Companies need to create a culture that prioritizes reporting and makes speaking up against unethical activity the clear expectation.
An impossible task?
In the end, though, no matter how many safeguards you build in, none of these policies will work unless the people at the top want them to. "For-profit companies are normally those that must attract private investors, who expect a return on their investment," says John Hooker, Professor of Business Ethics and Social Responsibility at the Tepper School of Business at Carnegie Mellon University. "Yet investors are human beings who are bound by ethical obligation like the rest of us. You can't 'convince' anyone to do the right thing just because it's right. It's like trying to convince someone that bachelors are unmarried. The right thing is, by definition, what you should do."
Dary Merckens, CTO of Gunner Technology, is somewhat cynical about convincing mega-corporations to do the right thing. "As much as Facebook and Google talk about ethics, it's mostly disingenuous because they know as publicly-traded companies, they have a fiduciary responsibility to increase shareholder wealth, and you don't do that by reigning in your products, making them less addictive, or by being more concerned about where and how your user data is being used," he says. "You do whatever it takes to make as much money as you can and never look back."
His own company is rolling out a digital media sharing app for private groups with a strict ethics policy baked into it. "More institutions need to put their collective feet down and say we won't play the game, we won't make our apps hyper-addictive, we won't sell our user data to the highest bidder," he says. "And then as users learn more about what these technologies are doing to them from companies that do play that game, they'll start looking for different solutions. So our hope is that companies whose foundational principle is to never compromise on producing ethical products, cost/profit/business considerations be damned, will be the ones left at the end of the day, successful enough and providing a lot of good for the world."