91Ƭ

Skip to main content

Unethical Engineering: Understanding the Discrepancy Between Intent and Behavior in Tech 

Clarifai, an AI startup translating images and video into structured data, was committed to philanthropic practices such as donating software to socially beneficial causes. This was the Clarifai that Liz O’Sullivan was familiar with when she joined the team in 2017.12 Several months later, O’Sullivan found herself in a workplace with a vastly different character. The office, which had paper covering the windows, was dubbed “The Chamber of Secrets” due to the clandestine nature of its meetings. Outsiders, along with the team themselves, were unsure of what they were building but the CEO assured them that they were saving lives. Shockingly, upon further inspection of project documentation, O’Sullivan and the rest of her team deduced that the company was working for the Department of Defense. When confronted with their discovery, the Clarifai CEO merely confirmed that their software would likely be used for autonomous weaponry. Liz O'Sullivan quit the next day. 9

Experiences similar to O’Sullivan’s are ubiquitous within tech companies, most notably within tech giants such as Google, Meta, and Amazon. During my time working in big tech, I encountered several engineers who quietly parted ways with their company due to ethical concerns about what they were building. Now, as graduation creeps closer, I am inclined to ask myself what I would do if I found myself in a situation like this. My immediate response seems obvious— of course, I would speak out and leave the company if necessary. That is the right thing to do, therefore, it is what I would do. Despite my gut reaction, I am reminded that Liz O’Sullivan was the only person to leave Clarifai, even though each person on the team was privy to the same information. 9 They’re not alone in this: A survey conducted by Indeed found that 15 percent of 1000 participants claimed to have worked for a company involved in a public scandal. Of the 38 percent of workers, 15 percent continued to work at their company. 1 Fundamentally, I know that people who choose to continue working on unethical projects are not evil people. They may be good people who, like most, want to contribute meaningfully to the world. For this reason, I ask: Why do people engage in unethical behavior when their intent is to do what is right? 

The answer to this question is rooted in the three-step process of ethical decision-making illustrated below, in Figure 1. This process begins with a person’s awareness or the ability to recognize that a situation calls for ethical consideration. This is followed by personal judgment or the ability to decide which potential outcomes are right or wrong. Lastly, there is the final action taken. Ultimately, these actions are based upon a mix of what a person believes is right and the inevitable, unconscious biases that affect their decision making processes (Gino, 2015). 7 Recognizing these unconscious biases is crucial to understanding why people engage in unethical behavior that seemingly conflicts with their moral predispositions. Ultimately, these biases can be classified into five different situational and social forces. 

Figure 1 

The Ethical Decision-Making Process (Gino, 2015). 7

The first of these situational forces is the appeal of a short-term reward. In 2015, a Harvard study found that people hold two competing motivations: a long-term desire to be good and a short-term desire to reap an immediate gain such as recognition, a promotion, or financial compensation. 7 Engaging and justifying unethical behavior in exchange for this reward is known as “justified neglect.” Unfortunately, justified neglect is pervasive in the tech community as high compensation motivates employees to primarily focus on what they might gain from the job at hand. 3 A cursory analysis of tech compensation explains why it is so enticing. “Dice’s Tech Salary Report” found that the average tech salary in 2021 was $104,566.5 For engineers, this average number is slightly higher at around $130,000, as shown in Figure 2 (Dice, 2022). However, salaries are regularly far greater than this. Due to demand for engineers, companies facing public backlash have been offering higher pay to attract talent. One such company is Amazon, which recently announced that the maximum base pay would increase to $350,000, an almost 120 percent increase from the previous cap of $160,000. 13 In addition to base pay, employees typically receive restricted stock units, which easily doubles their compensation. A quick search on Glassdoor or Levels.fyi, websites that allow employees to anonymously share their compensation, confirms that similar earnings exist across most big tech companies and high valued startups. Thus, as people try to balance their motivation to do good and to be paid well, they are often inconsistent in their moral behavior. 

At the same time, rather than seeking reward, employees seek to avoid loss. A study found that self-serving cognitions are heightened by organizations that expect higher levels of performance. This is because “high performance elicits performance pressure to avoid significant consequences” such as loss of employment. 11 Ultimately, it is the urge to avoid these potential losses that motivate unethical behavior. 

Armed with this knowledge, companies have capitalized on motivational forces to discourage employees from resisting directives. Take, for example, Meta’s performance review process. At its core, employees are graded and ranked against each other by their managers. At the end of each bi-annual review, the bottom 15 percent of people are dismissed from the company. 4 In order to maintain a high ranking, employees are dissuaded from giving managers critical feedback or challenging decisions even if they know them to be harmful. In fact, Facebook engineer Sophie Zhang found that simply bringing software misuse to light can come at the cost of unemployment. In 2017, Zhang became aware of authoritarian governments that were using fake Facebook pages to influence politics around the world. When she took her findings to management, Zhang said her managers told her to “stop finding fake political accounts and focus on her main job responsibilities.” Soon afterward, Zhang was fired for what Facebook claimed to be poor performance. 6 Zhang’s story illustrates how having the economic privilege to speak out without fear of being fired or blacklisted from the tech world is crucial to employee activism. To an employee with a single-income family or a new grad employee under an H-1B visa with limited job opportunities, self-protection may be more valuable than conducting ethical behavior and facing retaliation. 

Figure 2: Average Tech Salary by Occupation from "Dice’s Tech Salary Report" 2022 
Occupation2021Change from 2020
C-Suite Management$151,983+ 6.0%
Systems Architect$147,901+5.1%
Cyber Security Engineer$135,059+0.5%
Business Consultant$126,531+4.0%
Porgram manager$120,755+0.8%
Software developer$120,204+8.0%
Data Engineer$117,295-1.1%
UI/UX Designer$101,260+10.6%
Web Developer$98,912+21.3%
Network Engineer$93,373+2.0%
Systems Administrator$88,642+6.2%
Technical support engineer$77,169+12.4%

 


 

 

 

 

 

 

 

 

 

 

 

 

 

 

However, avoidance of loss is not the sole reason employees often do not challenge the status quo— this can also be attributed to a human tendency to show obedience to authority. In Stanley Milgram’s notorious shock experiments, a researcher instructed ordinary people to press a button that delivered a “shock” to another unseen person. Though there were no shocks being delivered at all, the participants were unaware of this and received feedback that they were inflicting pain on the recipient. Despite this, they did not stop pressing the button to deliver the shock when prompted by the researcher. From this study, Milgram found that people, despite knowing that they were causing harm, still felt the compelling need to obey an authoritative figure. 10 Such obedience starts young, as Pepperdine research aptly states: 

Children are primed to obey their parents, their survival depends upon it and in school, this conditioning continues. Students automatically know that they must show deference to their teachers. Consequently, later in life, when the boss orders an employee to do something, many people quickly obey without thinking. 8

Thus, people are conditioned to comply with an order regardless of its ethical nature. However, this is not just concerning because it discourages people from challenging authority, it also contributes to leaders feeling omnipotent. Harvard research indicates that as leaders become more senior and receive increasingly less dissent, they become characterized by the feeling of invincibility. To the omnipotent leader, ethical boundaries apply to everyone except them. 15 Thus, a cycle is formed in which employees acquiesce to unethical directives which, in turn, encourages omnipotence in leadership, and so on. 

In addition to situational pressures, there are social forces such as witnessing the unethical behavior of others that can be an antecedent to unethical behavior. In 2009, a study tasked a room full of students to complete a puzzle for a monetary reward. The reward each student could receive increased in correlation with how much of the puzzle was completed. The experiment purposefully allotted insufficient time to complete the entirety of the puzzle, making it impossible to earn the maximum award. In the first condition, students were simply asked to complete the puzzle, turn in their work, and collect their reward. In the second condition, a member of the research team posed as a student and announced that they finished the entire puzzle a minute after the solving time began. He then submitted his work and collected the full reward in front of the other students. Ultimately, this study found that the students who witnessed someone cheat were more likely to do it, replicating the behavior they observed. 7 When applied to the workplace, the dishonesty of others can be influential and become the social norm. This phenomenon, known as “cultural numbness,” shifts individual moral bearings towards the culture of an organization slowly, rather than in one abrupt switch. 15 Thus, dishonest behavior coupled with gradual homogeneity in ethical standards begets an environment overlooking unethical decision-making. 

Similar to the effect of seeing other people act unethically, thinking that other people act unethically can be just as damaging. A University of New Mexico study asked a group of students to partake in a problem-solving test. Half of the participants received a successful score and half received a failing score regardless of what they answered. When prompted to predict the results of other students, those who succeeded were more likely to assume that others also succeeded. Whereas, those who failed were more likely to assume that others also failed. 14 These results highlight how people are susceptible to what is called the “false consensus effect.” This is a form of bias that causes people to see their own actions and judgments as “relatively common and appropriate to existing circumstances” 2 meaning that people assume others are likely to do what they would do. When applied to the workplace, this sentiment can allow employees to justify their wrongdoings by believing that everyone else would do it. And unfortunately, the more people are able to justify their behavior, the more likely they are to behave unethically.

Indeed, the stories in the media exposing unethical practices in tech companies have highlighted the gap between the behavior people believe they ought to have versus the behavior they actually have. Ultimately, this alarming discrepancy can be attributed to a range of situational and social forces that are so strong that they seem to make individual choices almost irrelevant as they corrode our ethical decision-making process. Over time, the enticement of reward, fear of loss, obedience to authority, numbness to culture, and deceit of the false consensus effect warp our ethical standards and moral principles and enable us to act in ways that conflict heavily with what we believe to be right.


References 

  1.  
  2. .  
  3.  
  4.  
  5. from  
  6. 103(1), 54-73.  
  7. ,
  8. Sherman, Steven & Chassin, Laurie & Presson, Clark & Agostinelli, Gina. (1984). The role of the evaluation and similarity principles in the false Consensus effect. Journal of Personality and Social Psychology. 47. 1244-1262. 10.1037/0022-3514.47.6.1244.