Why Do People Bend the Rules?

Why Do People Bend the Rules?

Well-intentioned individuals often break the rules. Breaking the rules provides a cheater’s high and freedom from rules (Pincott, 2014). For a glorious moment, a person feels on top of the world. This moment of joy is one of the immediate rewards of breaking the rules. Left unchecked, a well-intentioned person can unconsciously lower their ethical standards so frequently that they lose their moral ground. Understanding the undertones of unethical decision-making is important to counter the warning signs preemptively. The higher a leader is within a company, the more imperative they make ethical decisions.    This paper discusses how morals and values have evolved through history. It also reviews the psychological theories that explain why people make unethical decisions and the implications of those behaviors in the workplace.

Moral and Values Throughout History

Morals and values are two similar constructs that play a large role in ethical decision-making. Values are a person’s judgment of right and wrong. They form the foundation of what influences a person’s behavior. Morals are the system of beliefs that grow out of an individual’s values. They are the standards behaviors that are acceptable to do. Ethics are the rules creating a structured system that governs appropriate conduct in society.

May (2006) stated in the 1950s that corporations sought to create domestic oligopolies for wealth and power. Conglomerates expanded, in the 1960s, without considering consumers’ needs. This led to several hostile takeovers in the 1970s and insider trading scandals in the 1980s. By the 1990s, corporations were being scrutinized for exuberant executive benefits while downsizing in a global market. Nowadays, corporations have dedicated resources to ethics by developing programs and hiring people responsible for monitoring and managing ethics throughout the corporation (Hartman et al., 2021). Ethics is starting to become ingrained in daily activities in the workplace.

Throughout the United States history, morality has varied in focus according to the Moral Foundations Theory. The Moral Foundations Theory (MFT) proposes a method to categorize the unique moralities consistently encountered worldwide (Haidt & Joseph, 2004). The Moral Foundations Theory consists of five moral foundations: harm, fairness, ingroup, authority, and purity.

 The first foundation, harm, underlying virtue is care and compassion, therefore viewing injustice as pain and suffering (Haidt & Joseph, 2004). Fairness-based morality views impartiality and tolerance over bias and prejudice. Ingroup is the third foundation. It holds patriotism and community in high regard and feels threatened by betrayal. Duty and social order drive the fourth foundation, authority. It detests those who show defiance. Lastly, there is purity. The religious notion of sanctity and purity shapes this foundation. Violators are considered contaminated and tarnished.

Wheeler et al. (2019) explored the trends of morality across the twentieth century. Their exploration was conducted using Google Books to analyze the moral language used throughout history. This research was limited due to reviewing only English-translated books within the specified period. They deduced that fairness and authority-based morality was moving industries in the 1950s. However, harm-based morality steadily increased from 1980 to at least 2007. As harm-based morality increased, the purity foundation decreased. Ingroup-based morality made a steady incline from the 1910s until 2007.

The researchers posited the rise of culture wars influenced the influx of harm-based morality language. Since the 1980s, there has been an upsurge of social justice concerns. Additionally, the political climate of conservatism has affected the outcome of purity-based morality.

Good People Gone Bad

What ethical lessons can be learned from past organizations? Organizations don’t make decisions. People do. Sometimes those decisions are unethical. Employees are often presented with ethical quagmires that will cross a moral line. Three psychological undertones lead to people bending the rules: omnipotence, cultural numbness, and justified neglect (Wedell-Wedellsborg, 2019).

Omnipotence is the aggrandized sense of entitlement. This dynamic applies to those who feel the rules do not apply to them. This is often identified in senior leaders. Prentice (2007) noted that over-optimism and overconfidence are two types of bias that lead to omnipotence. Overoptimistic people constantly believe that bad things happen will not happen to them. Generally, optimism is not a concern. The over-calibration of optimism can lead to systematic errors in judgment that can induce unethical behavior (Prentice, 2007).

Prentice (2007) suggested that overconfidence enhances over-optimism. This irrational confidence in the accuracy of their decisions exacerbates the errors in their ethical judgments (Prentice, 2007). This overconfidence can often lead people to double down on their decisions that have significant ethical implications. These people are so engrossed in their decision that they fail to see the effects their unethical approach can cause.

Each time an individual takes a risk, and it works out, the more susceptible to continue to take risks. This is known as the cheater’s high. The individual is in a confidence cycle of risk, win, rise, and repeat. This cycle then becomes the norm, lowering their inhibition and increasing irrational and reckless decision-making. When these individuals focus more on winning, they lose sight of morals and ethics and start to negotiate them incrementally.

An omnipotent professional can be an asset to an organization. They tend to take risks on innovation and take bold actions to facilitate change. However, the higher one climbs in an organization, the more impactful their decisions are. Without people around to help keep them grounded and point out their flaws, they will continue to spiral down a path of unethical decision-making. To combat this behavior, professionals should curate a group of trusted colleagues to acknowledge any flawed decision-making.

The second dynamic pointed out by Wedell-Wedellsborg (2009) is cultural numbness. While working at an organization, the culture of that organization influences a person. Over time the organizational culture can sway one’s moral compass. This is a common method in which a decent, ethical person can behave and unethical ways. Just like company culture, this is a gradual process. An individual may start to make concessions on what is ethical to fit in with the culture. Prentice (2007) identified several ways an individual can be culturally numb.

Generally speaking, cultural numbness is conformity bias. Irrespective of one’s beliefs and morals, conformity bias makes individuals inclined to behave similarly to the culture around them. This acculturation can alter daily behavior in that environment. Even when individuals are certain their behavior is unethical, they take cues from others around them for appropriate actions. This can lead to an exclusive organization and stifle creativity and originality.

Another form of cultural numbness described by Prentice (2007) is obedience to authority. Obedience to authority is natural. However, blindly following a leader without question can lead to unethical situations. Obedience to authority is when an individual doesn’t question an authority’s decision based simply on their position. The willingness to please the authority overrides the desire to do ethical things. Often, displeasing authority can lead to negative consequences in a workplace setting.

The most famous example of this scenario is an experiment conducted by psychologist Stanley Milgram. The experiment’s premise was for participants to administer increasing levels of shock to other participants at the experimenter’s command. Sixty-five percent of the participants continued to inflict pain on the participants even after they had reached a point of consternation (Marcus, 1974). Although wildly unethical, the experiment proved the lengths an individual will go to please their superior.

Incremental acculturation and superior pressure can suppress innovation and increase groupthink (Prentice, 2007). Groupthink occurs when individualism is eliminated by conformism. Well-intentioned individuals converge on a solution that irrational or premature decisions may fuel. Group members agree on these decisions to either conform or complete the task. The decision can be swayed by peer pressure, the most vocal or risk-takers—all of which can be detrimental to ethical decisions (Prentice, 2007).

 As previously mentioned, cultural numbness is an incremental change. Incrementalism is small degrees of change that are so minuscule that unethical behavior goes unnoticed. Employees are desensitized by unintentional unethical behavior. The longer one stays in an organization, the more they are oblivious to the cultural changes. This change in blindness is more significant for decision-makers (Hartman et al., 2021).

Being an engaged and active team member is essential to successful teams. However, regularly assessing one’s moral compass is crucial to counter-cultural numbness. Wedell-Wedellsborg (2019) suggested identifying moments where you compromise your morals for the collective. People must assess the last time they countered a decision or expressed their thoughts to influence peers (Hartman et al., 2021). A good id Again, having an honest group of friends or colleagues can help evaluate moral decisions.

Lastly, when people know the situation is immoral or unethical but choose not to intervene due to a reward, they exhibit justified neglect. Organizational aggravators, such as reward systems, facilitate unethical behavior (Moore & Gino, 2013). Rewards systems attach immediate incentives to ethical importance that could leave a lasting effect. For fear of negative repercussions, employees will rationalize their actions to make unethical choices. Institutional acceptance often vindicates these infractions. Additionally, the size of the reward strengthens the individual’s propensity to act unethically.

Prentice (2007) referred to this process as self-serving bias—where the information is gathered and processed in a self-serving way. Although presented with the facts, their judgment is clouded by their self-interest. This can cause individuals to look for only evidence confirming their ideas, which is known as confirmation bias (Nickerson, 1998). It can also cause individuals to ignore evidence that has discredited their belief—belief persistence (Prentice, 2007).

Employees must commit to upholding an ethical work environment to combat self-serving bias. Otherwise, self-serving bias can distort a well-intentioned person to bend the rules for immediate gain. This can trigger an avalanche of unethical exceptions.


  Even a snowflake can become an avalanche. Each aversion starts with a justifiable infraction—a small exception to the rules. Each infraction lowers the bar on what is consciously conceived as ethical and unethical (Prentice, 2007). These transgressions are so minute and frequent that they are unnoticeable. These seemingly insignificant lapses in judgment can taint the corporate culture.

Individuals fear retaliation for speaking out against unethical behavior. Therefore, cultural and structural checks and balances aid in overcoming cultural bias (Wedell-Wedellsborg, 2019). Most companies have ethical guidelines that protect individuals who are whistleblowers.

Prentice (2007) suggested the following solutions: debiasing, always keeping ethics in mind, monitoring rationalizations, and taking a stand. Debiasing is the course-correcting cognitive bias (Prentice, 2007). When employees keep ethics in mind, they base each action on ethics as a frame of reference. This can help minimize rationalizations that contradict ethics. Lastly, when an individual discovers an action or decision compromises their actions, they should take a stand and voice their opinion, regardless of the potential judgment or negative repercussions (Prentice, 2007).


Haidt, J., & Joseph, C. (2004). Intuitive ethics: How innately prepared intuitions generate culturally variable virtues. Daedalus, 133(4), 55–66. https://doi.org/10.1162/0011526042365555

Hartman, L., DesJardins, J., & MacDonald, C. (2021). Business ethics: Decision making for personal integrity & social responsibility (5th ed.). McGraw-Hill Education.

Marcus, S. (1974, January 13). Obedience to authority an experimental view. The New York Times. Retrieved February 28, 2022, from https://www.nytimes.com/1974/01/13/archives/obedience-to-authority-an-experimental-view.html

May, S. K. (2006). Case studies in organizational communication: Ethical perspectives and practices. Sage Publications, Inc.

Moore, C., & Gino, F. (2013). Ethically adrift: How others pull our moral compass from true North, and how we can fix it. Research in Organizational Behavior, 33, 53–77. https://doi.org/10.1016/j.riob.2013.08.001

Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220. https://doi.org/10.1037/1089-2680.2.2.175

Prentice, R. A. (2007). Ethical decision making: More needed than good intentions. Financial Analysts Journal, 63(6), 17–30. https://doi.org/10.2469/faj.v63.n6.4923

Wedell-Wedellsborg, M. (2019, April 12). The psychology behind unethical behavior. Harvard Business Review. Retrieved February 28, 2022, from https://hbr.org/2019/04/the-psychology-behind-unethical-behavior

Wheeler, M. A., McGrath, M. J., & Haslam, N. (2019). Twentieth century morality: The rise and fall of moral concepts from 1900 to 2007. PLOS ONE, 14(2), 1–12. https://doi.org/10.1371/journal.pone.0212267