Loading AI tools
Behavioral tendency towards action over inaction From Wikipedia, the free encyclopedia
Action bias is the psychological phenomenon where people tend to favor action over inaction, even when there is no indication that doing so would point towards a better result. It is an automatic response, similar to a reflex or an impulse and is not based on rational thinking. One of the first appearances of the term "action bias" in scientific journals was in a 2000 paper by Patt and Zechenhauser titled "Action Bias and Environmental Decisions", where its relevance in politics was expounded.[1]
People tend to have a preference for well-justified actions. The term “action bias” refers to the subset of such voluntary actions that one takes even when there is no explicitly good reason to do so.[2] In the case of a decision with both positive and negative outcomes, action will be taken in favor of achieving an apparent advantageous final result, which is preferred over inactivity. If besides gains, losses occur or resources are redistributed adversely, this will be neglected in the decision-making process.[2] Its opposite effect is the omission bias.[3]
Multiple different theories as to why people prefer action over inaction have been suggested. Humans might naturally aspire to act since it is perceived as being most beneficial, even though it can occasionally worsen the outcome of the action.[2] Inaction may be perceived as an inferior alternative to action. This view can be explained from an evolutionary perspective since early action proved to be adaptive in terms of survival, becoming a reinforced behavioral pattern. Even though living circumstances for people have changed beyond the need to prefer action over inaction to ensure survival, this bias persists in modern society as actions cause visible positive outcomes more so than omissions of actions, a link which becomes reinforced.[1]
There is a general tendency to reward action and punish inaction.[1] As shown by operant conditioning, rewards are more efficient in increasing the display of a behavior than punishments are in decreasing the likelihood of the display of a behavior. This results in humans choosing action rather than inaction. Engaging in action can also serve as means of signalling and emphasizing one's productivity to others which is rewarded by societal praise more than positive results originating from inactivity. Action also provides the doer with the impression of having control over a situation, which creates a feeling of personal security.[1] This is in contrast to inaction, which is more readily linked with feelings of regret in face of the lack of praise and even possible punishments for it.[4] The outcome associated with each action or inaction also affects future decisions, since the link is inevitably and immediately reinforced or punished each time a behavior is carried out; only a neutral outcome does not contribute to learning.[1][2]
Another reason for the existence of the bias might be that people develop the decision heuristic of taking action but then transfer it to an inappropriate context, resulting in action bias.[2]
In politics, action bias is manifested by politicians not taking action on issues such as global warming, but wishing to appear to do so, so they make statements, which are not actually actions, and offer relatively ineffective proposals and implementations. Actions and promises of future actions are not taken primarily to bring about an impactful change, but rather to showcase that one is working on it and progressing.[1][2] The symbolic power and external image of the action is far more powerful than its true benefit for change.[1][2]
In the field of medicine, action bias can occur in diagnosis and subsequent treatment, which is, among other things, a problem caused by specific diagnostic criteria. If a patient does not meet enough criteria or happens to meet exactly enough criteria, a premature diagnosis or misdiagnosis may be the result.[3] This leads to the patient not receiving satisfactory or needed treatment. One way to counteract the action bias is to use a broader range of tests or to get a second opinion from colleagues and technical experts from relevant fields before making a final diagnosis.[3] In medical decision-making there is the predisposition of professionals to interfere, even if not interfering would be a better option. Here, the action bias takes the name of intervention bias and its existence has been proven by many studies in the medical community.[5]
Action bias occurs among patients as well. When equally presented by a physician with the options of either taking medicine or just resting, most patients greatly prefer taking the medicine. This preference prevails even when patients are warned that the medicine could cause certain side effects or when they are explicitly told that there would be no effect in taking the medicine.[6]
The causes of intervention bias in medicine are most likely an interplay of two other biases researched in humans: self-interest bias and confirmation bias.[5] Another reason for intervention bias can be found in the fear of malpractice cases, as possible charges can be pressed.[5]
The self-interest bias occurs if a person shows self-serving behaviors and justifies those in favor of their own interests. Medical intervention is partly guided by the financial self-interest of practitioners and the health-care industry. Industry-sponsored studies and analyses can lead to conflicts of interest and biased interpretations of the results. The specialists then make questionable decisions and defend already biased information.[5] Doctors seem to be more satisfied when they have a greater involvement in their patients' treatment, which means that the amount of intervention is closely linked to career happiness and personal gratification.[5] Confirmation bias influences human decision making as sources that confirm one's pre-existing hypotheses are incorporated more readily and preferably than any challenging ideas. Those studies and assessments that justify and promote medical intervention are given more emphasis. Data that contradicts the reviewer's assumptions are either ignored or their own experience and evaluation are viewed as more reliable for the practitioner.[5]
Due to the action bias, medical intervention becomes less objective, the physician's primary focus can no longer be the best possible therapy for the patient, possible therapies may be implemented without proper, tailored testing.[5][7] Other consequences include incorrect and biased medical advice, and additionally physical harm to the patient and collapse of health care systems.[5] Although physicians also have the choice to wait and see if the symptoms subside or intensify and then perform a follow-up check, which would be temporary inaction, instead it is common to perform direct testing and prescription of medication.[3]
According to some psychologists, the goalkeeper shows an action bias in over 90 percent of the penalty kicks in soccer by diving to either the left or the right. These theorists assert that it is more effective to stand still, or to wait and see which direction the ball is kicked before moving, because guessing wrong will almost guarantee giving up a goal.[8] Researchers surmise that goalkeepers take the risk of guessing because "action" is preferred by their teammates, and success will bring social recognition and other rewards.[8]
However, this analysis ignores game theory and the dynamics of the sport. Because the penalty spot is only 12 yards away, the goalposts are 24 feet apart, the crossbar is 8 feet high, and the ball will be struck with great force, the goalkeeper cannot stand still and wait for the ball to be struck, because they will not have time to reach it. The ball could go to any of the four corners. To have a chance to make a save, the optimum strategy is to guess the target location and begin moving before the opponent's foot touches the ball. This context negates the claim of bias, because the goalkeeper is expected to put in the visible effort to make a save and actively prevent a goal, rather than arrive too late by waiting for directional certainty. Some penalty takers counter this strategy by rolling or chipping the ball down the middle, which is called a Panenka penalty, after a Czech player who made it famous at the UEFA Euro 1976 final.
Action bias is also influenced by previous outcomes. If a team loses a match, the coach is more likely to choose action by changing some of the players, than inaction, even though this might not necessarily lead to a better performance.[4] As expressed by one coach, “Just because I can do something doesn’t mean that I should, or that that activity is relevant.”[9]
Action bias also influences decision-making in the field of economics and management. In the situations where there is an economic downfall, the central banks and governments experience the pressure to take action, as they feel increased scrutiny from the public. As they are expected to fix the situation, action is seen as more appropriate than inaction. Even if the outcome is not successful, by taking action public figures can avoid criticism more easily.[8] In the cases of good economic performance, the authorities are more inclined towards an omission bias as they do not wish to be accused of making the wrong choices that might destroy the current equilibrium.[8] The action/omission bias can be seen in other similar scenarios such as: investors changing their portfolio, switching a company's strategy, applying for a different job, moving to a different city. At the macro-economic level, the action/omission bias comes into play when discussing changes of politics-related variables, such as interest rates, tax rates and various types of expenditures.[8]
The effect of action bias in environmental policy decisions has been investigated by Anthony Patt and Richard Zeckhauser. They argued that action bias is more likely to lead to nonrational decision-making in this domain due to uncertainty and delayed effect of actions, contributions coming from many parties, no effective markets, unclear objectives and few strong incentives.[2] The study concluded that the value of a decision is influenced by one's perceived involvement, individual susceptibility for action bias, as well as framing and context, leading to the occurrence of action bias in environmental policies.[2]
The utility-based action bias is a type of action bias that underlies purposive behavior. It works by comparing the advantages of possible effects of different actions and, as a result, it selects the action that will lead to the outcome with the highest utility value. The values of different options are then predicted and compared, and the action with the most chance of reward will be chosen.[10] Advantages of this bias include finding the most beneficial option available in the environment. The main disadvantage is that the subject needs to test the environment through trial and error in order to identify the utility value of each action.[10] When unfamiliar with a new environment, the person will often choose an action that was proven advantageous in a previous situation. If this is not suitable in the current scenario, the utility value of the action decreases, and the person will opt for a different action, even though changing strategy might not be entirely beneficial.[10] The utility-based action bias is the opposite of the goal-based action selection, which aims at completing a goal without taking into consideration the utility value of the actions performed. Unlike the utility-based action bias, not all possible actions are compared. Once an action that leads to the goal is found, the other options are disregarded, making it a less time-consuming strategy.[10]
The term single-action bias was coined by Elke U. Weber when she took notice of farmers’ reactions to climate change. Decision-makers tend to take one action to lower a risk that they are concerned about, but they are much less likely to take additional steps that would provide risk reduction. The single action taken is not always the most effective one. Although the reason for this phenomenon is not yet fully confirmed, presumably the first action suffices in reducing the feeling of worry, which is why further action is often not taken.[11] For example, Weber found that farmers in the early 1990s who started to worry about the consequences of global warming either changed something in their production practice, their pricing, or lobbied for government interventions. What they generally did not do is engage in more than one of those actions. This again shows that undertaking a single action possibly fulfills one's need to do something; this could prevent further action.[11] In the end, the single-action bias improves a person's self-image and eliminates cognitive dissonance by giving the false impression that they have been contributing to the greater good.[12]
Another example of single-action bias is house owners that live in coastal regions that are likely to be flooded due to sea level rise (SLR). They can take small actions by piling resources or making sandbags in case of flooding or bigger actions by taking out flood insurance, elevating their homes or moving into a region that is less at risk for flooding. The first smaller action they take (making sandbags) takes away their anxiety about possible flooding and thereby makes it less likely to take actions that might have a better outcome in the long run, such as moving into another region.[13] An option to eliminate the single-action bias is to have group discussions, in which people suggest different ideas to find a solution. This would give the individual more alternatives to solve the problem.[14]
Awareness of the action bias can help to carefully think about the consequences of inaction versus action in a certain situation. This leads to the process not being as impulsive as before and includes logical thinking which facilitates choosing the most efficient outcome. Inaction, in some situations, can enhance patience and self-control.[1] New contexts that encourage making thoughtful decisions or checking for an overview of possibilities can also be beneficial.[1][15]
In medical contexts, full disclosure about the effects of action, especially negative side effects of medication, and inaction during treatment can lead to a lower effect of the action bias. The percentage of people choosing medication goes even lower (10%) when a doctor actively discourages the use of medication.[6]
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.