Over the past few years, several notable observations have occurred in various industries or events around the world, where certain phenomena have become more pronounced and problematic. These phenomena include the Dunning-Kruger effect, the Bandwagon effect, and the Echo Chamber effect. While these effects have been studied and analyzed independently, there are many situations where these phenomena often interact and reinforce each other, leading to a phenomenon that is been termed the "Foolish Feedback Cascade" This cycle can be observed in many situations around the world, especially with the explosive growth of social media.
The Foolish Feedback Cascade
When those with little understanding share misinformation, it creates a domino effect where more people join in promoting a bad idea until it becomes the norm, drowning out any good ideas.
The "Foolish Feedback Cycle" is a phenomenon that results from the devastating combination of the Dunning-Kruger effect, the Bandwagon effect, and the Echo Chamber effect.
Dunning Kruger Effect
The Dunning-Kruger effect is a cognitive bias where individuals with low ability in a particular task overestimate their competence, while those with high ability tend to underestimate their competence. In other words, people who lack knowledge or expertise on a topic may think they know more than they actually do.
The Dunning-Kruger effect can have negative consequences, as people who overestimate their competence may be less likely to seek out new information, less likely to learn from their mistakes, and more likely to spread misinformation. Conversely, people who underestimate their competence may be less confident in their abilities, even if they are highly skilled or knowledgeable.
The Echo Chamber Effect
The echo chamber effect is a phenomenon in which individuals are exposed only to information and opinions that reinforce their beliefs and values. This can happen when people seek out news and social media content that aligns with their worldview or when algorithms tailor content to users based on their previous searches and activity.
The echo chamber effect can have negative consequences, as it can lead to confirmation bias, where individuals are more likely to accept and remember information that supports their beliefs while ignoring or dismissing information that contradicts them. This can limit people's exposure to diverse perspectives and prevent them from considering alternative viewpoints.
The Echo Chamber Effect has important implications for areas such as politics, social media, and the spread of misinformation. It can contribute to creating polarized and ideologically homogeneous groups, where people are less likely to be exposed to alternative viewpoints or information. This can lead to reinforcing existing beliefs and values and a lack of critical thinking or engagement with alternative perspectives.
The Bandwagon effect
The bandwagon effect is a cognitive bias where individuals are more likely to adopt a particular opinion or behaviour if they believe it is popular or widely accepted. People tend to follow the crowd, especially regarding social or cultural trends.
The bandwagon effect can have negative consequences, as it can spread misinformation or harmful ideas, even if they are not supported by evidence or logic. The more people believe a particular idea or opinion, the more likely others will adopt it, regardless of its validity or usefulness. This can create a self-reinforcing cycle, where the popularity of an idea or opinion is based on its perceived popularity rather than its merit.
A combination of the three effects forms the Foolish Feedback Cascade
The "Foolish Feedback Cascade" is a self-reinforcing cycle supporting bad ideas and misinformation. The cycle starts when individuals with little knowledge or expertise on a topic share their opinions, often with confidence and conviction, which can be convincing to others. This is where the Dunning-Kruger effect comes into play. People who lack expertise on a topic tend to overestimate their abilities and knowledge, leading them to be more confident in their opinions, even if they are wrong.
As more people join in promoting a bad idea, it gains momentum and popularity, often fueled by the Bandwagon effect. This effect describes how people tend to follow the crowd, especially when it comes to social or cultural trends. The more people that believe a particular idea or opinion, the more likely it is that others will adopt it.
This creates an echo chamber effect, where people within a particular community or social group are exposed only to opinions that align with their own, further reinforcing the cycle of misinformation. Social media can exacerbate this effect by tailoring content to users' interests, which can limit their exposure to different perspectives.
The impact of this cycle can be significant. It can lead to the spread of false information and harmful ideas, even if they are not supported by evidence or logic. This can have serious consequences for individuals, organizations, and even entire societies. For example, misinformation about vaccines can lead to reduced vaccination rates, which can result in the resurgence of previously controlled infectious diseases.
The "Foolish Feedback Cascade" often overshadows, dilutes or drowns out real science and good ideas. It can make it increasingly harder or near impossible to remedy because it creates a self-reinforcing cycle that resists change. The more people that believe in a bad idea, the harder it is to persuade them to change their opinion. Even if presented with factual evidence, they may ignore it or dismiss it, as it does not align with their preconceived notions.
Case Studies
The case studies are highly controversial and have caused a lot of friction. I am not taking sides but merely using them as examples of the impact. In these cases, one or more parties are involved, where one or the other could be wrong, or even both. I do hope you, the reader, can see the principle applied and not get into a debate about who was right or wrong. Either party in each case could see the other side as applying the principle.
Anti-Vaccine Movement
The anti-vaccine movement is an example of the "Foolish Feedback Cascade." It started with a small group of individuals who believed that vaccines could cause autism, despite scientific evidence to the contrary. These individuals shared their opinions online and in social media, often with confidence and conviction, leading to more people joining in promoting the bad idea. As the movement grew, it gained momentum and popularity, leading to reduced vaccination rates and the resurgence of previously controlled infectious diseases, such as measles (Orenstein & Ahmed, 2017).
2016 US Presidential Election
The 2016 US Presidential Election is an example of the "Foolish Feedback Cascade." It started with a small group of supporters of then-candidate Donald Trump, who promoted his populist message and unconventional campaign style, often with confidence and conviction. This created a domino effect where more people joined in promoting Trump's campaign, leading to its momentum and popularity. The cycle was reinforced by the Bandwagon effect, where people tend to follow the crowd, and the Echo Chamber effect, where people are exposed only to opinions that align with their own, further reinforcing the cycle of support (Barberá et al., 2015).
The impact of the "Foolish Feedback Cascade" in the 2016 election has been significant, as it led to the election of a candidate who had no previous political experience and who was known for promoting false information and conspiracy theories. This has had serious consequences for American politics and society. The cycle has also continued beyond the election, with Trump's supporters continuing to promote his messaging and conspiracies, often with little regard for factual evidence or logic (Allcott & Gentzkow, 2017).
Crypto Currency Investment
Cryptocurrency investment is an example of the "Foolish Feedback Cascade." It started with a small group of tech enthusiasts who saw the potential of blockchain technology and the potential of cryptocurrencies as an alternative to traditional currencies. This group started investing in cryptocurrencies with confidence, leading to a self-reinforcing cycle of more people investing and the value of the currencies rising. This cycle was reinforced by the Bandwagon effect, where people tend to follow the crowd, and the Echo Chamber effect, where people are exposed only to opinions that align with their own, further reinforcing the investment cycle (Brière et al., 2015).
The impact of the "Foolish Feedback Cascade" in cryptocurrency investment has been significant, as it led to a speculative bubble that eventually burst, causing significant losses for many investors. The cycle overshot the true value of the currencies and created unrealistic expectations for investors. This has had severe consequences for individual investors and has highlighted the importance of a critical and informed approach to investment decisions. Reference:
Anti-Mask Movement
The Anti-Mask Movement is an example of the "Foolish Feedback Cascade." It started with a small group of individuals who believed that mask-wearing was unnecessary and even harmful, despite scientific evidence to the contrary. These individuals shared their opinions online and in social media, often with confidence and conviction, creating a domino effect where more people joined in promoting a bad idea. As the movement grew, it gained momentum and popularity, fueled by the Bandwagon effect, where individuals follow the crowd, and the Echo Chamber effect, where people are exposed only to opinions that align with their own, further reinforcing the cycle of misinformation (Jolley & Paterson, 2020).
The impact of the Anti-Mask Movement has been significant, leading to increased COVID-19 transmission rates and the prolonging of the pandemic. The movement has also led to the spread of false information and harmful ideas, even if they are not supported by evidence or logic. This has had serious consequences for individuals, communities, and societies, and has highlighted the importance of critical thinking and evidence-based decision-making in public health (Tversky & Kahneman, 1974).
Agile
Agile is a software development philosophy that emphasizes values and principles to help organizations build better software products by promoting pragmatism and a strong sense of delivering value to customers. It is outlined in the Manifesto for Agile Software Development (Beck et al., 2001). However, its widespread adoption has led to a self-reinforcing cycle of misunderstanding, with individuals often changing the practices to suit their needs and compromising the philosophy's original intent. This has resulted in compromised Agile, referred to as "Agile In Name Only," "frAgile," or "Wagile," and has led to unrealistic expectations and poor outcomes due to the saturation of individuals calling themselves Agile experts but lacking proper education and training in its application (Fitzgerald & Stol, 2019).
References
Janis, I. L. (1972). Victims of groupthink: A psychological study of foreign-policy decisions and fiascoes. Houghton Mifflin.
Nemeth, C. J., & Staw, B. M. (1989). The tradeoffs of social control and innovation in groups and organizations. Advances in experimental social psychology, 22, 175-210.
Hogg, M. A. (1992). The social psychology of group cohesiveness: From attraction to social identity. New York: New York University Press.
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of general psychology, 2(2), 175-220.
Kunda, Z. (1990). The case for motivated reasoning. Psychological bulletin, 108(3), 480-498.
Sanna, L. J., & Schwarz, N. (2003). Integrating temporal biases: The interplay of focal thoughts and accessibility experiences. Psychological Science, 14(5), 460-464.
Berelson, B., Lazarsfeld, P. F., & McPhee, W. N. (1954). Voting: A study of opinion formation in a presidential campaign. University of Chicago Press.
Cialdini, R. B. (2001). Influence: Science and practice (Vol. 4). Boston, MA: Allyn and Bacon.
Sunstein, C. R. (2017). #Republic: Divided democracy in the age of social media. Princeton University Press.
Dunning, D., & Kruger, J. (1999). Unskilled and unaware of it: How difficulties in recognizing one's own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121-1134.
Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211-236.
Nyhan, B., & Reifler, J. (2015). Does correcting myths about the flu vaccine work? An experimental evaluation of the effects of corrective information. Vaccine, 33(3), 459-464.
Orenstein, W. A., & Ahmed, R. (2017). Simply put: Vaccination saves lives. Proceedings of the National Academy of Sciences, 114(16), 4031-4033.
Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211-236.
Barberá, P., Jost, J. T., Nagler, J., Tucker, J. A., & Bonneau, R. (2015). Tweeting from left to right: Is online political communication more than an echo chamber? Psychological Science, 26(10), 1531-1542.
Hu, S., & Cao, J. (2021). The role of social media in cryptocurrency investment: A perspective of trust and risk. Telematics and Informatics, 61, 101600.
Brière, M., Oosterlinck, K., & Szafarz, A. (2015). Virtual currency, tangible return: Portfolio diversification with bitcoin. Journal of Asset Management, 16(6), 365-373.
Capraro, V., Barcelo, H., & Fung, H. H. (2020). COVID-19 lockdown: The interplay between compliance and reactance. PloS
Jolley, D., & Paterson, J. L. (2020). Pylons ablaze: Examining the role of 5G COVID-19 conspiracy beliefs and support for violence. British Journal of Social Psychology, 59(3), 628-640.
Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124-1131.
Beck, K., Beedle, M., Van Bennekum, A., Cockburn, A., Cunningham, W., Fowler, M., ... & Kern, J. (2001). Manifesto for agile software development. Agile Alliance. Available at: https://agilemanifesto.org/
Fitzgerald, B., & Stol, K. J. (2019). The many faces of software development: A research framework for investigating agile adoption. Journal of Systems and Software, 157, 1103
In my view, the case studies demonstrate that the "Foolish Feedback Cycle" principle is present when one or both parties are wrong, leading to heat and friction. Regardless of the case study, this principle still applies. I am willing to acknowledge that I may or may not be trapped in "the foolish feedback cycle" myself because I could easily be one of those people on Mount Stupid.
Yes, they are highly controversial topics, but that illustrates the point of the principle.