Disinformation as weaponised ignorance: A hybrid teleological–epistemic framework

Document Type : Original article

Authors

Department of Philosophy, SR.C. Islamic Azad University, Tehran, Iran.

Abstract

Background: Contemporary debates on disinformation are dominated by two influential approaches: Fallis’s functional (teleological) model and Simion’s purely epistemic Disinformation as Ignorance-Generating Content (DIGC) model. Their respective strengths and weaknesses become salient in real-world settings such as the Backfire Effect, the spread of bullshit, and disputes about the epistemic status of Large Language Models (LLMs).
Aims: We aim to (i) critically evaluate the explanatory and classificatory utility of the functional and purely epistemic models across these scenarios, (ii) diagnose key failure modes (especially DIGC’s over-generation and the functional model’s difficulties with non-intentional sources), and (iii) propose a more extensionally adequate framework.
Methodology: We conduct a comparative conceptual analysis of both models and test their classifications against several cases. In particular, we use empirical findings on the Backfire Effect to examine whether a purely consequence-based criterion misclassifies accurate, well-intentioned scientific information. We also incorporate Frankfurt’s distinction between lying and bullshit to refine how epistemic malice is characterised.
Findings: Fallis’s functional model captures complex forms of disinformation (including true and adaptive disinformation) by tying disinformation to a misleading function, but it struggles to classify outputs from non-intentional sources such as autonomous AI. DIGC broadens coverage by removing intentionality and focusing on dispositions to increase ignorance, yet this purely epistemic stance yields an Over-generation Problem: under Backfire conditions, it can wrongly classify accurate and well-intentioned scientific communication as disinformation. To address these limitations, we propose a hybrid teleological framework, Functional-Contextual Disinformation (FC-DIGC), which combines DIGC’s consequence criterion with a teleological constraint requiring a misleading function. This synthesis better separates malicious deception (disinformation) from unintended epistemic harm (contextually harmful misinformation) and helps clarify how LLM outputs should be categorised.
Conclusion: A hybrid teleological approach improves extensional adequacy by preventing over-generation while retaining coverage for non-intentional systems. FC-DIGC provides a principled way to distinguish disinformation from contextually harmful misinformation and, by integrating the lying–bullshit contrast, captures a broader spectrum of epistemically motivated malice relevant to contemporary information environments, including AI-mediated communication.

Keywords

Main Subjects


Main Object: Humanities & Social Sciences, Epistemology of Information, Disinformation

Adler, J.E. (1997). “Lying, deceiving, or falsely implicating”. Journal of Philosophy. 94(9): 435-452. https://doi.org/10.2307/2564617.
Ausili, L. (2025). “The inseparable link between disinformation and attitudes”. Inquiry. 1-8. https://doi.org/10.1080/0020174X.2025.2542393.
Chisholm, R.M. & Feehan, T.D. (1977). “The intent to deceive”. Journal of Philosophy. 74(3): 143-159. https://doi.org/10.2307/2025605.
Dretske, F.I. (1983). “Précis of Knowledge and the Flow of Information”. Behavioral and Brain Sciences. 6(1): 55-90. https://doi.org/10.1017/S0140525X00014631.
Fallis, D. (2015). “What is disinformation?”. Library Trends. 63(3): 401-426. https://doi.org/10.1353/lib.2015.0014.
----------------. (2014). “The varieties of disinformation”. In L. Floridi & P. Illari (Eds.). The Philosophy of Information Quality (pp. 135–161). Springer. https://doi.org/10.1007/978-3-319-07121-3_8.
----------------. (2009). “A conceptual analysis of disinformation”. iConference Proceedings. http://hdl.handle.net/2142/15205.
Fetzer, J.H. (2004). “Disinformation: The use of false information”. Minds and Machines. 14(2): 231-240. https://doi.org/10.1023/B:MIND.0000021683.28604.5b.
Floridi, L. (2011). The Philosophy of Information. Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199232383.001.0001.
Frankfurt, H.G. (1988). “On bullshit”. In The Importance of What We Care About: Philosophical Essays (pp. 117–133). Cambridge University Press. https://doi.org/10.1017/CBO9780511818172.011
Goodwin, J. (2011). “Accounting for the appeal to the authority of experts”. Argumentation. 25(3): 285-296. https://doi.org/10.1007/s10503-011-9219-6.
Grice, H.P. (1989). Studies in the Way of Words. Harvard University Press.
Hájek, A. (2007). “The reference class problem is your problem too”. Synthese. 156: 563-585. https://doi.org/10.1007/s11229-006-9138-5.
Hauswald, R. (2025). “Artificial epistemic authorities”. Social Epistemology. 39: 716-725. https://doi.org/10.1080/02691728.2025.2449602.
Kelp, C. & Simion, M. (2025). “What is information?”. Aristotelian Society Supplementary Volume. 99(1): 189-208. https://doi.org/10.1093/arisup/akaf008.
Kingsbury, J. & McKeown-Green, J. (2009). “Definitions: Does disjunction mean dysfunction?”. Journal of Philosophy. 106(10): 568-585. https://doi.org/10.5840/jphil20091061034.
Kumar, K.P.K. & Geethakumari, G. (2014). “Detecting misinformation in online social networks using cognitive psychology”. Human-centric Computing and Information Sciences. 4(1): 22. https://doi.org/10.1186/s13673-014-0014-x.
Mahon, J.E. (2016). "The Definition of lying and deception". The Stanford Encyclopedia of Philosophy. Winter. E.N. Zalta (ed.). https://plato.stanford.edu/archives/win2016/entries/lying-definition.
Mizrahi, M. (2025). “No epistemic respect for bullshit machines or LLMs”. Social Epistemology Review and Reply Collective. 14(9): 138-146. https://wp.me/p1Bfg0-amy.
Nyhan, B.; Reifler, J.; Richey, S. & Freed, G.L. (2014). “Effective messages in vaccine promotion: A randomized trial”. Pediatrics. 133(4): e835-e842. https://doi.org/10.1542/peds.2013-2365.
Pluviano, S.; Watt, C. & Della Sala, S. (2017). “Misinformation lingers in memory: Failure of three pro-vaccination strategies”. PLoS One. 12(7): e0181640. https://doi.org/10.1371/journal.pone.0181640.
Simion, M. (2024a). “Knowledge and disinformation”. Episteme. 21(4): 1208-1219. https://doi.org/10.1017/epi.2023.25.
---------------. (2024b). Resistance to Evidence. Cambridge University Press. https://doi.org/10.1017/9781009298537.
Skyrms, B. (2010). Signals: Evolution, Learning, and Information. Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199580828.001.0001.
Williamson, T. (2007). The Philosophy of Philosophy. Wiley-Blackwell. https://doi.org/10.1002/9780470696675.
Volume 10, Issue 2
July 2026
Pages 493-512
  • Receive Date: 15 October 2025
  • Revise Date: 13 December 2025
  • Accept Date: 31 December 2025
  • First Publish Date: 13 January 2026
  • Publish Date: 01 July 2026