Why language clouds our ascription of understanding, intention and consciousness

Stuart, S. (2024) Why language clouds our ascription of understanding, intention and consciousness. Phenomenology and the Cognitive Sciences, (doi: 10.1007/s11097-024-09970-1) (Early Online Publication)

Full text not currently available from Enlighten.

Abstract

The grammatical manipulation and production of language is a great deceiver. We have become habituated to accept the use of well-constructed language to indicate intelligence, understanding and, consequently, intention, whether conscious or unconscious. But we are not always right to do so, and certainly not in the case of large language models (LLMs) like ChapGPT, GPT-4, LLaMA, and Google Bard. This is a perennial problem, but when one understands why it occurs, it ceases to be surprising that it so stubbornly persists. This paper will have three main sections. In the Introduction I will say a little about language, its aetiology, and useful sub-divisions into natural and cultural. In the second section I will explain the current situation with regard to large language models and fill in the background debates which set the problem up as one of increased complexity rather than one of a qualitatively different kind from narrow or specific AI. In the third section I will present the case for the missing phenomenological background and why it is necessary for the co-creation of shared meaning in both natural and cultural language, and I will conclude this section by presenting a rationale for why this situation arises and will continue to arise. Before we do any of this, I need to clarify one point: I do not wish to challenge the ascription of artificial general intelligence (AGI) to LLMs, indeed I think agnosticism is best in this respect, but I do challenge the more serious, and erroneous, ascription of understanding, intention, reason, and consciousness to them. And so, I am making two points: an epistemological one about why we fall into error in our ascription of a mental life to LLMs, and an ontological one about the impossibility of LLMs being or becoming conscious.

Item Type:Articles
Status:Early Online Publication
Refereed:Yes
Glasgow Author(s) Enlighten ID:Stuart, Dr Susan
Authors: Stuart, S.
College/School:College of Arts & Humanities > School of Humanities
Journal Name:Phenomenology and the Cognitive Sciences
Publisher:Springer
ISSN:1568-7759
ISSN (Online):1572-8676
Published Online:04 March 2024

University Staff: Request a correction | Enlighten Editors: Update this record