Loading AI tools
Ability to experience feelings and sensations From Wikipedia, the free encyclopedia
Sentience is the ability to experience feelings and sensations.[3] It may not necessarily imply higher cognitive functions such as awareness, reasoning, or complex thought processes. Sentience is an important concept in ethics, as the ability to experience happiness or suffering often forms a basis for determining which entities deserve moral consideration, particularly in utilitarianism.[4]
In Asian religions, the word "sentience" has been used to translate a variety of concepts. In science fiction, the word "sentience" is sometimes used interchangeably with "sapience", "self-awareness", or "consciousness".[5] Some writers differentiate between the mere ability to perceive sensations, such as light or pain, and the ability to perceive emotions, such as fear or grief. The subjective awareness of experiences by a conscious individual are known as qualia in Western philosophy.[5]
"Sentience" was first coined by philosophers in the 1630s for the concept of an ability to feel, derived from Latin sentiens (feeling).[6] In philosophy, different authors draw different distinctions between consciousness and sentience. According to Antonio Damasio, sentience is a minimalistic way of defining consciousness, which otherwise commonly and collectively describes sentience plus further features of the mind and consciousness, such as creativity, intelligence, sapience, self-awareness, and intentionality (the ability to have thoughts about something). These further features of consciousness may not be necessary for sentience, which is the capacity to feel sensations and emotions.[7]
According to Thomas Nagel in his paper "What Is It Like to Be a Bat?", consciousness can refer to the ability of any entity to have subjective perceptual experiences, or as some philosophers refer to them, "qualia"—in other words, the ability to have states that it feels like something to be in.[8] Some philosophers, notably Colin McGinn, believe that the physical process causing consciousness to happen will never be understood, a position known as "new mysterianism". They do not deny that most other aspects of consciousness are subject to scientific investigation but they argue that qualia will never be explained.[9] Other philosophers, such as Daniel Dennett, argue that qualia is not a meaningful concept.[10]
Regarding animal consciousness, the Cambridge Declaration of Consciousness, publicly proclaimed on 7 July 2012 at Cambridge University, states that many non-human animals possess the neuroanatomical, neurochemical, and neurophysiological substrates of conscious states, and can exhibit intentional behaviors.[a] The declaration notes that all vertebrates (including fish and reptiles) have this neurological substrate for consciousness, and that there is strong evidence that many invertebrates also have it.[2]
David Chalmers argues that sentience is sometimes used as shorthand for phenomenal consciousness, the capacity to have any subjective experience at all, but sometimes refers to the narrower concept of affective consciousness, the capacity to experience subjective states that have affective valence (i.e., a positive or negative character), such as pain and pleasure.[11]
The sentience quotient concept was introduced by Robert A. Freitas Jr. in the late 1970s.[12] It defines sentience as the relationship between the information processing rate of each individual processing unit (neuron), the weight/size of a single unit, and the total number of processing units (expressed as mass). It was proposed as a measure for the sentience of all living beings and computers from a single neuron up to a hypothetical being at the theoretical computational limit of the entire universe. On a logarithmic scale it runs from −70 up to +50.
Eastern religions including Hinduism, Buddhism, Sikhism, and Jainism recognise non-humans as sentient beings.[13] The term sentient beings is translated from various Sanskrit terms (jantu, bahu jana, jagat, sattva) and "conventionally refers to the mass of living things subject to illusion, suffering, and rebirth (Saṃsāra)".[14] It is related to the concept of ahimsa, non-violence toward other beings.[15] In some forms of Buddhism, plants, stones and other inanimate objects are considered to be 'sentient'.[16][17] In Jainism many things are endowed with a soul, jīva, which is sometimes translated as 'sentience'.[18][19] Some things are without a soul, ajīva, such as a chair or spoon.[20] There are different rankings of jīva based on the number of senses it has. Water, for example, is a sentient being of the first order, as it is considered to possess only one sense, that of touch.[21]
Sentience in Buddhism is the state of having senses. In Buddhism, there are six senses, the sixth being the subjective experience of the mind. Sentience is simply awareness prior to the arising of Skandha. Thus, an animal qualifies as a sentient being. According to Buddhism, sentient beings made of pure consciousness are possible. In Mahayana Buddhism, which includes Zen and Tibetan Buddhism, the concept is related to the Bodhisattva, an enlightened being devoted to the liberation of others. The first vow of a Bodhisattva states, "Sentient beings are numberless; I vow to free them."
Sentience has been a central concept in the animal rights movement, tracing back to the well-known writing of Jeremy Bentham in An Introduction to the Principles of Morals and Legislation: "The question is not, Can they reason? nor, Can they talk? but, Can they suffer?"
Richard D. Ryder defines sentientism broadly as the position according to which an entity has moral status if and only if it is sentient.[25] In David Chalmer's more specific terminology, Bentham is a narrow sentientist, since his criterion for moral status is not only the ability to experience any phenomenal consciousness at all, but specifically the ability to experience conscious states with negative affective valence (i.e. suffering).[11] Animal welfare and rights advocates often invoke similar capacities. For example, the documentary Earthlings argues that while animals do not have all the desires and ability to comprehend as do humans, they do share the desires for food and water, shelter and companionship, freedom of movement and avoidance of pain.[26][b]
Animal-welfare advocates typically argue that any sentient being is entitled, at a minimum, to protection from unnecessary suffering[citation needed], though animal-rights advocates may differ on what rights (e.g., the right to life) may be entailed by simple sentience. Sentiocentrism describes the theory that sentient individuals are the center of moral concern.
Gary Francione also bases his abolitionist theory of animal rights, which differs significantly from Singer's, on sentience. He asserts that, "All sentient beings, humans or nonhuman, have one right: the basic right not to be treated as the property of others."[27]
Andrew Linzey, a British theologian, considers that Christianity should regard sentient animals according to their intrinsic worth, rather than their utility to humans.[28]
In 1997 the concept of animal sentience was written into the basic law of the European Union. The legally binding protocol annexed to the Treaty of Amsterdam recognises that animals are "sentient beings", and requires the EU and its member states to "pay full regards to the welfare requirements of animals".[29]
Nociception is the process by which the nervous system detects and responds to potentially harmful stimuli, leading to the sensation of pain. It involves specialized receptors called nociceptors that sense damage or threat and send signals to the brain. Nociception is widespread among animals, even among insects.[31]
The presence of nociception indicates an organism's ability to detect harmful stimuli. A further question is whether the way these noxious stimuli are processed within the brain leads to a subjective experience of pain.[31] To address that, researchers often look for behavioral cues. For example, "if a dog with an injured paw whimpers, licks the wound, limps, lowers pressure on the paw while walking, learns to avoid the place where the injury happened and seeks out analgesics when offered, we have reasonable grounds to assume that the dog is indeed experiencing something unpleasant." Avoiding painful stimuli unless the reward is significant can also provide evidence that pain avoidance is not merely an unconscious reflex (similarly to how humans "can choose to press a hot door handle to escape a burning building").[30]
Animals such as pigs, chickens, and fish are typically recognized as sentient. There is more uncertainty regarding insects, and findings on certain insect species may not be applicable to others.[31]
Historically, fish were not considered sentient, and their behaviors were often viewed as "reflexes or complex, unconscious species-typical responses" to their environment. Their dissimilarity with humans, including the absence of a direct equivalent of the neocortex in their brain, was used as an argument against sentience.[32] Jennifer Jacquet suggests that the belief that fish do not feel pain originated in response to a 1980s policy aimed at banning catch and release.[33] The range of animals regarded by scientists as sentient or conscious has progressively widened, now including animals such as fish, lobsters and octopus.[34]
Digital sentience (or artificial sentience) means the sentience of artificial intelligences. The question of whether artificial intelligences can be sentient is controversial.[35]
The AI research community does not consider sentience (that is, the "ability to feel sensations") as an important research goal, unless it can be shown that consciously "feeling" a sensation can make a machine more intelligent than just receiving input from sensors and processing it as information. Stuart Russell and Peter Norvig wrote in 2021: "We are interested in programs that behave intelligently. Individual aspects of consciousness—awareness, self-awareness, attention—can be programmed and can be part of an intelligent machine. The additional project making a machine conscious in exactly the way humans are is not one that we are equipped to take on."[36] Indeed, leading AI textbooks do not mention "sentience" at all.[37]
Digital sentience is of considerable interest to the philosophy of mind. Functionalist philosophers consider that sentience is about "causal roles" played by mental states, which involve information processing. In this view, the physical substrate of this information processing does not need to be biological, so there is no theoretical barrier to the possibility of sentient machines.[38] According to type physicalism however, the physical constitution is important; and depending on the types of physical systems required for sentience, it may or may not be possible for certain types of machines (such as electronic computing devices) to be sentient.[39]
The discussion on the topic of alleged sentience of artificial intelligence has been reignited in 2022 by the claims made about Google's LaMDA (Language Model for Dialogue Applications) artificial intelligence system that it is "sentient" and had a "soul".[40] LaMDA is an artificial intelligence system that creates chatbots—AI robots designed to communicate with humans—by gathering vast amounts of text from the internet and using algorithms to respond to queries in the most fluid and natural way possible. The transcripts of conversations between scientists and LaMDA reveal that the AI system excels at this, providing answers to challenging topics about the nature of emotions, generating Aesop-style fables on cue, and even describing its alleged fears.[41]
Nick Bostrom considers that while LaMDA is probably not sentient, being very sure of it would require understanding how consciousness works, having access to unpublished information about LaMDA's architecture, and finding how to apply the philosophical theory to the machine.[42] He also said about LLMs that "it's not doing them justice to say they're simply regurgitating text", noting that they "exhibit glimpses of creativity, insight and understanding that are quite impressive and may show the rudiments of reasoning". He thinks that "sentience is a matter of degree".[35]
In 2022, philosopher David Chalmers made a speech on whether large language models (LLMs) can be conscious, encouraging more research on the subject. He suggested that current LLMs were probably not conscious, but that the limitations are temporary and that future systems could be serious candidates for consciousness.[43]
According to Jonathan Birch, "measures to regulate the development of sentient AI should run ahead of what would be proportionate to the risks posed by current technology, considering also the risks posed by credible future trajectories." He is concerned that AI sentience would be particularly easy to deny, and that if achieved, humans might nevertheless continue to treat AI systems as mere tools. He notes that the linguistic behaviour of LLMs is not a reliable way to assess whether they are sentient. He suggests to apply theories of consciousness, such as the global workspace theory, to the algorithms implicitly learned by LLMs, but noted that this technique requires advances in AI interpretability to understand what happens inside. He also mentions some other pathways that may lead to AI sentience, such as the brain emulation of sentient animals.[44]
a. ^ Quote: "The absence of a neocortex does not appear to preclude an organism from experiencing affective states. Convergent evidence indicates that non-human animals have the neuroanatomical, neurochemical, and neurophysiological substrates of conscious states along with the capacity to exhibit intentional behaviors. Consequently, the weight of evidence indicates that humans are not unique in possessing the neurological substrates that generate consciousness. Non-human animals, including all mammals and birds, and many other creatures, including octopuses, also possess these neurological substrates."[2]
b. ^ Quote: "Granted, these animals do not have all the desires we humans have; granted, they do not comprehend everything we humans comprehend; nevertheless, we and they do have some of the same desires and do comprehend some of the same things. The desires for food and water, shelter and companionship, freedom of movement and avoidance of pain."[26]
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.