Cognition is the "mental action or process of acquiring knowledge and understanding through thought, experience, and the senses".[2] It encompasses all aspects of intellectual functions and processes such as: perception, attention, thought, imagination, intelligence, the formation of knowledge, memory and working memory, judgment and evaluation, reasoning and computation, problem-solving and decision-making, comprehension and production of language. Cognitive processes use existing knowledge to discover new knowledge.
Cognitive processes are analyzed from very different perspectives within different contexts, notably in the fields of linguistics, musicology, anesthesia, neuroscience, psychiatry, psychology, education, philosophy, anthropology, biology, systemics, logic, and computer science.[3] These and other approaches to the analysis of cognition (such as embodied cognition) are synthesized in the developing field of cognitive science, a progressively autonomous academic discipline.
Etymology
The word cognition dates back to the 15th century, where it meant "thinking and awareness".[4] The term comes from the Latin noun cognitio ('examination', 'learning', or 'knowledge'), derived from the verb cognosco, a compound of con ('with') and gnōscō ('know'). The latter half, gnōscō, itself is a cognate of a Greek verb, gi(g)nósko (γι(γ)νώσκω, 'I know,' or 'perceive').[5][6]
Early studies
Despite the word cognitive itself dating back to the 15th century,[4] attention to cognitive processes came about more than eighteen centuries earlier, beginning with Aristotle (384–322 BCE) and his interest in the inner workings of the mind and how they affect the human experience. Aristotle focused on cognitive areas pertaining to memory, perception, and mental imagery. He placed great importance on ensuring that his studies were based on empirical evidence, that is, scientific information that is gathered through observation and conscientious experimentation.[7] Two millennia later, the groundwork for modern concepts of cognition was laid during the Enlightenment by thinkers such as John Locke and Dugald Stewart who sought to develop a model of the mind in which ideas were acquired, remembered and manipulated.[8]
During the very early nineteenth century cognitive models were developed both in philosophy—particularly by authors writing about the philosophy of mind—and within medicine, especially by physicians seeking to understand how to cure madness. In Britain, these models were studied in the academy by scholars such as James Sully at University College London, and they were even used by politicians when considering the national Elementary Education Act 1870 (33 & 34 Vict. c. 75).[9]
As psychology emerged as a burgeoning field of study in Europe, whilst also gaining a following in America, scientists such as Wilhelm Wundt, Herman Ebbinghaus, Mary Whiton Calkins, and William James would offer their contributions to the study of human cognition.[citation needed]
Early theorists
Wilhelm Wundt (1832–1920) emphasized the notion of what he called introspection: examining the inner feelings of an individual. With introspection, the subject had to be careful with describing their feelings in the most objective manner possible in order for Wundt to find the information scientific.[10][11] Though Wundt's contributions are by no means minimal, modern psychologists find his methods to be too subjective and choose to rely on more objective procedures of experimentation to make conclusions about the human cognitive process.[citation needed]
Hermann Ebbinghaus (1850–1909) conducted cognitive studies that mainly examined the function and capacity of human memory. Ebbinghaus developed his own experiment in which he constructed over 2,000 syllables made out of nonexistent words (for instance, 'EAS'). He then examined his own personal ability to learn these non-words. He purposely chose non-words as opposed to real words to control for the influence of pre-existing experience on what the words might symbolize, thus enabling easier recollection of them.[10][12] Ebbinghaus observed and hypothesized a number of variables that may have affected his ability to learn and recall the non-words he created. One of the reasons, he concluded, was the amount of time between the presentation of the list of stimuli and the recitation or recall of the same. Ebbinghaus was the first to record and plot a "learning curve" and a "forgetting curve".[13] His work heavily influenced the study of serial position and its effect on memory [citation needed]
Mary Whiton Calkins (1863–1930) was an influential American pioneer in the realm of psychology. Her work also focused on human memory capacity. A common theory, called the recency effect, can be attributed to the studies that she conducted.[14] The recency effect, also discussed in the subsequent experiment section, is the tendency for individuals to be able to accurately recollect the final items presented in a sequence of stimuli. Calkin's theory is closely related to the aforementioned study and conclusion of the memory experiments conducted by Hermann Ebbinghaus.[15]
William James (1842–1910) is another pivotal figure in the history of cognitive science. James was quite discontent with Wundt's emphasis on introspection and Ebbinghaus' use of nonsense stimuli. He instead chose to focus on the human learning experience in everyday life and its importance to the study of cognition. James' most significant contribution to the study and theory of cognition was his textbook Principles of Psychology which preliminarily examines aspects of cognition such as perception, memory, reasoning, and attention.[15]
René Descartes (1596–1650) was a seventeenth-century philosopher who came up with the phrase "Cogito, ergo sum", which means "I think, therefore I am." He took a philosophical approach to the study of cognition and the mind, with his Meditations he wanted people to meditate along with him to come to the same conclusions as he did but in their own free cognition.[16]
Psychology
In psychology, the term "cognition" is usually used within an information processing view of an individual's psychological functions,[17] and such is the same in cognitive engineering.[18] In the study of social cognition, a branch of social psychology, the term is used to explain attitudes, attribution, and group dynamics.[17] However, psychological research within the field of cognitive science has also suggested an embodied approach to understanding cognition. Contrary to the traditional computationalist approach, embodied cognition emphasizes the body's significant role in the acquisition and development of cognitive capabilities.[19][20]
Human cognition is conscious and unconscious, concrete or abstract, as well as intuitive (like knowledge of a language) and conceptual (like a model of a language). It encompasses processes such as memory, association, concept formation, pattern recognition, language, attention, perception, action, problem solving, and mental imagery.[21][22] Traditionally, emotion was not thought of as a cognitive process, but now much research is being undertaken to examine the cognitive psychology of emotion; research is also focused on one's awareness of one's own strategies and methods of cognition, which is called metacognition. The concept of cognition has gone through several revisions through the development of disciplines within psychology.[citation needed]
Psychologists initially understood cognition governing human action as information processing. This was a movement known as cognitivism in the 1950s, emerging after the Behaviorist movement viewed cognition as a form of behavior.[23] Cognitivism approached cognition as a form of computation, viewing the mind as a machine and consciousness as an executive function.[19] However; post cognitivism began to emerge in the 1990s as the development of cognitive science presented theories that highlighted the necessity of cognitive action as embodied, extended, and producing dynamic processes in the mind.[24] The development of Cognitive psychology arose as psychology from different theories, and so began exploring these dynamics concerning mind and environment, starting a movement from these prior dualist paradigms that prioritized cognition as systematic computation or exclusively behavior.[19]
Piaget's theory of cognitive development
For years, sociologists and psychologists have conducted studies on cognitive development, i.e. the construction of human thought or mental processes.[citation needed]
Jean Piaget was one of the most important and influential people in the field of developmental psychology. He believed that humans are unique in comparison to animals because we have the capacity to do "abstract symbolic reasoning". His work can be compared to Lev Vygotsky, Sigmund Freud, and Erik Erikson who were also great contributors in the field of developmental psychology. Piaget is known for studying the cognitive development in children, having studied his own three children and their intellectual development, from which he would come to a theory of cognitive development that describes the developmental stages of childhood.[25]
Stage | Age or Period | Description[26] |
---|---|---|
Sensorimotor stage | Infancy (0–2 years) | Intelligence is present; motor activity but no symbols; knowledge is developing yet limited; knowledge is based on experiences/ interactions; mobility allows the child to learn new things; some language skills are developed at the end of this stage. The goal is to develop object permanence, achieving a basic understanding of causality, time, and space. |
Preoperational stage | Toddler and Early Childhood (2–7 years) | Symbols or language skills are present; memory and imagination are developed; non-reversible and non-logical thinking; shows intuitive problem solving; begins to perceive relationships; grasps the concept of conservation of numbers; predominantly egocentric thinking. |
Concrete operational stage | Elementary and Early Adolescence (7–12 years) | Logical and systematic form of intelligence; manipulation of symbols related to concrete objects; thinking is now characterized by reversibility and the ability to take the role of another; grasps concepts of the conservation of mass, length, weight, and volume; predominantly operational thinking; nonreversible and egocentric thinking |
Formal operational stage | Adolescence and Adulthood (12 years and on) | Logical use of symbols related to abstract concepts; Acquires flexibility in thinking as well as the capacities for abstract thinking and mental hypothesis testing; can consider possible alternatives in complex reasoning and problem-solving. |
Beginning of cognition
Studies on cognitive development have also been conducted in children beginning from the embryonal period to understand when cognition appears and what environmental attributes stimulate the construction of human thought or mental processes. Research shows the intentional engagement of fetuses with the environment, demonstrating cognitive achievements.[27] However, organisms with simple reflexes cannot cognize the environment alone because the environment is the cacophony of stimuli (electromagnetic waves, chemical interactions, and pressure fluctuations).[28] Their sensation is too limited by the noise to solve the cue problem–the relevant stimulus cannot overcome the noise magnitude if it passes through the senses (see the binding problem). Fetuses need external help to stimulate their nervous system in choosing the relevant sensory stimulus for grasping the perception of objects.[29] The Shared intentionality approach proposes a plausible explanation of perception development in this earlier stage. Initially, Michael Tomasello introduced the psychological construct of Shared intentionality, highlighting its contribution to cognitive development from birth.[30] This primary interaction provides unaware collaboration in mother-child dyads for environmental learning. Later, Igor Val Danilov developed this notion, expanding it to the intrauterine period and clarifying the neurophysiological processes underlying Shared intentionality.[31] According to the Shared intentionality approach, the mother shares the essential sensory stimulus of the actual cognitive problem with the child.[32] By sharing this stimulus, the mother provides a template for developing the young organism's nervous system.[33]
Recent findings in research on child cognitive development [29][31][34][35][36][37][38][39][40] and advances in inter-brain neuroscience experiments[41][42][43][44][45] have made the above proposition plausible. Based on them, the shared intentionality hypothesis introduced the notion of pre-perceptual communication in the mother-fetus communication model due to nonlocal neuronal coupling.[27][31][33] This nonlocal coupling model refers to communication between two organisms through the copying of the adequate ecological dynamics by biological systems indwelling one environmental context, where a naive actor (Fetus) replicates information from an experienced actor (Mother) due to intrinsic processes of these dynamic systems (embodied information) but without interacting through sensory signals.[27][31][33] The Mother's heartbeats (a low-frequency oscillator) modulate relevant local neuronal networks in specific subsystems of both her and the nervous system of the fetus due to the effect of the interference of the low-frequency oscillator (Mother heartbeats) and already exhibited gamma activity in these neuronal networks (interference in physics is the combination of two or more electromagnetic waveforms to form a resultant wave).[27][31][33] Therefore, the subliminal perception in a fetus emerges due to Shared intentionality with the mother that stimulates cognition in this organism even before birth.[27][31][33]
Another crucial question in understanding the beginning of cognition is memory storage about the relevant ecological dynamics by the naive nervous system (i.e., memorizing the ecological condition of relevant sensory stimulus) at the molecular level – an engram. Evidence derived using optical imaging, molecular-genetic and optogenetic techniques in conjunction with appropriate behavioural analyses continues to offer support for the idea that changing the strength of connections between neurons is one of the major mechanisms by which engrams are stored in the brain.[46]
Two (or more) possible mechanisms of cognition can involve both quantum effects[47] and synchronization of brain structures due to electromagnetic interference.[48][27][31][33]
Common types of tests on human cognition
Serial position
The Serial-position effect is meant to test a theory of memory that states that when information is given in a serial manner, we tend to remember information at the beginning of the sequence, called the primacy effect, and information at the end of the sequence, called the recency effect. Consequently, information given in the middle of the sequence is typically forgotten, or not recalled as easily. This study predicts that the recency effect is stronger than the primacy effect, because the information that is most recently learned is still in working memory when asked to be recalled. Information that is learned first still has to go through a retrieval process. This experiment focuses on human memory processes.[49]
Word superiority
The word superiority effect experiment presents a subject with a word, or a letter by itself, for a brief period of time, i.e. 40 ms, and they are then asked to recall the letter that was in a particular location in the word. In theory, the subject should be better able to correctly recall the letter when it was presented in a word than when it was presented in isolation. This experiment focuses on human speech and language.[50]
Brown–Peterson
In the Brown–Peterson cohomology experiment, participants are briefly presented with a trigram and in one particular version of the experiment, they are then given a distractor task, asking them to identify whether a sequence of words is in fact words, or non-words (due to being misspelled, etc.). After the distractor task, they are asked to recall the trigram from before the distractor task. In theory, the longer the distractor task, the harder it will be for participants to correctly recall the trigram. This experiment focuses on human short-term memory.[51]
Memory span
During the memory span experiment, each subject is presented with a sequence of stimuli of the same kind; words depicting objects, numbers, letters that sound similar, and letters that sound dissimilar. After being presented with the stimuli, the subject is asked to recall the sequence of stimuli that they were given in the exact order in which it was given. In one particular version of the experiment, if the subject recalled a list correctly, the list length was increased by one for that type of material, and vice versa if it was recalled incorrectly. The theory is that people have a memory span of about seven items for numbers, the same for letters that sound dissimilar and short words. The memory span is projected to be shorter with letters that sound similar and with longer words.[52]
Visual search
In one version of the visual search experiment, a participant is presented with a window that displays circles and squares scattered across it. The participant is to identify whether there is a green circle on the window. In the featured search, the subject is presented with several trial windows that have blue squares or circles and one green circle or no green circle in it at all. In the conjunctive search, the subject is presented with trial windows that have blue circles or green squares and a present or absent green circle whose presence the participant is asked to identify. What is expected is that in the feature searches, reaction time, that is the time it takes for a participant to identify whether a green circle is present or not, should not change as the number of distractors increases. Conjunctive searches where the target is absent should have a longer reaction time than the conjunctive searches where the target is present. The theory is that in feature searches, it is easy to spot the target, or if it is absent, because of the difference in color between the target and the distractors. In conjunctive searches where the target is absent, reaction time increases because the subject has to look at each shape to determine whether it is the target or not because some of the distractors if not all of them, are the same color as the target stimuli. Conjunctive searches where the target is present take less time because if the target is found, the search between each shape stops.[53]
Knowledge representation
The semantic network of knowledge representation systems have been studied in various paradigms. One of the oldest paradigms is the leveling and sharpening of stories as they are repeated from memory studied by Bartlett. The semantic differential used factor analysis to determine the main meanings of words, finding that the ethical value of words is the first factor. More controlled experiments examine the categorical relationships of words in free recall. The hierarchical structure of words has been explicitly mapped in George Miller's WordNet. More dynamic models of semantic networks have been created and tested with computational systems such as neural networks, latent semantic analysis (LSA), Bayesian analysis, and multidimensional factor analysis. The meanings of words are studied by all the disciplines of cognitive science.[54]
Metacognition
Metacognition is an awareness of one's thought processes and an understanding of the patterns behind them. The term comes from the root word meta, meaning "beyond", or "on top of".[55] Metacognition can take many forms, such as reflecting on one's ways of thinking, and knowing when and how oneself and others use particular strategies for problem-solving.[55][56] There are generally two components of metacognition: (1) cognitive conceptions and (2) cognitive regulation system.[57][58] Research has shown that both components of metacognition play key roles in metaconceptual knowledge and learning.[59][60][58] Metamemory, defined as knowing about memory and mnemonic strategies, is an important aspect of metacognition.[61]
Writings on metacognition date back at least as far as two works by the Greek philosopher Aristotle (384–322 BC): On the Soul and the Parva Naturalia.[62]Improving cognition
Physical exercise
Aerobic and anaerobic exercise have been studied concerning cognitive improvement.[63] There appear to be short-term increases in attention span, verbal and visual memory in some studies. However, the effects are transient and diminish over time, after cessation of the physical activity.[64] People with Parkinson's disease has also seen improved cognition while cycling, while pairing it with other cognitive tasks.[65]
Dietary supplements
Studies evaluating phytoestrogen, blueberry supplementation and antioxidants showed minor increases in cognitive function after supplementation but no significant effects compared to placebo.[66][67][68] Another study on the effects of herbal and dietary supplements on cognition in menopause show that soy and Ginkgo biloba supplementation could improve women's cognition.[69]
Pleasurable social stimulation
Exposing individuals with cognitive impairment (i.e. dementia) to daily activities designed to stimulate thinking and memory in a social setting, seems to improve cognition. Although study materials are small, and larger studies need to confirm the results, the effect of social cognitive stimulation seems to be larger than the effects of some drug treatments.[70]
Other methods
Transcranial magnetic stimulation (TMS) has been shown to improve cognition in individuals without dementia 1 month after treatment session compared to before treatment. The effect was not significantly larger compared to placebo.[71] Computerized cognitive training, utilizing a computer based training regime for different cognitive functions has been examined in a clinical setting but no lasting effects has been shown.[72]
See also
- Cognitive Abilities Screening Instrument
- Cognitive biology
- Cognitive computing
- Cognitive holding power
- Cognitive liberty
- Cognitive musicology
- Cognitive psychology
- Cognitive science
- Cognitivism
- Comparative cognition
- Embodied cognition
- Cognitive shuffle
- Information processing technology and aging
- Mental chronometry – i.e., the measuring of cognitive processing speed
- Nootropic
- Outline of human intelligence – a list of traits, capacities, models, and research fields of human intelligence, and more.
- Outline of thought – a list that identifies many types of thoughts, types of thinking, aspects of thought, related fields, and more.
- Shared intentionality
References
Further reading
External links
Wikiwand in your browser!
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.