Loading AI tools
From Wikipedia, the free encyclopedia
Musical literacy is the reading, writing, and playing of music, as well an understanding of cultural practice and historical and social contexts.
Music literacy and music education are frequently talked about relationally and causatively, however, they are not interchangeable terms, as complete musical literacy also concerns an understanding of the diverse practices involved in teaching music pedagogy and its impact on literacy. Even then, there are those who argue[1] against the relational and causal link between music education and literacy, instead advocating for the solely interactional relationship between social characteristics and music styles. "Musical communications, like verbal ones, must be put in the right contexts by receivers, if their meanings are to come through unobscured,"[2] which is why the pedagogical influence of teaching an individual to become musically literate might be confused with overarching ‘literacy’ itself.
‘Musical literacy’ is likewise not to be confused with ‘music theory’ or ‘musicology.’ These two components are aspects of music education that ultimately act as a means to an end of achieving such literacy. Even then, many scholars[3] debate the relevancy of these educational elements to musical literacy at all. The term, ‘musicality,’ is, again, a distinct term that is separate from the concept of ‘musical literacy,’ as the way in which a musician expresses emotions through performance is not indicative of their music-reading ability.[4]
Given that musical literacy involves mechanical and descriptive processes (such as reading, writing, and playing), as well as a broader cultural understanding of both historical and contemporary practice (i.e. listening, playing, and musical interpretation while listening and/or playing), education in these visual, reading/writing, auditory, and kinesthetic areas can work, in tandem, to achieve literacy as a whole.
Understanding of what the term, ‘musical literacy,’ encompasses has developed, over time, as scholars invest time into research and debate. A brief timeline — as collated by Csikos & Dohany (2016)[5] — is as follows:
Scholars such as Waller (2010)[13] also delve further into distinguishing the relational benefit of different mechanical processes, stating that "reading and writing are necessary concurrent processes".[14] The experience of learning how to "read to write and write to read"[15] allows students to become both a consumer and producer where "the music was given back to them to form their own musical ideas, as full participants in their musical development".[16]
The mechanical and factual elements of musical literacy can be taught in an educational environment with ‘music theory’ and ‘musicology,’ in order to use these "certain bits of articulate information... [to] trigger or activate the right perceptual sets and interpretive frameworks".[17] The descriptive nature of both teaching how to read and write standard Western notation (i.e. music theory),[18] and reading about the social, political, and historical contexts in which the music was written, as well as the ways in which it was practiced/performed (i.e. musicology),[19] constitute the visual and reading/writing approaches to learning. While the "factual knowledge and ability components are developed culturally, within a given social context,[20] signs and symbols on printed sheet music are also used for ‘symbolic interaction,’[21] "which enable [the musician] to understand [broader musical] discourse".[22] Asmus Jr. (2004)[23] proposes that "most educators would agree that the ability to perform from musical notation is paramount;[24] that the only way to become a "better music reader is to read music".[25]
Auditory learning is equally — if not more (as claimed by Herbst, de Wet & Rijsdijk, 2005[26]) — important, however, as "neither the ‘extramusical’ nor the ‘purely musical’ content of [any piece of] music can come across for a listener who brings nothing to it from [their] previous experience of related music and of the world".[27] Listening is "through and through contextual: for the music to be heard or experienced is for it to be related to — brought in some fashion into juxtaposition with — patterns, norms, phenomena, facts, lying outside the specific music itself".[28] Auditory-oriented education teaches comprehensive listening and aural perception against the "backdrop of a host of norms associated with the style, genre, and period categories, and the individual compositional corpus".[29] This frames "appropriate reactions and registerings on the order of tension and release, or expectation and fulfillment, or implication and realization during the course of the music[al piece]".[30] It is in this department that conventional classroom education often fails the individual in their acquisition of complete musical literacy as not only have "researchers pointed out that children coming to school do not have the foundational aural experiences with music to the extent that they have had with language",[31] but the "exclusive concentration on reading [and thus lack of listening] has held back the progress of countless learners, while putting many others off completely".[32] It is in this regard that musical literacy operates independently of music education as — while affecting the outcome of an individual's literacy — it is not defined by the quality of the education.
Furthermore, the kinesthetic aspect of music education plays a role in the achievement of musical literacy, as "human interaction is mediated by the use of symbols, by interpretation, [and] by ascertaining the meaning of one another’s actions".[33] "The different ways human emotions embody themselves, in gesture and stance, sets of cultural associations carried by particular rhythms, motifs, timbres, and instruments [and] aspects of a composer’s life, work, and setting"[34] form both the musician's understanding of a work's historical context, as well as any new meaning attached to it by its recontextualization in their contemporary musical settings and practices.
These aspects of musical literacy development coalesce into various educational practices that approach these types of visual, auditory, reading/writing, and kinesthetic learning in different ways. Unfortunately, "fluent music literacy is a rarely acquired ability in Western culture"[35] as "many children are failed by the ways in which they are taught to read music".[36] As such, many scholars debate over the best way to approach musical pedagogy.
For many scholars, the acquisition of aural skills prior to learning the conventions of print music — a ‘sound before symbol’[37] approach — serves as the "basis for making musical meaning".[38] Much like pedagogical approaches in language development, Mills & McPherson (2015)[39] observe that "children should become competent with spoken verbal language [ie. aural skills] before they grapple with written verbal language [ie. visual/written notation skills]".[40] For others, they find a ‘language- and speech-based’ approach more effective, but only "after the basic structure and vocabulary of the language has first been established".[41] Gundmundsdottir[42] recommends that the "age of students should be considered when choosing a method for teaching"[43] given the changing receptiveness of a developing brain.
In-field research collated by Gudmundsdottir[44] on this topic notes that:
Moreover, Mills & McPherson[47] conclude that:
Burton[49] found "play-based orientation... appeal[ed] to the natural way children learn[ed]",[50] and that the process of learning how to read, write, and play/verbalise music paralleled the process of learning language.[51] Creating an outlet for the energy of children while using the conceptual framework of other school classes to develop their understanding of print music appears to enrich all areas of brain development.[52] As such, Koopman (1996)[53] is of the opinion that "[the] rich musical experience alone justifies the teaching of music at schools".[54]
Stewart, Walsh & Frith (2004)[55] state that "music reading is an automatic process in trained musicians"[56] whereby the speed of information and psychomotor processing occurs at a high level (Kopiez, Weihs, Ligges & Lee, 2006).[57] The coding of visual information, motor responses, and visual-motor integration[58] make up several processes that occur both dependently and independently of one another; while "the ability to play by ear may have a moderate positive correlation to music reading abilities",[59] studies also demonstrate that concepts of pitch and timing are perceived separately.[60]
The development of pitch recognition also varies within itself depending on the context of the music and what mechanical skills an instrument or setting may require. Gudmundsdottir[61] references Fine, Berry & Rosner[62] when she notes that "successful music reading on an instrument does not necessarily require internal representations of pitch as sight-singing does"[63] and proficiency in one area does not guarantee skill in the other. The ability to link the sound of a note with its printed notation counterpart is a cornerstone in highly developed musical readers [64] and allows them to ‘read ahead’ when ‘sight-reading’ a piece due to such aural recollections.[65] Less-developed readers — or, "button pushers"[66] — contrastingly overly-rely on the visual-mechanical processes of musical literacy (i.e., "going directly from the visual image to the fingering required [on the instrument]"),[67] rather than an inclusive auditory/cultural understanding (i.e. how to also listen to and interpret music in addition to the mechanical processes). While musically literate and -illiterate individuals may be equally-able to identify singular notes, "the experts outperform the novices in their ability to identify a group of pitches as a particular chord or scale... and instantly translate that knowledge into a motor output".[68]
Contrastingly, "rhythm production is [universally] difficult without auditory coding"[69] as all musicians "rely on internal mental representations of musical metre [and temporal events] as they perform".[70] In the context of reading and writing music in the school classroom, Burton[71] saw that "[students] were making their own sense of rhythm in print"[72] and would self-correct when they realised that their aural perception of a rhythmic pattern did not match what they had transcribed on the manuscript.[73] Shehan (1987)[74] notes that successful strategies for teaching rhythm — much like pitch — benefit from the teachings of language literacy, as "written patterns... associated with aural labels in the form of speech cues... [tend] to be a successful strategy for teaching rhythm reading".[75]
Scholars, Mills & McPherson,[76] identified stages of development in reading music notation and recommend correlating a pedagogical approach to a stage that is best-received by the neurological development/age of a student. For instance, encouraging young beginners to invent their own visual representations of pieces they know aurally provides them with the "metamusical awareness that will enhance their progress toward understanding why staff notation looks and works the way it does".[77] Similarly, for children younger than six years old, translating prior aural knowledge of melodies into fingerings on an instrument (i.e. kinesthetic learning) sets the foundation for introducing visual notation later and maintains the ‘fun’ element of developing musical literacy.[78]
These stages of development in reading music notation are outlined by Mills & McPherson[79] as follows:
There are various schools of thought/pedagogy that translate these principles into practical teaching methods. The aim of many pedagogical approaches that also attempt to simultaneously address the "deficiency in research that considers the ability to read and write music with musical comprehension [ie. cultural-historical knowledge in the context of visual and auditory learning] as a developmental domain".[81] One of these most well-known teaching frameworks is the ‘Kodaly Method’.
Zoltán Kodály claims that there are four fundamental aspects to a musician that must develop both simultaneously and at the same rate in order to achieve fluent musical literacy; "(1) a well-trained ear, (2) a well-trained intellect, (3) a well-trained heart (aesthetic/emotional understanding), and (4) well-trained hands (technique)".[82] He was one of the first educators to claim that music literacy involved "the ability to read and write musical notation and to read notation at sight without the aid of an instrument... [as well as] a person’s knowledge of an appreciation for a wide range of musical examples and styles".[83]
Kodály's education techniques utilise elements from language and the educational structure of language development to complement pedagogical efforts in the field of musical literacy development. In rhythm, the Kodály method assigns ‘names’ — originally adapted from the Galin-Paris-Chevé system (French time-name system pioneered by Galin, Paris and Chevé) — to beat values; correlating the number of beats in a note to the number of syllables in its respective name.
Analogous to rhythm, the Kodaly method uses syllables to represent the sounds of notes in a scale as a mnemonic device to train singers. This technique was adapted from the teachings of Guido d’Arezzo, an 11th-century monk, who used the tones, ‘Ut, Re, Mi, Fa, So, La’ from the ‘Hymn to St. John’ (International Kodaly Society, 2014) as syllabic representations of pitch. The idea in contemporary Kodály teaching is that each consequent pitch in a musical scale is assigned a syllable:
1 2 3 4 5 6 7 8/1
Doh Ray Me Fah Soh Lah Te Doh
This can be applied in ‘Absolute’ (or, ‘Fixed-Doh’) Form — also known as ‘Solfege’ — or in a ‘Relative’ (or, ‘Movable-Doh’) Form — also known as ‘Solfa’ — where ‘Doh’ starts on the first pitch of the scale (i.e. for A Major, ‘Doh’ is ‘A’; for G Major, ‘Doh’ is ‘G’; E Major, ‘Doh’ is ‘E’; and so on).
The work of Sarah Glover and (continued by) John Curwen throughout England in the 19th Century meant that ‘Moveable-Doh’ solfa became the "favoured pedagogical tool to teach singers to read music".[84] In addition to the auditory-linguistic aid of syllable-to-pitch, John Curwen also introduced a kinesthetic element where different hand signs were applied to each tone of the scale.
Csikos & Dohany[85] affirm the popularity of the Kodály method over history and cite Barkoczi & Pleh[86] and Hallam[87] on "the powerfulness of the Kodály method in Hungary... [and] abroad"[88] in the context of achieving musical literacy within the school curriculum.
Methods such as Kodály's, however — which rely on sound to inform the visual element of conventional staff notation — fall short for learners who are unable to see. "No matter how brilliant the ear and how good the memory, literacy is essential for the blind student too",[89] and unfortunately conventional staff notation fails to cater to visually-impaired needs.
To rectify this, enlarged print music or Braille music scores can be supplied to low vision, legally blind, and totally blind individuals so that they can replace the visual aspect of learning with a tactile one (i.e. enhanced kinesthetic learning). According to Conn,[90] in order for blind students to "fully develop their aural skills... fully participate in music... become an independent and lifelong learner... have a chance to completely analyse the music... make full use of [their] own interpretive ability... share [their] composition... [and] gain employment/career path",[91] they must learn how to read and write Braille music.
Not unlike sighted education, teaching students how to read language in braille parallels the teaching of braille musical literacy. Toussaint & Tiger[92] cite Mangold[93] and Crawford & Elliott [94] on the "novel relation between the tactile stimulus (i.e., a braille symbol) and an auditory or vocal stimulus (i.e., the spoken letter name)".[95] This mirrors Kodály's visual (i.e., conventional staff notation)-to-auditory (i.e., similarly, the spoken letter name) approach in music education.
Despite the pedagogical similarities, however, Braille music literacy is far lower than musical literacy in sighted individuals. In this sense, too, the comparative percentage of sighted versus blind individuals who are literate in language versus music, are on an equal trajectory — for instance, language literacy for sighted year five school students in Australia is at 93.9%[96] compared to a 6.55% rate of HSC students studying music,[97] while language literacy for blind individuals is at approximately 12%.[98] Ianuzzi[99] comments on these double standards when she asks, "How much music would students learn to play if their music teachers couldn't read the notes? Unfortunately, not very many teachers of blind children are fluent in reading and writing Braille themselves."[100]
Although the core of musical literacy is arguably in relation to "extensive and repeated listening",[101] "there is still a need for explicit theories of music reading that would organise knowledge and research about music reading into a system of assumptions, principles, and procedures"[102] that would benefit poor-to-zero vision individuals. It is via the fundamental elements of reading literacy and "the ability to understand the majority of... utterances in a given tradition"[103] that musical literacy can be achieved.[104]
Nonetheless, sighted or not, the various teaching methods and learning approaches required to achieve musical literacy evidence the spectrum of psychological, neurological, multi-sensory, and motor skills functioning within an individual when they come into contact with music. Many fMRI studies have correspondingly demonstrated the impact of music and advanced music literacy on brain development.
Both the processing of music and performance with musical instruments require the involvement of both hemispheres of the brain.[105] Structural differences (i.e. increased grey matter) are found in the brain regions of musical individuals which are both directly linked to musical skills learned during instrumental training (e.g. independent fine motor skills in both hands, auditory discrimination of pitch), and also indirectly linked with improvements in language and mathematical skills.[106]
Many studies demonstrate that "music can have constructive outcomes on our mindsets that may make learning simpler".[107] For instance, young children exhibited a 46% increase in spatial IQ — essential for higher mind capacities involving complex arithmetic and science — after developing aspects of their musical literacy.[108] Such mathematical skills are enhanced in the brain due to the spatial training involved in learning music notation because "understanding rhythmic notation actually requires math-specific skills, such as pattern recognition and an understanding of proportion, ratio, fractions, and subdivision [of note values]".[109]
Superior "dialect capacity, including vocabulary, expressiveness, and simplicity of correspondence"[110] can also be seen in musically literate individuals. This is due to "both music and language processing requir[ing] the ability to segment streams of sound into small perceptual units".[111] Research confirms the relationship between musical literacy and reading and reasoning,[112] as well as non-cognitive skills such as leisure and emotional development,[113] coordination and innovativeness, attention and focus, memory, creativity, self-confidence, and empathetic interpersonal relationships.[114] Due to these various factors and impacts, Williams (1987)[115] finds herself of the opinion that "[musical] literacy gives dignity as well as competence, and is of the utmost importace to self-image and success... [it gives the] great joy of learning... the thrill of participation, and the satisfaction of informed listening".[116]
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.