Audio compact cassettes use magnetic tape of three major types which differ in fundamental magnetic properties, the level of bias applied during recording, and the optimal time constant of replay equalization. Specifications of each type were set in 1979 by the International Electrotechnical Commission (IEC): Type I (IEC I, 'ferric' or 'normal' tapes), Type II (IEC II, or 'chrome' tapes), Type III (IEC III, ferrichrome or ferrochrome), and Type IV (IEC IV, or 'metal' tapes). 'Type 0' was a non-standard designation for early compact cassettes that did not conform to IEC specification.

Thumb
Differences in tape colour of most common tape formulations, top to bottom: ferric, ferricobalt, chromium dioxide and metal particle cassettes
Thumb
Standardized notches for automatic tape selection. Top to bottom: Type I (and Type III), Type II, Type IV

By the time the specifications were introduced, Type I included pure gamma ferric oxide formulations, Type II included ferricobalt and chromium(IV) oxide formulations, and Type IV included metal particle tapes—the best-performing, but also the most expensive. Double-layer Type III tape formulations, advanced by Sony and BASF in the 1970s, never gained substantial market presence.

In the 1980s the lines between three types blurred. Panasonic developed evaporated metal tapes that could be made to match any of the three IEC types. Metal particle tapes migrated to Type II and Type I, ferricobalt formulations migrated to Type I. By the end of the decade performance of the best Type I ferricobalt tapes (superferrics) approached that of Type IV tapes; performance of entry-level Type I tapes gradually improved until the very end of compact cassette production.[1]

Specifications

Thumb
Coercivity and remanence marked on the wrapper of a 'professional' cassette (TDK AM, ca. 1995). These are typical values for a microferric cassette.
Thumb
MOL, SOL, bias noise and dynamic range marked on the wrapper of a superferric cassette (TDK AR, 1990s). The values approach the limits of ferric tape technology
Thumb
Frequency response curves of a typical cassette tape showing the effects of different bias settings (after Roberson[2])

Magnetic properties

Magnetic recording relies on the use of hard ferrimagnetic or ferromagnetic materials. These require strong external magnetic fields to be magnetized, and retain substantial residual magnetization after the magnetizing field is removed.[3] Two fundamental magnetic properties, relevant for audio recording, are:

  • Saturation remanence limits maximum output level and, indirectly, dynamic range of audio recordings.[4] Remanence of audio tapes, referred to quarter-inch tape width, varies from around 1100 G for basic ferric tapes to 3500 G for Type IV tapes;[5] advertised remanence of the 1986 JVC Type IV cassette reached 4800 G.[6]
  • Coercivity is a measure of the external magnetic flux required to magnetize the tape, and an indicator of the necessary bias level. The coercivity of audio tapes varies from 350 Oe to 1200 Oe. High-coercivity particles are more difficult to erase, bias and record, but also less prone to high-frequency losses during recording, and to external interference and self-demagnetization during storage.[5][7][8]

A useful figure of merit of tape technology is the squareness ratio of the hysteresis curve.[9] It is an indicator of tape uniformity and its linearity in analogue recording.[9] An increase in the squareness ratio defers the onset of compression and distortion, and allows fuller utilization of the tape's dynamic range within the limits of remanence.[9][10] The squareness ratio of basic ferric tapes rarely exceeds 0.75, and the squareness ratio of the best tapes exceeds 0.9.[9]

Electromagnetic properties

Thumb
Hysteresis curves of Type I, CrO2 Type II and Type IV tapes.[11] On this chart, the vertical span is remanence (remaining magnetism), a rough indicator of maximum recording output level. The horizontal span shows coercivity ― how much flux it takes to magnetize the tapes

Manufacturers of bulk tape provided extremely detailed technical descriptions of their product, with numerous charts and dozens of numeric parameters. From the end user viewpoint, the most important electromagnetic properties of the tape are:

  • Maximum output levels, usually specified in dB relative to the nominal zero reference level of 250 nWb/m or the 'Dolby level' of 200 nWb/m. Often incorrectly called recording levels, these are always expressed in terms of the tape's output, thus taking its sensitivity out of the equation. Performance at low and middle, and at treble frequencies was traditionally characterized by two related but different parameters:
    • Maximum output level (MOL) is relevant at low and middle frequencies. It is usually specified at 315 Hz (MOL315) or 400 Hz (MOL400), and its value marks the point when the third harmonic coefficient reaches 3%.[12] Further magnetization of the tape is technically possible, but at the cost of unacceptable compression and distortion. For all types of tape, MOL reaches a maximum in the 125800 Hz area, while dropping off below 125 Hz and above 800 Hz.[13] The maximum output of Type I tape at 40 Hz is 35 dB lower than MOL400,[14] while in Type IV tapes it is 67 dB lower.[15] As a result, ferric tapes handle bass-heavy music with apparent ease compared to expensive metal tapes. Double-layer Type III (IEC III, ferrichrome or ferrochrome) tape formulations were supposed to allow bass frequencies to be recorded deeper into the ferric layer, while keeping the high frequencies in the upper chromium oxide layer.
    • At treble frequencies the playback head cannot reliably reproduce harmonics of the recorded signal.[16] This makes distortion measurements impossible; instead of MOL, high-frequency performance is characterized by saturation output level (SOL), usually specified at 10 kHz (SOL10k).[16] Once the tape reaches saturation point, any further increase in recording flux actually decreases output to below SOL.[16]
  • Noise level, usually understood as bias noise (hiss) of a tape recorded with zero input signal, replayed without noise reduction, A-weighted and referred to the same level as MOL and SOL. The difference between bias noise and the noise of virgin tape is an indicator of tape uniformity. Another important but rarely quantified type of noise is modulation noise, which appears only in the presence of a recorded signal, and which cannot be reduced by Dolby or dbx noise reduction systems.[17]
  • Dynamic range, or signal-to-noise ratio, was usually understood as the ratio between MOL and A-weighted bias noise level.[16][18] High fidelity audio requires a dynamic range of at least 6065 dB; the best cassettes tapes reached this threshold in the 1980s, at least partially eliminating the need for the use of noise reduction systems. Dynamic range is the most important property of the tape. The higher the dynamic range of music, the more demanding it is of tape quality; alternatively, heavily compressed music sources can do well even with basic, inexpensive tapes.[8]
  • Sensitivity of the tape, referred to that of an IEC reference tape and expressed in dB, was usually measured at 315 Hz and 10 kHz.[19]
  • Stability of playback in time. Low-quality or damaged cassette tape is notoriously prone to signal dropouts, which are absolutely unacceptable in high fidelity audio.[19] For high quality tapes, playback stability is sometimes lumped together with modulation noise and wow and flutter into an integral smoothness parameter.[20]

Frequency range, per se, is usually unimportant. At low recording levels (−20 dB referred to nominal level) all quality tapes can reliably reproduce frequencies from 30 Hz to 16 kHz, which is sufficient for high fidelity audio.[16] However, at high recording levels the treble output is further limited by saturation. At the Dolby recording level the upper frequency limit shrinks to a value between 8 kHz for a typical chromium dioxide tape, and 12 kHz for metal tapes; for chromium dioxide tapes, this is partially offset by lower hiss levels.[16] In practice, the extent of the high-level frequency range is not as important as the smoothness of the midrange and treble frequency response.[19]

Standards

Thumb
A mark on prerecorded chromium dioxide cassette intended for replay as Type I (Decca Records, 1980s)
Thumb
IEC I compatibility logo (BASF, 1981). These logos, advanced by BASF and the IEC, didn't catch on and were soon abandoned

The original specification for Compact Cassette was set by Philips in 1962–1963. Of the three then available tape formulations that matched the company's requirements, the BASF PES-18 tape became the original reference.[21] Other chemical companies followed with tapes of varying quality, often incompatible with the BASF reference. By 1970, a new, improved generation of tapes firmly established themselves on the market, and became the de facto reference for aligning tape recorders — thus the compatibility issue worsened even further.[21] In 1971 it was tackled by the Deutsches Institut für Normung (DIN), which set the standard for chromium dioxide tapes. In 1978 the International Electrotechnical Commission (IEC) enacted the comprehensive standard on cassette tapes (IEC 60094). One year later the IEC mandated the use of notches for automatic tape type recognition.[21] Since then, the four cassette tape types were known as IEC I, IEC II, IEC III and IEC IV.[21] The numerals follow the historical sequence in which these tape types were commercialized, and do not imply their relative quality or intended purpose.[22]

An integral part of the IEC 60094 standard family is the set of four IEC reference tapes. Type I and Type II reference tapes were manufactured by BASF, Type III reference tapes by Sony, and Type IV reference tapes by TDK.[23] Unlike consumer tapes, which were manufactured continuously over the years, each reference tape was made in a single production batch by the IEC-approved factory.[23][19] These batches were made large enough to fill the need of the industry for many years.[23] A second run was impossible, because chemists were unable to replicate the reference tape type formulation with proper precision.[23] From time to time, the IEC revised the set of references; the final revision took place in April 1994.[19] The choice of reference tapes, and the role of the IEC in general, has been debated. Meinrad Liebert, designer of Studer and Revox cassette decks, criticized the IEC for failing to enforce the standards and lagging behind the constantly changing market.[24] In 1987, Liebert wrote that while the market clearly branched into distinct, incompatible "premium" and "budget" subtypes, the IEC tried in vain to select an elusive "market average"; meanwhile, the industry moved forward, disregarding outdated references.[24] This, according to Liebert, explained sudden demand for built-in tape calibration tools that were almost unheard-of in the 1970s.[24]

From the end user viewpoint, the IEC 60094 defined two principal properties of each tape type:

  • Bias level for each type was set equal to the optimal bias of the relevant IEC reference tape, and sometimes changed when the IEC changed the reference tapes, though the BASF datasheet for the Y348M tape, approved as the IEC Type I reference in 1994, says that its optimal bias is exactly 0.0 dB from the previous reference (BASF R723DG). The IEC reference tape bias definition is: Using the relevant IEC reference tape and heads according to Ref. 1.1, the bias current providing the minimum third harmonic distortion ratio for a 1 kHz signal recorded at the reference level is the reference bias setting. Type II bias ('high bias') equals around 150% of Type I bias, Type IV bias ('metal bias') equals around 250% of Type I bias.[25] Real cassette tapes invariably deviate from the references and require fine tuning of bias; recording a tape with improper bias increases distortion and alters frequency response.[26] A 1990 comparative test of 35 Type I tapes showed that their optimal bias levels were within 1 dB of the Type I reference, while Type IV tapes deviated from the Type IV reference by up to 3 dB.[27] Some typical cassette deck frequency response curves showing the effects of different bias settings are provided in the relevant figure.
  • Time constant of replay equalization (often shortened to EQ) for Type I tapes equals 120 μs, as per the Philips specification. The time constant for Type II, III and IV tapes is set at a lower value of 70 μs. The purpose of replay equalization is to compensate for high-frequency losses during recording,[28] which, in case of ferric cassettes, usually start at around 11.5 kHz. The choice of time constant is a somewhat arbitrary decision, seeking the best combination of conflicting parameters — extended treble response, maximum output, minimum noise and minimum distortion.[29] High-frequency roll-off that is not fully compensated in the replay channel may be offset by pre-emphasis during recording.[29] Lower replay time constants decrease the apparent level of hiss (by 4 dB when stepping down from 120 to 70 μs), but also decrease apparent high-frequency saturation level, so the choice of time constants was a matter of compromise and debate. [30] "Hard" maximum and saturation levels, in terms of voltage output of playback head, remain unchanged. However, the high-frequency voltage level at the output of the replay equalizer decreases with a decrease in time constant.[citation needed] The industry and the IEC decided that it would be safe to decrease the time constant of Type II, III and IV tapes to 70 μs, because they are less prone to high-frequency saturation than contemporary ferric tapes.[29] Many disagreed, arguing that the risk of saturation at 70 μs is unacceptably high.[31] Nakamichi and Studer complied with the IEC, but provided an option for playing Type II and Type IV tapes using the 120 μs setting and matching pre-emphasis filters in the recording path. A similar pre-emphasis was applied by duplicators of prerecorded chromium dioxide cassettes; although loaded with Type II tape, these cassettes were packaged in Type I cassette shells and were intended to be replayed as Type I tapes.[8]

Type I

Type I, or IEC I, ferric or 'normal' cassettes were historically the first, the most common and the least expensive; they dominated the prerecorded cassette market.[8] The magnetic layer of a ferric tape consists of around 30% synthetic binder and 70% magnetic powder — acicular (oblong, needle-like) particles of gamma ferric oxide (γ-Fe2O3), with a length of 0.2 μm to 0.75 μm.[32] Each particle of such size contains a single magnetic domain.[33] The powder was and still is manufactured in bulk by chemical companies specializing in mineral pigments for the paint industry.[32] Ferric magnetic layers are brown in colour, whose shade and intensity depends mostly on the size of the particles.

Type I tapes must be recorded with 'normal' (low) bias flux and replayed with a 120 μs time constant. Over time, ferric oxide technology developed continuously, with new, superior generations emerging around every five years.[34] Cassettes of various periods and price points can be sorted into three distinct groups: basic coarse-grained tapes; advanced fine-grained, or microferric, tapes; and highest-grade ferricobalt tapes, having ferric oxide particles encapsulated in a thin layer of cobalt-iron compound. Ferricobalt tapes are often called 'cobalt doped', however, this is historically incorrect. Cobalt doping in a strict sense involves uniform substitution of iron atoms with cobalt.[35] This technology has been tried for audio and failed, losing to chromium dioxide.[22] Later, the industry has chosen the far more reliable and repeatable process of cobalt adsorption — encapsulation of unmodified iron oxide particles in a thin layer of cobalt ferrite.[35]

The remanence and squareness properties of the three groups substantially differ, while coercivity remains almost unchanged at around 380 Oe (360 Oe for the IEC reference tape approved in 1979[36]). Quality Type I cassettes have higher midrange MOL than most Type II tapes, slow and gentle MOL roll-off at low frequencies, but less high-frequency headroom than Type II.[13] In practice, that means that ferric tapes have lower fidelity compared to chrome tapes and metal tapes at high frequencies, but are often better at reproducing the low frequencies found in bass-heavy music.

Basic ferric

Thumb
Sony C60 compact cassette (1974)

Entry-level ferric formulations are made of pure, unmodified, coarse-grained ferric oxide. Relatively large (up to 0.75 μm in length), irregularly-shaped oxide particles have protruding branches or dendrites; these irregularities prevent tight packing of particles, reducing iron content of the magnetic layer and, consequently, its remanence (13001400 G) and maximum output level.[37] The squareness ratio is low, around 0.75, resulting in early but smooth onset of distortion.[37] These tapes, historically labeled and sold as 'low noise', have high levels of hiss and relatively low sensitivity; their optimal bias level is 1–2 dB lower than that of the IEC reference tape.

This group also includes most of the so-called 'Type 0' cassettes — a mixed bag of ferric tapes that do not meet the IEC standard or the original Philips specification.[25][38] Historically, informal 'Type 0' denoted early cassettes loaded with tape designed for reel-to-reel recorders.[25] In the 1980s, many otherwise decent and usable basic tapes were effectively demoted to 'Type 0' status when equipment manufacturers began aligning their decks for use with premium ferricobalts (the latter having much higher sensitivity and bias).[38] In the 21st century, 'Type 0' denotes all sorts of low-quality, counterfeit or otherwise unusable cassettes. They require unusually low bias, and even then only a few of them perform on par with quality Type I tapes.[25] A 'Type 0' tape, if it is usable at all, is incompatible with Dolby noise reduction: with the Dolby decoder engaged, the tape sounds dull, resulting from its poor sensitivity causing severe Dolby mistracking.[38]

Microferric

In the beginning of the 1970s, gradual technological improvements over the previous decade resulted in the second generation of Type I tapes. These tapes had uniformly needle-shaped, highly orientable particles (HOP) of much smaller size, around 0.25 μm in length, hence the trade term microferrics.[9] Their uniform shape allowed very dense packing of particles, with less binder and more particles per unit volume,[9] and a corresponding rise in remanence to around 1600 G. The first microferric (TDK SD) was introduced in 1971, and in 1973 Pfizer began marketing patented microferric powder that soon became an industry standard.[39] In the 20th century, Pfizer had a strong mineral pigment division, with factories in California, Illinois and Indiana. In 1990 Pfizer sold its iron-oxide business to Harrisons & Crosfield of the United Kingdom.[40] The next step was to align needle-shaped particles in parallel with the flux lines generated by the recording head; this was done by controlled flow of liquid magnetic mix over substrate (rheological orientation),[9] or by applying a strong magnetic field while the binder was curing.[41]

Typical microferric cassettes of the 1980s had less hiss and at least 2 dB higher MOL than basic Type I tapes, at the cost of increased print-through.[42] Noise and print-through are interrelated, and directly depend on the size of oxide particles. A decrease in particle size invariably decreases noise and increases print-through. The worst combination of noise and print-through occurs in highly irregular formulations containing both unusually large and unusually small particles.[43] Small improvements continued for thirty years, with a gradual rise of squareness ratio from 0.75 to over 0.9.[9][42] Newer tapes consistently produced higher output with less distortion at the same levels of bias and audio recording signals.[9] The transition was smooth; after the introduction of new, superior tape formulations, manufacturers often kept older ones in production, selling them in different markets or under different, cheaper, designations. Thus, for example, TDK ensured that its premium microferric AD cassette was always ahead of entry-level microferric D, having finer particles and lower noise.[44]

Ferricobalt Type I

The third, and best performing, class of ferric tapes is made of fine ferric particles encapsulated in a thin 30 Å layer of cobalt-iron mix, similar in composition to cobalt ferrite.[45] The first cobalt-doped cassettes, introduced by 3M in 1971, had exceptionally high sensitivity and MOL for the period, and were an even match for contemporary chromium dioxide tapes[46] — hence the trade name superferrics. Of many competing cobalt-doping technologies, the most widespread was low-temperature encapsulation of ferric oxide in aqueous solution of cobalt salts with subsequent drying at 100150 °C.[45][47] Encapsulated microferric particles retain needle-like shape and can be tightly packed into uniform anisotropic layers.[45][47] The process was first commercialized in Japan in the early 1970s.[48]

The remanence of ferricobalt cassettes is around 1750 G, resulting in around 4 dB gain in MOL and 23 dB gain in sensitivity compared to basic Type I tapes; their hiss level is on par with contemporary microferric formulations. The dynamic range of the best ferricobalt cassettes (true superferrics) equals 6063 dB, and the MOL at lower frequencies exceeds the MOL of Type IV tapes. Overall, superferrics are a good match to Type IV, especially in recording acoustical music with a wide dynamic range.[49][38] This was reflected in the price of top-of-the-line superferric tapes like Maxell XLI-S or TDK AR-X, which by 1992 matched the price of 'entry-level' metal tapes.

Type II

Thumb
The TDK KR (Krom) was the company's only chrome tape ever made. In 1974-1975, as soon as TDK had ferricobalt technology going, they killed chrome production altogether.
Thumb
All Type II cassettes made by TDK after 1975 (SA, SA-X, SA-XS shown) were ferricobalts, not chromes

IEC Type II tapes are intended for recording using high (150% of normal) bias and replay with the 70 μs time constant. All generations of Type II reference tapes, including the 1971 DIN reference that pre-dated the IEC standard, were manufactured by BASF. Type II has been historically known as 'chromium dioxide tape' or simply 'chrome tape', but in reality most Type II cassette tapes do not contain chromium.[50] The "pseudochromes" (including almost all Type IIs made by the Big Three Japanese makers — Maxell, Sony and TDK) are actually ferricobalt formulations optimized for Type II recording and playback settings.[50][51] A true chrome tape may have a distinctive 'old crayon' smell, more specifically, any oil or wax chalks that have chrome dioxide pigments in them like chrome yellow, which is missing in "pseudochromes". Both kinds of Type II tapes have, on average, lower high-frequency MOL and SOL, and higher signal-to-noise ratio than quality Type I tapes.[52] This is caused by the midrange and treble pre-emphasis applied during recording to match the 70 μs equalization at playback.[52]

Chromium dioxide

In the middle of 1960s, DuPont created and patented an industrial process for making fine ferromagnetic particles of chromium dioxide (CrO2). The first CrO2 tapes for data and video appeared in 1968.[41] In 1970, BASF, who would become the main proponent of CrO2, launched its chrome cassette production;[51] in the same year Advent introduced the first cassette deck with chrome capability and Dolby noise reduction. The combination of low noise CrO2 tape with companding noise reduction brought a revolutionary improvement to compact-cassette sound reproduction, almost reaching the high fidelity level. However, CrO2 tape required redesign of the bias and replay equalization circuitry. This problem was resolved during the 1970s,[53] but three unsolved issues remained: the cost of making CrO2 powder, the cost of royalties charged by DuPont, and the pollution effects of hexavalent chromium waste.[54][51]

The reference CrO2 tape, approved by the IEC in 1981, is characterized by coercivity of 490 Oe (high bias) and remanence of 1650 G.[55][48] Retail CrO2 cassettes had coercivity in the range from 400 to 550 Oe.[56] Owing to the very 'clean', uniform shape of the particles, chrome tapes easily attain an almost perfect squareness ratio of 0.90.[48][57] 'True chromes', not modified by the addition of ferric additives or coatings, have very low and euphonic hiss (bias noise), and very low modulation noise at high frequencies.[58][8] Double-layer CrO2 cassettes have the lowest absolute noise among all the audio formulations; these cassettes generate less noise at 4.76 cm/s than a ferric tape at 19.05 cm/s.[53] Their sensitivity is usually also very high, but MOL is low, on par with basic Type I tapes. CrO2 tape does not tolerate overload very well: the onset of distortion is sharp and dissonant, so recording levels should be set conservatively, well below MOL.[58] At low frequencies, the MOL of CrO2 tapes rolls off faster than in ferric or metal tapes, hence the reputation of 'bass shyness'. CrO2 cassettes are best fit for recording dynamic music with rich harmonic content and relatively low bass levels;[58] their dynamic range is a good fit for recording from uncompressed digital sources[34] and for music with extended quiet passages.[8] Good ferric tapes may have the same or higher treble SOL, but CrO2 tapes still sound subjectively better owing to lower hiss and modulation noise.[59]

Ferricobalt Type II

Thumb
Sony Chrome compact cassette (1976)
Thumb
BASF Chrome Extra II cassette (1988)
Thumb
Frequency response and noise level of the Nakamichi SX Type II cassette tape tested using a Nakamichi 600 two-head cassette deck

After the introduction of CrO2 cassettes, Japanese companies began developing a royalty-free alternative to DuPont's patent, based on an already established cobalt doping process.[48] A controlled increase in cobalt content causes an almost linear increase in coercivity, thus a Type II "pseudochrome" tape can be made by simply adding around 3% cobalt to a Type I ferricobalt tape.[35] By 1974 the technology was ready for mass production, and TDK and Maxell introduced their classic "pseudochromes" (TDK SA and Maxell UD-XL), while killing their true chrome lines (TDK KR and Maxell CR). By 1976, ferricobalt formulations took over the video tape market,[60] and eventually they became the dominant high-performance tape for audio cassette.[51] Chromium dioxide disappeared from the Japanese domestic market,[51] although chrome remained the tape of choice for high fidelity cassette duplication among the music labels. In consumer markets, chrome coexisted as a distant second with "pseudochromes" until the very end of the cassette era. Ferricobalt technology developed continuously: in the 1980s Japanese companies introduced 'premium' double-layered ferricobalts with exceptionally high MOL and SOL; in the middle of the 1990s TDK launched the first and only triple-coated ferricobalt, the SA-XS.[61][62]

The electromagnetic properties of Type II ferricobalts are very close to those of their Type I cousins. Owing to the use of 70 μs replay equalization, the hiss level is lower, but so is the treble saturation level. The dynamic range of Type II ferricobalts, according to the 1990 tests, lies between 60 and 65 dB. The coercivity of 580700 Oe and remanence of 13001550 G are close to the CrO2 reference tape, but the difference is big enough to cause compatibility problems.[50] TDK SA was the informal reference in Japan. TDK advertisements boasted that "more decks are aligned to SA than any other tape", but there is very little first-hand information on which tapes were actually used at the factories. Japanese manufacturers provided lists of recommended tapes but did not disclose their reference tapes. There is, however, enough indirect information converging on TDK SA. For example, in 1982, when Japanese-owned Harman Kardon sent samples for Dolby certification, they were aligned to the IEC CrO2 reference. However, production copies of the same models were aligned to TDK SA.[63] Since the Japanese already dominated both the cassette and hi-fi equipment markets, incompatibility further undermined the market share of European-made cassette decks and CrO2 cassettes.[64] In 1987, the IEC resolved the compatibility issue by appointing a new Type II reference tape U 564 W, a BASF ferricobalt with properties that were very close to contemporary TDK tapes. With the short-lived 1988 Reference Super, even BASF started the manufacture and sale of Type II ferricobalt tapes.[65][66]

Metal particle Type II

The coercivity of iron-cobalt metal particle mix, precipitated from aqueous solutions, depends on the cobalt content. A change in cobalt content from 0% to 30% causes a gradual rise in coercivity from around 400 Oe (Type I level) to 1300 Oe (Type IV level); alloyed iron-cobalt particles can reach a coercivity of 2200 Oe.[67] This makes possible manufacturing of metal particle tapes conforming to Type II and even Type I biasing requirements.[68]

In practice, only Denon, Taiyo Yuden, and, for only a few years, TDK, ever attempted making Type II metal tape. These rare expensive cassettes were characterized by high remanence, approaching that of Type IV tapes (2600 G); their coercivity of 800 Oe was closer to Type II than Type IV tapes, but still quite far from either type reference.[69] Independent tests of the 1990 Denon and Taiyo Yuden tapes placed them at the very top of the Type II spectrum — if the recording deck could cope with unusually high sensitivity and provide unusually high bias current.[70]

Type III

Thumb
Agfa Type III cassette

In 1973, Sony introduced double-layer ferrichrome tapes having a five-micron ferric base coated with one micron of CrO2 pigment.[71][51] The new cassettes were advertised as 'the best of both worlds' — combining the good low-frequency MOL of microferric tapes with good high-frequency performance of chrome tapes.[42][22] The novelty became part of the IEC standard, codenamed Type III; the Sony CS301 formulation became the IEC reference.[23] However, the idea failed to attract followers. Apart from Sony, only BASF, Scotch and Agfa introduced their own ferrichrome cassette tapes.[72]

These expensive ferrichrome tapes never gained substantial market share, and after the release of metal tapes they lost their perceived exclusivity.[51][42] Their place in the market was taken over by superior and less expensive ferricobalt formulations.[51][42] By 1983, tape deck manufacturers stopped providing an option for recording Type III tapes.[23] Ferrichrome tape remained in the BASF and Sony lineups until 1984[72] and 1988,[73] respectively.

The use of ferrichrome tapes was complicated by the conflicting rationale of the playback of these tapes. Officially, they were intended to be played back using 70 μs equalisation. The information leaflet that Sony included in each box of ferrichrome cassette tapes recommended that, "If the selector has two positions, NORMAL and CrO2, set it to the NORMAL position."[74] (which applies 120 μs equalisation). The leaflet notes that the high frequency range will be enhanced and that the tone control should be adjusted to compensate. The same leaflet recommends that if the playback machine offers a 'Fe-Cr' selection, that this should be selected. On Sony's machines, this automatically selects 70 μs equalisation. The service manual for the Sony TC-135SD, which was one of the few cassette decks offering a 'Fe-Cr' position, shows the tape type selector switch paralleling the ferrichrome equalisation selection with that of chrome dioxide (70 μs).[75] Neither Sony nor BASF cassette tapes feature the notches on the back surface that automatically select 70 μs equalisation on those machines that featured an automatic detection system.

Type IV

Metal particle Type IV

Thumb
Thumb
Top-of-the-line Type IV cassettes were packaged in expensive, precision-engineered shells. The TDK MA-R shell (left) had a rigid alloy frame, the Sony Metal Master (right) had ceramic shell halves and a ceramic tape guide insert
Thumb
Frequency response and noise analysis of Nakamichi ZX Metal Particle Type IV cassette tape using the Nakamichi LX-5 three-head cassette deck

Pure metal particles have an inherent advantage over oxide particles due to 34 times higher remanence, very high coercivity and far smaller particle size, resulting in both higher MOL and SOL values.[76][77] First attempts to make metal particle (MP) tape, rather than metal oxide particle tape, date back to 1946; viable iron-cobalt-nickel formulations appeared in 1962.[56] In the early 1970s, Philips began development of MP formulations for the Compact Cassette.[64] Contemporary powder metallurgy could not yet produce fine, submicron size particles, and properly passivate these highly pyrophoric powders.[78][79] Although the latter problems were soon solved,[78] the chemists did not convince the market in terms of the long-term stability of MP tapes; suspicions of inevitable early degradation persisted until the end of the cassette era.[56] The fears did not materialize,[56] and most metal particle tapes survived decades of storage just as well as Type I tapes; however, signals recorded on metal particle tapes do degrade at about the same rate as in chromium tapes, around 2 dB over the estimated lifetime of the cassette.[80][81]

Metal particle Compact Cassettes, or simply 'metal' tapes, were introduced in 1979 and were soon standardized by the IEC as Type IV.[56][79] They share the same 70 μs replay time constant as Type II tapes, and can be correctly reproduced by any deck equipped with Type II equalization.[19] Recording onto a metal tape requires special high-flux magnetic heads and high-current amplifiers to drive them.[19][79] Typical metal tape is characterized by remanence of 30003500 G and coercivity of 1100 Oe, thus its bias flux is set at 250% of Type I level.[42][56][82][19] Traditional glass ferrite heads would saturate their magnetic cores before reaching these levels. "Metal capable" decks had to be equipped with new heads built around sendust or permalloy cores, or the new generation of glass ferrite heads with specially treated gap materials.[83]

Metal particle tapes, particularly top-of-the-line double coated tapes, have record high midrange MOL and treble SOL, and the widest dynamic range coupled with the lowest distortion.[84] They were always expensive, almost exclusive, out of reach of most consumers.[84] They excel at reproducing fine nuances of uncompressed acoustic music, or music with very high treble content, like brass and percussion.[84][8] However, they need a high quality, properly aligned deck to reveal their potential.[84][8] First-generation metal particle tapes were consistently similar in their biasing requirements, but by 1983 newer formulations drifted away from each other and the reference tape.[85]

Metal evaporated

Unlike wet coating processes, metal evaporated (ME) media are fabricated by physical deposition of vaporized cobalt or cobalt-nickel mix in a vacuum chamber.[86] There is no synthetic binder to hold particles together; instead, they adhere directly to polyester tape substrate.[86][79] An electron beam melts source metal, creating a continuous directional flow of cobalt atoms towards the tape.[86] The zone of contact between the beam and the tape is blown with a controlled flow of oxygen, which helps formation of polycrystalline metal-oxide coating.[86] A massive liquid-cooled rotating drum, which pulls the tape into the contact zone, protects it from overheating.[86]

Metal evaporated coatings, along with barium ferrite, have the highest information density of all rerecordable media.[87] The technology was introduced in 1978 by Panasonic, initially in the form of audio microcassettes, and matured through the 1980s.[87][79] Metal evaporated media established itself in analogue (Hi8) and digital (Digital8, DV and MicroMV) videotape market, and data storage (Advanced Intelligent Tape, Linear Tape Open).[87] The technology seemed promising for analogue audio recording; however, very thin metal evaporated layers were too fragile for consumer cassette decks, the coatings too thin for good MOL,[79] and manufacturing costs were prohibitively high. Panasonic Type I, Type II and Type IV metal evaporated cassettes, introduced in 1984, were sold for only a few years in Japan alone, and remained unknown in the rest of the world.[79]

Measured performance characteristics

Thumb
Comparison of typical MOL, SOL, and 0-dB frequency responses for sample Type I, Type II, and Type IV cassette tapes
Thumb
Frequency response plots of some sample Type I, Type II, and Type IV cassette tapes

During the many years that cassette decks were popular, many audio magazines published comparative measurements of the performance characteristics of the wide variety of different tapes that were available in the marketplace.[88][89][90][91][92][93] [94] These measurements typically included parameters such as MOL, SOL, frequency response at 0-dB and −20-dB re Dolby Level, signal-to-noise ratio, modulation noise, bias level, and sensitivity. The first figure shows frequency response plots for sample Type I, Type II, and Type IV cassette tapes comparing their MOL, SOL, and 0-dB performance.

The second figure shows the frequency response performance of typical Type I, Type II, and Type IV cassette tapes, obtained for a number of different input signal levels, using a high quality Pioneer CT-93 stereo cassette deck from the 1990s.[95][94] For each of the three tape formulations, the record/replay characteristics of the cassette deck were aligned with the relevant IEC Reference Tape, and each tested tape was measured with the bias and equalization unchanged from that reference position. The record/replay frequency response was tested at four levels: +6 VU, 0 VU, −10 VU and −20 VU (Dolby Level is marked at +3 VU for the CT-93). Thus, these plots provide data on the linearity of the different tape formulations at both high and moderate recording levels. It is interesting to note that the Type I tape shows +6 VU and at 0 VU responses that are much flatter than that of the Type II tape. At +6 VU, the Type II tape displays significant amounts of signal level compression across the entire frequency range, reducing to about 2 dB of signal compression between 80 Hz and 1 kHz.

Some representative measured performance characteristics of a small number of commercially available tape types are presented in the table below.[90][88]

Maximum Output Level
(dB re 400-Hz Dolby Level)
3% Harmonic
Distortion
3% Twin-Tone
Intermodulation
High-Frequency
−3 dB Point
(kHz)
Tape Type 100 400 1k 2k 5k 10k S/N
Ratio
(dBA)
0-dB
Input
Level
−20-dB
Input
Level
Mod.
Noise
(dB)
Bias
(dB)
Sens.
(dB)
BASF LH-MI I +4.0 +4.8 +5.6 +0.8 −2.4 −8.8 58.3 10.6 −45.7 +0.4 −0.3
Maxell UR I +3.9 +4.3 +4.4 +0.5 −2.5 −9.0 57.0 9.8 −43.8 −0.5 0.0
Maxell UD-XL I I +6.5 +6.8 +6.8 +0.8 −2.0 −8.5 58.8 10.0 23.9 −46.3 +0.1 +0.9
Sony HF I +2.0 +2.4 +2.5 −0.9 −4.1 −10.3 54.3 8.9 −36.1 −0.9 −1.0
TDK D I +2.6 +3.5 +4.5 0.0 −3.1 −9.6 55.5 9.3 22.9 −45.4 −0.1 −1.0
TDK AD I +3.8 +6.2 +6.2 +1.3 −1.7 −8.2 60.3 9.9 23.2 −44.3 +0.5 −0.3
BASF CR-Mll II +4.8 +5.4 +4.0 −4.0 −8.3 −12.8 63.0 7.3 −51.0 +1.0 +1.0
Maxell UD-XL II II +4.4 +5.2 +5.1 −2.1 −5.6 −10.6 60.4 9.2 22.1 −48.0 −0.1 +1.7
Memorex CDXII II +5.7 +6.3 +6.1 −0.4 −3.1 −6.9 61.2 11.9 −47.4 +1.3 +2.9
TDK SA II +3.4 +4.4 +4.9 −1.9 −5.7 −11.2 60.9 8.9 20.3 −47.2 +0.1 +1.1
TDK SA-X II +3.7 +4.4 +3.6 −2.8 −7.3 −11.5 63.2 7.8 23.8 −47.8 +1.0 +1.6
Maxell MX IV +8.0 +9.1 +9.5 +2.3 −1.9 −6.8 62.7 12.5 25.0 −50.4 +0.1 +0.8
Sony Metal-ES IV +8.8 +10.2 +10.3 +2.1 −2.4 −7.1 66.0 12.5 −50.8 +0.6 +2.0

References

Bibliography

Wikiwand in your browser!

Seamless Wikipedia browsing. On steroids.

Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.

Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.