Compact Cassette tape types and formulations

Audio compact cassettes use magnetic tape of three major types which differ in fundamental magnetic properties, the level of bias applied during recording, and the optimal time constant of replay equalization. Specifications of each type were set in 1979 by the International Electrotechnical Commission (IEC): Type I (IEC I, 'ferric' or 'normal' tapes), Type II (IEC II, or 'chrome' tapes), Type III (IEC III, ferrichrome or ferrochrome), and Type IV (IEC IV, or 'metal' tapes). 'Type 0' was a non-standard designation for early compact cassettes that did not conform to IEC specification.

By the time the specifications were introduced, Type I included pure gamma ferric oxide formulations, Type II included ferricobalt and chromium(IV) oxide formulations, and Type IV included metal particle tapes—the best-performing, but also the most expensive. Double-layer Type III tape formulations, advanced by Sony and BASF in the 1970s, never gained substantial market presence.

In the 1980s the lines between three types blurred. Panasonic developed evaporated metal tapes that could be made to match any of the three IEC types. Metal particle tapes migrated to Type II and Type I, ferricobalt formulations migrated to Type I. By the end of the decade performance of the best Type I ferricobalt tapes (superferrics) approached that of Type IV tapes; performance of entry-level Type I tapes gradually improved until the very end of compact cassette production.

Magnetic properties
Magnetic recording relies on the use of hard ferrimagnetic or ferromagnetic materials. These require strong external magnetic fields to be magnetized, and retain substantial residual magnetization after the magnetizing field is removed. Two fundamental magnetic properties, relevant for audio recording, are:


 * Saturation remanence limits maximum output level and, indirectly, dynamic range of audio recordings. Remanence of audio tapes, referred to quarter-inch tape width, varies from around $1,100 G$ for basic ferric tapes to $3,500 G$ for Type IV tapes; advertised remanence of the 1986 JVC Type IV cassette reached $4,800 G$.
 * Coercivity is a measure of the external magnetic flux required to magnetize the tape, and an indicator of the necessary bias level. The coercivity of audio tapes varies from $350 Oe$ to $1,200 Oe$. High-coercivity particles are more difficult to erase, bias and record, but also less prone to high-frequency losses during recording, and to external interference and self-demagnetization during storage.

A useful figure of merit of tape technology is the squareness ratio of the hysteresis curve. It is an indicator of tape uniformity and its linearity in analogue recording. An increase in the squareness ratio defers the onset of compression and distortion, and allows fuller utilization of the tape's dynamic range within the limits of remanence. The squareness ratio of basic ferric tapes rarely exceeds 0.75, and the squareness ratio of the best tapes exceeds 0.9.

Electromagnetic properties


Manufacturers of bulk tape provided extremely detailed technical descriptions of their product, with numerous charts and dozens of numeric parameters. From the end user viewpoint, the most important electromagnetic properties of the tape are:


 * Maximum output levels, usually specified in dB relative to the nominal zero reference level of $250 nWb/m$ or the 'Dolby level' of $200 nWb/m$. Often incorrectly called recording levels, these are always expressed in terms of the tape's output, thus taking its sensitivity out of the equation. Performance at low and middle, and at treble frequencies was traditionally characterized by two related but different parameters:
 * Maximum output level (MOL) is relevant at low and middle frequencies. It is usually specified at 315Hz (MOL315) or 400Hz (MOL400), and its value marks the point when the third harmonic coefficient reaches 3%. Further magnetization of the tape is technically possible, but at the cost of unacceptable compression and distortion. For all types of tape, MOL reaches a maximum in the 125–800Hz area, while dropping off below $125 Hz$ and above $800 Hz$. The maximum output of Type I tape at $40 Hz$ is 3–5dB lower than MOL400, while in Type IV tapes it is 6–7dB lower. As a result, ferric tapes handle bass-heavy music with apparent ease compared to expensive metal tapes. Double-layer Type III (IEC III, ferrichrome or ferrochrome) tape formulations were supposed to allow bass frequencies to be recorded deeper into the ferric layer, while keeping the high frequencies in the upper chromium oxide layer.
 * At treble frequencies the playback head cannot reliably reproduce harmonics of the recorded signal. This makes distortion measurements impossible; instead of MOL, high-frequency performance is characterized by saturation output level (SOL), usually specified at $10 kHz$ (SOL10k). Once the tape reaches saturation point, any further increase in recording flux actually decreases output to below SOL.
 * Noise level, usually understood as bias noise (hiss) of a tape recorded with zero input signal, replayed without noise reduction, A-weighted and referred to the same level as MOL and SOL. The difference between bias noise and the noise of virgin tape is an indicator of tape uniformity. Another important but rarely quantified type of noise is modulation noise, which appears only in the presence of a recorded signal, and which cannot be reduced by Dolby or dbx noise reduction systems.
 * Dynamic range, or signal-to-noise ratio, was usually understood as the ratio between MOL and A-weighted bias noise level. High fidelity audio requires a dynamic range of at least 60–65dB; the best cassettes tapes reached this threshold in the 1980s, at least partially eliminating the need for the use of noise reduction systems. Dynamic range is the most important property of the tape. The higher the dynamic range of music, the more demanding it is of tape quality; alternatively, heavily compressed music sources can do well even with basic, inexpensive tapes.
 * Sensitivity of the tape, referred to that of an IEC reference tape and expressed in dB, was usually measured at $315 Hz$ and $10 kHz$.
 * Stability of playback in time. Low-quality or damaged cassette tape is notoriously prone to signal dropouts, which are absolutely unacceptable in high fidelity audio. For high quality tapes, playback stability is sometimes lumped together with modulation noise and wow and flutter into an integral smoothness parameter.

Frequency range, per se, is usually unimportant. At low recording levels (−20 dB referred to nominal level) all quality tapes can reliably reproduce frequencies from $30 Hz$ to $16 kHz$, which is sufficient for high fidelity audio. However, at high recording levels the treble output is further limited by saturation. At the Dolby recording level the upper frequency limit shrinks to a value between $8 kHz$ for a typical chromium dioxide tape, and $12 kHz$ for metal tapes; for chromium dioxide tapes, this is partially offset by lower hiss levels. In practice, the extent of the high-level frequency range is not as important as the smoothness of the midrange and treble frequency response.

Standards


The original specification for Compact Cassette was set by Philips in 1962–1963. Of the three then available tape formulations that matched the company's requirements, the BASF PES-18 tape became the original reference. Other chemical companies followed with tapes of varying quality, often incompatible with the BASF reference. By 1970, a new, improved generation of tapes firmly established themselves on the market, and became the de facto reference for aligning tape recorders — thus the compatibility issue worsened even further. In 1971 it was tackled by the Deutsches Institut für Normung (DIN), which set the standard for chromium dioxide tapes. In 1978 the International Electrotechnical Commission (IEC) enacted the comprehensive standard on cassette tapes (IEC 60094). One year later the IEC mandated the use of notches for automatic tape type recognition. Since then, the four cassette tape types were known as IEC I, IEC II, IEC III and IEC IV. The numerals follow the historical sequence in which these tape types were commercialized, and do not imply their relative quality or intended purpose.

An integral part of the IEC 60094 standard family is the set of four IEC reference tapes. Type I and Type II reference tapes were manufactured by BASF, Type III reference tapes by Sony, and Type IV reference tapes by TDK. Unlike consumer tapes, which were manufactured continuously over the years, each reference tape was made in a single production batch by the IEC-approved factory. These batches were made large enough to fill the need of the industry for many years. A second run was impossible, because chemists were unable to replicate the reference tape type formulation with proper precision. From time to time, the IEC revised the set of references; the final revision took place in April 1994. The choice of reference tapes, and the role of the IEC in general, has been debated. Meinrad Liebert, designer of Studer and Revox cassette decks, criticized the IEC for failing to enforce the standards and lagging behind the constantly changing market. In 1987, Liebert wrote that while the market clearly branched into distinct, incompatible "premium" and "budget" subtypes, the IEC tried in vain to select an elusive "market average"; meanwhile, the industry moved forward, disregarding outdated references. This, according to Liebert, explained sudden demand for built-in tape calibration tools that were almost unheard-of in the 1970s.

From the end user viewpoint, the IEC 60094 defined two principal properties of each tape type:


 * Bias level for each type was set equal to the optimal bias of the relevant IEC reference tape, and sometimes changed when the IEC changed the reference tapes, though the BASF datasheet for the Y348M tape, approved as the IEC Type I reference in 1994, says that its optimal bias is exactly 0.0 dB from the previous reference (BASF R723DG). The IEC reference tape bias definition is: Using the relevant IEC reference tape and heads according to Ref. 1.1, the bias current providing the minimum third harmonic distortion ratio for a 1 kHz signal recorded at the reference level is the reference bias setting. Type II bias ('high bias') equals around 150% of Type I bias, Type IV bias ('metal bias') equals around 250% of Type I bias. Real cassette tapes invariably deviate from the references and require fine tuning of bias; recording a tape with improper bias increases distortion and alters frequency response. A 1990 comparative test of 35 Type I tapes showed that their optimal bias levels were within $1 dB$ of the Type I reference, while Type IV tapes deviated from the Type IV reference by up to $3 dB$. Some typical cassette deck frequency response curves showing the effects of different bias settings are provided in the relevant figure.
 * Time constant of replay equalization (often shortened to EQ) for Type I tapes equals $120 μs$, as per the Philips specification. The time constant for Type II, III and IV tapes is set at a lower value of $70 μs$. The purpose of replay equalization is to compensate for high-frequency losses during recording, which, in case of ferric cassettes, usually start at around 1–1.5kHz. The choice of time constant is a somewhat arbitrary decision, seeking the best combination of conflicting parameters — extended treble response, maximum output, minimum noise and minimum distortion. High-frequency roll-off that is not fully compensated in the replay channel may be offset by pre-emphasis during recording. Lower replay time constants decrease the apparent level of hiss (by 4dB when stepping down from 120 to $70 μs$), but also decrease apparent high-frequency saturation level, so the choice of time constants was a matter of compromise and debate. "Hard" maximum and saturation levels, in terms of voltage output of playback head, remain unchanged. However, the high-frequency voltage level at the output of the replay equalizer decreases with a decrease in time constant. The industry and the IEC decided that it would be safe to decrease the time constant of Type II, III and IV tapes to $70 μs$, because they are less prone to high-frequency saturation than contemporary ferric tapes. Many disagreed, arguing that the risk of saturation at $70 μs$ is unacceptably high. Nakamichi and Studer complied with the IEC, but provided an option for playing Type II and Type IV tapes using the $120 μs$ setting and matching pre-emphasis filters in the recording path. A similar pre-emphasis was applied by duplicators of prerecorded chromium dioxide cassettes; although loaded with Type II tape, these cassettes were packaged in Type I cassette shells and were intended to be replayed as Type I tapes.

Type I
Type I, or IEC I, ferric or 'normal' cassettes were historically the first, the most common and the least expensive; they dominated the prerecorded cassette market. The magnetic layer of a ferric tape consists of around 30% synthetic binder and 70% magnetic powder — acicular (oblong, needle-like) particles of gamma ferric oxide (γ-Fe2O3), with a length of $0.2 μm$ to $0.75 μm$. Each particle of such size contains a single magnetic domain. The powder was and still is manufactured in bulk by chemical companies specializing in mineral pigments for the paint industry. Ferric magnetic layers are brown in colour, whose shade and intensity depends mostly on the size of the particles.

Type I tapes must be recorded with 'normal' (low) bias flux and replayed with a $120 μs$ time constant. Over time, ferric oxide technology developed continuously, with new, superior generations emerging around every five years. Cassettes of various periods and price points can be sorted into three distinct groups: basic coarse-grained tapes; advanced fine-grained, or microferric, tapes; and highest-grade ferricobalt tapes, having ferric oxide particles encapsulated in a thin layer of cobalt-iron compound. Ferricobalt tapes are often called 'cobalt doped', however, this is historically incorrect. Cobalt doping in a strict sense involves uniform substitution of iron atoms with cobalt. This technology has been tried for audio and failed, losing to chromium dioxide. Later, the industry has chosen the far more reliable and repeatable process of cobalt adsorption — encapsulation of unmodified iron oxide particles in a thin layer of cobalt ferrite.

The remanence and squareness properties of the three groups substantially differ, while coercivity remains almost unchanged at around $380 Oe$ ($360 Oe$ for the IEC reference tape approved in 1979 ). Quality Type I cassettes have higher midrange MOL than most Type II tapes, slow and gentle MOL roll-off at low frequencies, but less high-frequency headroom than Type II. In practice, that means that ferric tapes have lower fidelity compared to chrome tapes and metal tapes at high frequencies, but are often better at reproducing the low frequencies found in bass-heavy music.

Basic ferric


Entry-level ferric formulations are made of pure, unmodified, coarse-grained ferric oxide. Relatively large (up to $0.75 μm$ in length), irregularly-shaped oxide particles have protruding branches or dendrites; these irregularities prevent tight packing of particles, reducing iron content of the magnetic layer and, consequently, its remanence (1300–1400G) and maximum output level. The squareness ratio is low, around 0.75, resulting in early but smooth onset of distortion. These tapes, historically labeled and sold as 'low noise', have high levels of hiss and relatively low sensitivity; their optimal bias level is 1–2dB lower than that of the IEC reference tape.

This group also includes most of the so-called 'Type 0' cassettes — a mixed bag of ferric tapes that do not meet the IEC standard or the original Philips specification. Historically, informal 'Type 0' denoted early cassettes loaded with tape designed for reel-to-reel recorders. In the 1980s, many otherwise decent and usable basic tapes were effectively demoted to 'Type 0' status when equipment manufacturers began aligning their decks for use with premium ferricobalts (the latter having much higher sensitivity and bias). In the 21st century, 'Type 0' denotes all sorts of low-quality, counterfeit or otherwise unusable cassettes. They require unusually low bias, and even then only a few of them perform on par with quality Type I tapes. A 'Type 0' tape, if it is usable at all, is incompatible with Dolby noise reduction: with the Dolby decoder engaged, the tape sounds dull, resulting from its poor sensitivity causing severe Dolby mistracking.

Microferric
In the beginning of the 1970s, gradual technological improvements over the previous decade resulted in the second generation of Type I tapes. These tapes had uniformly needle-shaped, highly orientable particles (HOP) of much smaller size, around $0.25 μm$ in length, hence the trade term microferrics. Their uniform shape allowed very dense packing of particles, with less binder and more particles per unit volume, and a corresponding rise in remanence to around $1,600 G$. The first microferric (TDK SD) was introduced in 1971, and in 1973 Pfizer began marketing patented microferric powder that soon became an industry standard. In the 20th century, Pfizer had a strong mineral pigment division, with factories in California, Illinois and Indiana. In 1990 Pfizer sold its iron-oxide business to Harrisons & Crosfield of the United Kingdom. The next step was to align needle-shaped particles in parallel with the flux lines generated by the recording head; this was done by controlled flow of liquid magnetic mix over substrate (rheological orientation), or by applying a strong magnetic field while the binder was curing.

Typical microferric cassettes of the 1980s had less hiss and at least $2 dB$ higher MOL than basic Type I tapes, at the cost of increased print-through. Noise and print-through are interrelated, and directly depend on the size of oxide particles. A decrease in particle size invariably decreases noise and increases print-through. The worst combination of noise and print-through occurs in highly irregular formulations containing both unusually large and unusually small particles. Small improvements continued for thirty years, with a gradual rise of squareness ratio from 0.75 to over 0.9. Newer tapes consistently produced higher output with less distortion at the same levels of bias and audio recording signals. The transition was smooth; after the introduction of new, superior tape formulations, manufacturers often kept older ones in production, selling them in different markets or under different, cheaper, designations. Thus, for example, TDK ensured that its premium microferric AD cassette was always ahead of entry-level microferric D, having finer particles and lower noise.

Ferricobalt Type I
The third, and best performing, class of ferric tapes is made of fine ferric particles encapsulated in a thin $30 Å$ layer of cobalt-iron mix, similar in composition to cobalt ferrite. The first cobalt-doped cassettes, introduced by 3M in 1971, had exceptionally high sensitivity and MOL for the period, and were an even match for contemporary chromium dioxide tapes — hence the trade name superferrics. Of many competing cobalt-doping technologies, the most widespread was low-temperature encapsulation of ferric oxide in aqueous solution of cobalt salts with subsequent drying at 100–150°C. Encapsulated microferric particles retain needle-like shape and can be tightly packed into uniform anisotropic layers. The process was first commercialized in Japan in the early 1970s.

The remanence of ferricobalt cassettes is around $1,750 G$, resulting in around $4 dB$ gain in MOL and 2–3dB gain in sensitivity compared to basic Type I tapes; their hiss level is on par with contemporary microferric formulations. The dynamic range of the best ferricobalt cassettes (true superferrics) equals 60–63dB, and the MOL at lower frequencies exceeds the MOL of Type IV tapes. Overall, superferrics are a good match to Type IV, especially in recording acoustical music with a wide dynamic range. This was reflected in the price of top-of-the-line superferric tapes like Maxell XLI-S or TDK AR-X, which by 1992 matched the price of 'entry-level' metal tapes.

Type II
IEC Type II tapes are intended for recording using high (150% of normal) bias and replay with the 70μs time constant. All generations of Type II reference tapes, including the 1971 DIN reference that pre-dated the IEC standard, were manufactured by BASF. Type II has been historically known as 'chromium dioxide tape' or simply 'chrome tape', but in reality most Type II cassette tapes do not contain chromium. The "pseudochromes" (including almost all Type IIs made by the Big Three Japanese makers — Maxell, Sony and TDK) are actually ferricobalt formulations optimized for Type II recording and playback settings. A true chrome tape may have a distinctive 'old crayon' smell, more specifically, any oil or wax chalks that have chrome dioxide pigments in them like chrome yellow, which is missing in "pseudochromes". Both kinds of Type II tapes have, on average, lower high-frequency MOL and SOL, and higher signal-to-noise ratio than quality Type I tapes. This is caused by the midrange and treble pre-emphasis applied during recording to match the 70μs equalization at playback.

Chromium dioxide
In the middle of 1960s, DuPont created and patented an industrial process for making fine ferromagnetic particles of chromium dioxide (CrO2). The first CrO2 tapes for data and video appeared in 1968. In 1970, BASF, who would become the main proponent of CrO2, launched its chrome cassette production; in the same year Advent introduced the first cassette deck with chrome capability and Dolby noise reduction. The combination of low noise CrO2 tape with companding noise reduction brought a revolutionary improvement to compact-cassette sound reproduction, almost reaching the high fidelity level. However, CrO2 tape required redesign of the bias and replay equalization circuitry. This problem was resolved during the 1970s, but three unsolved issues remained: the cost of making CrO2 powder, the cost of royalties charged by DuPont, and the pollution effects of hexavalent chromium waste.

The reference CrO2 tape, approved by the IEC in 1981, is characterized by coercivity of $490 Oe$ (high bias) and remanence of $1,650 G$. Retail CrO2 cassettes had coercivity in the range from 400 to $550 Oe$. Owing to the very 'clean', uniform shape of the particles, chrome tapes easily attain an almost perfect squareness ratio of 0.90. 'True chromes', not modified by the addition of ferric additives or coatings, have very low and euphonic hiss (bias noise), and very low modulation noise at high frequencies. Double-layer CrO2 cassettes have the lowest absolute noise among all the audio formulations; these cassettes generate less noise at $4.76 cm/s$ than a ferric tape at $19.05 cm/s$. Their sensitivity is usually also very high, but MOL is low, on par with basic Type I tapes. CrO2 tape does not tolerate overload very well: the onset of distortion is sharp and dissonant, so recording levels should be set conservatively, well below MOL. At low frequencies, the MOL of CrO2 tapes rolls off faster than in ferric or metal tapes, hence the reputation of 'bass shyness'. CrO2 cassettes are best fit for recording dynamic music with rich harmonic content and relatively low bass levels; their dynamic range is a good fit for recording from uncompressed digital sources and for music with extended quiet passages. Good ferric tapes may have the same or higher treble SOL, but CrO2 tapes still sound subjectively better owing to lower hiss and modulation noise.

Ferricobalt Type II


After the introduction of CrO2 cassettes, Japanese companies began developing a royalty-free alternative to DuPont's patent, based on an already established cobalt doping process. A controlled increase in cobalt content causes an almost linear increase in coercivity, thus a Type II "pseudochrome" tape can be made by simply adding around 3% cobalt to a Type I ferricobalt tape. By 1974 the technology was ready for mass production, and TDK and Maxell introduced their classic "pseudochromes" (TDK SA and Maxell UD-XL), while killing their true chrome lines (TDK KR and Maxell CR). By 1976, ferricobalt formulations took over the video tape market, and eventually they became the dominant high-performance tape for audio cassette. Chromium dioxide disappeared from the Japanese domestic market, although chrome remained the tape of choice for high fidelity cassette duplication among the music labels. In consumer markets, chrome coexisted as a distant second with "pseudochromes" until the very end of the cassette era. Ferricobalt technology developed continuously: in the 1980s Japanese companies introduced 'premium' double-layered ferricobalts with exceptionally high MOL and SOL; in the middle of the 1990s TDK launched the first and only triple-coated ferricobalt, the SA-XS.

The electromagnetic properties of Type II ferricobalts are very close to those of their Type I cousins. Owing to the use of $70 μs$ replay equalization, the hiss level is lower, but so is the treble saturation level. The dynamic range of Type II ferricobalts, according to the 1990 tests, lies between 60 and 65dB. The coercivity of 580–700Oe and remanence of 1300–1550G are close to the CrO2 reference tape, but the difference is big enough to cause compatibility problems. TDK SA was the informal reference in Japan. TDK advertisements boasted that "more decks are aligned to SA than any other tape", but there is very little first-hand information on which tapes were actually used at the factories. Japanese manufacturers provided lists of recommended tapes but did not disclose their reference tapes. There is, however, enough indirect information converging on TDK SA. For example, in 1982, when Japanese-owned Harman Kardon sent samples for Dolby certification, they were aligned to the IEC CrO2 reference. However, production copies of the same models were aligned to TDK SA. Since the Japanese already dominated both the cassette and hi-fi equipment markets, incompatibility further undermined the market share of European-made cassette decks and CrO2 cassettes. In 1987, the IEC resolved the compatibility issue by appointing a new Type II reference tape U 564 W, a BASF ferricobalt with properties that were very close to contemporary TDK tapes. With the short-lived 1988 Reference Super, even BASF started the manufacture and sale of Type II ferricobalt tapes.

Metal particle Type II
The coercivity of iron-cobalt metal particle mix, precipitated from aqueous solutions, depends on the cobalt content. A change in cobalt content from 0% to 30% causes a gradual rise in coercivity from around $400 Oe$ (Type I level) to $1,300 Oe$ (Type IV level); alloyed iron-cobalt particles can reach a coercivity of $2,200 Oe$. This makes possible manufacturing of metal particle tapes conforming to Type II and even Type I biasing requirements.

In practice, only Denon, Taiyo Yuden, and, for only a few years, TDK, ever attempted making Type II metal tape. These rare expensive cassettes were characterized by high remanence, approaching that of Type IV tapes ($2,600 G$); their coercivity of $800 Oe$ was closer to Type II than Type IV tapes, but still quite far from either type reference. Independent tests of the 1990 Denon and Taiyo Yuden tapes placed them at the very top of the Type II spectrum — if the recording deck could cope with unusually high sensitivity and provide unusually high bias current.

Type III
In 1973, Sony introduced double-layer ferrichrome tapes having a five-micron ferric base coated with one micron of CrO2 pigment. The new cassettes were advertised as 'the best of both worlds' — combining the good low-frequency MOL of microferric tapes with good high-frequency performance of chrome tapes. The novelty became part of the IEC standard, codenamed Type III; the Sony CS301 formulation became the IEC reference. However, the idea failed to attract followers. Apart from Sony, only BASF, Scotch and Agfa introduced their own ferrichrome cassette tapes.

These expensive ferrichrome tapes never gained substantial market share, and after the release of metal tapes they lost their perceived exclusivity. Their place in the market was taken over by superior and less expensive ferricobalt formulations. By 1983, tape deck manufacturers stopped providing an option for recording Type III tapes. Ferrichrome tape remained in the BASF and Sony lineups until 1984 and 1988, respectively.

The use of ferrichrome tapes was complicated by the conflicting rationale of the playback of these tapes. Officially, they were intended to be played back using $70 μs$ equalisation. The information leaflet that Sony included in each box of ferrichrome cassette tapes recommended that, "If the selector has two positions, NORMAL and CrO2, set it to the NORMAL position." (which applies $120 μs$ equalisation). The leaflet notes that the high frequency range will be enhanced and that the tone control should be adjusted to compensate. The same leaflet recommends that if the playback machine offers a 'Fe-Cr' selection, that this should be selected. On Sony's machines, this automatically selects $70 μs$ equalisation. The service manual for the Sony TC-135SD, which was one of the few cassette decks offering a 'Fe-Cr' position, shows the tape type selector switch paralleling the ferrichrome equalisation selection with that of chrome dioxide ($70 μs$). Neither Sony nor BASF cassette tapes feature the notches on the back surface that automatically select $70 μs$ equalisation on those machines that featured an automatic detection system.

Metal particle Type IV


Pure metal particles have an inherent advantage over oxide particles due to 3–4times higher remanence, very high coercivity and far smaller particle size, resulting in both higher MOL and SOL values. First attempts to make metal particle (MP) tape, rather than metal oxide particle tape, date back to 1946; viable iron-cobalt-nickel formulations appeared in 1962. In the early 1970s, Philips began development of MP formulations for the Compact Cassette. Contemporary powder metallurgy could not yet produce fine, submicron size particles, and properly passivate these highly pyrophoric powders. Although the latter problems were soon solved, the chemists did not convince the market in terms of the long-term stability of MP tapes; suspicions of inevitable early degradation persisted until the end of the cassette era. The fears did not materialize, and most metal particle tapes survived decades of storage just as well as Type I tapes; however, signals recorded on metal particle tapes do degrade at about the same rate as in chromium tapes, around 2dB over the estimated lifetime of the cassette.

Metal particle Compact Cassettes, or simply 'metal' tapes, were introduced in 1979 and were soon standardized by the IEC as Type IV. They share the same $70 μs$ replay time constant as Type II tapes, and can be correctly reproduced by any deck equipped with Type II equalization. Recording onto a metal tape requires special high-flux magnetic heads and high-current amplifiers to drive them. Typical metal tape is characterized by remanence of 3000–3500G and coercivity of 1100Oe, thus its bias flux is set at 250% of Type I level. Traditional glass ferrite heads would saturate their magnetic cores before reaching these levels. "Metal capable" decks had to be equipped with new heads built around sendust or permalloy cores, or the new generation of glass ferrite heads with specially treated gap materials.

Metal particle tapes, particularly top-of-the-line double coated tapes, have record high midrange MOL and treble SOL, and the widest dynamic range coupled with the lowest distortion. They were always expensive, almost exclusive, out of reach of most consumers. They excel at reproducing fine nuances of uncompressed acoustic music, or music with very high treble content, like brass and percussion. However, they need a high quality, properly aligned deck to reveal their potential. First-generation metal particle tapes were consistently similar in their biasing requirements, but by 1983 newer formulations drifted away from each other and the reference tape.

Metal evaporated
Unlike wet coating processes, metal evaporated (ME) media are fabricated by physical deposition of vaporized cobalt or cobalt-nickel mix in a vacuum chamber. There is no synthetic binder to hold particles together; instead, they adhere directly to polyester tape substrate. An electron beam melts source metal, creating a continuous directional flow of cobalt atoms towards the tape. The zone of contact between the beam and the tape is blown with a controlled flow of oxygen, which helps formation of polycrystalline metal-oxide coating. A massive liquid-cooled rotating drum, which pulls the tape into the contact zone, protects it from overheating.

Metal evaporated coatings, along with barium ferrite, have the highest information density of all rerecordable media. The technology was introduced in 1978 by Panasonic, initially in the form of audio microcassettes, and matured through the 1980s. Metal evaporated media established itself in analogue (Hi8) and digital (Digital8, DV and MicroMV) videotape market, and data storage (Advanced Intelligent Tape, Linear Tape Open). The technology seemed promising for analogue audio recording; however, very thin metal evaporated layers were too fragile for consumer cassette decks, the coatings too thin for good MOL, and manufacturing costs were prohibitively high. Panasonic Type I, Type II and Type IV metal evaporated cassettes, introduced in 1984, were sold for only a few years in Japan alone, and remained unknown in the rest of the world.

Measured performance characteristics




During the many years that cassette decks were popular, many audio magazines published comparative measurements of the performance characteristics of the wide variety of different tapes that were available in the marketplace. These measurements typically included parameters such as MOL, SOL, frequency response at 0-dB and −20-dB re Dolby Level, signal-to-noise ratio, modulation noise, bias level, and sensitivity. The first figure shows frequency response plots for sample TypeI, TypeII, and TypeIV cassette tapes comparing their MOL, SOL, and 0-dB performance.

The second figure shows the frequency response performance of typical TypeI, TypeII, and TypeIV cassette tapes, obtained for a number of different input signal levels, using a high quality Pioneer CT-93 stereo cassette deck from the 1990s. For each of the three tape formulations, the record/replay characteristics of the cassette deck were aligned with the relevant IEC Reference Tape, and each tested tape was measured with the bias and equalization unchanged from that reference position. The record/replay frequency response was tested at four levels: +6VU, 0VU, −10VU and −20VU (Dolby Level is marked at +3VU for the CT-93). Thus, these plots provide data on the linearity of the different tape formulations at both high and moderate recording levels. It is interesting to note that the TypeI tape shows +6VU and at 0VU responses that are much flatter than that of the TypeII tape. At +6VU, the TypeII tape displays significant amounts of signal level compression across the entire frequency range, reducing to about 2dB of signal compression between 80Hz and 1kHz.

Some representative measured performance characteristics of a small number of commercially available tape types are presented in the table below.