|
JOSEF DORNER |
|
From 15 MilliMaxwell
to 1,200 NanoWebers |
|
Need an explanation?
Here's a look at the evolution of fluxivity and level standards. |
|
An EYE-CATCHING HEADLINE to attract or confuse
the reader? Well, certainly not the latter, because one is often
confused enough when trying to understand the data printed on
the specification sheets accompanying many tape recorders. We
find distortion performance and signal-to-noise ratios referred
to 185, 200, 250, or even 370nWb/m. In some cases, flux values
as high as 1,000 and even 1,040 nWb/m are mentioned, while Europeans
use such odd values as 320 and 514 nWb/m for reference fluxivity.
Where do all these different values come from? Maybe some light
can be shed on this matter by looking back into the history
of magnetic recording.
A STANDARD IS CREATED
Let's go back about 30 years, to a time when Geman technicians
were already talking about a standard tape flux, well before
their cohorts on the other side of the Atlantic were. Their
definition read something like this: "...for the purpose
of program exchange, a reference value for remanent tape magnetization
has to be established. When using general purpose tapes, this
level shall be approximately 6 dB below maximum output level.
(In reality, the span was only 4 dB at that time- author). Fortissimo
passages shall modulate the tape up to that reference level.
This is of importance for the purpose of program exchange. Only
in applications where program exchange is not a criterion, modulation
up to 3 percent of third-harmonic distortion may be tolerated;
this is in order to achieve a higher signal-to-noise ratio and
better utilization of the tape. For class 38 (15 ips) the reference
level is set to 200 milliMaxwell and for class 19 (7.5 ips)
to 160 milliMaxwell." (Draft for DIN 45 513).[1] Remember
this dates back to 1955!
In America, all that was known at that time, as far as a recording
standard was concerned, was the calibration tape made by a well-known
manufacturer of magnetic tape recording equipment (Ampex) with
a reference level recorded on it, which was named the "Operating
Level." That Operating Level was used to calibrate the
VU-meter to obtain a 0 VU deflection. By digging a little deeper,
one was able to learn that this Operating Level corresponded
to 1 percent of third-harmonic distortion on the then most widely
used (general purpose) recording tape in the USA.
MOL AND THE VU METER
At this point it may be of interest to note that this general
purpose tape produced 3 percent of third-harmonic distortion
when it was modulated to a point some 6 dB above Operating Level.
Many a studio (particularly some European studios) - where the
VU-meter started to make its appearance in the early '60s -
may have been misled by this fact to think that a VU-meter has
to be operated with a 6 dB lead. By 1966, however, the Deutsche
Industrie Normen (DIN) had already recognized that this was
not quite correct, because it is stated in the explanatory note
accompanying DIN 45 406 that "....on average the lead required
is about 8 dB (8 VU). Deviations from this average by ±5
dB, however, are not exceptional."
If one compares this with the old RETMA TR 105 B standard (1951)
for Audio Facilities for Broadcasting Systems, one can read
the following in section V.2.a: "If a VU-meter is incorporated,
it shall remain as normally connected, and its multiplier shall
remain set for a signal which is 10 dB below standard output
level" (Standard output level is +18 dBm).
Can one not conclude from this that signal peaks, as recorded
on tape, produced flux values up to some 8 to 10 dB beyond the
1 percent distortion level, in other words, far in excess of
the 3 percent distortion point? Yes, because in 1965 the NAB
standard for reel-to-reel recordings has the following to say
in a footnote to section 2.04, which relates to the standard
reference program level: "It is well established that at
least 10 dB margin is required between the sine wave load handling
capacity of a system and the level of program material as measured
by a standard volume indicator." The NAB standard reference
level is described in section 2.03 with a footnote which reads
as follows: "The recording was made... at an output level
8 dB below that which produces 3 percent third-harmonic distortion."
(This is not contradictory to the above statement because it
simply defines a level of tape magnetization which is to serve
as a reference.) So, where do we go from here?
THE AMERICAN REFERENCE
FLUX
Fortunately, John McKnight in the United States seemed to have
been bothered by this lack of a precise value for recorded tape
flux. As a consequence, he investigated this situation and prepared
his findings for publication in the Journal of the Audio Engineering
Society.[2] A reference flux of 100 nWb/m is mentioned or suggested
in that investigation, and one reads for the first time 210
nWb/m for the earlier discussed Operating Level and 165 nWb/m
for the NAB Standard Reference Level. Later on, these values
were downward corrected slightly, and from a 1972 data sheet
of a manufacturer of calibration tapes, one can read 185 nWb/m
for the Operating Level and 150 nWb/m for the NAB Reference
Level.
At this point, we should pause to take a closer look at the
units of measurement.
UNITS OF MEASUREMENT
NanoWeber-per-meter is the value of fluxivity that would be
measured if the tape was 1 meter (or approximately 39 3/8 inches
wide. Reducing this to a more realistic width, namely 1mm (or
39 mil), the unit became picoWeber-per-millimeter, which was
0.1 milliMaxwell per millimeter in the days before the ST units
came into force. In the case of the NAB Reference Level, the
result is 15 mM/mm, which explains one of the values mentioned
in the title of this article.
Since we are already doing some calculations, let's look at
the previously mentioned German reference of 200 milliMaxwell
for 1/4-inch tape. If we divide that figure by the metric equivalent
of 1/4 inch, which is 6.25 mm (tapes today are 6.3mm wide),
then we get the figure of 32 mM/mm. Converting this to nanoWebers,
we arrive at the standard 320 nWb/m.
It may be worth mentioning at this point that in a comparison
of U.S. and European levels one must be aware of the fact that
the ANSI S 4.6 method of measuring remanent flux yields a value
which is lower by 0.8 dB, as compared with a measurement performed
in accordance with DIN 45 520. In practice, this means that
when comparing calibration tapes of U.S. and European origin,
the U.S. tape will yield a higher signal level because what
is 200 nWb/m in the U.S. would measure 220 nWb/m in Europe.
(This also explains the previously cited downward correction
from 165 to 150 nWb/m.)
STEREO-MONO COMPATIBILITY
After this digression into levels and their history, let's continue
on. Magnetic oxides were improved over the years, making higher
levels of magnetization possible without adversely affecting
distortion performance. This made it feasible to raise the operating
level (0 VU) to 250 nWb/m for the so-called High Output tapes.
In Europe (more precisely in Germany), the advent of stereo
made those exacting engineers reach for their slide rules, because
stereo/mono level compatibility was their goal. Music productions
were already recorded in stereo, yet broadcasts were still in
mono. Such a stereo recording, when played back on a full track
head, did not produce the same signal level as that which resulted
when playing a mono recording; there was some unused, unmagnetized
"land" between the stereo tracks, and left and right
signals were not adding up algebraically. One can live with
reduced cross-talk performance in stereo, so the tracks were
widened until they were spaced only 0.75mm apart, making each
track 2.75mm wide. As a result of this, the core sections of
the head spread out at an angle to accommodate the windings.
With this, the Butterfly Head was born (see FIGURE 1).
|
|
|
Figure 1. Butterfly
Head |
The tape's width was utilized to a possible maximum,
but stereo/mono level compatibility was still not reached. A
few quick calculations and one can see that a stereo recording
has to be modulated to 514 nWb/m in order to produce the same
signal level as that which is obtained from a 320 nWb/m mono
recording when playing the stereo tape on a monophonic reproducer.
Total flux, mono on 1/4-inch (6.25mm) tape:
320 nWb/m x 6.25 = 2000 nWb/m
Stereo played on full track head:
The goal was reached: The fader on the mixing desk did not have
to be moved, regardless of whether mono or stereo recordings
were played! At the time it was a bit strange, perhaps, to see
blank tape appearing on the market which was labelled "stereo,"
though this simply meant that such a tape could be modulated
to the higher stereo level without any increase in distortion.
Stereo/mono compatibility - which is not of much interest anymore
- is thus explained, but what about universal compatibility
of recorded levels in general?
VU VS. PPM AND PEAK FLUXIVITY
In America the VU-meter is still favored while in Europe the
peak program meter (PPM) is predominant. The performance characteristics
of the latter are described and specified in IEC 280-10 and
in DIN 45 506. It is a quick-acting meter, and because of this,
it is also called a "quasi peak-reading meter." However,
as suggested by the word quasi, it is not a true peak-indicating
device. A closer examination of its characteristic behavior
suggests that short modulation peaks may overshoot by 1 to 4
dB.[3]
A graphic presentation (FIGURE 2) of the maximum output level
performance (MOL) of various tapes, including the most modern
oxides, shows how tape performance has improved over the years.
The point of maximum modulation, which is universally considered
to be the level at which the third-harmonic distortion content
measures 3 percent,[4] has shifted gradually to higher flux
values, with 1,200 nWb/m being reached by at least one state-of-the-art
tape. This explains the second figure in the title. Quite a
wide range from the NAB reference of 150 nWb/m via the high-output
operating reference to the German DIN levels for mono and for
stereo, up to the MOL which is possible today.
|
|
Figure 2 Maximum Output Level (MOL) performance of magnetic
recording tapes at 15ips. O= Theoretical peak flux values when
aligning VU meter or PPM as described in text The dashed line
(1955} represents the performance of the old U.S tape if flux
values were measured in accordance with DIN. |
Attempts to establish references of even figures
have been repeatedly made. For example, there is the EIA standard
RS-400/1972 containing a reference to CCIR 79-1/1966 at which
time the value of 100 nWb/m was recommended, and in more recent
times, one finds 400 nWb/m mentioned in a newer EIA standard.
But all this is of little help to a studio's maintenance engineer
when faced with the decision of how he should calibrate his
level meters. So, in analyzing this historical retrospect it
comes almost as an automatic conclusion that 250 nWb/m (or even
320 nWb/m) would be a good reference for calibrating a VU-meter
to its 0 VU reference deflection, as it would allow the modulation
peaks to reach up to 800 or 1,000 nWb/m. In the case of a quasi
peak-reading meter or PPM, however, the 510 nWb/m (or 500 for
simplicity's sake, being twice 250) would be an equally good
reference because its assumed 4 dB overshoot would again result
in a peak magnetization in the range of 800 nWb/m, still well
below the accepted MOL of 3 percent third-harmonic distortion.
It's up to the individual engineer's discretion, of course,
as to how hard he intends to drive his tape into saturation.
It should be borne in mind, however, that for every dB gained
in signal-to-noise, one must pay with a disproportional increase
in distortion, a fact which was discovered long ago by a pioneer
in the development of new recording techniques.[5]
Analog recording may still be around for a while, and so it
is hoped that useful conclusions can be drawn from this article
which help to ensure that the inherent quality of analog is
not given away unwisely, as may all too often be the case. |
|
References
1. |
Krones, Dr. F. "Herstellung und elektroakustische
Eigenschaften der AGFA Magnetbander, Filme und Bezugsbänder."
Sonderdruck aus den Forschungslaboratorien der AGFA Leverkusen.
Band I. (Nov. 1955), Seite 304. |
2. |
McKnight. John G. "Absolute Flux and Frequency
Response Characteristics in Magnetic Recording."
Preprint 447. 31st AES Convention. Oct. 1966. Published
in revised form: Journal of SMPTE Vol. 78. (June 1969),
pp 457-472. |
3. |
Silver. Sidney L. "VU-Meters vs Peak Program Meters."
db (Jan. 1980). pp 46-49. |
4. |
DIN 45 511. "IEC Draft 94-5." NAB Magnetic
Recording And Reproducing Standards (1965), Section 2.11:
Distortion. |
5. |
Langevin, Robert Z. "Intermodulation Distortion
in Tape Recording." Journal AES (July 1963). Vol.
11. pp 270-278. |
|
|
aus: db magazine July 1984 pp
36-38
|
|
|
|
|
Output Level vs.
Flux (taken from the Studer 820 Service-Manual) |
|