Informal History of HiFi

You don't know where you're going until you know where you've been.

-- Myself

 

In the Beginning was the Wax Cylinder

The earliest method of sound reproduction was developed by Thomas Edison as an outgrowth of another project. This one involved the storage of the semi-digital Morse code for telegraph. The device consisted of a metal disk with holes drilled in it to encode the "dits and dahs" of a message for mass distribution. Consider it the first hard drive. While experimenting on this device, Edison accidentally connected it to a higher voltage than its drive motor was designed for. The much higher speed of the disk caused its contact point to vibrate with a whining sound. The mistake with the voltage of the drive motor was the beginning of sound reproduction, as improbable as that seems.

One thing about Thomas Edison: he was very weak when it came to theory. His dictum, "Genius is one percent inspiration and ninety-nine percent perspiration" summed up his entire approach. This was the diametrical opposite of Nikola Tesla, whose philosophy was that if it didn't work right the first time, the designer had not done his job. A look at Edison's work shows that he often relied on all sorts of "folk wisdom". What Edison lacked in theoretical understanding, he more than compensated with sharp observational skills. Very little ever escaped his attention, and Edison drew new ideas from even the most mundane observations. Thus it was with the telegraphic "hard drive" and the humming contact points. What anyone else would have either not noticed, or would have dismissed and promptly forgotten, Edison not only noticed, but developed an idea. That idea being that if a contact point and a metal disk could make sound, then a similar arrangement just might be able to record and reproduce sound, and not just "ones 'n' zeros".

For this new experiment, a cylinder with fine threads was cut and attached to a handle, and a carrier on a screw with the same pitch ran over the cylinder. Attached to this was a horn that drove a thin metal disk with attached stylus that impressed the vibrations on the metal foil. While cranking the handle, sound entering the horn moved the stylus to make indentations on a thin metal foil around the cylinder. This recording could be played back simply by reversing the process. Except for a minor faux pas that ripped the first sheet of foil, the device worked astonishingly well, and played back Edison's recitation of Mary Had a Little Lamb. Much attention followed, and this became the basis for many a dictating machine. The metal foil cylinders were replaced by hard wax (for "Dictaphones") and later by the shellac platter. The later version evolving into the "Victrola" record player. The recording and playing of music was, of course, purely mechanical. For the next couple of decades, the recording involved the musicians playing into huge horns that directly drove the cutter head. Play back was via a steel needle attached to a heavy hollow tone arm that conducted the sound to a wooden horn. The turn table was driven by a powerful coil spring and a "clock work". Sound quality left much to be desired, as there was no compensation for frequency response, no compensation for wow and flutter, and the groove had a large pitch to accommodate the large excursions of the cutter head. You don't get much at either end of the audio band: limited treble and bass, even with the 78rpm standard that would persist for the next half-century.

Audio Goes Electronic: Progress Never so Rapid

The next advance is also indirectly due to Edison. Two years later, Edison began work on the incandescent light. Here, the A Number One problem is the need to have something which is conductive, can be brought to a dazzling white heat without melting, and which won't instantly burn up. At first, Edison tried platinum wires and foils. Platinum is barely adequate for the task (refractory metals such as molybdenum, tungsten, or osmium weren't available). It could work on the edge of melting and give an adequate light, but there was still the problem of burn-outs in an oxygen atmosphere. To solve that problem, Edison tried working the platinum filaments in a vacuum. The old style mechanical hand pumps could take the pressure down to one millimeter of mercury, good enough for a proof of concept, but still not good enough for a practical lamp with a decent lifetime. To solve this problem, Edison could have used an inert atmosphere. If the bulb was pumped down, baked out, and then filled with hydrogen two or three times, the platinum filament lamp would have worked since hydrogen is inert to platinum. The poor quality of the piston hand pumps wouldn't've mattered since three pump downs with a hydrogen fill would get the original atmosphere down to an effective pressure of 2.3 X 10-9mm/Hg (more than good enough). However, Edison became fixated on vacuum light bulbs. Since platinum is not so refractory, and even the improved Sprengle vacuum pumps not very good, Edison turned to graphite filaments made from the pyrolysis of natural fibres, mainly bamboo.

Here, the natural tendency would be to declare the graphite filament lamp a rousing success and be done with it. However, Edison was not one to do that, and so research continued for improvement. There was one remaining problem: graphite evaporation in the high vacuum led to sooty deposits on the glass. Since these lamps were DC, Edison discovered that graphite was migrating from the negative end of the filament to the positive end. This led him to try to repel these deposits back to the filament by including a metal plate. Edison observed that current would flow between the filament if the plate was positive, but not negative. This was called the "Edison Effect", since the electron had not yet been discovered as the source of negative charge. Uncharacteristically, Edison did nothing with this discovery. Of course, since this was still the early 1880s, no one saw any use for what was simply a laboratory curiosity. This is indeed unfortunate since Maxwell had already done field calculations on a scenario where free negative charge is being drawn to a positive target with a grid of negative wires interposed. Maxwell realized that this could give amplification, and even identified what would later become known as "amplification factor". We use the lower case Greek letter mu as its symbol since that's the symbol Maxwell used. Typical for those times, those who theorized, like Maxwell, and those who did, like Edison, didn't talk to each other.

Meanwhile, Hertz was busy discovering radio waves, and Marconi busy exploiting the discovery for practical purposes. This led to wireless telegraphy, and the need for a decent detector.

In 1904, Sir Ambrose J. Fleming used a very similar device to Edison's plate in a light bulb for the purpose of detecting radio waves by means of the "Edison Effect". Edison sued Fleming for ripping off his light bulb patents. Of course, he lost since the purpose of the Fleming Valve was quite different, and Edison had mentioned no such possibilities when he filed for his patents. By this time, the crystal detector had been discovered, and was already in wide use. The Fleming valve was a bit more complex, since it required a power source for the filament, and it really didn't work all that much better since, like the crystal detector, it was a passive detector without amplification. Two years after that, Lee DeForest added a third electrode to the basic Fleming Valve -- a control grid -- which made an amplifier, as opposed to just a passive detector. Still, DeForest had a long uphill climb since this "Audion", as he termed it, was even more complex than a Fleming Valve, and DeForest's basic grid leak detector didn't work all that much better since the original Audion didn't give very much gain. The DeForest Audion would have been totally dead on arrival, were it not for Edwin Armstrong. Lee DeForest was strictly a legend in his own mind, and pretty much of a moron -- the blind squirrel who happened to luck into a nut. Armstrong rescued DeForest's invention with an invention of his own: the regenerative detector. This gave the crude Audion enough sensitivity to beat the performance of the much easier and simpler crystal detector by enough of a margin that scientists and engineers took notice. Among those who took that vital interest were Bell Labs, and the Radio Corporation of America, the latter being set up as a "skunk works" for the military during WW I. At this time, Armstrong worked out the characteristics for vacuum tubes we are all familiar with today: rp, gm, and u. To say DeForest was miffed is quite the understatement. It got no better when DeForest won a lawsuit against Armstrong on a technicality. Armstrong, being the good sport, went to the board of directors of the Institute of Radio Engineers in an attempt to turn over the award it had given him for inventing the regenerative detector to DeForest. By a unanimous vote, the board ruled that Armstrong would keep his award, court ruling or no court ruling. This affront made DeForest, who had already embarrassed himself in court by demonstrating his ignorance of his own invention over some ten hours of testimony, even madder than before.

None of this bothered Armstrong, who simply went ahead with more and better inventions, including the superheterodyne receiver that took out a lot of the finickiness of operating a regen receiver. This led to a user friendly device with mass appeal.

By the 1920s, commercial AM broadcasting became a reality, as did the use of vacuum tube "line amps" for telephone communications. In neither case was audio quality a high priority. Telephone lines were limited to a high end of 8.0KHz, and AM, at 10KHz, wasn't much better. At this time, preliminary investigations by Bell Labs showed that any sort of HiFi quality bandwidth was judged undesirable by listeners when the bandwidth of the signal didn't match that of the amp. Singers even originated a singing style called "crooning" specifically to sound good over narrow band AM. The question of sonic quality bothered Armstrong, even if the AM broadcasters didn't care. This led Armstrong to develop FM, with its wider bandwidth, and much reduced noise from operating at much higher frequencies. A demonstration got lots of attention and acclaim, but the established AM broadcasters weren't interested in changing how they did business, and so FM didn't take off right away.

There was, however, another industry which developed a serious stake in sonic excellence: the movies. By the late 1920s -- early 1930s, the movie industry had progressed from silent films to "talkies". At this time, the major studios controlled everything, from the performers, to directors, and even controlled their own theatres. It quickly became obvious that sound quality was a big draw that brought audiences into theatres. The optical sound track recording was also inherently superior to 78rpm disks. The demand for improved sonic quality from such a big and important industry spurred RCA and Western Electric into competition for the industry dollars, and its need for vastly better sound equipment. In 1932, the 2A3 was introduced. This is one of the premiere audio triodes, and has been in production ever since. The 300B came along a few years later. In 1936, the 807 RF final, and its audio cousin, the 6L6 were introduced. 1936 was also the year that Armstrong introduced HiFi FM broadcasting.

From spark and crystal to commercial broadcasting and FM in just about twenty years. That's amazing progress when you think about it.

Stereo and the Golden Age of Hi-Fi

Alan Blumlein introduced stereo in 1940, the year Fantasia was released with multi-channel audio: the first movie to feature this. Western Electric, Decca, EMI, and RCA become major players in the development of stereo. The involvement of the USA in WW II put an end to audio development as engineers went into military projects: the first radar installations and military communications. After the war, the best engineers would segway into the Cold War effort and the up and coming aerospace industry. Still, there was one new development in audio: the Williamson of 1948. The Williamson was designed to showcase the new audio final, the KT66.

The Williamson was the first amp to feature high open loop gain, and a very large feedback factor. Up till then, most amps didn't use negative feedback at all, or used it sparingly. The unheard of THD measurements of the Williamson also began the race for lower and lower THD numbers that continues to this very day. It wasn't necessarily a good thing, however the design of speakers was still very much a "black art" with no real design going into it. As a result, the speakers of the 1940s and 1950s tended to have sloppy bass and frequency dependent misbehaviours. The large feedback factor of the Williamson allowed for a much lower output impedance that could tame those wild, unpredictable speakers. The problem here is that THD figures make for marketing department "braggin' points", but correlate very poorly to listener satisfaction. Norman Crowhurst proposed a system that weighted the harmonic content, placing more emphasis on the nasty, dissonant high order harmonics, and relatively less on the less offensive low order harmonics. The industry would have no part in that. The end result we are all familiar with: sound-alike "Big Box" systems.

Ampex introduced an improved magnetic tape recorder, based on pre-war German research. This deck used tape speeds of 30ips or 15ips that allowed the making of some excellent masters. To go along with this, Columbia released their twelve inch, 33-1/3rpm, "LP" disk made of vinyl -- much more durable and less noisy than shellac disks. RCA came out with a seven inch, 45rpm disk (soon to be dubbed the "single"). Both used the new "microgroove" technology that allowed more playing time than you got with 78's. (At this time, most LPs were for classical and jazz. Rock 'n' Roll was still a niche market and most Rock was released on mono 45s made from waste vinyl from LP pressings. The Rock LP "Concept Album" didn't appear until the mid-1960s.) FM finally came into its own as the "good music" radio format. These FM stations were largely subsidized by the more profitable AM franchises. So far as broadcasting goes, FM attained some excellent sonic performances. This would come to be called the "Golden Age of HiFi". Some aspects of this "golden age" weren't so swell: the worst being the competition for ever lower THD figures. This meant a demand for higher and higher negative feedback figures, requiring higher open loop gains, and the tendency to move away from the more linear medium gain triodes to high gain triodes and small signal pentodes with less favourable distributions of high order harmonics in their distortion spectrums. Still, the era produced some enduring products. The humble Dynaco Stereo 70 sold some 500,000 units at a pretty reasonable price, not audiophile expensive. There is still a demand for these, and a good sample will sell now for more than it did then, even in inflation-adjusted dollars. When was the last time you heard of anyone wanting a solid state amp from the early 1970s, or a circa 1985 CD player?

Solid State and the Great Regression

In 1964, the first solid state audio equipment became available. This was sold as the "wave of the future". Back in those days, people still believed in the future as something other than a back-drop for visions of dystopia. Vacuum tubes were "Fred Flinstone" whereas solid state was "George Jetson". The reality was much different from the marketing hype. These solid state amps were not reliable, the germanium power transistors blew at the drop of an electron. The sonic performance was simply gawdawful, and frequently made even worse by the circuitry required to protect those delicate finals. (When tripped, either by a peak transient in the music, or excessively reactive speaker loads, these protect circuits lent a raspy quality to the sound. It sounded as ugly as it does when pronounced.) There has never been an active device as linear as the old fashioned triode, and transistors are far from that standard. (Recognized as long ago as the early 1950s, and the quest for a truly linear solid state device continues to this very day.) To force the highly nonlinear transistors into linearity required heretofore unheard of levels of negative feedback. This, in turn, led to an entirely unfamiliar phenomenon: TIM. The high feedback factors made the design of a product that's reliably stable very difficult. An amp that tests fine at the factory might go into self-destructive oscillation when connected to the user's random speakers. It would take another twenty years to get the solid state sonics back to what you could expect from just a run-of-the-mill GAOHF VT amp. The marketing hype machine went into overdrive in order to sell the public these sonic abominations.

There were a couple of good developments to come from this time. Back in 1963, Neville Theile had come up with a new idea for speaker design. It was his idea to consider the speaker system as a high pass filter, and to apply the same analytical techniques: the second and fourth order Butterworth or Chebychev characteristic. Since the original idea appeared in some "backwater" Australian magazine, it was largely forgotten until about 1971. When rediscovered, the Thiele method allowed a systematized design for speaker systems. No longer a "black art", speakers could be designed to spec. This made it much easier to design without squirrelly cross-overs and/or any unusual, frequency-dependent surprises. Another positive development was Walter Jung's investigation into the sonic performance of capacitors. Up till that time, no one really gave the matter of passive components much thought. Of course, factors such as ESR, DA, and DF were all well-known. However, this was mainly of interest to designers of VHF and UHF equipment. It wasn't considered relevant to frequencies as low as the audio band. Jung also discovered that capacitors could be microphonic, that they could even self-excite at their natural resonant frequencies. All of these factors can -- and do -- have an influence (and not for the better) on sonic performance. It does matter what you include in the signal path.

Through the 1970s, solid state power levels crept higher and higher, while speaker sensitivity crept lower and lower. Both being necessary since transistors clip hard and fast, and the feedback required to get linear performance out of them makes the clip behaviour even worse. You need those enormous power levels to keep your solid state equipment out of clipping. Less sensitive speakers also help to cover up the solid state nastiness. Worse was yet to come.

The Digital Devolution

As with everything else, the CD was introduced with much fanfare and hype. This new format promised perfect digital sound forever. That was nonsense. To be sure, CDs are are somewhat more durable than vinyl platters. A CD player can compensate for the occasional glitch cused by a scratch in ways that a record player can not. After all, every analog recording method suffers deterioration. That black fluff you periodically clean from your stylus; the brown gunk you clean off the record heads, all of that was worn away when you played the LP or tape. That's where your high frequencies went. Eventually, that LP or tape isn't worth listening to due to this slow degradation. With digital, you don't have that problem. So long as the playback can distinguish a "one" from a "zero", the playback will be as good as original every time.

However, CDs have some other enormous problems. First of all, the promise of no distortion is bogus. There is no such thing as a perfectly linear ADC or DAC. You simply substitute one nonlinear transfer characteristic for another. It's only a question of how nonlinear is it.

The other problem lies with the 44.1/16 CD standard. This means a sample rate of 44.1KHz and 16 bits per sample. Since it requires a minimum of two samples to recreate the original, the upper frequency limit is: 22.05KHz. Any frequencies above that "fold-over frequency" will cause "aliases" -- frequencies within the passband that weren't part of the original analog signal. Even though 22.05KHz sounds pretty good, in practice, it isn't since there is no such thing as a perfect lowpass filter. The high frequency rolloff has to start much sooner than that if appreciable aliasing is to be avoided. Lowpass filters with sharp skirt selectivity also have some nasty phase behaviours around their cutoff frequencies. This means that CDs lack the upper end that tape and vinyl have. Add to this complication the fact that sound engineers needed time to relearn how to master for CDs, and it's no wonder that the early CDs sounded decidedly inferior to the analog methods it was to replace. For a long time, vinyl simply sounded better. Unfortunately, vinyl all but disappeared by the early 1990s. The sample rate was just too low, but digital technology of the times wasn't up to doing much better. Played through already lousy solid state equipment, it probably didn't make much difference, but played through a really good vacuum tube amp many of those early, and some later, CDs sound decidedly "off".

Bad as that was, it gets even worse. The MP3 format is what's called a "lossy" file compression technique. For some files, you want the compression to conserve space, but don't want to lose any data. A text file with random missing letters becomes hard to read, and a binary won't run if some of its bits are missing. With a sound file, it was decided that not all bits were necessary, and so some were thrown out, usually the low level information. It was figured that near subaudible components and components at the extreme upper and lower audio frequencies wouldn't be missed. It was a wrong impression. It does make a difference, and now it seems that this low level information is critical for determining sound distance and direction (i.e. sound stage) and emotional influence. Losing that information makes MP3s sound "lifeless", and unemotional. And to think that some people actually expect you to pay for MP3s! Here's an indication of just how much worse sound quality can get: The Death of High Fidelity. Despite the advance of the technology, sound quality is taking great strides backwards. At the rate they're going, it won't be too long before wind-up Victrola and shellac disks simply sound better. Yuck!

Equally aggravating in this regard is that the listener will hear just how "off" these modern mastering techniques sound, and guess where the blame will fall? If you said the amp, go to the head of the class.

The Japanese DiY Revolution

By 1990, the Japanese audio magazine, MJ, was running articles about Western Electric 300Bs, RCA 45s, and Ampex 845s, in a rediscovery of a "forgotten" technology that had long been relegated to history. Meanwhile, back in the US of A, the audiophile press had degenerated into a total mess. It was more about getting people out there to spend the $BIG BUX on the latest Hot Item, which invariably promised to be even better than last issue's Hot Item. How sound equipment actually sounded was totally irrelevant. It was -- and is -- total nonsense. Naturally, there was no mention, let alone comparisons, with GAOHF equipment. Firstly, because forty year old equipment didn't bring in the advertiser dollars, and secondly, because it would be highly embarrassing. So wealthy Japanese who made their fortunes selling gullible Americans mass produced, overpriced, and under qualitied products took to cruising Akiharaba (a suburb of Tokyo inhabited with quirky individuals like the Otaku and small shops dealing in quirky products, including 1950s era American consumer electronics) for vintage electronics.

Not only that, but Japanese DiYers were coming up with their own new designs. By a fortunate turn of events, the Soviet military decided to stick with vacuum tube technology. The advantages were that vacuum tube equipment isn't destroyed by the EMP of a nuclear detonation. Nor do high levels of background radiation interfere with the operation of a vacuum tube. The Soviets kept up military production which subsequently became available to Japanese audiophiles when the Soviet Union imploded, ending the Cold War. Of course, the prices for NOS VTs went through the roof, headed for deep space. The "bug" spread through Asia, landed in Europe, and finally made its way to the USA.

So we're right back to the beginning. The circle is complete.


Copyright © 2009 All rights reserved