Noel Borthwick [Cakewalk]
Great article on The Science Of Sample Rates that discusses the pro's and con's of high sample rates.
Its long but well worth the read.
Ok, let me explain further why I don't think this is really such a great article.
Now in 2013, the 16/44.1 converter of a Mac laptop can have better specs and real sound quality than most professional converters from a generation ago, not to mention a cassette deck or a consumer turntable. There’s always room for improvement, but the question now is where and how much?
I've already explained my critique of the above passage in the article in several earlier posts here, but just to make it perfectly clear- because the author of the article certainly failed to pick up on this point- anyone with a recent vintage PC or Mac having an onboard "High Definition Audio" ("Intel HDA") codec chip is already equipped for performing playback of 24-bit/96k and 24/192k digital audio and evaluating for themselves the assertions made later in the article about high sampling rates being "harmful" to audio quality, even if their audio interface lacks 96k or 192k sampling.
Technology always advances and today, external clocking is far more likely to increase distortion and decrease accuracy when compared to a converter’s internal clock. In fact, the best you can hope for in buying a master clock for your studio is that it won’t degrade the accuracy of your converters as you use it to keep them all on the same page.
There are however, occasions when switching to an external clock can add time-based distortion and inaccuracies to a signal that some listeners may find pleasing. That’s a subjective choice, and anyone who prefers the sound of a less accurate external clock to a more accurate internal one is welcome to that preference.
The above-quoted passage points to a misunderstanding and miscontrual by the author of the SOS mag article "Does Your Studio Need A Digital Master Clock?" to which he linked.
The author's statements about some listeners finding time-based distortion and inaccuracies added to a signal when switching to an external clock pleasing and even preferring such apparently stem from these two remarks appearing in the linked SOS review:
SOS
So, although sonic differences may be perceived when using an external clock as compared to running on an internal clock, and those differences may even seem quite pleasant in some situations, this is entirely due to added intermodulation distortions and other clock-recovery related artifacts rather than any real audio benefits, as the test plots illustrate.
Overall, it should be clear from these tests that employing an external master clock cannot and will not improve the sound quality of a digital audio system. It might change it, and subjectively that change might be preferred, but it won’t change things for the better in any technical sense.
What I found seriously wrong with this portion of the article was that the author misunderstood (or just failed to grasp) the cause of the problem about which he was writing (that external clocking may cause some converters to distort) which cause was explained in the linked SOS article:
SOS
So even though a very good-quality external word clock is being supplied here, the performance of the A-D converter becomes noticeably (and audibly) worse than when running on its internal clock.
This is not an unusual situation by any means, and the reduction in audio quality is not related to the supposed quality of the reference clock source either...
Moreover, the implication is that the A-D converter’s external clock-recovery circuitry has a far more significant effect on the A-D’s performance than the quality or precision of the external reference clock source.
...it is certainly possible to synchronise an A-D to an external clock without affecting its performance, but that it takes a skillfully designed and manufactured clock-recovery system to do it.
Namely, the author failed to grasp that even if the internal clocking accuracy of converters has improved, the fact most of the converters tested by SOS produced distortion when clocked externally was not actually due to any lower relative accuracy of the external master clocks under test but because of deficiencies in the converters' own external clock extraction/recovery (i.e., slaving) capabilities when clocked by a more accurate and lower-jitter external master clock!
Moreover, I feel that the author blew up the remarks in the SOS review stating that external-clocking-caused distortion might be found pleasant, by further suggesting on his own that some listeners might prefer it and were welcome to their preference, and then suggesting that external clocking distortion be considered as one of many subjective choices/preferences, while entirely failing to note that the distortion as found by SOS when using external clocking was always very small and might not even be audible, as had been clearly pointed out in the SOS article:
SOS
It’s important to take on board that in all of the above examples, where there was an increase in noise and distortion when running on an external clock, the change was always very small, and arguably even negligible in some cases. Without superb monitoring conditions these subtle changes might be inaudible, and would certainly be much less significant than, say, a sub-optimally placed microphone as far as the overall quality of a recording is concerned.
The author's casting distortion caused when using external clocking into a "subjective preference" struck me as a rather bizarre focus on the problem revealed, and made me wonder if he understood that some people, such as anyone producing for film/video, might actually, solely as a matter of overriding practical necessity rather than out of any subjective preference for the sound of distortion, always need to slave to external clocks, as had been pointed out by SOS:
SOS
The only situation where a dedicated master clock unit is truly essential is in systems that have to work with, or alongside, video, such as in music-for-picture and audio-for-video post-production applications. It’s necessary here because there must be a specific integer number of samples in every video picture-frame period, and to achieve that, the audio sample rate has to be synchronised to the picture frame rate. The only practical way to achieve that is to use a master clock generator that is itself sync’ed to an external video reference, or which generates a video reference signal to which video equipment can be sync’ed.
...
Moreover, the audible problems of not synchronising multiple digital devices together correctly are far worse than the very small potential increases in noise and distortion that may result from forcing an A-D to slave to an external reference clock.
In this light, the author's next paragraph:
This is a theme that we find will pop up again and again as we explore the issue of transparency, digital audio, sampling rates, and sound perception in general: Sometimes we do hear real, identifiable differences between rates and formats, even when those differences do not reveal greater accuracy or objectively “superior” sound.
revealed to me that the author, in taking the remarks from the linked SOS external clocking review and shaping them to fit the theme of his article had missed the real technical significance of and misconstrued the SOS review. In fairness, although the author did in fact point out that converters are more likely to perform better when internally clocked and may distort when externally clocked, that was the only thing which the author accurately related from the SOS review.
Namely, the SOS reviewer's remarks about some people possibly preferring such converter distortion and pointing out that the distortion was atonal IM distortion and thus not actually musical were given as a (perhaps sarcastic) warning to anyone preferring certain converters for their "warm" distortion feature (e.g. as offered by Lavry among others) and if the author had grasped that instead then he could have ridden that subjective preference matter horse home as well or instead.
In summary, the following
There are however, occasions when switching to an external clock can add time-based distortion and inaccuracies to a signal that some listeners may find pleasing. That’s a subjective choice, and anyone who prefers the sound of a less accurate external clock to a more accurate internal one is welcome to that preference.
was not only technically incorrect (the external clocks were more, not less, accurate than the internal clocks, and inacuracy of the external clocks was not the cause of the problem) but also made it seem that the choice to use external clocking is merely a matter of preferring the distortion such could produce, thus revealing to me a lack of knowledge on the author's part as well as a misconstruing of the SOS reviewer's remarks .
Next we come to this:
Designers can oversample signals at the input stage of converter and improve the response of filters at that point. When this is done properly, it’s been proven again and again that even 44.1kHz can be completely transparent in all sorts of unbiased listening tests.
The problem I have with this part of the article is that the AES journal "engineering report" to which the author linked did not relate to oversampling, nor did it prove conclusively "that even 44.1kHz can be completely transparent" as the author alleged, and in any event, there are serious doubts surrounding the validity of the test results reported which have put its validity into question.
The test described in JAES report, which has become known as the "Boston Acoustic Society Double-Blind Test" (BAS DBT, full text
here and further info
here) evaluated whether listeners in a double-blind test could discriminate DVD-A/SACD content playback from the same content when passed through a 16/44.1kHz A/D/A "bottleneck" (a CD recorder with realtime monitoring) during playback, as was described in the report:
JAES
This engineering report, then, describes double-blind comparisons of high resolution
stereo playback with the same two-channel analog signal looped through a 16/44.1 A/D/A chain
It should be noted that there was no actual testing of any 44.1kHz source content (e.g. no CD-DA content}, but rather, only the use of a 16/44.1k A/D-D/A chain (the CD recorder's monitoring function) which could be switched into the output path of a DVD-A or SACD player to "degrade" the playback to "CD quality".
It's unclear to what oversampling of signals the author was referring to. The only reference to transparency which I've found in the BAS DBT report was in the introductory paragraph which referenced much earlier blind tests showing that CD-A was "transparent" in comparison to source tapes. If the author was referring to the SACD and DVD-A content used for the testing, then the author was possibly confusing oversampling with material recorded at higher sample rates. If the author was referring to oversampling which might be happening in the CD-recorder's converters, I found no mention of the particular CD-recorder employed nor any specfications given in the report itself although the later-added "explanation" webpage indicates that a
n HHB pro model was
used although again, no specifications were given, so possibly the author was assuming its converters employed oversampling (as they likely may).
The BAS DBT report received quite a lot of
attention when it first emerged and has since been criticized for a number of reasons, including
allegations that the DVD-A and SACD discs used for the test were ones which had been produced from older source material not originally recorded or produced in actual hi-res formats and thus the discs did not actually contain any hi-res content but only content which corresponded to 16/44.1k, and was thus not a true comparison against hi-res source material.
Moreover, despite the controversy and doubts surrounding the validity of the BAS DBT, no follow-up or repeatability testing has ever been conducted afaik and it thus remains only a single isolated and unverified instance, not support for "as proven again and again" as the author alleges in the article.
Ok, that's enough for now. Hopefully it's becoming clearer why I consider that the facetious scientist doesn't understand significant aspects of what he's writing about and as a consequence is spewing mis-info in pursuit of his subjective/objective theme.
If in doubt, and assuming you understand binary number precision, read the "
32 Bits and Beyond" section of his article about bit depth
here (and see if you don't think he should change the "You" in the title to read "I").
More to follow, maybe...