OK I got bored pretty early in this piece, but it seems like they are saying that the digital audio stream delivered from different drives are different. That would seem to be possible, even plausible. A properly buffered system with a reliable clock should have no such difference in theory, except for dropped bits, but that is possible as are real time delays in the data delivery and processing. But they seem to have based that conclusion on
listening to the sound output. And they seem to be comparing audible performance to a CD player as the gold standard, even though that system introduces many more sources of error. That is like measuring the voltage of a battery by putting your tongue on the contacts. With a bit of expense and trouble, they could have actually examined the bits arriving at the D/A stage and compared them exactly in the digital/time domain. Of course if they could demonstrate that the actual bits delivered at the actual time was identical in both systems, I have no doubt their conclusions would not have changed.