Cactus Music
The results are found here, not sure if they took all results and averaged them as many submitted for the same interfaces. Prety unscientific test eh.. Just made me think we could do something like it here.
http://www.kailuamusicschool.com/tech/round-trip-latency-roundup/
This is the GS thread. Steveo42 posted the link in another thread so I found myself reading most of it out of curiosity. Problem I have with it is it shows pretty similar results for most interfaces which blows the doors off the "buy a RME for low latency" quote we often read on forums. http://forum.cockos.com/showthread.php?t=174445&page=4
I think to make it more scientific and easier to read we would need a baseline of only a few settings.
Like there's no point comparing a digital loopback to analog.
And stick to one clock rate like 48 hz as we all understand it makes the figures go down a little as you go up.
I'd think if we just did all at 48hz and only 32, 64, 128 and 256 that would give enough info
And we could use Sonars report or this tool http://www.oblique-audio.com/free/rtlutility
Anybody interested or should we just use asio4all and be done with it! :)
From what I have observed in the Internet before that table look accurate. RTL for Interface/Driver version/Frequency/Bit depth/Buffer size is a constant. Particular computer should not influence it at all. So one single measurement (assuming done with real loopback test) is scientific.
Sonar reported latency is not scientific. It trust the information provided by the interface. Some of them do not report truth.
RTLUtility is a "hardware prove". That result you can trust (
at least one "small man" DAW has such thing build in, easy to use and accurate by definition for any interface/mode/driver/etc.)
The problem, all that RTL numbers say nothing about which column is usable, especially not for particular system. Here statistic is required, so for which users/interface combination particular buffer size has no problem. Unfortunately that is all aspects dependent, from hardware up to VSTs in use. And that is the reason for "buy a RME for low latency" quote. Most users have confirmed so far that IF particular system CAN run with 64 samples buffer, THEN RME can be used with 64 samples. Reversed, if RME can not work with 64 samples NO OTHER INTERFACE (with the same bus type) can work with 64 samples on that system/environment (I have not seen any single report breaking that statement). It can happened some other interfaces can be used with the same settings as well, but they can not go lower, so they are just "not worse". I must admit that Internet claim modern MOTU and ZOOM are comparable in terms of stability and RTL. But that can be proved by time only (and taking the number of reports about broken MOTU units, it can take quite some time).
I do not have "hi end" interfaces, but I have observed that I can not use my VS-20 with the same buffer size as my M-Audio, on physically and in software the same system. Especially on low end, manufacturers not only return garbage about latency but also allow setting which then know can not be used in practice.
Users see 1ms in advertisement, see 5ms in software (wonder why... but then read small text in advertisement...) and looking that other work with over 10ms, think they are working with the best interface in the world. Not many then start RTL, but if they do, they set 64 samples and that returns 16ms "something is wrong here, bug Internet claims that is still not so bad for my purpose". When DAW is constantly crashing with 64, they tend to blame computer/software, set it to 128/256 and live with that.
It can happened the same user could happily use RME with 64sample and 6ms RTL. So people recommend to try. Sound logical for me.