A while back I read a report on what would be the best sampling rate to use. It showed some fallacies in setting sample rate too high.
I remember going through the tests and finding the results for myself.
I think they used some sort of signal generation and nulling at various sample rates but I can't find the reference. Or maybe it was up-sampling and down-sampling errors.
Does anyone know what I'm talking about and where to find it? I tried a bunch of Google parameters but no luck.