I am confused as to why a higher sample rate does not provide higher resolution? If I take 12 samples per second. It will sound very grainy and choppy, like looking at a very slow fan. At one point does that graininess and choppiness go away?
So it would seem that if computers were powerful enough and the rate of samples was extremely high, that there would be a higher resolution.
Meanwhile, I like 88.2 or 96K. It does not impact my computer much, and provides lower latency. But I have to admit, I have not tried 386 or something higher.
It is hard to believe that humankind will go on for a million more years and that the sample rates will not go up.