Anderton
drewfx1
The sample time does not remotely equal the time resolution. This is a common misunderstanding of how sampling works.
It depends on bit depth as well as sample rate, but the short answer is 48kHz has a timing resolution of FAR greater than 1/48,000.
Can you explain this? When I zoom in to the sample level, I see a straight line that last x number of microseconds. I understand that gets smoothed when it's reconstructed, but how can data shorter than one sample be encoded into a straight line? What Moorer is saying is that if you have two events 10 microseconds apart, those events cannot be encoded in something that cannot resolve fewer than 20 microseconds.
For a film analogy, if the frame rate is 30 frames per second and you have two different, sequential visual events occurring during the time that one frame occurs, how can those two events play back? I don't see how they could be encoded in a single frame as two separate events.
Rule #1: Never use analogies when trying to understand sampling - they're almost always wrong (in whole or in part) because sampling just isn't intuitive and it doesn't really work the same way other stuff does.
Rule #2: Never argue the analogy, as it inevitably just takes things OT.
Be careful when zooming in - many (most?) DAWs just show a picture that "connects the dots", which is extremely misleading as this has little to do with what a reconstructed signal looks like (or how a sampled signal is reconstructed). It gives you a reasonable picture at low frequencies, but a high frequency sine wave looks nothing like a sine wave - even though your DAC outputs a nice looking sine wave.
The short answer is what happens is if you move your signal a fraction of a sample in time (at a reasonable bit depth), then the sample values will change.
Consider a 12kHz sine wave sampled at 48kHz:
1. Since 12 kHz is exactly 1/4th the sample rate, you get exactly 4 samples per cycle.
2. This means that successive samples are exactly 90° apart.
3. Let's say we take our first sample s1 at 0°. That means s2 is at 90°, s3 is at 180°, s4 is at 270° and so on.
4. Now let's move our signal back .5 samples in time.
5. Now s1 is at 45°, s2 at 135°, s3 at 225°, s4 at 315°, and so on.
Hopefully it's obvious that the sample values are not going to be the same when we are sampling the sine wave at different phases in its cycle.
So the question becomes, how far can you move the waveform on the x-axis (time) without having (almost) any of the y-axis (sample amplitude) values change?
Again, hopefully it's obvious that with higher resolution on the y-axis (increased bit depth), the amount you can move your signal in time without changing the sample values changes gets smaller.
An experiment to try:
1. Make a stereo wave form at 44.1/48kHz where every sample in L vs. R are absolutely identical (i.e. L=R).
2. Upsample by 2x (or higher).
3. Shift L by 1 sample in time at the higher rate.
4. Downsample back to the original SR.
5. Zoom all the way in and compare L and R.
You will find that not every sample in L and R are the same anymore - because you time shifted L by a fraction of a sample.