mister happy
Inter Sample Peaking only happens, by its very explicit definition, at one half the sampling frequency. In other words, an Inter Sample Peak in a 44.1kHz data stream can only occur as a blip lasting 1/22500 seconds.
No one can actually hear a 1/22500 second blip.
If the digital over lasts ANY longer than 1/22500 seconds in a 44.1kHz data stream it is, by its very definition, no longer an Inter Sample Peak; it is an "OVER".
Inter Sample Peaks may have been an audible problem on ancient 8kHz phone lines. With contemporary digital audio production, the periodic re-discovery of Inter Sample Peaking as a unrecognized problem seems most likely to broil the groins of *engineers* that don't like math.
I fear you may have misinterpreted something you've read. Two facts are unclear from your description:
1. Intersample peaks don't exist in digital audio
2. Overs and ISPs are not directly related to one another
But you're right about ISPs not necessarily being a problem -
if the amplifier has enough headroom to handle them. Unfortunately, battery-operated players often don't have much headroom and that's why we'd rather avoid ISPs.
You're also correct in saying that an ISP of 0.01 ms in duration is not going to be audible. However, you can't assume that an ISP exists for only a single cycle. In practice, a badly-mastered mix is likely to have thousands of ISPs per second, perhaps persisting throughout the entire song. That can definitely be audible on an iPod.
"Overs" are a different animal altogether. Technically, they don't exist either. Digital audio is incapable of exceeding all-ones, so there is nothing "over" that. 0dB is the absolute maximum. It's a completely legitimate value; we only use the word "over" when there are a
consecutive series of 0dB values. There is no universal standard, but 12 in a row is a number employed by some meters. As you say, a blip that short is unlikely to be audible. But I have reviewed bad mixes with hundreds or even thousands of consecutive 0dB samples, and that's a clearly audible mistake that can't be blamed solely on a faulty limiter.
Overs are a much bigger problem than ISPs, because unlike ISPs they result in
inharmonic distortion that has a
much lower threshold of audibility than the
harmonic distortion caused by ISPs. Why then, the emphasis on "true peak" ISP detection? Convenience, mostly. Well, there are two reasons. First, whenever the "true" peak value exceeds 0dB, it may suggest that the levels are generally too hot and therefore digital clipping may have also occurred. Second, it opens the possibility of analog clipping later on.
Still, an engineer who relies entirely on true peak values while ignoring overs is just being negligent.
You're on the right track being suspicious of these limiter tests. They suggest that if a limiter doesn't set the maximum after-conversion peaks to precisely the requested level, that means they are deficient and perhaps unusable. But that's rarely relevant. If I set my limit to -1.0 dBTP and the limiter gives me -0.3 dBTP, that's not going to have any impact on the subjective quality of my master. To further complicate things, the limiter may have done a fine job, dutifully limiting my true peaks to <= 0dB, and I my end user could still end up with ISPs after decoding a lossy codec.
All that said, I'm still glad that the two limiters I use most are unassailably accurate.