Can anyone school me - I am going nuts trying to understand where I am going wrong trying to do what I thought should be a simple test:
I want to test my audio interface fidelity by taking an imported ripped wave from a CD and sending it out a stereo pair on my interface while recording it back in through a stereo pair. The idea is that I'll repeat this 10x or 20x and see what kind of degradation and artifacts I notice when I A/B the first generation and last generation recordings.
Here is what I did:
1. I ripped a lossless wav file from a CD to serve as my source audio
2. Opened Sonar and dragged the .wav file into a track.
3. Configured the output to ports 3/4 on my interface
4. Wired 3/4 out to 3/4 in
5. Added another track, set it to record from 3/4 in
6. Recorded a second pass from track1 to track2
What I expected was that I would see track2 matching track1, at least with respect to levels. All faders and gain are at unity. I use a Focusrite Liquid56 and as far as I can tell, gain is unity.
What I actually see is that the recorded track is hotter than the source track by a seemingly variable amount. I've tried changing the volume on my source track1 to -1.0db and -0.5db and while this seems to give me a similar level on my track2, when I play back track2 to repeat the process to track3 for a third generation, the volume is mismatched again.
I can't help but think I'm doing something wrong ... I've been unable to record several generations and have the volume matched, and I don't understand why.
Have I overlooked something simple?