Don, I think everyone is overthinking this issue.
Raising the sample rate also raises the cutoff frequency for the anti-aliasing filter, that is true.
However, it is also moot. That's because modern A/D converters' final anti-aliasing filters are digital linear-phase filters that can be very steep without smearing. Shifting the anti-aliasing filter's knee up isn't necessary, as it would if we were talking about a conventional reactive (analog) filter.
The reason for that is modern converters are oversampled, meaning we're actually sampling at much higher frequencies than 44.1 or 48 or even 192. It's more like 5.6MHz! That relaxes the filter requirements such that its knee is WAY beyond audible frequencies. It's only AFTER that step that we decimate down to 44.1/48/88.2/96/192 by basically tossing out all the extraneous samples. At this point we have a DIGITAL signal that can be DIGITALLY band-limited to 20KHz without artifacts.
The bottom line is that while you still want wide bandwidth in all your analog components, there is virtually no audible benefit to digitally recording at sample rates greater than 44.1KHz.
If you can hear a difference, it's due to inadequacies of your hardware. There should be no difference. To insist that there is a difference is to argue against Nyquist himself, whose theorem unequivocally states that any audible wave can be accurately reproduced -- not approximated, but accurately reproduced -- as long as the sample rate is slightly higher than double the highest frequency you need to record.