Part of CCs (as RPNs) have some proposed meaning (see tables at
www.midi.org). Some equipment/synth follow that proposal. Some set default mapping to proposed or fixed with some (N)RPNs. Many allow "MIDI Learn" for parameters, in that case (N)RPNs can be unsupported (as ambiguous).
So particular parameter in particular plug-in can be controlled:
1) by automations. That is special not MIDI (at least not directly) related way to control VST parameters. No MIDI track pointing to that plug-in (in case that is Audio FX) required for that.
2) by MIDI. MIDI track is required and there are 2 variations:
2.a) MIDI signal (CC, RPN, NRPN, etc.) by which that parameter is controlled is predefined
2.b) MIDI signal can be learned within plug-in. The subset of MIDI messages which plug-in understand is plug-in specific. So some plug-ins can understand NRPNs, other not.
From the user perspective, (1) and (2) can be controlled "with mouse" or with MIDI controller. But routes from the MIDI controller to parameter are different:
1) for automations, Sonar has 3 approaches:
1.I) "Remote control..." (f.e. in the Synth rack)
2.II) Control Surface "ACT Dynamic mapping" (so call "ACT Learn")
3.III) Control Surface direct automation parameters steering (f.e. used by MackieControl)
2) for MIDI way the host is only involved as a MIDI provider, it has no influence what delivered to the plug-in MIDI message will do (play note, switch it off, control attack, etc.)
All methods have sufficient for any purpose time resolution. But it is up to the plug-in to support fast parameter changes. Some are not expecting more then one change for one parameter inside one audio buffer frame and can produce very strange results. But one buffer is usually no longer then 3-5ms so that is rarely a limitation.
Note that transferring one NRPN throw standard MIDI cable takes ~4ms. If that "worked fine", other methods should be fine to.