My main creative project (Kicksville) took the idea of using Sonar for backing tracks and expanded on that idea to a degree I haven't seen anywhere else. The basic process was no different than most though: I made stems from the original projects, loaded 'em into a master project, and pressed play when it was time. We used a single large project instead of the playlist feature because we needed to keep timecode consistent for video and lighting worlds.
The big difference was how deeply we integrated Sonar into the production design. How we did that is a loooong story, but the short version is, Sonar controlled everything. Playback, timecode/controller info to video & lighting, scene changes to remote VST hosts, full dynamic automation of the two audio consoles (real-time control of faders, pan, routing, on/off, send levels, on every one of the 72-ish inputs), and more. The tech was part of the performance, so we even ran a split of my main computer screen so the audience could see Sonar doing its thang.
If anyone wants any more of the boring technical details, I'm happy to talk about the geekery involved, but ultimately the playback side of it was pretty simple. And most importantly, in ten years of doing these shows, we've never had a problem.
Here's an example.... BTW, the audio on this video was the 2-track FOH feed from our on-stage console, recorded back into Sonar during the show. In other words, this is the exact audio that went to the PA, not something mixed in post.
https://youtu.be/QOWNVf2p00s