Interface simulator is extremely hard to write. Even digital part of computer is not easy to simulate (check qemu for example, to have an idea how that goes... you can imagine my frustration when after simulating one ARM based device, checking that the simulator works perfectly with original linux kernel/distribution and writing my own Linux disk driver for newer kernel, I have found that real hardware is refusing to cooperate with my driver
).
But there are 2 tests which if implemented will almost solve the problem:
1) the tester which investigate which jitter is acceptable for an interface, with particular driver in particular mode. A kind of audio loop based (already existing) latency utility which fill WDM/ASIO/etc. buffers with jitter and put load on the system at the same time (disk + RAM + video). So people which already have the hardware can run it and publish results
2) a virtual audio interface, which measure the jitter coming from the client (DAW)
So, taking the result from latency real live tests (exist in the Internet for almost all interfaces), the result from (1) published the same way and the result from (2) measured on your system with your projects, it should be easy to predict what you are going to achieve with different devices.
Hypothetic example: let say an interface is know to produce 5ms loop time with 16bit/64 buffer size, but it tolerate less then 1ms jitter with such settings. 10ms with 16bit/128 and tolerance 2ms, and so on. You measure that Sonar with your project produce up to 1.5ms jitter. Then you know you will be unable to run at 5ms with this interface, but will be able to run with 10ms.
Cactus Music
Or buy a Mac :)
Or PC made to run DAW and tested with particular interface (there are many reports in this forum that works).
Finally a PC for $1000 + RME for $1000 will work not worse then Mac for $1800 + "am interface" for $200