Re-thinking of MIDI is a good thing. MIDI was invented over 30 years ago for hardware synthesizers. There was no way the originators could have predicted USB MIDI, let alone Bluetooth MIDI or 2.67
billion devices incorporating MIDI by the end of this year. In fact MIDI seemed like it was going to become solely a control medium after ADAT came out, and digital audio became affordable. VST came along that used MIDI but the original virtual instruments were so unresponsive they were more of a curiosity than anything else.
Then computer power increased exponentially, and people could no longer find vintage harwdare synthesizers. Virtual instruments started doing emulations, and that was their cue to come back into vogue (rhymes with "Moog"

). When instruments like Kontakt came along that beat the living crap out of systems but the computers could actually cope, MIDI became super-relevant again.
While some DAWs have appended MIDI features, like what Cubase has done with expression,
all DAWs -- not just SONAR -- need to consider the implications of how computers, virtual instruments, and control affect workflow with MIDI. We're not just sending MIDI data via an output port to a piece of hardware.
If all this thread does is point fingers at problems, real or especially imagined, and opine that Cakewalk's engineers are dysfunctional idiots who probably can't tie their shoelaces without help, it won't contribute any more than previous threads with a similar theme. But if it makes an actual attempt to analyze workflow and come up with solutions on how best to handle MIDI in the 21st century, then it will definitely have value. Of course as with
all re-assessments of workflow, it's essential to take into account a multiplicity of applications, as well as the Law of Unintended Consequences.
Should MIDI clips be like audio clips, in that they stake out a claim on a certain space on the timeline, and silence is treated the same as audio silence? Is lack of data the MIDI equivalent of silence? Should clips shorten automatically if data is removed, or retain their space on the timeline? How do you handle parallel streams of data? Should MIDI clips incorporate clip automation like audio, automation controllers in lanes or tracks, or graphical editing in strips? Or all three? And if all three, how would you display them? The solutions to these questions may appear easy, but they're not...