The Band19
As the teraflops increase, this becomes less of an issue. ... And therefore, IMO, "optimization" is a footnote. It's like driving down the autobahn in the right lane with your blinker stuck on? People are going to come flying past you in the left lane @ 200+ mph...
Great point. Today's machines are fast beyond anything we could have imagined back in the 90s when PCs were just coming to become commonplace. So a lot of software runs acceptably fast without a lot of effort spent on optimization.
Also, there's another factor. Any piece of software, with rare exceptions, is going to spend an inordinate amount of time in a small percentage of the code. 5% of the code accounting for 80% or more of the cycles consumed is far from uncommon. So optimization requires that the code base settle down before it's even worth optimizing in many cases. In the case of FF, version one of Pro Q worked just fine speed-wise. Given that much of the code was solid when they enhanced it to version two, they had something that was not a moving target, code that they code reliably analyze to see where the bulk of the cycles were being consumed.