CakeAlexS
Yup I don't recommend overclocking at all unless your PC is a dedicated Quake workstation, that's asking for trouble regardless who who might do it (unless that person can mind meld with chips like Spock with carbon silicon based lifeforms).
Overclocking is an interesting topic.
Various studies by Microsoft (
https://research.microsoft.com/pubs/144888/eurosys84-nightingale.pdf) have shown that overclocking is prone to errors.
It follows from common sense and all that, but I have a different opinion.
Every CPU has a tolerance, and the unlocked CPUs need to be tested to find their limits.
When I built my system I overclocked it to 4.8 GHz and ran various tests. The common ones are Prime95, Linnpack, Intel Burnin tools, and so on.
There were also stress tests by the motherboard manufacturer (ASUS) that were helpful.
Of course the cooling system is the most important thing here, so do the homework.
After several hours of test, and periodic tests over time, I concluded that 4.8 GHz was a bit too high. I never had errors, but the CPU temperature reached 80C. Supposedly, the Sandy Bridge would limit at 72C, but mine didn't seem to do that.
So from the start I chose 4.4 GHz as a compromise, and recently I backed it down to 4.3 GHz. (The BCLK overclock was 103% and I changed it to 100%, CPU clock unchanged). This was to remove a possible problem I was looking at. CPU temp of 70C is my limit.
As I write this, I am stress testing with 8 instances of Reaktor running a very complex ensemble. The CPU temp is 57C, and GUI performance is fine with no delays. I'm doing this test to verify some possible hardware problem. I will run this overnight. I see no delays or problems as I compose this posting.
Notice that all 8 cores are running evenly. In Windows 7 this was not so. In Windows 8 it was much better, more even, but still CPU 1 got hit the most.
In Windows 8.1 the thread and process distribution was very even, and everything seems to respond much smoother.
I am very impressed with 8.1 and wonder why so few others seem to jump on-board.
There are a few things that I find stressful. If I run Acronis image backup with maximum compression, I see very extreme temperatures when I do terabytes of backup.
But they work. I've restored several times due to inevitable problems and difficulties in debugging software issue.
I have never, ever, so far, "knock on wood", seen any hardware problems or things related to overclocking. But I do expect that someday, before its time, my system will die, so I am always vigilant.
But why overclock audio system? Most overclockers are gamers who want that extra detail.
For me it is Reaktor. I make a lot of Reaktor ensembles and use many long chains of processing elements. These take up enormous amounts of CPU.
Reaktor is not multiprocessor aware, so I have to split the processing among DAW tracks so that the other CPUs help out.
A while back I used Live as my DAW. I had Live 4, 5, 6, 7 and 8. Eventually, I needed 64-bit processing as my samples were needing more than the 32-bit address space allowed. Live 8 - 64-bit took a long time to happen with many, many bugs along the way.
So I went back to Cakewalk, the tool I used to use back in the DOS and Window 3.1 era.
It has a few MIDI channel assignment flaws that I overcome with Plogue Bidule.
Sonar X3c is very nice, but I think it still has a few thread flaws that cause problems.
We will see how this evolves.