2017/09/14 06:42:34
Sanderxpander
If you mean something like the Quadro cards then those are more for AutoCAD rendering and such. They're even more over specced and pricier than the gaming cards, not to mention largely obsoleted by modern fast "regular" CPU/GPU combos. If you mean something else I'm intrigued!
2017/09/14 09:03:36
Kev999
Sanderxpander
If you mean something like the Quadro cards then those are more for AutoCAD rendering and such. They're even more over specced and pricier than the gaming cards, not to mention largely obsoleted by modern fast "regular" CPU/GPU combos. If you mean something else I'm intrigued!

 
No, I mean Matrox. They make cards for 2D applications rather than 3D.
2017/09/14 13:08:14
Sanderxpander
Cool, thanks!
2017/09/29 08:06:38
HMusikk
Just a little update from my side. I have ended up with this spec.:
GIGABYTE GA-X99P-SLI Intel X99 ATX
Intel Core i7-6850K Processor
Noctua NH-U9S CPU Cooler
Corsair Vengeance DDR4 3200MHz 16GB
Samsung 850 EVO 500GB 2.5" SSD
Samsung 850 EVO 1TB 2.5" SSD
WD Desktop Black 1TB 3.5"
MSI GeForce GT 730 2GB
Corsair RM750x 750W PSU
Samsung 34" LED Curved C34F791
Microsoft Windows 10 Pro 64-bit
 
(Hopefully it will be good DAW for at least as many years (9) as the DAW I have now)
2017/09/29 11:44:45
pb7r47sz
Gunnar,
I see that you have changed your mind about a 4 core CPU and gone to a 6 Core.  Cakewalk takes full advantage of all the cores.  However, I do see a few conflicts with this new spec out. 
The Noctua U9S cooler should handle this chip at stock speeds, but overclocking would not be recommended.  A more powerful cooler is needed for overclocking.  Which brings me to the 3200 MHz memory.  Although your MB will handle the "extreme memory profile" it will do so by increasing the core voltage and the CPU will run at a higher frequency and run hotter.  A dog chasing its tale affect is created when increasing the memory frequency.  The stock frequency of 2133 is ample for DAW.  If you want to use this memory frequency, you need to rethink your cooling solution.
The next conflict I see is between the monitor and graphics card.  The Samsung C34F791 looks like an impressive monitor that should serve you well.  However, the GT 730 graphics card will not light up all those pixels.  The graphics card resolution should be equal or better than the monitor resolution.  A minimum of an Nvidia GTX 1070 would be needed.  You can check out graphics card specs and recommended resolutions at game-debate.com in the Hardware tab.  As a side note, the 750 watt power supply selected should handle the 1070 or even a 1080 graphics card.
  
2017/09/30 07:59:18
HMusikk
Forest,
Thank you very much for your feedback. I will definitely change the memory to 2133!
About the GT 730 graphics (I have to ask because I don’t know anything about this stuff 😊). The resolution in the specification says: 4096 x 2160 at 60 Hz. I thought this was sufficient for the monitor with a resolution at 3440x1440. But there is obviously something I'm missing here?
2017/09/30 13:18:25
pb7r47sz
Gunnar,
I was a little melodramatic when I said the video card would not light all the pixels.  I'm just not sure what day it would finish lighting them.  I have just read in many forums of how disappointed people were when they found out their old video card would not handle 4K when the specs said it would.  This is what I do know.  If you are familiar with the movies, there are 24 frames played each second to fool the human eye into thinking things are moving fluidly.  Gaming has set 30 frames per second (fps) as a minimum.   For your video card it has to build a 3440 x 1440 set of pixels 24 times a second to fool your eye.   The Gt 730 series video cards recommended resolution for playing video games is 1366 x 768 to get above 30 fps.  This card is equivalent to Intel's integrated GPU which has been benchmarked at 23 fps at 1080P.  So your monitor images will be jerky with blotchy pixels as the video card  tries to build each image.  The other thing to watch out for is the frequency the the video card and monitor will support at a specific resolution.  You will only find this information in the manual.  Next, to communicate 4k between a video card and monitor, the transmission standard of the video card and the cable needs to either be HDMI 2.0 or DP 1.2.  This is why I go to game-debate.com since they have done all the leg work.  Lastly, a video card that is not straining makes less noise for your microphones to pick up. 
2017/10/02 17:15:42
abacab
The arguments about frame rates only applies to 3D rendering, not the 2D GUI used in a DAW.  If the card can support the resolution of the monitor, it should be good to go.
2017/10/02 18:19:01
pb7r47sz
Thanks for clearing up my confusion "abacab".  So Gunnar the MSI video card model N730K-2GD5LP/OC will be maxed out at 4k.  However the monitor you spec'd out is more like 2.4K.  So you have your answer.  It is adequate for DAW.  Another potential forum wives tale killed before it could spread.  Thanks again abacab.   Good luck with your build Gunnar.
2017/10/02 19:16:24
abacab
I see that the i7-6850K does not support processor based graphics.  So a GPU would be a must in that case.  But just for the record, I believe that Intel has had 4K support for integrated graphics since the 6th generation (Skylake).
 
I am running two monitors with my 3rd gen Intel HD 2500 Ivy Bridge (full HD 1920 support), which works great.  I tested a GT 7xx series GPU and found no advantage to that.  I suppose if you needed higher resolution or extra monitor ports that your motherboard didn't support, that would be a good reason for a discrete GPU in a DAW (or in the case that your CPU doesn't support integrated graphics).
12
© 2024 APG vNext Commercial Version 5.1

Use My Existing Forum Account

Use My Social Media Account