Choosing a Gaming CPU October 2013: i7-4960X, i5-4670K, Nehalem and Intel Update
by Ian Cutress on October 3, 2013 10:05 AM ESTSleeping Dogs
Sleeping Dogs is a strenuous game with a pretty hardcore benchmark that scales well with additional GPU power when SSAO is enabled. The team at Adrenaline.com.br is supreme for making an easy to use benchmark GUI, allowing a numpty like me to charge ahead with a set of four 1440p runs with maximum graphical settings.
One 7970
With one AMD GPU, Sleeping Dogs is similar across the board.
Two 7970s
On dual AMD GPUs, there seems to be a little kink with those running x16+x4 lane allocations, although this is a minor difference.
Three 7970s
Between an i7-920 and an i5-4430 we get a 7 FPS difference, almost 10%, showing the change over CPU generations. In fact at this level anything above that i7-920 gives 70 FPS+, but the hex-core Ivy-E takes top spot at ~81 FPS.
One 580
0.4 FPS between Core2Duo and Haswell. For one NVIDIA GPU, CPU does not seem to matter(!)
Two 580s
Similarly with dual NVIDIA GPUs, with less than ~3% between top and bottom results.
Sleeping Dogs Conclusion
While the NVIDIA results did not change much between different CPUs, any modern processor seems to hit the high notes when it comes to multi-GPU Sleeping Dogs.
137 Comments
View All Comments
tim851 - Thursday, October 3, 2013 - link
You know, once you go Quad-GPU, you're spending so much money already that not going with Ivy Bridge-E seems stupid.In the same vein I'd argue that a person buying 2 high end graphics cards should just pay 100 bucks more to get the 4770K and some peace of mind.
Death666Angel - Thursday, October 3, 2013 - link
I'd gladly take a IVB-E, even hex core, but that damned X79 makes me throw up when I just think about spending that much on a platform. :/von Krupp - Thursday, October 3, 2013 - link
It's not that bad. I picked up an X79 ASRock Extreme6 for $220, which is around what you'll pay for the good Z68/Z77 boards and I still got all of the X79 features.cpupro - Sunday, October 6, 2013 - link
"I'd gladly take a IVB-E, even hex core, but that damned X79 makes me throw up when I justthink about spending that much on a platform. :/"
And be screwed.
"von Krupp - Thursday, October 03, 2013 - link
It's not that bad. I picked up an X79 ASRock Extreme6 for $220, which is around what you'll pay
for the good Z68/Z77 boards and I still got all of the X79 features."
Tell that to owners of original not so cheap Intel motherboards, DX79SI. They need to buy new motherboard for IVB-E cpu, no UEFI update like other manufacturers.
HisDivineOrder - Thursday, October 3, 2013 - link
Not if they actually bought one when it was more expensive then waited until these long cycles allowed you to go and buy a second one on the cheap (ie., 670 when they were $400, then another when they were $250).althaz - Thursday, October 3, 2013 - link
Except that you might need the two or four graphics cards to get good enough performance, whereas there's often no real performance benefit to more than four cores (for gaming).Take Starcraft 2, a game which can bring any CPU to its knees, the game is run on one core, with AI and some other stuff offloaded to a second core. This is a fairly common way for games to work as it's easier to make them this way.
Jon Tseng - Thursday, October 3, 2013 - link
<sigh> it was so much easier back in the day when you could just overclock a Q6600 and job done. :-pJlHADJOE - Thursday, October 3, 2013 - link
You can still do the same thing today with the 3/4930k.Back in the day the Q6600 was basically the 2nd tier HEDT SKU, much like the 4930k is today, perhaps even higher considering the $851 launch price.
rygaroo - Thursday, October 3, 2013 - link
I still run an O.C. Q6600 :) but my GPU just died (8800GTS 512MB). Do you suspect that the lack of fps on Civ V for the Q9400 is due more to the motherboard limitations of PCIE 1.1 or more caused by the shortcomings of an old architecture? I don't want to spend a lot of money on a new high end GPU if my Q6600 would be crippling it... but my mobo has PCIE 2.0 x16 so it's not a real apples to apples comparison w/ the shown Q9400 results.JlHADJOE - Friday, October 4, 2013 - link
I tested for that in the FFIV benchmark.Had PrecisionX running and logging stuff in the background while I ran the benchmark. Turned out the biggest FPS drops coincided with the lowest GPU utilization, and that pretty much nailed the fact that my Q6600 @ 3.0 was severely bottlenecking the game.
Tried it again with CPU-Z, and indeed the FPS drops aligned with high CPU usage.