The Intel Core i9-9900KS Review: The 5 GHz Consumer Special
by Dr. Ian Cutress on October 31, 2019 10:45 AM ESTTest Bed and Setup
As per our processor testing policy, we take a premium category motherboard suitable for the socket, and equip the system with a suitable amount of memory running at the manufacturer's maximum supported frequency. This is also typically run at JEDEC subtimings where possible. It is noted that some users are not keen on this policy, stating that sometimes the maximum supported frequency is quite low, or faster memory is available at a similar price, or that the JEDEC speeds can be prohibitive for performance. While these comments make sense, ultimately very few users apply memory profiles (either XMP or other) as they require interaction with the BIOS, and most users will fall back on JEDEC supported speeds - this includes home users as well as industry who might want to shave off a cent or two from the cost or stay within the margins set by the manufacturer. Where possible, we will extend out testing to include faster memory modules either at the same time as the review or a later date.
Test Setup | |
Intel 9th Gen | Intel Core i9-9900KS |
Motherboard | MSI Z390 Gaming Edge AC (A.60 BIOS) |
CPU Cooler | TRUE Copper |
DRAM | Corsair Vengeance 2x8 GB DDR4-2666 |
GPU | Sapphire RX 460 2GB (CPU Tests) MSI GTX 1080 Gaming 8G (Gaming Tests) |
PSU | Corsair AX860i |
SSD | Crucial MX200 1TB |
Many thanks to...
We must thank the following companies for kindly providing hardware for our multiple test beds. Some of this hardware is not in this test bed specifically, but is used in other testing.
235 Comments
View All Comments
Agent Smith - Friday, November 1, 2019 - link
Only one year warranty with this CPU, reduced from 3yrs. So it’s marginally faster, uses more power, offers no gaming advantages and it’s price hike doesn’t justify the performance gain and warranty disadvantage over 9900k.... and the 3950x is about to arrive. Mmm?
willis936 - Friday, November 1, 2019 - link
Counter strike really needs to be added to benchmarks. It’s just silly how useless these gaming benchmarks are. There is virtually nothing that separates any of the processors. How can you recommend it for gaming when your data shows that a processor half the price is just as good? Test the real scenarios that people would want to use this chip.Xyler94 - Friday, November 1, 2019 - link
It's more because you need a specific set of circumstances these days to see the difference in gaming that's more than margin of error.You need at least a 2080, but preferably a 2080ti
You need absolutely nothing else running on the computer other than OS, Game and launcher
You need the resolution to be set at 1080p
You need the quality to be at medium to high.
then, you can see differences. CS:GO shows nice differences... but there's no monitor in the world that can display 400 to 500FPS, so yeah... Anandtech still uses a 1080, which is hardly taxing to any modern CPU, that's why you see no differences.
willis936 - Friday, November 1, 2019 - link
csgo is a proper use case. It isn’t intense, graphically, and people regularly play with 1440p120. Shaving milliseconds off input to display latency matters. I won’t go into an in depth analysis to why, but imagine a human response time has a gaussian distribution and whoever responds first wins. Even if the mean response time is 150 ms, if the standard deviation is 20 ms and your input to display latency is 50 ms then there are gains to cutting 20, 10, even 5 ms off of it.And yes, more fps does reduce input latency, even in cases where the monitor refresh rate is lower than the fps.
https://youtu.be/hjWSRTYV8e0
Xyler94 - Tuesday, November 5, 2019 - link
If you visually can't react fast enough, doesn't matter how quickly the game can take an input, you're still limited on the information presented to you. 240hz is the fastest you can go, and 400FPS vs 450FPS isn't gonna win you tournaments.CS:GO is not a valid test, as there's more to gaming than FPS. Input lag is more about the drivers and peripherals, and there's even lag between your monitor and GPU to consider. But go on, pretend 50FPS at 400+ makes that huge of a difference.
solnyshok - Friday, November 1, 2019 - link
No matter what GHz, buying a 14nm/PCIE3 chip/mobo just before 10nm/PCIE4 comes to the market... Seriously? Wait another 6 months.mattkiss - Friday, November 1, 2019 - link
10nm/PCIe 4 isn't coming to desktop next year, where did you hear that?eek2121 - Friday, November 1, 2019 - link
The 3700X is totally trolling Intel right now.RoboMurloc - Friday, November 1, 2019 - link
I dunno if anyone mentioned yet, but the KS has additional security measures to mitigate exploits which are probably causing the performance regressions.PeachNCream - Friday, November 1, 2019 - link
I expect I will never own an i9-9900KS or a Ryzen 7 3700X, but it is interesting to see how close AMD's 65W 8 core chip gets to Intel's 127+W special edition CPU in terms of performance in most of these benchmarks.