AMD Ryzen 7 1800X reviewed: Zen is an amazing workstation chip with a 1080p gaming Achilles heel

After years of waiting, AMD’s Ryzen has finally arrived. The company has spent years in the proverbial desert, struggling with Bulldozer improvements while simultaneously designing its new Zen architecture. It’s no exaggeration to say AMD’s future as a PC company literally depends on Ryzen’s success.

AMD has positioned Ryzen aggressively with price points that match extremely favorably with Intel’s Core i7 family and HEDT desktop parts, but there have always been questions about how well Ryzen would perform. It’s been years since AMD fielded a high performance CPU design and the company doesn’t have the same market share it once commanded. If the company is serious about pushing Ryzen into workstations, consumer PCs, and servers, it’s got a very high bar to clear.

Today’s review will focus on the Ryzen 7 1800X‘s performance, rather than rehashing Ryzen’s design or architecture. Anyone with questions is invited to peruse those articles or ask in comments below. Before discussing Ryzen’s performance, however, we need to talk about its launch. Normally, a manufacturer gives us 7-10 days at minimum; the more significant the product, the longer the review window. AMD bucked this trend by launching Ryzen less than a week after we received our hardware kits. Because I’d previously committed to attend Nvidia’s GTX 1080 Ti launch at GDC, I only had 60 hours to test the Ryzen 7 1800X.

AMD plans to hit Intel where it hurts — its wallet.

As short as our review window was, I may have gotten lucky. In at least a few cases, reviewers didn’t receive their kits until 24-48 hours before the launch. I’ve run enough benchmarks on the Ryzen 7 1800X to feel comfortable characterizing this article as a review, but it’s also something of a hot mess. I suspect you’ll see wide variation in reported benchmarks and experiences; don’t be surprised if different people have different results.

The limited testing time was exacerbated by the motherboard AMD shipped us for testing. Asus has a well-deserved reputation for quality, but my Crosshair VI Hero threw so many errors, AMD ultimately concluded I might have a bad board, not just a wonky BIOS. To be clear: I was scarcely the only reporter to experience problems or see odd performance, but our board seems to have been at the low end of the bell curve. Both Asus and AMD worked with us extensively and their assistance was  much appreciated, but extensive troubleshooting still cut deeply into my test time. After switching to a Gigabyte Aorus X370 motherboard I retested our game benchmarks — beginning around 12 AM this morning. While the Gigabyte motherboard did improve the situation slightly and was markedly more stable, it didn’t resolve the gaming Achilles heel I referred to in the title (we’ll be exploring that issue in greater detail below).

Ryzen-Die-Shot-1

As a result, our tests are not as thorough as we would have liked. We redesigned our CPU benchmark suite in preparation for Ryzen, but didn’t have time to run every CPU through every new test. Because we needed to test in parallel, not every CPU could be benchmarked with the same cooler or the same SSD. We chose the tests we did partly to ensure that these differences would not meaningfully impact our results, and our future deep dives into the chip will standardize on common hardware once again.

With those caveats in mind, let’s check the numbers.

Test setup:

We’ve expanded our CPU test suite since the Core i7-7700K launched, but didn’t have time to run every test on every processor. Our Intel Core i7-6700K, 7700K, 6900K, and 6950X all used 32GB of G-Skill DDR4-3200 (F4-3200C14Q-32GTZ) clocked at that frequency. Neither of the Ryzen testbeds we tested, however, was capable of running four DIMMs at these clocks. Since we’d already tested the Intel systems, we had to make a choice: 16GB of DDR4-3200 or 32GB of DDR4-2133. Since none of our benchmarks require >16GB of RAM and AMD isn’t using quad-channel memory for Ryzen, we opted for 16GB.

All of our GPU and 3D benchmarks were run using a GeForce GTX 1070. All of our game benchmarks are run at 1920×1080. While this isn’t considered an enthusiast resolution anymore, the point of these tests is to stress the CPU, not the GPU. All of our testbeds ran Windows 10 with the latest patches and updates installed. All of our GPU benchmarks were performed with Nvidia’s ForceWare 376.88.

We’re going to split our benchmark results between workstation and application tests and 3D benchmarks with workstation and content creation tests up first. Our test results and analysis are in the slideshow below. As always, you can click on any slide to expand it in a new window.

In workstation and computation tests, Ryzen is a beast to be reckoned with. Even when it doesn’t match Intel in raw performance, its performance-per-dollar gives it a huge advantage over its much larger, more expensive, rival. That said, there’s a narrow case to be made for chips like the Core i7-7700K, particularly in lightly threaded workloads. If you’re still dealing with single-thread or dual-threaded applications, the Core i7-7700K can still deliver the best performance per dollar. In most cases, however, the Ryzen 7 1800X is in a class of its own. That’s also true for gaming — but in very different contexts.

Results like these are guaranteed to raise questions, and we’ve spoken to AMD extensively over the past few days to explore the issue. According to AMD, there are three issues collectively contributing to these problems. First, Ryzen’s SenseMi technology and Precision Boost are extremely fine-grained controls that offer significantly finer granularity than any previous AMD solution, which means BIOS implementations of these features are new and not necessarily working at 100% efficiency yet. Second, AMD has been out of the high-performance market for so long, virtually no software is written explicitly for or optimized to perform well on AMD CPUs. Ryzen puts AMD on a far better footing, but software patches don’t arrive overnight. Third, there are some games that are far more sensitive to the differences between AMD and Intel CPUs than others. We happened to pick a test suite that had more of these slowdowns in it than others, and even we don’t see it every test (Vulkan, for example, runs quite well).

The other reason AMD missed it is because they chose 1440p for a minimum resolution, figuring that no one with a $ 500 CPU would still be gaming in 1080p. I can understand that argument even if I don’t normally find it persuasive (I prefer to keep a lower resolution to allow CPU performance to shine through). According to AMD, the difference in performance between itself and Intel is much reduced at 1440p and completely eliminated in 4K. I haven’t had the opportunity to verify those figures yet, but it does make sense — as resolution rises, the bottleneck in the system moves from the CPU to the GPU. If you game at 1440p or above, these results may not have much bearing on your experience.

I had a number of conversations with AMD on the game performance question as well as discussions of board stability in general. Having tested a second motherboard, I think many of my stability and performance concerns were driven, at least in part, by faulty hardware. That said, Ryzen’s relative weakness in gaming while being such a vast improvement over the FX-9590 and offering extremely strong application/workstation performance is a bit odd. It’s possible that AMD’s heavy reliance on multi-threading, while effective in non-gaming tests, made it take a whack in game tests. This last, however, is merely speculation on my part.

Last but not least, here’s a touch of icing for the proverbial cake.

Prime95

There is one caveat to be aware of. Remember, all of our Intel rigs used 32GB of DDR4-3200, while the AMD systems could only use 16GB of the same RAM. While this is undoubtedly having an impact on the results, it’s not going to tilt them in some crazy direction; 16GB of RAM doesn’t consume 42W of power, and that’s how much it would have to draw to bring the 1800X and 6900K into line with one another.

Conclusions

We still have plenty of questions about Ryzen and its performance, and we’ll be revisiting these topics in days to come. How strong Ryzen is, in an absolute sense, depends on where your interests lie. Evaluated strictly as a gaming chip, Ryzen is a good (but not great) option, due to its significant deficits against Intel. Evaluated as a workstation processor or 3D rendering solution, it’s extraordinary. The formal list price on the Core i7-6900K is $ 1089 – $ 1109. List price on the Ryzen R7 1800X is $ 500 bucks. I still think AMD should’ve held off launching until its partners had a bit more time to improve their boards, but that’s a decision made well above my pay grade.

If you’re unhappy about Ryzen’s gaming weaknesses relative to Intel, I’d suggest taking a walk down memory lane. When AMD began to gain ground on Intel in the late 1990s, it didn’t leap from budget chip manufacturing to the Athlon 64 X2 in a single bound. The K6 and K6-2 were adequate chips for budget-conscious gamers and Windows desktop software, but they weren’t going head-to-head with Intel’s Pentium II’s and winning the competition. It took years, and multiple product iterations, before the Athlon 64 was ready to tackle Intel in the server room. Given how far AMD’s market share has fallen, it’s going to take the company at least a year to build up any serious market share.

I’ve said for years that Ryzen didn’t need to beat Intel in every particular, it simply needed to offer a viable alternative at a good price. Ryzen more than does that. It may not be the best chip at everything, but it’s more than good enough to help AMD win back some badly-needed market share. Even more importantly, this is a CPU core that AMD can scale up and out as the need arises. It’s great to have competition back in the CPU market.

Let’s block ads! (Why?)

ExtremeTechExtremeTech

mediamaker

Doing our best to keep you informed.

admin has 1178 posts and counting.See all posts by admin

Leave a Reply

Your email address will not be published. Required fields are marked *

Read previous post:
These are the plastic surgeries Americans loved most in 2016

According to a new report, more Americans are going under the knife and using their own fat to improve their

Close