Today we're taking a look at CPU performance in Battleground 2042 and this may well be the virtually difficult benchmark nosotros've ever done. The problem faced when trying to test a multiplayer game like Battleground 2042 is that information technology'south extremely difficult to get authentic comparative data. Testing one or two hardware configurations isn't too difficult or that time consuming... play the game on the same map under the same conditions for a few minutes, do that three times to record the average, and you have a pretty skilful idea of how they compare. Information technology might not be an exact apples to apples comparing, but it's certainly ballpark.

But testing 20+ configurations for comparing a wide range of CPUs is a massive undertaking, and long story short it'southward taken me 7 directly days of doing nothing simply trying to load into a 128-player conquest lucifer on the aforementioned map (and succeeded!).

For testing we've used the Orbital map, and of course that map wasn't always available in the rotation, so we had to wait for it to wheel into use. This along with a number of other factors meant that I was merely able to exam 3 or 4 CPUs per day. We've also included split 60 second testing on the same map using bots which is a more controlled environment as I have a stock-still number of players in the game and they're all active. Information technology isn't as CPU demanding as there are less players, plus the AI load is unlike. But we're more confident in the accurateness of that data because it'southward a more than controlled test.

The 128-player conquest data is based on iii minutes of gameplay and because the amount of players in the server can modify, besides as what the players are actually doing, in that location's more variance, only the 3-run average helps to address this. Merely please be aware the margin for error is higher when compared to our more controlled testing, and I was certainly seeing a bigger run to run variance.

It'southward likewise worth noting that during our testing, the game received a patch, and Nvidia released an updated GPU commuter. We used the GeForce RTX 3090 for testing as information technology was typically faster than the Radeon RX 6900 XT in this game, every bit we discovered in the GPU criterion a calendar week ago.

Neither of these updates affected the results. Nosotros believe the Nvidia commuter generally addressed DLSS-related issues, while the game patch mostly addressed stability and problems fixes. The test was verified with GeForce Game Ready Driver 496.76 WHQL drivers, using the latest version of the game. All configurations used 32GB of dual-rank, dual-aqueduct DDR4-3200 CL14 retentivity, and we've also included some memory results using a few different configurations for good measure out.

For now, let'south start with the CPU testing…

Benchmarks

Typically, we test CPU performance at lower resolutions such as 1080p to assistance remove any GPU bottlenecks, though with Battleground 2042 multiplayer that's not really necessary equally you'll see soon when we examination at 1440p.

But here at 1080p, we see that for the all-time performance yous'll want a 12th-gen Intel processor, though we're talking about an eight% operation advantage for the 12900K over the 5950X. The 12600K was also nine% faster than the 5800X, though the ane% depression and 0.1% depression data was comparable. What's actually interesting to note though is that despite the game being very CPU enervating -- at least by normal gaming standards -- the 5950X was only 5% faster than the 5600X when comparing the average frame rate, and up to 14% faster for the 1% depression.

Battlefield 2042 utilizes viii cores when available, and so with the 5950X one-half the cores essentially did nothing. In the case of Zen 3, the 8 cores aren't maxed out, then the 5590X saw utilization in 128 player games of effectually thirty to 40%.

With the 5800X, utilization was more in the range of 70-80%. And then the only reason the 5950X was a few frames faster would be due to the slight increase in frequency, as stuff similar enshroud capacity is the same per CCD.

The 5600X was similar despite being a 6-core/12-thread CPU every bit the game didn't max out the 5800X, meaning a fast 6-cadre Zen 3 processor is still fine, though it's right on the border with utilization often locked at 100%. Despite this heavy utilization, the game didn't stutter, at to the lowest degree no more than what was witnessed using higher core count Zen 3 processors. But it does mean you're right on the border with this role, and slower 6-core/12-thread processors will kickoff to see a reject in performance, assuming your GPU is capable of driving over 100 fps using your desired quality settings.

So for the newer CPU architectures it's less about core count and more nigh the IPC that particular architecture offers. In that location is a footling more variance on Intel'south side as L3 cache chapters increases from Cadre i5 to i7 and from i7 to i9.

The difference betwixt the 11th-gen models is minimal, and we're merely talking nigh 6-cores vs 8-cores. With 10th-gen we do see as much as a 15% variation between the 10600K and 10900K, and this is largely due to the L3 cache capacity.

AMD's Zen ii processors mixed information technology up with Intel 10th-gen, and it was good to see parts like the Ryzen 3 3600X neck and neck with the Cadre i5-10600K. As nosotros go down towards the Zen+ parts you lot tin can see how these older Ryzen CPUs are starting to prove their age. The 2700X struggled despite being an 8-core processors with 0.1% lows of 40 fps and while the 79 fps average was still respectable, the 5800X was 43% faster hither.

Modern iv-core/8-thread CPUs tin still technically play Battleground 2042, but you can expect a lot more noticeable stuttering than what you lot'd receive on equivalent 6 and viii-cadre models. What tin can't deliver playable performance are 4-cadre/four-thread CPUs such as the Core i3-8350K, the game was essentially cleaved on this CPU, providing nothing but constant stuttering.

Looking over these numbers, the almost surprising part is that even when throwing a pregnant amount of processing power at the game, it's difficult to push over 100 fps in large 128-actor matches. Nosotros'll talk more than about this towards the end of the article.

The 1440p results are interesting as they reverberate the GPU testing much more closely, where the CPU limits were largely removed. And so these results are more than GPU limited when using high-end CPUs such as Intel twelfth-gen or AMD Ryzen 5000 series. For the rest of the CPUs, the performance figures are about the aforementioned. For example, the Core i9-11900K dropped from 113 fps on average to 110 fps.

This explains why a lot of Battlefield players haven't been able to ameliorate performance by lowering the resolution or reducing quality settings, they're but not GPU limited, only they're also probably not ever CPU express in the way they call back they are.

Testing with Bots

Now this data is based on a custom bot lucifer using nothing but the game's AI and when compared to the 128-role player results we just saw, CPU utilization for a part like the Ryzen 5 5600X dropped by ~15%. That's enough to boost GPU operation past around 30%, and improve 1% lows by a massive 50%, though interestingly 0.1% lows remained much the same, at least for the 5600X.

It's a similar story with older 6-core/12-thread processors similar the 2600X, where the boilerplate frame rate increased by 20% while the 1% low was boosted by an incredible 59%.

But it's not just the mid to depression-end CPUs that benefit massively from this lighter workload. The 12900K'southward boilerplate frame rate jumped by 33% with a 94% increase in 1% low performance. It'south interesting to see such a massive alter in performance through what is only a very small difference in utilization. But of class, the CPU load is likely very unlike which is why utilization figures on their ain can be quite misleading.

Jumping to 1440p, we go GPU bound and this sees processors from the 10600K upward all delivering similar boilerplate frame rates, though the twelfth-gen CPUs were much better when looking at the 0.ane% low performance.

Memory Benchmarks

Given that we're heavily CPU limited in Battlefield 2042, it makes sense that memory would play a key role when it comes to performance, and it certain does. That said, we saw no improvement when moving to DDR5-6000 with the 12900K, and in fact we saw a slight frame rate regression. That's a real shame and still another blow to the current land of DDR5.

Moving on to the Ryzen 9 5950X results, I installed some budget DDR4-3000 CL18 single rank memory, and here we run across that the low-latency DDR4-3200 memory boosts performance by 18%, with a 15% comeback to i% lows. That's a pregnant difference given DDR4-3000 and 3200 are similar, at least in terms of frequency. There is, of course, a big divergence between CL18 and CL14 timings.

Using the same retentivity configurations we saw upwardly to a 21% improvement with the 8700K, and although this is a completely different CPU architecture, it makes sense that the more CPU limited yous are, the more than higher quality retentiveness can give y'all a boost. So if you have an 8700K and you're only able to drive effectually lxx fps with a relatively loftier-end GPU, tuning your memory could lead to noticeable performance improvements.

What Nosotros Learned

As many gamers have noticed, Battlefield 2042 is a very CPU demanding game and pushing by the 100 fps barrier can be a existent claiming. But is this a neglect on the developer'south behalf, is the game heavily unoptimized, and can it be stock-still?

As we meet it, the problem Battlefield 2042 faces is that of all modernistic games. Yep, it's very CPU intensive, but gamers shouldn't exist concerned virtually the pct utilization of their CPUs in Battlefield, as other aspects to the CPU might exist limiting performance, such as cache operation or retentivity latency, which aren't included in that figure. If they changed the game to utilize the CPU more than and bump up that number on high-finish CPUs, the load on the CPU itself would be increased, which would cause functioning problems on lower end CPUs. The game seems very demanding on multiple aspects of the CPU and optimizing for multiple areas could be hard, but that doesn't hateful information technology's unoptimized overall.

So if you remember it'due south bad at present, many gamers would have no chance of achieving playable performance if the game was maxing out modern 8 or 12-core processors, for example. If we await at the official organisation requirements, and focus on the AMD processors (the Intel recommendations are rubbish) we see that the minimum spec is a Ryzen 5 1600 and based on the testing we've seen here from parts like the R5 2600X, which is merely marginally faster, that recommendation makes sense, information technology's an absolute minimum.

The recommended spec calls for at least a Ryzen 7 2700X and this is where yous want to exist at a minimum, simply it has to exist said, while this CPU was still a bit overwhelmed, the game was perfectly playable, and so the recommendation makes sense. Had the programmer utilized the CPU more heavily, the recommended spec would become something similar the 5800X, and at that point very few could enjoy the game.

Then while gamers are often quick to criticize the developer by blurting out generic terms similar unoptimized, the truth is it's a lot more than complicated than that. And Battlefield 2042 does have a lot going on: they've doubled the thespian count and that's basically shown quad-core processors the door, while putting the heat on previous generation 6-core/12-thread processors. The game also features an advanced destruction system, conditions effects, and so on.

At this signal, we don't recollect the level of CPU usage we're seeing is unjustified or suggests the game is poorly optimized (referring to CPU/GPU... there have been other complaints). Could more be done to optimize the game? Probably, merely would that radically change operation without compromising on actor numbers or effects? I doubt information technology.

Information technology's a fine balance between making the game playable for the majority of the fan base, and adding new features like increasing the player count to make the game more than exciting. You can't only do more than while requiring less and I think that's what a lot of gamers were expecting.

Finally, if y'all're in the (small?) group of gamers who are willing to requite Battleground 2042 a second adventure, what's the all-time CPU to get for? If yous're on the AM4 platform, the Ryzen 7 5800X looks to be the best option, though as far as we can tell, the cheaper 5600X works just fine.

The 5800X has come up downwards in toll and at $390 it offers a "better price per core" ratio than the 5600X. Given that and the added headroom, it's probably the way to get. For Intel owners it depends on what y'all have. Annihilation older or slower than the Core i5-10600K, the upgrade to twelfth-gen with the 12600K or above is going to net you ~30% greater functioning, which in Battlefield 2042 is a significant improvement.

Correct at present the Cadre i5-12600K does expect like the perfect CPU for Battlefield 2042, throw it on the MSI Z690 Pro-A or Gigabyte Z690 UD and you lot have a powerful $500 philharmonic. And of course, DDR5 isn't required, in fact, you're best off avoiding it for now.

Speaking of memory, because this is a very CPU sensitive game, memory does influence performance more than in other games. For those gaming at 4K this will exist less of an effect, but for those trying to bulldoze as many frames as possible at lower resolutions, tightening upwardly times and increasing the frequency will dramatically improve performance.

As for how much RAM you require? Non a lot. 16GB is plenty as total system usage when playing Battlefield 2042 never exceeded 12 GB in our testing. By and large, it hovered around 10 GB, and that was with 32 GB installed in our testbed. The only time you're going to creep well over that is when running out of VRAM, and the game does use a lot of VRAM with the ultra quality settings, so you lot'll want at to the lowest degree an 8GB graphics card, or more for playing at 1440p and higher.

Circling back to our before GPU testing, if we look at the 1080p data yous only need an GeForce RTX 3060 Ti or Radeon RX 6700 XT when using a high-end CPU, and that'due south because the CPU will be the main operation limiting component, not the GPU. For 1440p, the RTX 3080 or 6800 XT will exist required, and our 1440p CPU and GPU data is very similar with just a 10% variation in performance betwixt the two dissimilar examination methods.

That's Battlefield 2042's CPU and system performance in a nutshell.
Bring a big CPU because y'all're going to need information technology...

Shopping Shortcuts:
  • Intel Core i9-12900K on Amazon
  • Intel Core i7-12700K on Amazon
  • Intel Cadre i5-12600K on Amazon
  • AMD Ryzen nine 5950X on Amazon
  • AMD Ryzen 9 5900X on Amazon
  • AMD Ryzen 7 5800X on Amazon
  • Intel Core i9-11900K on Amazon
  • Intel Cadre i7-11700K on Amazon
  • Nvidia GeForce RTX 3070 Ti on Amazon
  • AMD Radeon RX 6800 XT on Amazon