Red Dead Redemption 2 is finally here on PC, and it has a ton of graphics settings to play with. It’s also has stability issues and requires a fast CPU (or a workaround to help eliminate stuttering)—it’s been a rough launch on PC for many players. Your PC might be in need of an upgrade to run it well, in other words, and Black Friday deal season should be an ideal time for some holiday purchases. There’s also the important choice between DX12 and Vulkan graphics APIs, not to mention performance across popular CPUs and GPUs. It’s a lot to cover, so let’s get to it.
A word on our sponsor
As our partner for these detailed performance analyses, MSI provided the hardware we needed to test Red Dead Redemption 2 on a bunch of different AMD and Nvidia GPUs, multiple CPUs, and several laptops. See below for the full details, along with our Performance Analysis 101 article. Thanks, MSI!
Looking at the PC features, the list of graphics settings is good if perhaps a bit overkill (see below). Resolution support is good—I was able to select widescreen, ultrawide, and doublewide resolutions, as well as old school 4:3 stuff like 1024×768. Except RDR2 doesn’t properly handle those in fullscreen mode.
Borderless windowed and windowed modes are fine, but in fullscreen mode RDR2 just stretches a standard 16:9 aspect ratio image to the chosen resolution. It’s not good a good look. Also, any change in aspect ratio requires a game restart to work properly (or the UI layout is off). Anyway, it’s not terrible, but RDR2 gets a yellow on aspect ratios. The FOV meanwhile can be adjusted, for both first- and third-person cameras, but the range is quite limited so it’s another yellow.
Controller support and remapping the controls get green happy faces. The latter is pretty much required unless you have extra fingers and appendages, or like the brain-bending camera-relative horse controls for some reason.
RDR2 stuttering
Even if RDR2 is officially component agnostic, it hasn’t been without problems. I’ve tested the game on quite a few different CPUs and GPUs, and helped James put together this guide on how to eliminate RDR2 stuttering. Rockstar says a potential fix for the stuttering should roll out today.
Official mod support isn’t really a thing, but there are already mods available for the singleplayer campaign, and more are likely to show up. Rockstar is taking the same stance as with GTA5: no mods for multiplayer or you might get banned, but for singleplayer use mods are generally okay. Just be careful if you’re toying with mods and then launch Red Dead Online—you’ll want to remove any extra files first.
One final piece of good news is that, like Grand Theft Auto 5, Red Dead Redemption 2 is officially component agnostic. Whether your graphics card comes from AMD or Nvidia, or you’re running a gaming CPU from AMD or Intel CPU, RDR2 generally doesn’t care—Rockstar doesn’t have a stake in the hardware vendor game. That doesn’t mean all CPUs and GPUs are guaranteed to run flawlessly (more on this in a moment), and I encountered quite a few crash-to-desktop events in the testing that I’ve completed. But at least it wasn’t specifically designed to favor one component vendor.
Red Dead Redemption 2 settings overview
Like GTA5, RDR2 has no presets for graphics. The game will attempt to auto-detect settings that it deems appropriate for your hardware, but you’ll almost certainly end up wanting to tweak things. There’s a slider labeled “Quality Preset Level” that might seem like a good starting point, but it has 21 tick marks available, many of them apparently overlap, and it has nebulous targets: ‘favor performance,’ ‘balanced,’ and ‘favor quality.’ The issue is that one PC’s ‘balanced’ settings won’t always be the same as another PC, and many of the advanced settings (which are locked by default) get set to different values depending on your CPU and GPU.
Figuring out exactly how to reliably benchmark RDR2 took a bit of trial and error, but I’ve got that sorted out now. The simple solution is to manually configure every setting for each hardware combination I test, after cranking the variable “Quality Preset Level” to minimum—annoying, but it could be worse. I’m starting with the most popular graphics cards, and I’ll add other cards and CPUs over the coming days.
I’ve standardized on the Vulkan API, which generally gives higher average fps. Since I’m using a fast CPU for the graphics card testing, at least I’m not getting hit with massive stutters that require workarounds (see above boxout). DX12 can in some cases deliver more stable framerates (ie, better minimum fps), but Vulkan is the default and I’ve got some additional API test results below for those who are interested. If you’re using Vulkan and performance seems bad, try DX12—and vice versa.
Also note that 3GB cards can’t even attempt to run with all settings at ultra, and 2GB cards are limited to low on many settings. If you have an older GPU with only 1GB or 1.5GB VRAM, I’m not sure what will happen, but you’ll probably be locked into whatever settings the game decides to use. GTA5 has an “ignore memory limits” option, but presumably to try and mitigate instability, RDR2 doesn’t have an equivalent and won’t allow you to exceed your GPU’s VRAM.
If you just stick with the default settings on a 1080p display (or drop to 1080p on a higher resolution monitor), you’ll probably be okay. And by okay, I mean that if you have at least a GTX 1060 / RX 570 or faster graphics card with 4GB VRAM, you can probably run RDR2 at 1080p and get 30-60 fps. Unless your CPU is a problem, or some other software is a problem, or the game keeps crashing, or … you get the point. I haven’t had too many problems on my test PCs, but then I’ve been running benchmarks more than playing the game. Regardless, if you want to know how the various GPUs and CPUs stack up, that’s what I’m here for.
Running through Red Dead Redemption 2’s graphics settings, there are about 40 different options to adjust. As a baseline measurement of performance, I drop the Quality Preset Level to minimum, then unlock the advanced graphics settings and set everything in there to minimum. Then I set the main options to maximum quality but leave MSAA off. That’s the starting point for my “ultra quality” and the above settings performance testing charts. The difference between maximum and minimum on the Quality Preset Level when customizing all the other settings isn’t massive, but it’s measurable and it’s best to be sure.
You can see specific image shots of my test settings in the above gallery, if you’re interested. Let me also define my low, medium, and high benchmark settings while I’m here. Low is simple: take the settings above, but set everything in the top section to low/off/minimum. Medium and High use the medium and high values for the primary settings, with 2x and 8x anisotropic filtering, respectively (but leave MSAA and FXAA off). Again, images of all of these are in the above gallery.
There. Now we can finally talk about what settings actually matter, as well as performance.
Most of the settings only cause a minor dip in framerates. Reflection Quality and Volumetrics Quality are the two major settings to adjust if you’re looking to improve framerates. Global Illumination Quality, Shadow Quality, Screen Space Ambient Occlusion, and Texture Quality can also provide a modest boost to performance—though lower resolution textures are very noticeable and I’d leave them at ultra or at least high on any GPU with 4GB or more VRAM. I’d also leave SSAO on medium or higher if possible.
In terms of advanced settings that can reduce performance, enabling 4x MSAA causes more than a 25 percent drop in framerates. Enable MSAA at your own peril—your GPU almost certainly can’t handle it. Unless you’re reading this in 2025, in which case, I hope your RTX 5080 or RX 8700 or whatever is awesome. In contrast, TAA is more than sufficient and causes an imperceptible 1 percent dip. Likewise, enabling 4x Reflection MSAA causes performance to drop around 8 percent, and it’s not an effect you’re likely to notice while playing.
Elsewhere, setting Parallax Occlusion Mapping Quality to ultra can cause a modest 4-5 percent drop in performance. Depending on your GPU, the Tree and Grass LOD settings can also drop performance a few percent—but setting trees to max makes them look nicer and is probably worth the hit. And finally, setting Soft Shadows to ultra reduces performance a few percent but is worth considering.
The remaining advanced settings mostly cause a very small (1-2 percent at most) dip in performance—it adds up if you crank everything to max, but individually the various options don’t matter much. That’s basically true of all the settings I didn’t specifically call out. Nearly everything else in the settings won’t substantially change performance, based on my testing. That’s 22 different settings that all cause less than a 1-2 percent change in performance. Maybe some of those settings matter more on an older or slower GPU, as I only checked the 2060 and 5700, but if you’re at that point you should probably just drop the resolution or use resolution scaling first.
Overall, going from my ultra to low ‘presets’ improves performance substantially—more than double the fps in my testing—while increasing all of the advanced settings from the defaults (the ‘maximum’ setting at the bottom of the settings charts) causes a modest 15 percent loss of performance. No GPU is currently able to maintain a steady 60 fps at 4K ultra, and even 1440p ultra is a stretch, so I’ve only tested those resolutions at high quality settings.
Red Dead Redemption 2 system requirements
The official RDR2 system requirements are pretty tame. Rockstar lists some relatively old hardware for its minimum recommendation, but given the amount of crashing and other problems users have reported, you should probably err on the side of higher-end components. Rockstar also doesn’t state what level of performance you should expect, and I’m guessing it’s 30 fps at 1080p low with the minimum setup, while the recommended PC hardware is probably aiming for 30 fps or more at 1080p high. Either way, you’re going to need a lot of storage space.
Minimum PC specifications:
- OS: Windows 7 SP1
- Processor: Intel Core i5-2500K / AMD FX-6300
- Memory: 8GB
- Graphics Card: Nvidia GeForce GTX 770 2GB / AMD Radeon R9 280 3GB HDD
- Storage Space: 150GB
Recommended PC specifications:
- OS: Windows 10 April 2018 Update (v1803 or later)
- Processor: Intel Core i7-4770K / AMD Ryzen 5 1500X
- Memory: 12GB
- Graphics Card: Nvidia GeForce GTX 1060 6GB / AMD Radeon RX 480 4GB
- Storage Space: 150GB
Those specs don’t look too bad, but the CPU specs in particular are suspect—or at least, you’ll probably need to use the above workaround to eliminate lengthy stalls and stuttering. Hopefully things get sorted out over the next week or so with another patch, but right now having a PC that greatly exceeds the minimum specs is a good idea. Especially if you’re hoping for a smooth 1080p high at 60 fps, in which case you’re probably looking at an RX 5700 or RTX 2060 Super with a 6-core/12-thread CPU or better.
Red Dead Redemption 2 graphics card benchmarks
That brings us to actual performance, and I continue to use my standard testbed for graphics cards (see the boxout to the right). Red Dead Redemption 2 includes its own benchmark tool, which was used for all of the benchmark data. The built-in benchmark runs through five scenes, the first four of which are fairly static and don’t really represent areas of the game where slowdowns are likely to occur or matter. Each lasts about 20-25 seconds and none are particularly demanding, while the final sequence is a 130 second robbery followed by a horse ride through town, with some shooting—a much better test sequence that’s more representative of play.
I’m collecting frametimes from the last portion, using FrameView (an Nvidia variant of PresentMon). Each GPU is tested multiple times to verify the results, though variability between runs is relatively small. Needless to say, I’ve watched the benchmark a few too many times already. At one point, Arthur fires off up to 14 shots from his six-shooter without reloading—because he’s overclocked I guess. Anyway, the benchmark only looks at performance in one area of the game. Other areas will perform better, some will perform worse, but it at least gives a reasonable baseline measurement of the performance you can expect.
All of the discrete GPU testing is done using an overclocked Intel Core i7-8700K with an MSI MEG Z390 Godlike motherboard, using MSI graphics cards. AMD Ryzen CPUs are (or at least will be) tested on MSI’s MEG X570 Godlike, except the 2400G which uses an MSI B350I board since I need something with a DisplayPort connection. MSI is our partner for these videos and provides the hardware and sponsorship to make them happen, including three gaming laptops: the GL63 with RTX 2060, GS75 Stealth with RTX 2070 Max-Q, and GE75 Raider with RTX 2080.
I used the presets I defined earlier, along with the latest AMD and Nvidia drivers available at the time for testing: AMD 19.11.1 and Nvidia 441.12, both of which are game ready for Red Dead Redemption 2. I will eventually test—or at least try to test—Intel and AMD integrated graphics at 720p low. I’m not holding my breath that the former will work.
At low / minimum quality, Red Dead Redemption 2 looks okay, but the texture quality is really poor and the world in general looks very bland and blurry. There’s still plenty of geometry and objects to pretty things up, and distant surfaces look okay, but anything close to the camera starts to look like it has textures from the original Deus Ex. There’s a massive difference between low, medium, and high texture quality, and a modest difference between high and ultra.
Even at minimum quality settings, performance is nothing special. The GTX 1060, both 3GB and 6GB variant, can average 60 fps, and so can the RX 570, but anything slower is going to struggle. Cards like the GTX 1050 only hit 40 fps, and 2GB VRAM means many settings can’t even go any higher. Minimum fps on many GPUs also falls well below 60, and Rockstar’s engine is definitely not built to hit high framerates. The fastest cards can just barely break 144 fps, but dips into the sub-100 fps range are plenty common.
AMD GPUs do better on both averages and minimums for a change. The RX 570 for instance nearly stays above 60 fps, with a 58 fps minimum in the benchmark. The 1060 6GB in contrast has 97 percentile minimums of just 48 fps. After the sketchy launch performance for AMD GPUs in Ghost Recon Breakpoint and The Outer Worlds, both AMD promoted games, I wouldn’t have expected AMD to come back swinging in RDR2. Then again, it’s been out on consoles for a year, which utilize AMD hardware, and it does use low-level APIs that traditionally have favored AMD. Either way, it’s a nice change of pace for Team Red (Dead).
Bumping everything up to medium quality (except the advanced options, as noted earlier), performance drops about 15-20 percent on the slower cards, while the fastest cards are still mostly CPU limited. Even the medium quality textures still don’t look great up close, but there’s a definite improvement vs. minimum quality overall.
AMD GPUs continue to lead their closest Nvidia counterparts as well—the 570 is 13 percent faster than the 1060 6GB, and 29 percent faster than the 1060 3GB. For reference, there are many games where the 1060 3GB actually comes out ahead of the 570. It’s a bit ironic to see AMD GPUs perform this well in a game that’s supposedly vendor agnostic, and perhaps drivers and patches will change things, but this is how things stand right now.
Of the cards I’ve tested so far, the RX 570 and GTX 1060 6GB still clear 60 fps averages, though minimum fps is far below that. The wide gap between the average and 97 percentile average framerates usually indicated plenty of stuttering/micro-stuttering and framerate dips, which is definitely happening in RDR2. To smooth things out, you’ll want at least GTX 1660 Ti (1070 should also suffice) or RX Vega 56 level hardware—and a fast CPU, but more on that below.
Switching to high quality settings drops performance another 20-25 percent relative to medium, unless you’re on an ultra-fast card like a 2080 Ti. This is probably as high as most people should go on current hardware, reserving ultra quality for the future. It’s not like the slight change in ultra quality reflections is really noticeable.
Minimums on the AMD GPUs also start to look a bit less consistent, and the newer Nvidia Turing and AMD Navi architectures offer some clear advantages. Notice how the GTX 1650 beats the 1060 3GB and comes relatively close to the 1060 6GB? The same goes for AMD’s RX 5700 series compared to the Vega and Radeon VII.
Hitting 60 fps gets a bit more difficult, with the RX 590 and GTX 1660 Ti getting there but, again, with relatively poor minimum fps. To get a steady 60 fps (for minimums as well as averages), you’re looking at the RX 5700 or RTX 2060 Super—the vanilla RTX 2060 falls just a hair short. These are also the last settings where I can test the 4GB cards, as ultra quality requires a bit too much VRAM. Actually, I can still do 1440p at high quality on 4GB cards, but first let’s look at 1080p ultra.
Ultra quality is simply too demanding for today’s graphics cards. The difference in visual fidelity is also pretty small—slightly better textures, lighting, shadows, etc. And the settings aren’t even fully maxed out in my tests, as there are several advanced options that can still be cranked up and drop performance another 15 percent.
Sure, the RTX 2080 Ti can still handle 1080p ultra at more than 60 fps, and a handful of other GPUs will average 60 fps as well, but minimums are going to be lower. Otherwise there’s not much to say here. If you want to try pushing one or two options to ultra, that’s fine. Just leave reflections and volumetrics quality at high or even medium, because you don’t really need them. The discernible difference between each level is minimal.
1440p at high settings is actually less demanding than 1080p at ultra, which makes it potentially viable for the high-end cards. The problem is maintaining 60 fps at 1440p, as usual. AMD’s minimum fps are generally worse than the top Nvidia cards, though the 6GB 2060 also looks pretty weak. Of the cards I’ve tested so far, only the 2070 Super and 2080 Ti keep minimums above 60 (which means the 2080 and 2080 Super should also suffice). Average fps still favors AMD on most matchups, however.
If you’re only looking to hit 30 fps, the GTX 1660 Ti and above should be fine, and maybe even the RX 590. The RX 570 4GB does average 40 fps, but the dips into the low 20s are definitely noticeable and something I wouldn’t recommend. Nvidia’s 1060 likewise feels very choppy—maybe 1440p medium would be okay, but high quality isn’t.
Finally, 4K at high quality is as far as I tried to push things. Ultra quality drops performance about 25-35 percent, depending on your GPU, which means nothing comes close to averaging 60 fps at maximum quality and 4K in RDR2. But 4K high still looks crisp and clean, and at least one GPU—the RTX 2080 Ti—can average more than 60 fps. But that average comes with a 43 fps minimum indicating plenty of dips, unfortunately.
I’m reminded of my early GTA5 testing, where I maxed out everything including the advanced settings. Back then, the GTX 980 Ti was the king of the graphics cards, but 4K and max quality on a single GTX 980 Ti simply wasn’t going to cut it. In fact, GTA5 at maxed settings (including 4xMSAA) plugged along at just 24 fps on the then-fastest GPU.
So if you’re looking at RDR2 and wondering how not even the RTX 2080 Ti can handle 4K at maximum quality, this isn’t really anything new. Of course, multi-GPU support was still more of a thing back in 2015, whereas SLI and CrossFire support is practically gone these days. Note that because RDR2 uses Vulkan or DirectX 12, support for multi-GPU has to be explicitly coded into the game, and that hasn’t happened (and probably never will).
Maybe a patch or driver update will improve things, but right now nobody is going to be playing RDR2 at 4K, maximum quality, and 60 fps.
After all the initial testing results, I also wanted to show some of my API testing numbers. The above gallery represents a lot of benchmarks, all to basically reach the conclusion that the choice of which API to use has no clear winner. Some cards at some settings do better with Vulkan, others do better at DX12. But regardless of which API you choose, RDR2 has bugs and performance anomalies that need to be squashed.
Red Dead Redemption 2 CPU benchmarks
I’ve already mentioned stuttering on CPUs with lower core and thread counts, but it’s now time to show just how bad things can get. As usual, I’m testing with the fastest consumer graphics card currently available, the RTX 2080 Ti, in order to show as much of a difference between the CPUs as possible. I’ve also run the Core i3-8100 with and without the stuttering workaround (limiting RDR2.exe to 98 percent of the CPU).
I’ll be adding Ryzen CPU testing soon enough, and we’re expecting an impending patch that will hopefully fix a lot of the problems seen so far. Until then, here are the CPU testing results, with some CPUs tested in both DirectX 12 and Vulkan modes:
The Core i9-9900KS ends up at the top of the charts, which is no real surprise, followed by the overclocked i7-8700K. The stock 8700K isn’t much slower, though minimum framerates do tend to fluctuate a bit more. And then there’s a cliff drop to the i5-8400.
Much of the drop in performance from the 8700K to the 8400 is from the lack of Hyper-Threading on the latter. I’ve tried several tweaks to improve the i5-8400 results, which are currently prone to stuttering at times, but I haven’t found a magic bullet yet. It’s weirdly inconsistent—1080p ultra it actually beat the 8700K OC in DX12 mode, but it’s not clear why in certain combinations it does well, where others it sucks.
That stands in contrast to the i3-8100, where minimum fps is so bad using the default settings as to render RDR2 almost entirely unplayable. But the stuttering fix does wonders, relatively speaking, and while minimum fps is still well below 60, it’s better than the alternative of not playing at all. Maybe.
Keep in mind that the Core i3-8100 is going to be very similar to any of Intel’s previous generation of Core i5 parts. It has 4-cores and no Hyper-Threading, which doesn’t sit well with RDR2. A stuttering fix is pretty much required for anyone using this sort of CPU. Rockstar says an official one is on the way.
Red Dead Redemption 2 laptop benchmarks
What about laptops? Considering the problems with CPUs with fewer cores and threads, I was a bit worried about how RDR2 would perform on the GL63. It has a 4-core/8-thread Core i5-8300H mobile CPU, with lower clocks than you’ll typically see from the desktop i5-8400. The other two laptops have 6-core/12-thread Core i7-8750H processors, which should be less of a concern.
Turns out, my misgivings were mostly unwarranted. There were no massive stalls on the GL63, or any of the other laptops. On the other hand, the lower clockspeeds and fewer threads definitely won’t help performance.
The mobile RTX cards aren’t able to keep up with the desktop models. Part of that is because the mobile GPUs are clocked lower (especially the Max-Q variants), but the slower mobile CPUs are certainly a factor. Keeping minimum fps above 60 is going to be difficult on midrange gaming laptops, but if you go whole hog on something like the GE75 you should be okay.
The GL63 does fall a bit behind the GS75, but it’s difficult to say whether that’s the CPU or the GPU slowing it down. The 2060 and 2070 Max-Q usually perform about the same, but the latter also has 8GB VRAM, which can help.
All three of my test laptops from MSI are also equipped with 32GB of system RAM. That normally wouldn’t matter for games, but RDR2 seems to push beyond the level of hardware I consider sufficient, so I wanted to check. As a quick test, I slapped an additional 16GB of RAM into my desktop and retested the 2080 Ti. At least in my testing, the extra system RAM didn’t appear to be a factor, though it might help some during longer play sessions.
Parting thoughts
Thanks again to MSI for providing the hardware for our testing of Red Dead Redemption 2. To put things bluntly, this has been a bungled launch on PC. The cynics among us will point at the delayed Steam release as proof that Rockstar knew the PC launch of RDR2 was premature. Even if Rockstar didn’t know, it’s surprising to see big stability problems in a marquee PC game as good looking as RDR2.
Check back next month when the Steam release arrives, and I won’t be surprised if the stability and performance woes are a thing of the past. Also, it would be lovely if Rockstar just axed the first four scenes from the built-in benchmark; all they’re doing for me (and others) is doubling the amount of time it takes to run my tests.
For now, RDR2 generally needs a good graphics card for 1080p at high settings, but just as important is a CPU with sufficient cores and threads. I’m working to test AMD’s Ryzen parts as well, and early indications are that the 6-core/12-thread and higher models are going to do fine in RDR2. Old AMD parts like the FX series wouldn’t be my first pick, but then they never were—I haven’t had an FX PC around for testing since the first Ryzen CPUs shipped, and I don’t miss it at all.
Overall, AMD’s graphics cards do quite well in RDR2, all things considered. Minimum fps at higher settings drops off, but for 1080p medium or high, particularly on midrange GPUs, the Radeon models definitely hold up better than the previous gen GTX cards.
That’s how things stand right now, though I expect RDR2 will get patched in the coming weeks to improve performance and stability. That means these benchmarks may be more of a snapshot in time rather than something to look back on for months to come, but I’ll cross that bridge when I come to it.