

One thing that we do know for sure though is that the older DirectX 11 renderer quite simply did not work the same way as the new DirectX 12 one does as far as VRAM allocation, so in all likelihood Far Cry 6 would have had zero texture streaming problems (with or without the HD Texture Pack, regardless of the amount of available VRAM) had they continued to use it. No way we'll ever know for sure though of course. It's possible Ubisoft Montreal would have been able to do the switch to DirectX 12 without introducing the issues we're seeing, as they were the ones who implemented the switch to DirectX 12 for the Assassin's Creed series (going from Odyssey to Valhalla), and managed to not introduce any new major bugs in the process. I really, really think this is a combination of "fairly new, unoptimized renderer" and "studio branch that has minimal experience (relatively speaking) with the series as a whole." Far Cry 6 also uses an entirely new DirectX 12 based renderer (presumably switched to for the purposes of letting them implement DXR raytracing, which is one thing not possible with DirectX 11). Far Cry New Dawn, Far Cry 5, Far Cry Primal, Far Cry 4, and Far Cry 3 all used the same DirectX 11 based renderer (which was improved over time in terms of features and capabilities of course) and were all primarily developed by Ubisoft Montreal (as opposed to Ubisoft Toronto like Far Cry 6 is).

I think what we're seeing is just a classic case of sheer technical incompetence. Once again, Far Cry 5 was also an AMD-sponsored title. I disagree about it being some kind of AMD-related marketing ploy, honestly.
