But most importantly, i have noticed, that all the reviewers who revisit older video cards in modern gaming benchmarks versus the current generation cards, almost always use cards of pathetic stock memory and clock speeds, which is especially unfair to any high end card, that has ''big numbers'' of cuda cores and a massive chunk memory output, but perhaps comes with a bit lowish safe clocks.
#DIFFERENCE BETWEEN THE VANISHING OF ETHAN CARTER AND REDUX DRIVER#
So what's the problem here? NVIDIA gimping drivers? Even though i have that theory in mind, i think the real matter is that only the performance of current architecture cards is being improved with driver releases and old architecture cards are left as they were. Many people also notice how faster AMD cards became over the years, like GTX 780 having been on par with R9 290 in 2013, is nowdays always behind. Poor performance of GTX 780 has been noted and tested by Techspot, Gamer's Nexus, Tech Yes City, and so on. Despite that, you will notice that in some modern titles it barely equals GTX 1050 Ti. So GTX 780, being old enough, but offering 2304 cuda cores and having a massive memory bandwidth should, in theory still max out many modern titles at 1080p. GTX 980, on the other hand, is not that old to be the target of obvious gimping, as it is always outperforming GTX 1060 3 GB and equals GTX 1060 6 GB on many occasions or falls just short behind it. Why GTX 780, and not GTX 580 or GTX 980? For starters, GTX 580, even if it is 3 GB model, is clearly inferior to GTX 1050, so no one really should care about it performing in modern titles despite having massive memory bandwidth, it has only 512 cuda cores - a minuscule number to offer anything potent today. Not having been warranted and certified about the existing research results in the internet, i've decided to investigate this matter myself by testing GeForce GTX 780 - an insanely popular video card for just this particular gimping matter. I am not sure when it all started, but certainly over the last 3 years this theory about NVIDIA deliberately decreasing performance of older generation video cards through new driver updates in favor of the new generation video cards has been accumulated and talked about more severely than ever before! hmd pd for pixel density, stat fps for framerate.The phrase ''NVIDIA is gimping video cards'' is nothing new. You can also do hmd sp from here for screen clarity. then press tab you will get all the quality settings. SteamLibrary\SteamApps\common\The Vanishing of Ethan Carter Redux\EthanCarter\Binaries\Win64\EthanCarter-Win64-Console.exe. Now I’m starting to skip frames a little though.Ī user commented below you can get the console by going to It appeared with some new settings though such as PixelDensity=1.000000, I changed it to 1.500000 and seeing pretty big increase in image quality. I had to set the engine.ini to read only because it kept defaulting back. I feel I can make out in stereo of trees that are very far away that I couldn’t before. Wow if you have the GPU power open up the Engine.ini file under “app data” then EthanCarter and add ScreenPercentage=200 then turn off antialiasing, you can see pretty damn far in stereo.