Especially since it works best at high frame rates. Like, if you were playing 30fps doubling to 60 might be a perceptable difference because of how long it is in between frames.
But going from 60 to 120, it’s still 50% “fake” frames, but the time between “real screens” is much smaller allowing for more frequent corrections to what the “fake” frames are predicting.
So while it won’t help a bad computer run anything, it can help a “mid” computer make what it can run look a lot better, because you can crank up a bunch of options to maintain the fps you were getting without I.
I play cyberpunk 2077 at 130 fps on overdrive at 1440p and its awesome and smooth with a RTX 4070. I play on a hisense 55" tv at 11 feet on my sofa and I don’t notice any artifacts. Its just way better than classic SSAO and other global illumination “hacks” that raster uses.
Frame generation is fucking huge.
Especially since it works best at high frame rates. Like, if you were playing 30fps doubling to 60 might be a perceptable difference because of how long it is in between frames.
But going from 60 to 120, it’s still 50% “fake” frames, but the time between “real screens” is much smaller allowing for more frequent corrections to what the “fake” frames are predicting.
So while it won’t help a bad computer run anything, it can help a “mid” computer make what it can run look a lot better, because you can crank up a bunch of options to maintain the fps you were getting without I.
I play cyberpunk 2077 at 130 fps on overdrive at 1440p and its awesome and smooth with a RTX 4070. I play on a hisense 55" tv at 11 feet on my sofa and I don’t notice any artifacts. Its just way better than classic SSAO and other global illumination “hacks” that raster uses.