Currently trying to refund the new Indiana Jones game because it’s unplayable without raytracing cri. My card isn’t even old, it’s just 8GB of VRAM is the absolute minimum apparently so my mobile 3060 is now useless. I miss when I used to be able to play new games in 2014 on my shitty AMD card at 20fps, yeah it didn’t look great but developers still included a very low graphics option for people like me. Now you need to be upgrading every 2 years to keep up.

  • peppersky [he/him, any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    13
    ·
    16 hours ago

    The new indiana jones is actually pretty decently optimized, like I run it at 1080p all high/ultra settings on my rtx 3060 12gb, with DLAA downscaling enabled at a mostly locked 60fps. Like it is leagues better than any UE5 game, it’s just the hard VRAM requirements that suck.

    I feel like a lot of the issues game graphics have nowadays is just that GPU prices have been ridiculously inflated over the last two decade because of crypto/ai. Like it is not surprising that devs will follow the newest trends and technologies when it comes to graphics, but the hardware needs of raytracing and global illumination and the likes are just too high for what gpu performance/dollar you can get in 2024. I just recently upgraded from an AMD RX480 to a used Nvidia RTX 3060 12GB (which seemed to be the best bang for the buck, an RTX 4060 would have been much more expensive for not a lot more performance), and that upgrade gets you maybe double performance in your games, for a GPU that is a whole seven years newer (and no VRAM upgrade at all when you get the base model). These cards just simply shouldn’t cost as much as they do. If you don’t have unlimited money to spend, you are going to have a much worse experience today compared to half a decade or a decade ago.