I’m glad they found a way, but at the same time - what the hell? Why is it OK for game devs of this magnitude to have a hardcoded hardware list? Look for feature support, not a string that is easy to manipulate outside your control!
The problem in this case is that they automatically trigger XeSS, which isn’t bad unto itself (unless it can’t be deactivated, which this sounds like).
The GPU does support XeSS but it crashes on Linux. If they just added a toggle/cmd flag to disable the feature changing the vendorId wouldn’t be necessary.
I am playing Baldur’s Gate 3 (can’t say enough great things about this game) and they had a toggle in the game setting for upscaling options. And DLSS runs great with my Linux PC. I thought I heard Larian say they are trying to get XeSS too.
You can enable either Nvidia’s DLSS or AMD’s FSR via the settings menu.
I imagine Larian care. Especially since they’re pushing Steamdeck support.
The reason this is a “supported platform” issue is that the developers of Hogwarts legacy know their supported platforms support XeSS, so any work that is not “just turn it on” is additional work for no gain.
I would bet money that Intel’s dev rel team worked closely with Avalanche to add XeSS support to sell more Intel GPUs.
Most likely the Hogwarts devs were said, “sure, do whatever you want on your own hardware, just don’t you dare break anything on any other platform while we’re trying to ship”. The easiest way to green light this and know nothing else would be affected would be to hard code everything behind Intel’s vendor IDs.
So this probably isn’t a case of Intel working around a game dev’s code, it’s probably a case of Intel working around its own code.
IIRC, with an Nvidia card DXVK will spoof an AMD card in a lot of games because otherwise the game will try to interact with the Windows Nvidia drivers which aren’t there.
# Report Nvidia GPUs as AMD GPUs bydefault. This is enabled bydefault
# to work around issues with NVAPI, but may cause issues in some games.
#
# Supported values: True, False
# dxgi.nvapiHack = True
nteract with the Windows Nvidia drivers which aren’t there
Funny story. I was trying to get RayTracing working under Wine for a few days and finally found the solution (needed to download the nvlibs zip from GitHub and run the installer).
Couple weeks later I went back into Wine and it was broken. After another 3 days of struggling, I decided to redownload nvlibs and run the installer, when I noticed it only symlinks the needed libraries into WINEPREFIX. Me, being the resource miser I am, had removed the folder from ~/Downloads when I thought I was done with it …
I’m glad they found a way, but at the same time - what the hell? Why is it OK for game devs of this magnitude to have a hardcoded hardware list? Look for feature support, not a string that is easy to manipulate outside your control!
The problem in this case is that they automatically trigger XeSS, which isn’t bad unto itself (unless it can’t be deactivated, which this sounds like).
The GPU does support XeSS but it crashes on Linux. If they just added a toggle/cmd flag to disable the feature changing the vendorId wouldn’t be necessary.
Could the game developers simply add this toggle for XeSS?
LinUX iS nOt A sUpPoRrtEd PlaTfOrm
I am playing Baldur’s Gate 3 (can’t say enough great things about this game) and they had a toggle in the game setting for upscaling options. And DLSS runs great with my Linux PC. I thought I heard Larian say they are trying to get XeSS too.
I imagine Larian care. Especially since they’re pushing Steamdeck support.
The reason this is a “supported platform” issue is that the developers of Hogwarts legacy know their supported platforms support XeSS, so any work that is not “just turn it on” is additional work for no gain.
Could Intel fix the issue with XeSS on Linux?
Is the problem with Intel, Linux, or the game itself?
The game runs when XeSS is disabled. The game runs on Linux otherwise. The only factor causing it to fail is XeSS.
If XeSS causes the issue on any games that activate it, then it is most likely an Intel issue.
I would bet money that Intel’s dev rel team worked closely with Avalanche to add XeSS support to sell more Intel GPUs.
Most likely the Hogwarts devs were said, “sure, do whatever you want on your own hardware, just don’t you dare break anything on any other platform while we’re trying to ship”. The easiest way to green light this and know nothing else would be affected would be to hard code everything behind Intel’s vendor IDs.
So this probably isn’t a case of Intel working around a game dev’s code, it’s probably a case of Intel working around its own code.
IIRC, with an Nvidia card DXVK will spoof an AMD card in a lot of games because otherwise the game will try to interact with the Windows Nvidia drivers which aren’t there.
You remember correctly. From the DXVK conf file:
# Report Nvidia GPUs as AMD GPUs by default. This is enabled by default # to work around issues with NVAPI, but may cause issues in some games. # # Supported values: True, False # dxgi.nvapiHack = True
Funny story. I was trying to get RayTracing working under Wine for a few days and finally found the solution (needed to download the nvlibs zip from GitHub and run the installer).
Couple weeks later I went back into Wine and it was broken. After another 3 days of struggling, I decided to redownload nvlibs and run the installer, when I noticed it only symlinks the needed libraries into WINEPREFIX. Me, being the resource miser I am, had removed the folder from ~/Downloads when I thought I was done with it …