this post was submitted on 26 Oct 2023
741 points (95.7% liked)
Linux Gaming
15516 readers
76 users here now
Discussions and news about gaming on the GNU/Linux family of operating systems (including the Steam Deck). Potentially a $HOME
away from home for disgruntled /r/linux_gaming denizens of the redditarian demesne.
This page can be subscribed to via RSS.
Original /r/linux_gaming pengwing by uoou.
Resources
WWW:
Discord:
IRC:
Matrix:
Telegram:
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
"works fine" is very different than "is equivalently optimized."
Valve has done a lot of work to get games to work well on the Steam Deck, and that likely translates to other AMD GPUs. So it makes total sense that Valve would optimize the Proton translation layer for DirectX calls to the AMD driver differently than the NVIDIA driver (or rather, in a way that AMD handles better). A big issue in GPU optimization is keeping it busy, so perhaps the AMD driver working with Valve's patches on the DirectX to Vulkan layer improve utilize m utilization. That could translate to a modest performance improvement even on well optimized games (perhaps 5-10%, probably not more than 20%).
I don't know if that's what's going on here, but it's a plausible explanation.
I can see why you'd think that, but what you fail to understand is that valve is not the only one working on proton, and valve themselves did not even make DXVK. Those are free and open source efforts and valve even pays external devs to commit to that software. I'm telling you that DXVK itself is not going to give a boost to graphical performance because it literally cannot, those are extra instructions that your GPU has to perform in order to send out frames.
Directx to vulkan translation is exactly that, translation. It receives directx calls and translates them to vulkan. For one, it has overhead, two, if the game is optimized, it is already going to be running at max performance on windows, using DXVK is going to slow the GPU time down because it will have to perform more calculations. No scheduler will save you from that, not even the Linux one, because it isn't something that is handled by the scheduler.
Two things:
DirectX -> Vulkan isn't a direct translation since the APIs aren't 1:1, so there's going to be some tuning in how APIs are mapped, and the tuning can differ depending on the GPU driver you're using.
It's the same with processors, you can optimize a compiler to work better on AMD vs Intel or vice versa (look at Intel C++ compiler benchmarks for an example of that), even if they use the exact same set of instructions because the microarchitectures are optimized differently. This is because the way the instruction set gets mapped to the microarchitecture can impact performance significantly (something like 10% is possible, depending on the benchmark).
GPU drivers are complicated, and there are a lot of areas where the interaction between the driver, software, and system services can be optimized. AMD's drivers are open source, which helps with those optimization efforts. Then you throw in a big, well-funded, and motivated company like Valve funding development (both through salaries and donations) and you end up with AMD GPUs getting extra attention for things like DXVK.
So I would expect AMD on Linux to perform better vs NVIDIA on Linux when compared to AMD vs NVIDIA on Windows. As in, the performance difference on Linux vs Windows would be more favorable for AMD cards than NVIDIA ones because AMD on Linux gets more attention than NVIDIA on Linux. I don't expect the same for compute, since NVIDIA invests heavily in that space on Linux, so it's not an inherent advantage of the platform (e.g. the scheduler discussion), but a question of where optimization efforts are focused.
Alright look, I'm not going to argue about who said what because we both know what we said and it is unrelated to the topic at hand.
The reason the windows amd driver is bad is not due to performance, it is the very same reason why the proprietary driver is bad on Linux, it is horrible reliability.
There are circumstances where they trade blows and circumstances where they perform similarly. If you really want to compare the two based on OS alone, you need to compare the equivalent drivers which is the proprietary one.
We're already not doing an apples to apples comparison here because we're comparing WINE+DXVK vs DirectX. Comparing the OS itself isn't that interesting, at least from an end-user perspective, what is interesting is comparing the typical user experience on both platforms. As in, no tinkering with stuff, just installing in the most obvious way.
Valve is optimizing for that typical user experience on their Steam Deck, and that translates to the desktop fairly well. They're not really doing the same on Windows, so it's interesting to compare devs+manufacturers optimizing stuff on Windows vs the community+Valve optimizing stuff on Linux.
Why would not comparing the OS itself be interesting? That is literally the foundation of everything you are seeing on the screen.
You also can't just compare WINE+DXVK to DirectX, because you can actually use DXVK on windows. If the video title was "directx vs dxvk" then that would be totally fair, but it is not, it is called "windows vs linux". I'm simply trying to say that the vast majority of games are not going to see a 17% increase in GPU performance, your biggest boost is going to lie with CPU bound games because it is the truth.
The only time you'll see a game perform better on a GPU on Linux is when the game has a native version, and even then that only applies if they actively develop that version, many games are not actively developed and are even a few versions behind.
Because regular users aren't going to be changing drivers based on the game, or doing a ton of system-level configuration to get a bit better performance.
So it should be defaults vs defaults.
If we want to compare OSes, we should do targeted benchmarks (Phoronix does a ton of those). There are far more interesting ways to compare schedulers than running games, and the same is true for disk performance, GPU overhead, etc.
How many people actually do that though? I'm guessing not many.
"Windows vs Linux" is comparing the default experiences on both systems, and that's interesting for people who are unlikely to change the defaults (i.e. most people).
That's just not true, as evidenced by this video. If you take the typical setup on Windows vs the typical setup on Linux, it seems you get a 17% average performance uplift on Linux on these games.
That doesn't mean Linux is 17% faster than Windows, nor does it mean you should expect games to run 17% better on Linux, it just means Linux is competitive and sometimes faster with the default configuration. And that's interesting.
Linux does not have a default configuration, that's why we have over 600 distros. If you want to have a baseline "default configuration" then fedora would be the way to go, which he has not used.
Yes, he got a performance uplit by 17% on average in these games, the point he is trying to make is that you can get this in every game on Linux which is what is not true.
Most of those games are also CPU bound, an area that Linux is going to destroy windows. Once again, I am referring to GPU performance specifically, as that is the general point that OP makes with these posts.
Sure, but each distro has a default configuration, and distros don't vary that much in terms of performance with those default configurations for playing games. If there is a consistent performance difference, it'll likely be something like 1-2%, which should be within run-to-run variance and not really impact the results.
And if anyone assumes that an average between 10 games represents the difference you'll see on average for your own games doesn't understand statistics because 10 games is not enough to be a representative sample, especially since they weren't even randomly selected to begin with. It's still an interesting result.
You're being hyperbolic here.
The differences, all else being equal, should be pretty small most of the time unless there's a hardware driver issue (e.g. when Intel's new p-core vs e-core split came out, Windows had much better support).
If we're seeing a huge difference, more is going on than just a "better" scheduler or more efficient kernel or whatever. It's much more likely Windows is using DirectX and Linux is using DXVK or something. The bigger the gap, the less likely it's the kernel that's doing it.
As someone who has used Linux exclusively for ~15 years, these kinds of benchmarks are certainly exciting. However, we need to be careful to not read too much into them.
That may be true, but de facto defaults today is Proton experimental on Steam with the a recent Linux kernel. That's pretty much the same across all distros.
Yup, the difference between Ubuntu, Fedora, and Arch or whatever isn't going to be all that big, assuming you're working with each distribution's default kernel and running with a Steam's provider runtime. You might get 1-2% here and there, but that's pretty much within run to run variance anyway.
That's not all the factors that play a role in performance in games.
For instance, what fork of the kernel are they using? Are they using zram? What graphics driver are they using? Gamescope? Gamemode? All of those things affect performance of a game to varying degrees.
Also, Proton experimental is definitely not the default on any system, that would be Proton 8.