• 0 Posts
  • 144 Comments
Joined 1 year ago
cake
Cake day: June 18th, 2023

help-circle
  • I basically have one primary criteria in choosing operating systems: I want the one that gets the least in my way doing the things I want to do (whether that’s something productive or entertainment). I don’t care that I’m using Linux, it just happens to be Linux (or a Linux distro) that’s currently better at getting out of my way than Windows (or macOS, or any other OS).

    I’ve been evaluating Linux on my desktop like once per year maybe, and until recently Windows always won in terms of getting out of my way. I was using Windows 10 LTSC IoT before (because guess what: it got in my way less than regular Windows 10/11) and it was pretty good honestly, but what finally tipped the scales over for me was that Microsoft decided to let an update add unwanted entries into my start menu and re-enable the stupid search field in the task bar.

    So I re-evaluated different Linux distributions last year, eventually landed on Fedora and together with swapping my Nvidia RTX 3080 for a Radeon 7800 XT for better Linux compatibility (especially with Wayland) and also Valve’s Proton getting better and better, I started using a Linux distro full-time on my desktop January 1st, 2024.

    Stuck with Fedora for a few months and landed at openSUSE Tumbleweed (after some annoyances regarding SELinux and other things iirc with the Fedora 40 update). Tumbleweed or rather the fact that it’s bleeding edge had its fair share of issues in the last days (with some big releases like Mesa 24.1, Plasma 6.1 and some other packages being relatively buggy). This made me think about using a more stable distro like Debian or openSUSE Leap (I know there’s also Slowroll, but some issues Tumbleweed has also roll over to it), but then again I pretty much always have fairly recent hardware in my PC, which usually demands somewhat recent kernels and other packages.

    If I find that Windows gets less in my way tomorrow than what I’m currently using, I’ll consider switching to Windows. Or macOS. Or Debian. Or FreeBSD. Etc.







  • They don’t owe you anything in a sense that you don’t have to purchase their product, that is correct.

    Also yeah, the idea of cheating didn’t even come to my mind. We used to do that a lot back in the day :D - but to be fair, trainers aside, games often actively supported cheats out-of-the-box, and I don’t think From Software’s games do. It’s probably still trivial to cheat on the PC version, but on console, it might not be feasible.

    I totally get the feeling of accomplishment that comes with playing games on high difficulties, I do play quite a few games at higher difficulties, but then again I also enjoy lower levels of challenge at times.

    It’s still a very valid complaint that difficulty levels aren’t a thing. It wouldn’t change the difficulty for anyone who enjoys the current default difficulty, and might make the game more enjoyable to other players.


  • Elden Ring sold more than 20 million copies, that’s quite a big “niche” if you ask me.

    Not sure how lowering the health pool or damage per hit of bosses (as a very trivial example on how to easily reduce difficulty) affects the story of this game. And even if this would make the game less authentic to some players, they could just play it at the default difficulty…?

    There is just absolutely no reason (other than maybe ego problems, but just add an achievement for each difficulty level then) why more difficulty options make the game worse for players who enjoy the current difficulty setting, as they can simply stick to the default difficulty. These players will have the exact same experience as they have now, and others who struggled or just didn’t enjoy the grind of the default difficulty could turn it down a notch and enjoy the game.





  • Porting games to a different architecture is normally quite a bit more involved than just recompiling them, especially when architecture-agnostic code wasn’t a design goal of the original game code. No, Valve couldn’t release all their games natively running on ARM tomorrow, the process would take more time.

    But even if Valve were to recompile all their games for ARM, many other studios wouldn’t just because a few gaming handhelds would benefit from it. The market share of these devices wouldn’t be big enough to justify the cost. Very few of the games that run on Steam Deck are actually native Linux versions, studios just rarely bother porting their games over.

    I’m not saying ARM chips can’t be faster or otherwise better (more efficient) at running games, but it just doesn’t make sense to release an ARM-based handheld intended for “PC” gaming in the current landscape of games.

    Apple can comparatively easily force an architecture transition because they control fhe software and hardware. If Apple decides to only sell RISC-V based Macs tomorrow and abandon ARM, developers for the platform would have to release RISC-V builds of their software because at some point nobody could run their software natively anymore because current Macs would be replaced by RISC-V Macs as time passed by. Valve does not control the full hard- and software stack of the PC market so they’d have a very hard time to try and force such a move. If Valve released an ARM-based gaming handheld, other manufacturers would still continue offering x86-based handhelds with newer and newer CPUs (new x86 hardware is still being developed for the foreseeable future) and instead of Valve forcing developers to port their games to native ARM, they’d probably lose market share to these other handhelds as people would naturally buy the device that runs current games best right now.

    In a “perfect world” where all games would natively support ARM right now an ARM-based handheld for PC gaming could obviously work. That simply isn’t the world we live in right now though. Sure we could ramble on about “if this and that”, it’s just not the reality.


  • As you said yourself, it’s not the same thing. Proton can occasionally beat Windows because Vulkan might be more efficient doing certain things compared to DirectX (same with other APIs getting translated to other API calls). This is all way more abstract compared to CPU instruction sets.

    If Qualcomm actually managed to somehow accurately (!) run x86 code faster on their ARM hardware compared to native x86 CPUs on the same process node and around the same release date, it would mean they are insanely far ahead (or, depending on how you look at it, Intel/AMD insanely far behind).

    And as I said, any efficiency gains in idle won’t matter for gaming scenarios, as neither the CPU nor the GPU idle at any point during gameplay.

    With all that being said: I think Qualcomm did a great job and ARM on laptops (outside of Apple) might finally be here to stay. But they won’t replace x86 laptops anytime soon, and it’ll take even longer to make a dent in the PC gaming market because DIY suddenly becomes very relevant. So I don’t think (“PC”) gaming handhelds should move to ARM anytime soon.




  • If both AMD/Intel and Qualcomm do a good job with their core design and the same process node is used, I don’t see how a translation layer can be any faster than a CPU natively supporting the architecture. Any efficiency advantages ARM supposedly has over x86 architecturally will vanish in such a scenario.

    I actually think the efficiency of these new Snapdragon chips is a bit overhyped, especially under sustained load scenarios (like gaming). Efficiency cores won’t do much for gaming, and their iGPU doesn’t seem like anything special.

    We need a lot more testing with proper test setups. Currently, reviewers mostly test these chips and compare them against other chips in completely different devices with a different thermal solution and at different levels of power draw (TDP won’t help you much as it basically never matches actual power draw). Keep in mind the Snapdragon X Elite can be configured for up to “80W TDP”.

    Burst performance from a Cinebench run doesn’t tell the real story and comparing runtimes for watching YouTube videos on supposedly similar laptops doesn’t even come close to representing battery life in a gaming scenario.

    Give it a few years/generations and then maybe, but currently I’m pretty sure the 7840U comfortably stomps the X Elite in gaming scenarios with both being configured to a similar level of actual power draw. And the 7840U/8840U is AMD’s outgoing generation, their new (horribly named) chips should improve performance/watt by quite a bit.