• 0 Posts
  • 29 Comments
Joined 6 months ago
cake
Cake day: June 4th, 2025

help-circle

  • X2 “Elite Extreme” probably in ideal conditions vs. the base M4 chip in a real-world device. Sure, nice single core results but Apple will likely counter with the M5 (the A19 Pro already reaches around 4,000 and the M chips can probably clock a bit higher). And the M4 Pro and Max already score as high or higher in multi-core. Real world in a 14 inch laptop.

    It doesn’t “crush” the M4 series at all and we’ll see how it’ll perform in a comparable power/thermal envelope.

    I don’t hate what Qualcomm is doing here, but these chips only work properly under Windows and the Windows app ecosystem still hasn’t embraced ARM all that much, and from what I’ve heard Windows’ x64 to ARM translation layer is not as good as Rosetta 2. Linux support is pretty horrible, especially at launch.


  • In practice, they either use system authentication if you use the implementation bundled with iOS/Android - and sure, that can be Face ID if setup, or other forms of biometric authentication. Both operating systems have APIs that allow password managers to provide their own implementation of passkeys, and in that case you have to authenticate with your password manager - sure most of them support using system authentication (biometrics) as well, but this could also be a master password or hardware key (which work very similar to passkeys by the way).

    I’d argue if you don’t assume that whatever system you’re using is reasonably secure/private, you probably shouldn’t enter any passwords on that system either. This isn’t a passkeys vs. passwords problem.



  • Yeah, but the old display supports VRR via VESA Adaptive-Sync. Nvidia supports that as well, but not sure if their mobile GPUs don’t for built-in displays?

    If it is supported, I don’t see any advantage of having Gsync vs. standard VRR.

    If not that’s a shame. Pretty wasteful having to buy the same display with different firmware just to get adaptive sync working.








  • A Google search preview from the official Barcelona Asus store “Asus by MacMan” has accidentally revealed what many feared: the ROG Xbox Ally X will retail for €899, while the standard model sits at €599.

    Not sure how this would translate to USD 1000 to be honest. Prices in euro usually include taxes, which is what, 21% in Spain? So minus taxes the 599,-€ model would translate to about $550 (taxes not included).

    This still doesn’t undercut the Steam Deck which I feel it should do considering it’s likely using the same APU and the Deck is a couple years old at this point, but it’s not as bad as the headline/article makes it sound.


  • There isn’t official pricing nor reliable sources out there so I’m going by rumors.

    With your calculation you have to keep in mind that the Switch 2 cards have to somewhat match microSD Express speeds, so a more accurate comparison would be these, but they aren’t available in 64 GB sizes.

    All I’ve heard is that they’re expensive and with the larger sizes often required for Switch 2 games it’s an even bigger problem than with Switch (1). These key cards exist for a reason. And I’d bet Nintendo takes a margin on these instead of only requiring the publisher to cover the manufacturing costs.



  • For someone owning both devices and actually trying to decide which version to get, both are decent in portable mode with the Switch 2 taking the lead in docked mode (as the Deck doesn’t increase its power limits in docked mode whatsoever). So I’d probably get the Switch 2 version if I didn’t have a desktop PC to go with my Deck, but I do, so my “docked” experience (playing on my PC) is vastly superior anyway, with the Deck getting the portable part done.

    For a technical comparison it’s kind of inaccurate I think. Yes, it’s certainly impressive that the Switch 2 can run this game in portable mode likely consuming less than 10 watts for the entire system while producing okay graphics. And it’s clear that DLSS does a lot of heavy lifting here, but:

    • The 8.9 watts figure is likely somewhat inaccurate because it’s based on approximate battery life while playing the game. Even if the game is played from 100% to 0%, there’s still inaccuracies because the specific battery likely won’t have 19.3 Wh exactly. Instead it’ll likely be a bit higher than that when brand new, and a bit lower with 100s of cycles.
    • The Switch 2 clearly consumes less power than the Deck needs to achieve “playable” framerates in Cyberpunk 2077, but that doesn’t tell us that much about the efficiency of just the SoC. I’d assume the Deck requires a little bit more juice for its OLED screen and also more for the rest of the system, for example the standard NVMe drive it uses. The “approximately 9 watts consumption” comparison they’re doing makes it look like the Switch 2 is around 3 times as efficient, but that’s not how efficiency curves work. You’re comparing the Deck at a power consumption level that’s probably the peak of Switch 2s efficiency curve.
    • Game settings are (currently?) impossible to match. Some can be matched, others are either some in-between on Switch or even “lower than low”, for example some models/geometry. I assume these changes have a large enough performance impact that CDPR thought they were worth to implement just for the Switch 2.
    • Scene-specific pixel counting wasn’t really done, so it’s not possible to say which device renders more “real” pixels (even though DLSS certainly seems to make the most out of these pixels).

    I still think the Switch 2 is very impressive in terms of performance in portable mode, certainly more than I expected when hearing about the rumored Ampere architecture and the Samsung manufacturing process.

    It also shows that something comparable to DLSS (likely FSR 4) would be hugely beneficial to PC handhelds so I hope that the Deck 2 will properly support that. Sad that AMDs Z2 series don’t, but I hope Valve is cooking another custom chip with AMD soon.



  • I wouldn’t say we’re over-reliant on Steam, but maybe on Valve to some extent.

    If Valve would suddenly stop all their work on/around Linux, that’d certainly affect Proton and also things like the open AMD GPU drivers. Sure, others would likely continue their work (it’s not like they’re doing it all alone now anyway), but Valve certainly brings a lot of expertise and also commercial interest.