• 0 Posts
  • 68 Comments
Joined 1 year ago
cake
Cake day: June 25th, 2023

help-circle

  • Regardless of the reason, the end result is still the same, which is that new users are left with the idea that terminal is essential for using Linux.

    You can say that you set up a distro without using terminal all you want, but as long as new users don’t know how to do that, my point still stands. Frankly, the fact that you even thought to bring up that point feels like, to me, extra proof that experienced users are highly dismissive of the new user experience.



  • As a recent Linux user, I can say that he’s got a point, but he’s making the wrong point. What I’ve learned is that technically, you don’t have to use terminal. But as a new user, you’re never made aware that there are non-terminals options. Every time you try installing a program or really doing anything, the first response on any article or forum is generally going to be to open up terminal and start typing. Linux is in a weird spot because the are so many desktop environments that the only way to make a tutorial that works on all distributions is to tell the user to use terminal. Yet by doing so, you are pushing away new users who will begin to think that Linux is too technical for normal use.

    I see many experienced users dismiss new users’ concerns because “you don’t actually need terminal,” but I don’t think these people really understand that while that’s technically true, the new user experience has been constant tutorials and articles that basically state the exact opposite. I’m not sure what a good solution would be, but I do think that experienced users need to acknowledge that just because new users identify an incorrect problem, doesn’t mean that there isn’t a problem at all








  • It’s confusing because both AMD and Nvidia call both frame gen and upscaling as the same thing.

    Upscaling: GPU renders game at low resolution (eg, 720p), and then (semi) smartly guesses what’s in the pixels that weren’t rendered. You get improved framerates because the GPU is doing less work per frame. The downside is typically that the image is typically a bit blurrier, and depending on how the GPU guesses the missing pixels, you might also get ghosting, which is where moving objects leave a smear trail behind them. The general consensus is that if you plan to use an upscaler, you should only use the highest quality mode on the upscaler. Any lower and the blurring becomes too significant

    Use when:

    • your GPU isn’t powerful enough to drive your monitor at its native resolution (ie you were going to run the game at a lower resolution anyways)
    • your game isn’t running as fast as you’d like, but turning down the settings would result in too noticeable of a drop in visual quality
    • your game doesn’t support your monitor’s native resolution (common in older games)

    Do not use when:

    • you could turn down the settings and still be satisfied with the visual quality

    Frame gen: GPU renders a frame, holds on to the frame, renders the next frame, and then guesses at what happened between the two frames. The framerate is improved because the GPU is inserting an entirely guessed frame in between every rendered frame. The downside is that because the GPU has to hold on to a frame, the latency is increased. More specifically, the time between when you move your mouse and when your camera moves will be increased with frame gen.

    Use when:

    • your game isn’t latency-sensitive (eg puzzle games, strategy games, some adventure games)
    • you have a high refresh rate monitor (higher refresh rates typically lead to less added latency)

    Do not use when:

    • your frame rate (without frame gen) is below 60 fps (added latency becomes too noticeable)
    • your game is latency-sensitive (eg competitive multiplayer games)

    Terminology:

    • AMD FSR 1: semi-dumb upscaler

    • AMD RSR: literally just FSR 1

    • AMD FSR 2: semi-smart upscaler

    • AMD FSR 3: very slightly smarter upscaler than FSR 2, and comes with semi-smart frame generation

    • AMD AFMF: literally just the frame generation part of FSR 3, but slightly dumber

    • nVidia DLSS 1: semi-dumb upscaler

    • nVidia NSR: literally just DLSS 1

    • nVidia DLSS 2: semi-smart upscaler

    • nVidia DLSS 3: smarter upscaler than DLSS 3, and comes with semi-smart frame generation

    • Intel XeSS: semi-smart upscaler





  • I can kinda see both ways. I think both systems aren’t necessarily mutually exclusive - I think the old system works more intuitively for pipes with low volumes of fluid, and the new system works more intuitively for pipes that are full or near full.

    I hope that the developers can mix the two systems, so that pipes function with the old system when pipes are empty or near empty, and it switches to the new system when pipes are full or near full




  • Yeah, sure! This happens to be my field of research.

    So I was referring to this particular paper, which unfortunately (to my knowledge) didn’t get much follow-up.

    Tangentially, there is much other evidence that circadian rhythms have evolved in part to deal with differences in microbial pathogens at the day vs. at night. However, whether it’s because the composition of bacteria in the atmosphere is different, or because animals are more likely to get themselves exposed to pathogens when they’re foraging, or a mix of both, is unclear. My favorite paper that demonstrates this effect is this one, where the circadian clock affects how strongly the immune system responds to bacteria in the lungs. I’ll also include the seminal paper here that first kickstarted the idea that immunology is fundamentally circadian, although frankly I didn’t like how the paper was written. It looked at how mice responded to Salmonella infection at the day vs. at night and found a difference in immune response that then led to a difference in how severe the infection got.



  • You’re right, I can’t give medical advice. But having abnormally long or short circadian days is a known thing - called circadian diseases. It’s not really my specialty, so I can’t comment too much on it, but my understanding is that many of them are genetic. These genetic variations can cause the circadian clock to run slower or faster than normal (which happens to be adjacent to what I study, so I can talk about it in excruciating detail if desired)

    The Familial Advanced Sleep Phase Syndrome (FASP) is one such genetic circadian disease that gets a lot of attention among the circadian field, but you almost certainly don’t have it, since FASP makes your clock run shorter than 24 hours, whereas you seem to imply that yours runs longer.

    The key thing to remember is that the circadian clock is not psychological. There is an actual, physical, molecular clock running in your brain and in nearly all the cells in your body. If this clock has imperfections, then that will directly lead to consequences in your circadian rhythms and your sleep cycle. The circadian clock is a real thing that people with the right equipment can measure and read. It wouldn’t even be particularly hard - just a blood sample or a swab would be sufficient. To be honest, I myself would like to study your cells to see if there really is anything out of place, but that would probably break so many research and ethics rules.

    Anyways, to answer your question, I would recommend getting a medical opinion - it might be worth specifically bringing up that you suspect you have a circadian disease. I’m not too sure about treatment options, since my impression has generally been that we kind of don’t have any treatments for circadian diseases. But it’s not really my specialty, so maybe there is. My memory is that melatonin is a masking cue, which basically means that it makes you sleep but it doesn’t actually affect your circadian clock (which probably explains your poor experience with melatonin).