![](https://lemmy.world/pictrs/image/444ec5b0-bfdf-4eae-a860-a1aff1d4d247.png)
![](https://programming.dev/pictrs/image/8140dda6-9512-4297-ac17-d303638c90a6.png)
I figured they would just run sfc /scannow
and then sit staring at their screen bewildered when it inevitably does nothing.
Find me on Mastodon, if you want.
I figured they would just run sfc /scannow
and then sit staring at their screen bewildered when it inevitably does nothing.
Just found this article about it that seems to fundamentally misunderstand it in every single way. I didn’t know it was even possible to be this clueless. Either that, or it’s AI.
Thor from Pirate Software (a game studio) does this. He has his set up so that if he doesn’t log into a specific server for a year, the source code to his game will be automatically published.
You could do the same thing. Just grab a super cheap server that checks the last login date and sends out emails.
It’s been 4 years since I built my last one, but I still think it holds true.
I’ve heard Intel chips still run hot, especially the 14th Gen i9. However, I came across this article by Puget Systems (a system integrator who mainly deals with professional workstations rather than gaming rigs) who found that decreasing the PL1, which I assume means Power Level, from 253W to 125W was a good enough tradeoff for performance/heat that it’s the default configuration they ship to their customers.
On the other hand, they still do mention that tasks such as UE light baking, V-Ray, Cinebench, and Blender saw gains of 10-18% when using the higher power limit, which seems much more like what OP’s workload is. Puget then proceed to recommend a CPU with a higher core count like a Threadripper PRO for those kinds of workloads, so perhaps OP really would be better off going AMD for their workstation.
I’m not sure you got the point that the article was the trying to get across.
Competition and interop are great things, but that’s not the problem. The problem is in fundamental design choices that completely change how users interact with systems.
Let’s look at controllers. Should we standardise on tracked controllers or hand tracking? There are benefits and drawbacks to both.
Controllers allow one hand to have many different inputs (be they buttons/sticks/touch/etc) and provide haptic feedback, while hand tracking does not.
Hand tracking allows better immersion as your hands in-headset will more closely match your real hands, which is something you don’t always get with controllers.
Not all applications/games can be made to work well with both of these input methods, if at all.
VRChat for sure. You can easily spend so much time in there.
Probably off-by-one errors
You can code in Notepad in the same way you can eat off the floor with your hands. Using better tools is a nicer experience.
As for performance, when one of the world’s most popular editor runs on Electron, it’s not that hard to see why performance could be an issue when working on large projects on older hardware.
I’ve never personally had an issue with VSCode’s performance, but I’m also fortunate enough to be in a position where I can afford a relatively modern machine. Many others have to make do with what they have, which is why Zed might appeal to them.
implying that any developer actually reads warnings
Same here, to a certain extent.
I was referring only to Linux’s lack of bullshittery in comparison to Windows, nothing else.
Far easier to do too. I did one of each last month and there’s no question that the Windows setup experience is terrible in comparison.
I played on release and then stopped after a while. I left a positive review hoping the game would improve over time.
I came back a year later and literally nothing had changed. In fact, it had gotten worse. The same bugs existed as when I first played, and my controls had now broken (Index controllers now cannot cast spells??) and there was no fix or workaround available online.
Given the amount of money and time they had to fix and test things, my review is now negative.
🧜♀️ Mermaid + 🔥 Fire = 🚨 Siren
Clever…
I want to like Forgejo but the name is really terrible.
Is it “forj-joe”? Nah, that double-J sound is way too awkward.
Do you then merge the J sounds to make “forjo”? If so, why not just call it that?
Is it maybe “for-geh-joe”? That seems the most likely to me, but then that ignores the “build < forge” marketing on their website.
I know it’s pretty inconsequential, but it feels weird using a tool that you don’t even know how to pronounce the name of.
Seems like a “haha JS bad” kind of joke, but OP seems to forget that Python is also in a similar boat.
You at least have to know that it’s a meme format. Otherwise it just looks like someone complaining about async with a bad crop.
Interestingly, this JXL loads in Boost, but the one in the post doesn’t. Perhaps it’s because it’s inside a comment?
I would say finding that the bug is in a library is worse than finding it in your own code.
If it’s your own code, you just fix it.
If it’s in a library you then have to go and search for issues. If there isn’t one, you then go and spend time making one and potentially preparing a minimum reproducible example. Or if you don’t do that (or it’s just unmaintained) then you have to consider downgrading to a version that doesn’t have the bug and potentially losing functionality, or even switching to another library entirely and consequently rewriting all your code that used the old one to work with the new one.
Yeah, I’d take my own bugs over library bugs any day.
Not to mention VSCodium already exists.