It is insanely interesting to me whenever I come across details in old file formats that were included specifically to work around hardware limitations. The wide knowledge required to be aware of all these wild factors is amazing.
As you can tell, I’m fun at parties.
“To contrast, the human brain apparently can’t remember a simple piece of information like not getting attached to their companion cube. I think we know who would be better at a party, the punchcard.”
My dad converted old assembly programs into Cobol for spending money in uni - his textbooks were full of cast offs.
MicroSD cards also don’t look nearly as badass if woven into a skirt.
C++ is pretty sweet.
It definitely has its issues - don’t get me wrong, but it’s pretty sweet.
Hrm, give me a moment to check the ACLs, I’ll be able to resolve all these complex conflicting rules shortly…
Nevermind, it was easier to just globally disable SeLinux so I did that. Your system should be more secure now.
It’s really easy, just throw an error if you detect a program will cause a halt. I don’t know why these engineers refuse to just patch it.
And a master copy is used to produce slaves - though slave isn’t widespread in version control it’s still quite present in databases. And it all comes from the same Master/Slave naming habit.
Absolutely fucking correct, and that’s the part that I hate much more than not being given proper credit.
Pshaw… just write it in raw HTML. It’s an incredibly legible markup language. I talk to my spouse in HTML just to stay sharp.
If only he had a briefcase of XSLTs to make sure the XML was safe first.
Uh… from Caml? Because OCaml’s object support is the least surprising part of the language.
I’d suggest choosing a mature language with a large number of utilities/libraries available - Java, Python, Rust spring to mind but the graphical shit is really what you’d want to lean hard on a library is. I don’t know enough to say for certain but it sounds like most of your work will be defining objects and how they interact… off the shelf solutions can’t really help with that.
For those of us inclined to fish nets… Tim Curry is also a fair reference.
But yes - Haskell Curry has leant his name to Haskell and Currying.
The sibling comment said drugs which may be effective for some people but I’d actually just highlight “leisure” being able to afford to explore when your mind takes you is a luxury that pays off massively for your mental health. I have wanderlust and I’m a programmer, sometimes my legs want to move and, with my understanding boss, I can go out into the world and walk along the beaches or through the forest while I ponder problems… this is a huge boon for my mental health and is something most employees can’t afford due to monetary stresses and toxic employers.
I absolutely agree and I consider LLM results to be “neat” but never trusted - if I think I should bake spaghetti squash at 350 I might ask an LLM and only find real advice if our suggested temperatures vary.
But some people have wholly bought into the “it’s a magic knowledge box” bullshit - you’ll see opinions here on lemmy that generative AI can make novel creations that indicate true creativity… you’ll see opinions from C-level folks that LLMs can replace CS wholesale who are chomping at the bit to downsize call centers. Companies need to be careful about deceiving these users and those that feed into the mysticism really need to be stopped.
No, I don’t think so - it’s just a dick move to go out of your way to sabotage someone. If they’re fucking up just visit their existing mistakes - don’t waste time contriving new ones.
You’re shifting the goal posts though - prior to AI being an expert reference on the internet was expensive and dangerous, since you could potentially be held liable - as such a lot of topic areas simply lacked expert reference sources. Google has declared itself an expert reference in every topic utilizing Gemini - it isn’t, this will end badly for them.
One of my proudest university moments was getting a 50% on an exam. I built this absolutely fucking glorious solitare implementation in Java as a first year student that dove deep into how image buffers work and stayed up all night doing it. I got 100% on the project and 0% on the presentation that I slept through (my prof did offer me some extra credit which I took advantage of).
Never have I ever felt more validated in preferring to be a code monkey with zero interactions with clients than in that moment - I produced unimpeachably perfect results and completely fucked the communication side (thankfully, I’ve worked through a lot of my social anxiety but I’m still strongly in the introvert camp).