Thin concrete slabs are extremely brittle.
Thin concrete slabs are extremely brittle.
Is it? It’s rather expensive and would you really know, if the data is gone or corrupted?
You’d have to download every single file in certain intervals and check it. That’s not really low complexity.
But what actually is “archival”?
Like, what technology normal person has access to counts at least as enthusiast level archival?
Magnetic tape, optical media, flash, HDD all rot away, potentially within frighteningly short timeframes and often with subtle bitrot.
Why exactly does MS gaming employ over 20.000 people?
And when people started writing books instead of memorizing epic poems.
Had to work with a fixed string format years ago. Absolute hell.
Something like 200 variables, all encoded in fixed length strings concatenated together. The output was the same.
…and some genius before me used + instead of stringbuilders or anything dignified, so it ran about as good as lt. Dan.
And there are some truly magic tools.
XSDs are far from perfect, but waaay more powerful than json schema.
XSLT has its problems, but completely transforming a document to a completely different structure with just a bit of text is awesome. I had to rewrite a relatively simple XSLT in Java and it was something like 10 times more lines.
Python caches bytecode, so the translation happens only once.
Java loads everything immediately and keeps it in memory. All beans, all connections, etc. That takes up a ton of memory.
Of course, but I’m not productive in it.
If I have to do everything myself, it will take more time to get it done. The trade-off is of course always control/speed vs convenience, but C is definitely too inconvenient for me.
Not that limited. Limited means an old thin client, not a microcontroller. I already set up a small web server on a pi pico with mpy, so it’s quite impressive. But from what I understand, the interop with “MacroPython” is not that great.
Did you use mpy for x86 devices? Are the limitations worth it?
But that would mean either using Graal/native image or going full Scala, right?
I only used Scala for Gatling, where it’s obviously very java-y.
There’s nothing to really grow. It’s mostly just small helpers. Aggregate sensor data, pull data from A and push it to B every hour, a small dashboard, etc.
C is too involved for my case , I want to be productive after all.
Rust is already rather low level, though there are some cool looking frameworks.
Train nerds are a weird bunch.
Please never change.
And DBAs. I’m currently working on a project where I said from the very start, I can set up this DB in k8s and I can get it to work decently, but I have neither the knowledge nor the time to get it right. Please give me someone who knows how this works.
No, don’t worry, it’ll be fine, we don’t need that, this kuverneles thing I keep hearing about handles that!!!
Six months of hard contact with the enemy on production later:
Well, we’re currently looking for someone who actually knows how DBs work, because we have one of those issues that would cost a proper DBA 5min and me 5 months.
To be fair, a lot of roles simply disappeared over the years.
Developers today are much more productive than 30 years ago, mostly because someone automated the boring parts away.
A modern developer can spin up a simple crud app including infrastructure in a day or so. That’s much much more productive than 1995. We just cram a lot more of the world into software, so we need 20x the amount of developers we needed back then.
It’s really weird, though, that nobody really created a language/tool to bridge these two world. It’s always just generating one representation from the other, mostly in a bad way.
I’d argue, that for many problems, a graphical view of the system can help reasoning. But there simply is nothing in that regard.
and I could make a death ray out of my home wifi box and a wok.
I mean, you could. Do you happen to have a small nuclear reactor and about 400l of liquid helium?
It’s absolutely not inherently wrong or implausible to assume that the constant and rather direct exposure over decades causes cancer.
Old timey radio operators definitely died earlier. They had much higher cancer rates. Granted, completely different levels of radiation, but radiation damage is stochastic. If there is an effect at all, it will cause thousands of new cases even low doses simply because we have like 7 billion phone users.
Doing proper studies on that is hard, but absolutely necessary.
Well, obviously, you just have to put a sticker with a geometric pattern on it to turn the bad radiation into good radiation!
(I wish that was a joke, but you can actually buy those)
And who does that?
I think you don’t really get my point. I’m not arguing that there are no ways to archive data. I’m arguing that there are no technologies available for average Joe.
It is hardly a good strategy to basically set up half a datacenter at home.