I know a google engineer who was saying they’re having to update their code bases to handle > 16 exabytes of storage, if you can imagine. But yeah, that’s storage, not RAM.
I know a google engineer who was saying they’re having to update their code bases to handle > 16 exabytes of storage, if you can imagine. But yeah, that’s storage, not RAM.
I started in C and switch to C++. It’s easy to think that the latter sort of picked up where the former left off, and that since the advent of C++11, it’s unfathomably further ahead. But C continues to develop and occasionally gets some new feature of its own. One example I can think of is the restrict
key word that allows for certain optimizations. Afaik it’s not included in the C++ standard to date, though most compilers support it some non-standard way because of its usefulness. (With Rust, the language design itself obviates the need for such a key word, which is pretty cool.)
Another feature added to C was the ability to initialize a struct
with something like FooBar fb = {.foo=1, .bar=2};
. I’ve seen modern C code that gives you something close to key word args like in Python using structs. As of C++20, they sort of added this but with the restriction that the named fields have to come in the same order as they were originally defined in the struct, which is a bit annoying.
Over all though, C++ is way ahead of C in almost every respect.
If you want to see something really trippy, though, have a look at all the crazy stuff that’s happened to FORTRAN. Yes, it’s still around and had a major revision in 2018.
I guess the MAC address guy is up next. 48 bits may not go so far if every light bulb is going to want its own.
Imagine if you were the guy who made the call on IPv4 addresses…
Falsehoods About Time
Having a background in astronomy, I knew going into programming that time would be an absolute bitch.
Most recently, I thought I could code a script that could project when Easter would land every year to mark it on office timesheets. After spending an embarrassing amount of…er…time on it, I gave up and downloaded a table of pre-calculated dates. I suppose at some point, assuming the code survives that long, it will have a Y2K-style moment, but I didn’t trust my own algorithm over the table. I do think it is healthy, if not essential, to not trust your own code.
Falsehoods About Text
I’d like to add “Splitting at code-point boundary is safe” to your list. Man, was I ever naive!
So the next captcha will be a list of AI-generated statements and you have to decide which are bat shit crazy?
“Recall uses Copilot+ PC advanced processing capabilities to take images of your active screen every few seconds,”
Seems like a lot of extra disk thrashing that would shorten the life expectancy of an SSD? Like it would be considerably more than your usual background chatter of daemons writing to log files and what not. Unless I’m misunderstanding this?
We need to watermark insert something into our watermark posts that watermark can be traced back to its origin watermark if the AI starts training watermark on it.
This was a struggle for me going from hobbyist programmer to working at a company. I tried to tone it down. Really. But eventually I got “promoted” to having my own office with a suspiciously thick door. Hmm…
I live in the path of totality and the local tourism office is projecting anywhere from 70K to half a million visitors. It’s insane! Also, I read Niagara Falls, which is obviously used to seeing a lot of tourists, has nevertheless declared a preemptive state of emergency. And there are advisories to be very careful if you’re driving on highways at the time of totality, as there will inevitably be idiots who stop suddenly to gawk and burn out their eyeballs.
Mind you, it could all be a bust given the current weather forecast is for Monday to be cloudy across the whole region. I guess it’ll still be cool to see everything go dark for a few minutes though.
True story. I was looking for an answer to an obscure problem and found it in a 10-year-old stackoverflow post. Then I looked more closely at the author…
Hey! Me from 10 years ago, stop being such a smart ass! It’s obnoxious.
1st reaction: lmao
2nd reaction: hey wait, this is pure genius!
There should be a law that whenever this happens, the changes must be highlighted in bold.
I’m just going to leave this picture of a wampa from the planet Hoth here for no particular reason.
There is an issue with templated code where the implementation does have to be in the header as well, though that is not the case here. C++20 introduced modules which I guess were meant to sort out this mess, but it has been a rocky road getting them to be supported by compilers.
Looks like we’ve got a Java programmer here taking C++ for a spin.
Good Lord, if the US nuclear arsenal is that antiquated, I shudder to think of where the Russians are at. Please don’t short-circuit and accidentally launch…
There is bounds checking, but it’s opt-in. I often enable it on debug builds.
Ah I think I found it. I need to go:
{
"format_on_save": "off"
}
You can always combine integer operations in smaller chunks to simulate something that’s too big to fit in a register. Python even does this transparently for you, so your integers can be as big as you want.
The fundamental problem that led to requiring 64-bit was when we needed to start addressing more than 4 GB of RAM. It’s kind of similar to the problem of the Internet, where 4 billion unique IP addresses falls rather short of what we need. IPv6 has a host of improvements, but the massively improved address space is what gets talked about the most since that’s what is desperately needed.
Going back to RAM though, it’s sort of interesting that at the lowest levels of accessing memory, it is done in chunks that are larger than 8 bits, and that’s been the case for a long time now. CPUs have to provide the illusion that an 8-bit byte is the smallest addressible unit of memory since software would break badly were this not the case, but it’s somewhat amusing to me that we still shouldn’t really need more than 32 bits to address RAM at the lowest levels even with the 16 GB I have in my laptop right now. I’ve worked with 32-bit microcontrollers where the byte size is > 8 bits, and yeah, you can have plenty of addressible memory in there if you wanted.