Recently, I had a conversation with a junior developer on my team. Let’s call him Alan. We were talking about a new notification feature that was going to be used to send reminder e-mails to potentially thousands of people if they had forgotten to enter certain data in the last month or so. Alan was confident that the code he’d written was correct. “I’ve tested it well.”, he said…
I don’t really get why people use any time other than ms/seconds since the epoch for anything other than displaying that time to the end user. Having time just be a single number with no time zone shenanigans makes writing logic like that so much easier.
Epochs aren’t that simple either.
First of all, local time can be relevant, so you have to store timezone information somewhere anyway.
Epochs are also somewhat iffy in regards to leap years or seconds.
And finally: write me an SQL to retrieve all entries submitted in 2022 using just epochs.
Timezones are annoying as fuck, don’t get me wrong, but simply ignoring them isn’t a solution either.
I don’t really remember SQL, does it prevent you from using a range of values? I can understand why leap seconds would be an issue.