No, that is literally exactly what they just complained about.
> And it's annoying that you need to read some footnote to figure out what exactly it means; it's annoying that it is basically a code-phrase that you just need to know that it's not supposed to be taken literally.
Even to point of deferring the real explanation to a secondary paragraph.
It seems you're arguing based on the assumption that real time is exactly 365x24x60x60 seconds every year or exactly 366x24x60x60 seconds on leap years. It's not.
The problem is that we have a very precise definition of a second in terms of decay of atoms (maybe precise is the wrong word, as it's the statistical likelihood of x atoms decaying given certain conditions, but whatever). The problem is arguably that this is over-defined.
There's a good case that a second is actually how it's always been defined historically up until 1967 - as 1/60th of a minute, which is 1/60th of an hour, which is 1/24th of a day. That's what UNIX seconds are. 86400 of them in one day. And we have a pretty good idea what a solar day is, and have been doing calendars based on them for thousands of years.
But if you want to base your times on the decay of caesium, then you can do that, but you have to accept that it no longer corresponds neatly to a solar day any more. The length of a day fluctuates by a couple of seconds a day in either direction, which we largely just ignore because over time, that mostly cancels out. Personally, I don't think leap seconds should ever have been introduced - over the last century or so, the earth has been rotating "faster" than our idealised second based on radioactive decay, so we've added leap seconds. But more recently, it's been rotating "slower", and we're at the situation where we need negative leap seconds. Maybe really, we should have just left it alone and over a longer period it'd all have averaged out anyway.
But what's interesting is that apart from the meddling with leap seconds, we've decided that a "typical day" has exactly 86400 seconds where a second is some constant time, even though that isn't true of the reality of our planet. Some days are too short when defined this way, some days are too long. But on average, this 86400 seconds is pretty much right.
And arguably, any day that needs a leap second isn't "wrong", the problem is actually that we over defined a second before we realised that the periodicity of the solar day wasn't a constant. I wouldn't advocate trying to redefine what a second is again, because actually having a constant time second is incredibly useful for defining all the other derived SI units. But with that usefulness, you also need to be aware that it's not the same as the traditional timekeeping second.
But in any case, except for leap seconds, all the world's time systems agree on 84000 seconds per day. So, can you make the case for why you think UNIX time in particular is a problem? And what would you rather have instead?
Because sometimes when you measure time you want to know what fraction of the day it is, and sometimes when you measure time you want an objective measure of the passage of time.
Quite often, we want to use something originally recorded using the former to calculate the latter. It is most convenient to have a second that is of fixed duration. Which is kind of exactly why the tz database exists in the first place.
Except it's made more complicated because most systems that use unixtime also use NTP, and that means they employ smearing because essentially nothing in computing supports 23:59:60 or 23:59:61 or repeated seconds of 23:59:59. So on the day of the leap, the recorded time for events doesn't match standard time. Which is why the unsmear library exists (among others, probably).
Note that TAI (international atomic time) truly is the number of actual seconds since it was first synchronized in 1958. That is what is used to define UTC, and it's about 30 seconds ahead of UTC currently.
All that is to say... Calling unixtime "seconds since epoch" is a forgivable sin in terms of the practicalities of communication, but it's not really defensible as a matter of being a factual description of reality. The truth is that the new definition of a second was agreed upon decades before Unix came along, and when we're measuring time in seconds we don't typically care about the solar day or sidereal day. Further, there is no practical way to construct a computer or clock (barring a sundial) so that supports the original dynamic definition of time divisions. I can't even imagine how relativistic times with GPS satellites would have to work. It would be the longitude problem all over again.
>it's not really defensible as a matter of being a factual description of reality
Is it a factual description of reality? "Seconds since epoch" is an almost entirely abstract idea, given that neither seconds nor epochs exist in the universe. The only way it's connected to reality is that time moves forwards, so "since" has meaning. So it seems to me that someone who says "seconds since epoch" can choose to give the words any meaning they like as long as everyone understands what is meant.
So to me this is as relevant a complaint as saying that the special case of graphs shouldn't be called "trees" because branches in real trees sometimes rejoin. It's a metaphor. We're dealing with entirely human-made concepts with barely any input from the real world, we can use any words we like.
It's only as abstract as "meters of distance". Which is to say, not in any practical sense until we introduce general or special relativity. The fact that it's synthetic and not an natural unit doesn't mean it's abstract or variable. It means it's arbitrary. It could be any value, but it must remain static.
That's also why we don't use the kings body dimensions as the standard of measurement anymore. We don't need to recalibrate the entire nation's standard of "foot," "inch," "cubit," or whatever when the king grows. That's not useful.
Just because the inch is the width of an adult man's thumb doesn't mean the purpose of the inch is to define thumbs and should always follow from that. That is not the goal of wanting the measuring unit. It's the same for seconds.
That's why I said: Sometimes you want to know what time of day it is, and sometimes you want to know how long something took to occur. That is to say, a datetime and a timespan. Now, you could use a different unit of measurement for those to, but in practical terms it's stupid.
Meters and seconds are both abstract and arbitrary. Abstract because time and space aren't divided. A measure of length isn't concrete like a count of atoms in a cup is. Surely we can agree that those two things have different degrees of concreteness. Yes, the phrase "a second" can be translated to a real equivalent, but the "second" is human idea, unlike the atom.
Distance can absolutely be concrete. We can specify distance traveled by light in a vacuum, or, if you'd rather, a specified number of Planck units. Similarly, time can be defined by the number of wavelengths of a photon of a given energy level. Which is basically the definition of modern SI second and meter are derived. They're based on the speed of light.
In the end, all you're doing is finding different ways to dissect physical constants. But you're still using those constant. The constant itself doesn't change because the Earth revolves slower today than it did yesterday, yet the planet certainly does revolve the same distance about its axis in more time.
Which is why I say distance and time are not really variable or imprecise (better terms than abstract, which simply has to many nebulous meanings) until you start to introduce relativity.
In fact, time is a fundamental element of many physical constants. The Planck constant, the gravitational constant, and the speed of light are all fundamentally based on time and distance. And since the speed of light relates to mass and energy, whatever units you pick need to be consistent. If we say that 1 second is always 1/84,000th of a solar Earth day (instantaneously I suppose?) then you've immediately made 3 of the known physical constants depending on how quickly the Earth spins. And you could do that and recalculate all of physics every day if you wanted to. But it's kind of stupid. It would be like re-graduating a tape measure every day you build a house. In theory it's just fine. Practically it's really not.
> And it's annoying that you need to read some footnote to figure out what exactly it means; it's annoying that it is basically a code-phrase that you just need to know that it's not supposed to be taken literally.
Even to point of deferring the real explanation to a secondary paragraph.