this post was submitted on 14 Dec 2023
214 points (95.7% liked)

Programmer Humor

31259 readers
863 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 4 years ago
MODERATORS
all 32 comments
sorted by: hot top controversial new old
[–] davel@lemmy.ml 30 points 6 months ago (2 children)
[–] CanadaPlus@futurology.today 4 points 6 months ago (1 children)
  1. Ok, but the time on the server clock and time on the client clock would never be different by a matter of decades.
  2. The system clock will never be set to a time that is in the distant past or the far future.

Does this come up? I feel like if you're doing retrocomputing you assume a certain level of responsibility for your software breaking.

  1. Ok, but the duration of one minute on the system clock will be pretty close to the duration of one minute on most other clocks.
  2. Fine, but the duration of one minute on the system clock would never be more than an hour.
  3. You can’t be serious.

You can't be, can you? Ditto on that being the user's problem. My thing also isn't portable onto Zeus Z-2 or a billiard ball computer you built in your garage.

There's some weird shit in the crowdsourced ones. I don't even know where to start.

[–] Redjard@lemmy.dbzer0.com 5 points 6 months ago (1 children)

You heard of standby and the likes? What do you recon that does to programs calculating with time in that exact moment?

[–] CanadaPlus@futurology.today 4 points 6 months ago

I... Actually don't know.

The real time clock continues to move in real time under reasonable conditions. If it's in a weird year it's either because you've decided to run a disk you found in a cave, left by the Ancient Ones, or you're cheating at Animal Crossing.

I'm a little unclear on how the rest of the clocks typically work together. If your program is drawing from one that gets stopped for a while, I guess yeah, a minute could totally be weeks long, and I'm in the picture as a falsehood believer.

[–] AVincentInSpace@pawb.social 1 points 5 months ago

if that person who wrote all these could provide examples for why literally any of them are wrong instead of just resorting to the standard "falsehoods programmers believe" fare of "you believe this? ha. it is wrong. therefore I am smarter than you" I would very much appreciate it

[–] ExLisper@linux.community 24 points 6 months ago

OMG, it's so trivial. What you do is when T2 happens you send an atomic clock back in time to T1 and start counting till T2 happens again. If T1 and T2 happen in different locations you send two entangled clocks and collapse the state on T2 clock when the event happens measuring the exact moment on T1. How is this an issue?

[–] alcoholicorn@hexbear.net 15 points 6 months ago (3 children)

Only problem is accepting dates in anything except YYYYMMDD, or unix time stamps if you need more precision.

[–] snowe@programming.dev 14 points 6 months ago (2 children)

Neither of those will solve the problem in the comic.

[–] fiah@discuss.tchncs.de 7 points 6 months ago* (last edited 6 months ago)

you know what will solve those problems though? blame them on someone else. "oh yeah that bug, yeah sorry the package we're using messed it up, there's a PR for that"

[–] azertyfun@sh.itjust.works 6 points 6 months ago* (last edited 6 months ago) (3 children)

EDIT: NVM I'm a goddamn idiot, Unix Time's handling of leap seconds is moronic and makes everything I said below wrong.


Unix Time is an appropriate tool for measuring time intervals, since it does not factor in leap seconds or any astronomical phenomenon and is therefore monotonously increasing... If T1 and/or T2 are given in another format, then it can get very hairy to do the conversion to an epoch time like unix time, sure.

The alt-text pokes fun at the fact that due to relativity, at astronomical scales then time moves at different speeds. However, I would argue that this is irrelevant as the comic itself talks about "Anyone who's worked on datetime systems", vanishingly few of which ever have to account for relativity (the only non-research use-case being GPS AFAIK).
While the comic is funny, if:

  • Your time source is NTP or GPS
  • "event 1" and "even 2" both happen on Earth
  • You're reasonably confident that the system clock is functioning properly

(All of which are reasonable assumption for any real use-case)
Then ((time_t) t2) - ((time_t) t1) is precise well within the error margin of the available tools. Expanding the problem space to take into account relativistic phenomena would be a mistake in almost every case and you're not getting the job.

[–] CanadaPlus@futurology.today 2 points 6 months ago* (last edited 6 months ago)

Clock misalignment comes up pretty frequently in some networking and networking-esque applications. Otherwise, yeah, the edge cases are indeed on the edge.

Subsecond precision comes up often in common applications too, but you can just expand out to milliseconds or whatever.

[–] mormegil@programming.dev 1 points 6 months ago (1 children)

When you're saying Unix time does not include leap seconds, you are making exactly the wrong conclusion. Unix time is not a monotonically increasing number of seconds since the Epoch, because it excludes those seconds which are marked as leap seconds in UTC. I.e. the time between now and the Epoch was larger than the current Unix time shows (by exactly the number of leap seconds in between). See e.g. https://en.wikipedia.org/wiki/Unix_time#Leap_seconds

[–] azertyfun@sh.itjust.works 1 points 6 months ago

Aight I'm just dumb then. Now the question is who the fuck thought this was a good idea? Probably someone so naive they thought it'd make time conversions easy...

[–] snowe@programming.dev 0 points 6 months ago (1 children)

Unix time fails to work for the 'simple' case of timezones entirely. It's not meant for timezone based data and therefore unixtime in one timezone subtracted from unix time in another timezone will most likely give completely incorrect results. Even in the same timezone it will give incorrect results, see the 'simple' case of a country jumping across the international date line. Typically they skip entire days, none of which unix time will account for, as that would require not just time zone data, but location data as well.

[–] azertyfun@sh.itjust.works 3 points 6 months ago

You misunderstand what Unix Time is. It's the number of seconds since 1970-01-01T00:00+00:00. It's always relative to UTC. And the number of seconds since epoch is always the same regardless of where you are on Earth.

As I write this it's 1702600950 for you, for me, and in Sydney. Timezones (and DST, and leap seconds, and other political fuckery) only play a role once you want to convert 1702600950 into a "human" datetime. It corresponds to 2023-12-15 00:46:02 UTC and 2023-12-14 16:46:02 PST (and the only sane and reliable way to do the conversion is to use a library which depends on the tzdata).

[–] codapine@lemm.ee 10 points 6 months ago (1 children)

ISO 8601, it's the only way.

[–] tetris11@lemmy.ml 13 points 6 months ago

The past is the past. Everything that happened before time t_now should be set to Inf. I thank you for your ears.

[–] amda@feddit.nl 7 points 6 months ago (1 children)

Or anyone who has worked with general relativity

[–] CanadaPlus@futurology.today 2 points 6 months ago* (last edited 6 months ago)

Actually, while mathematically heavy, it's easy to measure in GR, assuming you've got a metric solved (If you don't, you're fucked. That shit is intractable to the point where you can name every exact solution on one page, and inexact solutions can just be lies) However, you may have to ask additional questions about what sort of time you want, which probably stems from why you need it.

[–] xmunk@sh.itjust.works 5 points 6 months ago

T2 - T1 = 'a while'

[–] sebsch@discuss.tchncs.de 5 points 6 months ago (1 children)

I mean, as long you only need the delta in milliseconds it's easy. Just count the milliseconds from 1970 to the event. The problem starts when you want to have a human readable representation.

It's calenders they suck, not time.

[–] mormegil@programming.dev 6 points 6 months ago

Well... unless you measure the number of [milli]seconds using something like time_t, which lies because of leap seconds. I.e. even such a seemingly simple interface, in fact, includes a calendar.

[–] CanadaPlus@futurology.today 4 points 6 months ago* (last edited 6 months ago)

Re: The mouseover text, is there a standard frame of reference for really general space stuff? I propose a frame comoving with the CMB and reaching the center of the Earth at Epoch 0 if not.

[–] Amaltheamannen@lemmy.ml 3 points 6 months ago

Sounds like a distributed systems problem. While the time between events could be impossible (can't guarantee clocks are synchronized) you can use a logical clock to have causaul ordering.

[–] neuracnu@lemmy.blahaj.zone 3 points 6 months ago (2 children)

.net has a timespan data type specifically for this sort of thing.

[–] BehindTheBarrier@programming.dev 3 points 6 months ago

After using it, coming to python and not having a super easy way to work with dates is a pain.

But DateTime in dotNet have horrible timezone support. It's essentially either local timezone, not timezone or utc. And the utc part is somewhat rough. There's some datetimeoffset and the like, but they too just don't let working with timezones be easy.

[–] CanadaPlus@futurology.today 1 points 6 months ago* (last edited 6 months ago)

I'm guessing it's not alone. Every time format should come with a distance function and order function, or equivalent. If you have a life, that could mean something like subtraction.

Unfortunately, "should" isn't always enough. Optimally there's also type structure to the return of the function so you can't mix up seconds and days, or calendar and (one of the) standard length days.

[–] tias@discuss.tchncs.de 3 points 6 months ago* (last edited 6 months ago) (1 children)

I feel like this is a solved and simple problem as long as there are no relativistic effects. Just make sure t1 and t2 are represented as seconds since a known reference time, e.g. Unix epoch, and make sure that measure is accurate. You don't need to bring the Gregorian calendar into it, use TAI represented as an integer.

[–] jol@discuss.tchncs.de 3 points 6 months ago (1 children)

Until you need to decide how many months are between t1 and t2, and then all answers are wrong.

[–] tias@discuss.tchncs.de 2 points 6 months ago* (last edited 6 months ago)

To do that you first need to choose a calendar and a time zone, then convert to that representation. It can be done, but you need a good implementation that understands the entire history of what has transpired w.r.t. to date conventions in that location and culture. For timestamps in the future it is impossible to do correctly, since you can't know how date conventions will change in the future.

However, I should add that as far as mathematical operations go, calculating the number of months between t1 and t2 is an entirely different thing than the duration of time that passed between those timestamps. Even if it is expressed similarly in the English language, semantically it's something else. It's like asking "how many kilometers did your car go" vs "how many houses did the car pass on the way".