Nah, they will do what they always do. Change some system environmental variables to move the zero date on till after they would have retired.
Nobody wants to touch the original code, it was developed in the 1970s
In this thread: mostly people that don’t know how timekeeping works on computers.
This is already something that we’re solving for. At this point, it’s like 90% or better, ready to go.
See: https://en.m.wikipedia.org/wiki/Year_2038_problem
Time keeping, commonly, is stored as a binary number that represents how many seconds have passed since midnight (UTC) on January 1st 1970. Since the year 10,000 isn’t x seconds away from epoch (1970-01-01T00:00:00Z), where x is any factor of 2 (aka 2^x, where x is any integer), any discrepancies in the use of “year” as a 4 digit number vs a 5 digit number, are entirely a display issue (front end). The thing that does the actual processing, storing and evaluation of time, gives absolutely no fucks about what “year” it is, because the current datetime is a binary number representing the seconds since epoch.
Whether that is displayed to you correctly or not, doesn’t matter in the slightest. The machine will function even if you see some weird shit, like the year being 99 100 because some lazy person decided to hard code it to show “99” as the first two digits, then take the current year, subtract 9900, and display whatever was left (so it would show the year 9999 as “99”, and the year 10000 as year “100”) so the date becomes 99 concatenated with the last two (now three) digits left over.
I get that it’s a joke, but the joke isn’t based on any technical understanding of how timekeeping works in technology.
The whole W2k thing was a bunch of fear mongering horse shit. For most systems, the year would have shown as “19-100”, 1900, or simply “00” (or some variant thereof).
More of a front end issue actually, almost all time is just stored as the number of seconds since 00:00:00 Jan 1 1970.
And it’s represented as a 64 bits value, which is over 500 billions years.
64 bits value
… About that… https://en.wikipedia.org/wiki/Year_2038_problem
That’s the 32 bit timestamp
Actual programmers wondering why this joke doesn’t mention 65535…
“Were being short-sighted”
Lol Picard maneuver. Pretty sure your opinion wasn’t asked for.
oh just start at 0000 again, signate that as 10,000. Files didn’t start until like 1979 anyways, and there can’t be many left, and even if it is a problem, now you have 2000 years to not worry about it.
I wonder how Voyagers’ code represents time
It just counts up, according to this answer.
We’re being short-sighted
Tell that to the billionaires speed-running terraforming this planet into a barren wasteland.
The Butlerian jihadn will have happened by then.
Y10K.
The trick is to unplug our computer a few seconds before midnight on December 31st, 9999 and then plug in the wire again
Good news! We’ll be exctinct long before this happens. One less thing to worry about!
Seems hyperbolic to assume we will be extinct by 9999.
Sure we’re heading for a climate crisis, but I don’t think all humans will be dead; Just the poorest.
That has forever been the fallacy.
The poor won’t die in the apocalypse leaving only the rich behind. The poor will die, and the rich will be faced with the harsh reality that they needed an army of poor working under them to sustain themselves, leading them to all die within the generation.
That’s true until it isn’t. Automation is on its way. Marching ever onward.
The factory I work in built a new building this year that employs 1/4 of the workers as the next newest one and does 2.5x the output.
In 9999, this meme will be problematic because it assumes the entire galaxy conforms to an Earth-based calendar system.
Well the USA is on Earth so obviously the earth calendar is the default.
Still set by London 😂
Awww shit, time to rewatch my favourite Jike Mudge movie starring Lon Rivingston; Space Office (9999).
Haha, I can’t believe this guy has the job of manually changing all the dates on the companies database, this place sucks. I bet the past was way better.
What about the year TREE(3)?
Ugh, I definitely don’t have the bandwidth to support anything beyond the year graham’s number.
“How many years is that?”
“At least THIS many.” (holds up 4 Knuth’s arrow notations fingers)
I don’t think 10000 year is a problem. There is a real “year 2038 problem” that affects system storing unix time in signed int32, but it’s mostly solved already. The next problem will be in year 33000 or something like that.
I’ve been curious about that myself. On one hand, it still seems far away. On the other hand, it’s a bit over 13 years away now and I have gear actively in use that’s older than that today.
Is there an ELI5?
A common method of storing dates is the number of seconds since midnight on Jan 1, 1970 (which was somewhat arbitrarily chosen).
A 32-bit signed integer means it can store numbers between 231 through 231 - 1 (subtracting one comes from zero being effectively a positive number for these purposes). 231 - 1 seconds added to Jan 1, 1970 gets you to Jan 19, 2038.
The solution is to jump to 64-bit integers, but as with Y2K, there’s a lot of old systems that need to be updated to 64-bit integers (and no, they don’t necessarily have to have 64-bit CPUs to make that work). For the most part, this has been done already. That would put the date out to 292,277,026,596 CE. Which is orders of magnitude past the time for the sun to turn into a red giant.
Maybe it’s not LI5, but I certainly enjoy your explanation for including several important facts and context. I respect your skill and knowledge, dear internet stranger.
Unix computers store time in seconds that have passed since january first 1970. one there have been too many seconds since 1970, it starts breaking. ‘signed’ is a way to store negative numbers in binary. the basics of it are: when the leftmost bit is a 1, it’s a negative number (and then you do some other things to the rest of the number so that it acts like a negative number) so when there have been 09999999 seconds since 1970, if there’s one more second it’ll be 10000000, which a computer sees as -9999999.
it’s mostly solved already
I wished I believe this. Or I guess I agree that it is solved in most software but there is lots of commonly used software where it isn’t. One broken bit of software can fairly easily take down a whole site or OS.
Try to create an event in 2040 in your favourite calendar. There is a decent chance it isn’t supported. I would say most calendar servers support it, but the frontends often don’t or vice-versa.
Well, I looked at a Year 10000 problem less than 2 hours ago. We’re parsing logs to extract the timestamp and for that, we’re using a regex which starts with:
\d{4}-\d{2}-\d{2}
So, we assume there to be 4 digits for the year, always. Can’t use it, if you live in the year 10000 and beyond, nor in the year 999 and before.
Just start over at year 0000 AT (after ten thousand)
The ISO time standard will certainly need to be redone
Do you think so? Surely, it’s able to handle dates before the year 999 correctly, so I’d also expect it to handle years beyond 10000. The
\d{4}
is just our bodged assumption, because well, I have actually never seen a log line with a year that wasn’t 4 digits…Kinda?
Each date and time value has a fixed number of digits that must be padded with leading zeros.
To represent years before 0000 or after 9999, the standard also permits the expansion of the year representation but only by prior agreement between the sender and the receiver.[21] An expanded year representation [±YYYYY] must have an agreed-upon number of extra year digits beyond the four-digit minimum, and it must be prefixed with a + or − sign[22] instead of the more common AD/BC (or CE/BCE) notation; by convention 1 BC is labelled +0000, 2 BC is labeled −0001, and so on.[23]
Oh wow, I really expected the standard to just say that however many digits you need are fine, because you know, maths. But I guess, this simplifies handling all kinds of edge cases in the roughly 7975 years we’ve still got.
Luckily I’ll be retired by then.
It’s going to be significantly more than the year 33000 before we run out of 64-bit epoch timestamps.
The max value for signed 64-but epoch values is more than 292 billion years away, or 20 times the age of the universe itself.
So yeah, we’re basically solid forever with 64-bit
There are so many problems there is an entire Wikipedia page dedicated to them.
I’m pretty certain most of my work inevitably ends up being related to a time issue
Yes, there are random systems using every kind of smart or brain-dead option out there.
But the 2038 problem impacts the previous standard, and the current one will take ages to fail. (No, it’s not 33000, unless you are using some variant of the standard that counts nanoseconds instead of seconds. Those usually have more bits nowadays, but some odd older systems do it on the same 64 bits from the standard.)
It’s a UX problem rather than a date format problem at that point. Many form fields require exactly 4 digits.