• Lucy :3@feddit.org
    link
    fedilink
    arrow-up
    139
    ·
    9 days ago

    Programmers in 292,271,023,045 after uint64_t isn’t enough for the unix timestamp anymore:

    • Agent641@lemmy.world
      link
      fedilink
      arrow-up
      17
      ·
      8 days ago

      Programmers dealing with the timezones of asymmetric period binary and trinary star systems once we go interstellar 💀

      • Lucy :3@feddit.org
        link
        fedilink
        arrow-up
        8
        ·
        8 days ago

        Don’t worry, we’ll be extinct soon, hopefully. Maybe even before int32_t runs out. Unfortunately not soon enough to stop the humans impact on earth before the worst damage is done.

        • BlanketsWithSmallpox@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          8 days ago

          I’ll let you in on a secret.

          Humanity and the animals that we like will get through just fine.

          Humans in general and the vast majority of biodiversity will be fucked if it ever happens.

          I firmly believe it won’t. Too many good people in the world doing far more than the shitty ones.

  • Rusty@lemmy.ca
    link
    fedilink
    English
    arrow-up
    95
    ·
    9 days ago

    I don’t think 10000 year is a problem. There is a real “year 2038 problem” that affects system storing unix time in signed int32, but it’s mostly solved already. The next problem will be in year 33000 or something like that.

      • marcos@lemmy.world
        link
        fedilink
        arrow-up
        14
        ·
        9 days ago

        Yes, there are random systems using every kind of smart or brain-dead option out there.

        But the 2038 problem impacts the previous standard, and the current one will take ages to fail. (No, it’s not 33000, unless you are using some variant of the standard that counts nanoseconds instead of seconds. Those usually have more bits nowadays, but some odd older systems do it on the same 64 bits from the standard.)

    • Ephera@lemmy.ml
      link
      fedilink
      English
      arrow-up
      22
      ·
      9 days ago

      Well, I looked at a Year 10000 problem less than 2 hours ago. We’re parsing logs to extract the timestamp and for that, we’re using a regex which starts with:

      \d{4}-\d{2}-\d{2}
      

      So, we assume there to be 4 digits for the year, always. Can’t use it, if you live in the year 10000 and beyond, nor in the year 999 and before.

        • Ephera@lemmy.ml
          link
          fedilink
          English
          arrow-up
          5
          ·
          9 days ago

          Do you think so? Surely, it’s able to handle dates before the year 999 correctly, so I’d also expect it to handle years beyond 10000. The \d{4} is just our bodged assumption, because well, I have actually never seen a log line with a year that wasn’t 4 digits…

          • itslilith@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            8
            ·
            9 days ago

            Kinda?

            Each date and time value has a fixed number of digits that must be padded with leading zeros.

            To represent years before 0000 or after 9999, the standard also permits the expansion of the year representation but only by prior agreement between the sender and the receiver.[21] An expanded year representation [±YYYYY] must have an agreed-upon number of extra year digits beyond the four-digit minimum, and it must be prefixed with a + or − sign[22] instead of the more common AD/BC (or CE/BCE) notation; by convention 1 BC is labelled +0000, 2 BC is labeled −0001, and so on.[23]

            • Ephera@lemmy.ml
              link
              fedilink
              English
              arrow-up
              5
              ·
              9 days ago

              Oh wow, I really expected the standard to just say that however many digits you need are fine, because you know, maths. But I guess, this simplifies handling all kinds of edge cases in the roughly 7975 years we’ve still got.

    • Pennomi@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      9 days ago

      It’s a UX problem rather than a date format problem at that point. Many form fields require exactly 4 digits.

    • GissaMittJobb@lemmy.ml
      link
      fedilink
      arrow-up
      7
      ·
      9 days ago

      It’s going to be significantly more than the year 33000 before we run out of 64-bit epoch timestamps.

      The max value for signed 64-but epoch values is more than 292 billion years away, or 20 times the age of the universe itself.

      So yeah, we’re basically solid forever with 64-bit

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        9 days ago

        33,000 would come from other programs that store the year as a 16-bit signed int. Year 32,768, to be precise.

    • kevincox@lemmy.ml
      link
      fedilink
      arrow-up
      5
      ·
      9 days ago

      it’s mostly solved already

      I wished I believe this. Or I guess I agree that it is solved in most software but there is lots of commonly used software where it isn’t. One broken bit of software can fairly easily take down a whole site or OS.

      Try to create an event in 2040 in your favourite calendar. There is a decent chance it isn’t supported. I would say most calendar servers support it, but the frontends often don’t or vice-versa.

    • toddestan@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      8 days ago

      I’ve been curious about that myself. On one hand, it still seems far away. On the other hand, it’s a bit over 13 years away now and I have gear actively in use that’s older than that today.

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        13
        ·
        edit-2
        9 days ago

        A common method of storing dates is the number of seconds since midnight on Jan 1, 1970 (which was somewhat arbitrarily chosen).

        A 32-bit signed integer means it can store numbers between 231 through 231 - 1 (subtracting one comes from zero being effectively a positive number for these purposes). 231 - 1 seconds added to Jan 1, 1970 gets you to Jan 19, 2038.

        The solution is to jump to 64-bit integers, but as with Y2K, there’s a lot of old systems that need to be updated to 64-bit integers (and no, they don’t necessarily have to have 64-bit CPUs to make that work). For the most part, this has been done already. That would put the date out to 292,277,026,596 CE. Which is orders of magnitude past the time for the sun to turn into a red giant.

        • pfm@scribe.disroot.org
          link
          fedilink
          arrow-up
          2
          ·
          8 days ago

          Maybe it’s not LI5, but I certainly enjoy your explanation for including several important facts and context. I respect your skill and knowledge, dear internet stranger.

        • gandalf_der_12te@discuss.tchncs.de
          link
          fedilink
          arrow-up
          1
          ·
          8 days ago

          midnight on Jan 1, 1970 (which was somewhat arbitrarily chosen).

          well not so much, as far as I remember the first end-user computers became available in 1971 or 1972 or something, and the internet also underwent some rapid developments in that time, so the date has a certain reasoning to it.

      • teije9@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        8
        ·
        9 days ago

        Unix computers store time in seconds that have passed since january first 1970. one there have been too many seconds since 1970, it starts breaking. ‘signed’ is a way to store negative numbers in binary. the basics of it are: when the leftmost bit is a 1, it’s a negative number (and then you do some other things to the rest of the number so that it acts like a negative number) so when there have been 09999999 seconds since 1970, if there’s one more second it’ll be 10000000, which a computer sees as -9999999.

    • JackbyDev@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 days ago

      I don’t think it will be a problem because it’s 8,000 years away lol, but people do store time in ISO 8601 strings.

  • Gork@lemm.ee
    link
    fedilink
    arrow-up
    56
    ·
    9 days ago

    There might be a new calendar year system by then. Probably some galactic dictator who says that the beginning of their rule is now Year Zero.

    Year Zero of the Glorious Zorg Empire!

    • ERROR: Earth.exe has crashed@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      14
      ·
      9 days ago

      Lol China used to use “Year 1” right after Xinhai Revolution.

      Its “民国” (ROC) followed by the year number

      Example: 民国一年 ROC Year One (aka 1912)

      (ROC stand for Republic of China, btw)

      Then the communists kicked the KMT out, and I think the ROC government in exhile in Taiwan stopped using it.

      • randint@lemmy.frozeninferno.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 days ago

        and I think the ROC government in exhile in Taiwan stopped using it.

        Actually it is still used. It’s everywhere in legal documents, government documents and stuff. Though people more commonly say 2024 instead of 民國113年.

      • Donkter@lemmy.world
        link
        fedilink
        arrow-up
        33
        ·
        9 days ago

        That has forever been the fallacy.

        The poor won’t die in the apocalypse leaving only the rich behind. The poor will die, and the rich will be faced with the harsh reality that they needed an army of poor working under them to sustain themselves, leading them to all die within the generation.

        • DogWater@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          8 days ago

          That’s true until it isn’t. Automation is on its way. Marching ever onward.

          The factory I work in built a new building this year that employs 1/4 of the workers as the next newest one and does 2.5x the output.

          • GenosseFlosse@feddit.org
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            8 days ago

            You still need loaders, drivers, retailers to get anything to the customer. A lot of rich ski and holiday towns can’t staff the stores and Cafe’s, because the employees can’t afford to pay rent in the same towns, so they face a similar issue…

            • explodicle@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              ·
              8 days ago

              Amazon is already testing robotic loaders, self driving trucks are already in development, and vending machines retail everything in Japan.

              • GenosseFlosse@feddit.org
                link
                fedilink
                arrow-up
                1
                ·
                7 days ago

                Maybe, but there are still a lot of invisible people involved to get the food all the way to your table. And small suppliers cannot afford to switch their whole operation to robots.

    • dutchkimble@lemy.lol
      link
      fedilink
      arrow-up
      6
      ·
      8 days ago

      The trick is to unplug our computer a few seconds before midnight on December 31st, 9999 and then plug in the wire again

  • chetradley@lemm.ee
    link
    fedilink
    arrow-up
    34
    ·
    9 days ago

    In 9999, this meme will be problematic because it assumes the entire galaxy conforms to an Earth-based calendar system.

  • MystikIncarnate@lemmy.ca
    link
    fedilink
    English
    arrow-up
    33
    ·
    edit-2
    8 days ago

    In this thread: mostly people that don’t know how timekeeping works on computers.

    This is already something that we’re solving for. At this point, it’s like 90% or better, ready to go.

    See: https://en.m.wikipedia.org/wiki/Year_2038_problem

    Time keeping, commonly, is stored as a binary number that represents how many seconds have passed since midnight (UTC) on January 1st 1970. Since the year 10,000 isn’t x seconds away from epoch (1970-01-01T00:00:00Z), where x is any factor of 2 (aka 2^x, where x is any integer), any discrepancies in the use of “year” as a 4 digit number vs a 5 digit number, are entirely a display issue (front end). The thing that does the actual processing, storing and evaluation of time, gives absolutely no fucks about what “year” it is, because the current datetime is a binary number representing the seconds since epoch.

    Whether that is displayed to you correctly or not, doesn’t matter in the slightest. The machine will function even if you see some weird shit, like the year being 99 100 because some lazy person decided to hard code it to show “99” as the first two digits, then take the current year, subtract 9900, and display whatever was left (so it would show the year 9999 as “99”, and the year 10000 as year “100”) so the date becomes 99 concatenated with the last two (now three) digits left over.

    I get that it’s a joke, but the joke isn’t based on any technical understanding of how timekeeping works in technology.

    The whole W2k thing was a bunch of fear mongering horse shit. For most systems, the year would have shown as “19-100”, 1900, or simply “00” (or some variant thereof).

    Edit: the image in the OP is also a depiction of me reading replies. I just can’t even.

    • FooBarrington@lemmy.world
      link
      fedilink
      arrow-up
      17
      ·
      edit-2
      8 days ago

      My brother in Christ, there’s more to time than just storing it. Every datetime library I’ve ever used only documents formatting/parsing support up to four year digits. If they suddenly also supported five digits, I guarantee it will lead to bugs in handling existing dates, as not all date formats could still be parsed unambiguously.

      It won’t help you if time is stored perfectly, while none of your applications support it.

      Regarding Y2K, it wasn’t horse shit - thousands upon thousands of developer hours were invested to prevent these issues before they occurred. Had they not done so, a bunch of systems would have broken, because parsing time isn’t just about displaying 19 or 20.

      • JackbyDev@programming.dev
        link
        fedilink
        English
        arrow-up
        5
        ·
        8 days ago

        The comment you’re replying to is really frustrating to me. It annoys me when people are so arrogant but also wrong. Do they live in a perfect world where nobody stores dates as ISO 8601 strings? I’ve seen that tons of times. Sometimes, it may even be considered the appropriate format when using stuff like JSON based formats.

        • FooBarrington@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          8 days ago

          I’m 100% with you - it’s the dangerous level of knowledge where someone understands the technical background for the most part, but is lacking real world experience. Reminds me of the blog posts titled “Misconceptions programmers have about X” - almost everything we touch in IT is complicated if you get deep enough.

          But their style of commenting really jives with Lemmy on technical topics. I can’t count the number of posts where people proudly shout fundamentally wrong explanations for current AI models, yet any corrections are downvoted to oblivion. It’s not as bad on non-AI-topics, but I can’t imagine anyone in the field reading GPs comment and agreeing…

      • friendlymessage@feddit.org
        link
        fedilink
        arrow-up
        1
        ·
        8 days ago

        I would hope that these kinds of parsers are not used in critical applications that could actually lead to catastrophic events, that’s definitely different to Y2K. There would be bugs, yes, but quite fixable ones.

        Regarding Y2K, it wasn’t horse shit - thousands upon thousands of developer hours were invested to prevent these issues before they occurred. Had they not done so, a bunch of systems would have broken, because parsing time isn’t just about displaying 19 or 20.

        “There’s no glory in prevention”. I guess it’s hard to grasp nowadays, that mankind at some point actually tried to stop catastrophies from happening and succeeded

        • FooBarrington@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          8 days ago

          Even if such parsers aren’t used directly in critical systems, they’ll surely be used in the supply chains of critical systems. Your train won’t randomly derail, but disruptions in the supply chain can cause repair parts not to be delivered, that kind of thing.

          And you can be certain such parsers are used in almost every application dealing with datetimes that hasn’t been specifically audited or secured. 99% of software is held together with duct tape.

          • friendlymessage@feddit.org
            link
            fedilink
            arrow-up
            1
            ·
            8 days ago

            True. But I wouldn’t see this as extremely more critical than the hundreds of other issues we encounter daily in software. Tbh, I’d be glad if some of the software I have to use daily had more duct tape on it…

            • FooBarrington@lemmy.world
              link
              fedilink
              arrow-up
              4
              ·
              8 days ago

              I think you might be underestimating the potential impact.

              Remember the Crowdstrike Windows BSOD? It caused billions in damages, and it’s the absolute best case scenario for this kind of issue. Our potential Y10K bug has a bunch of additional issues:

              • you don’t just have to patch one piece of software, but potentially all software ever written that’s still in use, a bunch of which won’t have active maintainers
              • hitting the bug won’t necessarily cause crashes (which are easy to recognize), it can also lead to wrong behavior, which will take time to identify. Now imagine hundreds of companies hitting the bug in different environments, each with their own wrong behavior. Can you imagine the amount of continuous supply chain disruptions?
              • fixes have to be thought about and implemented per-application. There’s no panacea, so it will be an incredible amount of work.

              I really don’t see how this scenario is comparable to anything we’ve faced, beyond Y2K.

    • friendlymessage@feddit.org
      link
      fedilink
      arrow-up
      16
      ·
      8 days ago

      Y2K was definitely not only fear-mongering. Windows Systems did not use Unix timestamps, many embedded systems didn’t either, COBOL didn’t either. So your explanation isn’t relevant to this problem specifically and these systems were absolutely affected by Y2K because they stored time differently. The reason we didn’t have a catastrophic event was the preventative actions taken.

      Nowadays you’re right, there will be no Y10K problem mainly because storage is not an issue as it was in the 60s and 70s when the affected systems were designed. Back then every bit of storage was precious and therefore omitted when not necessary. Nowadays, there’s no issue even for embedded systems to set aside 64 bit for timekeeping which moves the problem to 292277026596-12-04 15:30:08 UTC (with one second precision) and by then we just add another bit to double the length or are dead because the sun exploded.

      • chiliedogg@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        8 days ago

        The Microsoft Zune had a y2k9 bug caused by a lingering clock issue from leap year from the extra day in February 2008 that caused them to crash HARD on Jan 1, 2009. I remember It being a pretty big PITA getting it back up and running.

      • LovableSidekick@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        7 days ago

        Not a storage problem but still a possible problem in UIs and niche software that assumes years have 4 digits or 4 characters. But realistically if our civilization is even still around then AI will be doing all that for us and it won’t be an issue humans even notice.

        • AppaYipYip@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          7 days ago

          This would be a great short story if for some reason the AI didn’t realize there was going to be a date issue and didn’t properly update itself causing it to crash. Then the problem is it was self sufficient for so long no humans know how to restart it or fix the issue, causing society to have a technology blackout for the first time in centuries.

          • LovableSidekick@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            7 days ago

            Yes, it’s kind of a familiar sci-fi trope - a supercomputer that has no built-in recovery mechanism in spite of being vitally important. Like the Star Trek episode where they made smoke come out of a robot’s head by saying illogical things.

    • JackbyDev@programming.dev
      link
      fedilink
      English
      arrow-up
      11
      ·
      8 days ago

      You need to qualify your statement about Y2K being fear mongering. People saying all technology would stop (think planes crashing out of the sky) were clearly fear mongering or conspiracy theorists. People saying certain financial systems needed to be updated so loans didn’t suddenly go from the year 1,999 to 19,100 or back to 1900 were not fear mongering. It’s only because of a significant amount of work done by IT folks that we have the luxury of looking back and saying it was fear mongering.

      Look at this Wikipedia page for documented errors. One in particular was at a nuclear power plant. They were testing their fix but accidentally applied the new date to the actual equipment. It caused the system to crash. It took seven hours to get back up and they had to use obsolete equipment to monitor conditions until then. Presumably if the patch wasn’t applied this would happen at midnight on January 1st 2000 too.

      Y2K was a real problem that needed real fixes. It just wasn’t an apocalyptic scenario.

      • dx1@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        8 days ago

        Planes crashing out of the sky wouldn’t have been inconceivable. Say you have two air traffic control systems that are synchronizing - one handles dates with a modulo 100 (00-99, i.e. 1900-1999), another handles them in epoch time. All of a sudden the two reported time + positions of two different planes don’t match up by a century, and collision projection software doesn’t work right. I’ve seen nastier bugs than that, in terms of conceptual failure.

        At no point is that a theory about a “conspiracy” either, IDK why you’re bandying that term around.

        • JackbyDev@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          8 days ago

          At no point is that a theory about a “conspiracy” either, IDK why you’re bandying that term around.

          Conspiracy is probably the wrong term. What I mean is that some (keyword: some) predictions were quite extreme and apocalyptic. See the fringe group response section for examples of what I was trying to convey.

          The New York Times reported in late 1999, “The Rev. Jerry Falwell suggested that Y2K would be the confirmation of Christian prophecy – God’s instrument to shake this nation, to humble this nation. The Y2K crisis might incite a worldwide revival that would lead to the rapture of the church. Along with many survivalists, Mr. Falwell advised stocking up on food and guns”.

          That’s what I meant by the sort of “conspiratorial” response. Maybe I should reword my post to make it more clear?

      • MystikIncarnate@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 days ago

        You’re spot on. The vast majority of news coverage and “hype” from the general public relating to Y2K was all horse shit, but there were critical systems that did have issues and needed some work.

        For the most part, the whole 19100 issue was a display bug, and likely wouldn’t have caused problems, and the same for 1900… Those are examples that people generally saw at banks and whatnot, it would, for the most part, look weird, but for the most part, wouldn’t create any actual problems. It would just be confusing for a while until the system caught up.

        I think there’s a few examples of companies missing the January 1st deadline and ending up with stuff marked as January 1900 for a bit. Otherwise they didn’t have any significant issues.

        Anything that involves a legally binding agreement would be critical though. Since the date is part of the agreement terms, it would need to be correct, and shown correctly.

        Unless the “bug” literally crashed the system (which, it really should not have in most cases), like in your example, or it was connected to a legal contract, then it really wasn’t that big of a problem.

        The media, and people in general kept going on about it like they knew what the technical problem was, and it was always just conjecture and banter that made people worry unnecessarily.

        What I’m trying to say is that Y2K was something that needed to be fixed but the likelihood that it would affect any singular person in society was very small. Those that were going to be affected, generally knew who they were and they were taking the steps required to fix the problem.

        • JackbyDev@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 days ago

          I remember reading a primary source about someone’s experience fixing Y2K stuff. I wish I could find it or remember more. The funniest part was that they actually got everything to work, but on January 1st when they tried to get into work their badge didn’t work! The system on their badge reader was broken due to Y2K lol.

    • SwingingTheLamp@midwest.social
      link
      fedilink
      English
      arrow-up
      8
      ·
      8 days ago

      I first heard about the Y2K bug in about 1993 from a programmer who was working on updating systems. These were all older systems, often written in COBOL, which did not use epoch time, and in fact didn’t reference system time at all. They’d be doing math on data entered by users, and since they were written back when every byte of memory was precious (and nobody expected that the program would still be in use after 30 years), they’d be doing math on two-digit years. It would certainly be a problem to calculate people’s ages, loan terms, payments due, et cetera, and get negative numbers.

      Heck, I remember reading a story about a government system once that marked the residents of Hartford, CT as dead, because somehow the last letter of the city name data overflowed into the next column, and marked them as 'd’eceased. Y2K was definitely a real problem.

    • SapphironZA@sh.itjust.works
      link
      fedilink
      arrow-up
      6
      ·
      8 days ago

      Y2k was not fear mongering. There were a great many systems, in industrial finance and infrastructure applications that definitely needed to be addressed. You know, the things that keep modern infrastructures running. Of course there were consumer facing companies that took advantage of it, but that was small in comparison.

      It ended up not being a disaster, because it was taken seriously.

    • boonhet@lemm.ee
      link
      fedilink
      arrow-up
      4
      ·
      8 days ago

      Lmao I actively work with shortdates in a database because I have no control over how things are stored. They need to solve before 100 years have passed from the epoch, but at some point before then it’ll be fun to figure out if “58” in a date of birth is 1958 or 2058.

    • JcbAzPx@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      6 days ago

      Y2K wasn’t entirely fear mongering horse shit. There were quite a few important cogs in our digital infrastructure that were using code that would not work past 1999. It was necessary to terrify corporate ownership into paying to fix the code, otherwise they would have never done it.

    • Skates@feddit.nl
      link
      fedilink
      arrow-up
      3
      ·
      8 days ago

      … any discrepancies in the use of “year” as a 4 digit number vs a 5 digit number, are entirely a display issue (front end).

      That’s exactly how I read the meme. It would still require a change.

      Whether that is displayed to you correctly or not, doesn’t matter in the slightest. The machine will function even if you see some weird shit,

      I’m not sure if this is some nihilistic stuff, or you really think this. Of course nothing actually matters. The program will still work even if the time is uint32 instead of uint64. The machine of course will still work as well. Shit, your life will go on. The earth continues to spin and this will for sure not cause the heat death of the universe. But aside from actual crashes and some functionality bugs, UI issues should be the ones you worry about the most. If your users are a bank and they need to date the contracts, and you only offer 3 digits for the year? I think you’ll agree with me that if users don’t like using your program, it’s a useless program.

  • SapphironZA@sh.itjust.works
    link
    fedilink
    arrow-up
    28
    ·
    8 days ago

    Nah, they will do what they always do. Change some system environmental variables to move the zero date on till after they would have retired.

    Nobody wants to touch the original code, it was developed in the 1970s

    • ddplf@szmer.info
      link
      fedilink
      arrow-up
      11
      ·
      8 days ago

      Look at this fucking piece of shit code, oh right, it’s been written by a homo sapiens sapiens. No wonder they collapsed soon after.

  • Jamablaya@lemmy.world
    link
    fedilink
    arrow-up
    25
    ·
    edit-2
    8 days ago

    oh just start at 0000 again, signate that as 10,000. Files didn’t start until like 1979 anyways, and there can’t be many left, and even if it is a problem, now you have 2000 years to not worry about it.

  • BmeBenji@lemm.ee
    link
    fedilink
    arrow-up
    24
    ·
    9 days ago

    We’re being short-sighted

    Tell that to the billionaires speed-running terraforming this planet into a barren wasteland.

  • Zozano@lemy.lol
    link
    fedilink
    English
    arrow-up
    17
    ·
    edit-2
    9 days ago

    Awww shit, time to rewatch my favourite Jike Mudge movie starring Lon Rivingston; Space Office (9999).

    Haha, I can’t believe this guy has the job of manually changing all the dates on the company’s database, this place sucks. I bet the past was way better.

    • yannic@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      7 days ago

      Ditto for the Y6239 problem for what must be a dozen of pieces of software that use the hebrew calendar, when it switches to five digit years.

      • dx1@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        8 days ago

        Years

        YYYY

        ±YYYYY

        ISO 8601 prescribes, as a minimum, a four-digit year [YYYY] to avoid the year 2000 problem. It therefore represents years from 0000 to 9999, year 0000 being equal to 1 BC and all others AD, similar to astronomical year numbering. However, years before 1583 (the first full year following the introduction of the Gregorian calendar) are not automatically allowed by the standard. Instead, the standard states that “values in the range [0000] through [1582] shall only be used by mutual agreement of the partners in information interchange”.[20]

        To represent years before 0000 or after 9999, the standard also permits the expansion of the year representation but only by prior agreement between the sender and the receiver.[21] An expanded year representation [±YYYYY] must have an agreed-upon number of extra year digits beyond the four-digit minimum, and it must be prefixed with a + or − sign[22] instead of the more common AD/BC (or CE/BCE) notation; by convention 1 BC is labelled +0000, 2 BC is labeled −0001, and so on.[23]

        If you’re being handed a string 2022424-12-19T14:44:39Z, and told it’s ISO-8601, you should be able to figure it out. Really, a decent parser should be able to recognize that on its own (just use {4,} instead of {4} in regex). It does mean that non-hyphenated YYYYMMDD shouldn’t be used (I typically never see them encoded that way) - but even if you did, you’d just do (\d{4,})(\d{2})(\d(2}).

        • JackbyDev@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          8 days ago

          I get your point, but in the same way that people “shouldn’t” have been using two digits for year storage, there are certainly many parsers of ISO 8601 that don’t match the spec. In 8,000 years I don’t think this will be a problem though lol. I don’t think we can really perceive what technology might be like that far in the future. But, hypothetically, is year 10,000 was in a few days and this was year 9,999 I would suspect we’d see at least some problems come January.

          As an example, YAML 1.2 changed Boolean representation to only be case insensitive in 2009, but in 2022 people still complain about the 1.1 version. (Caveat: I don’t know if this person actually ran into a “real” problem or only a hypothetical one.)

          • JcbAzPx@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 days ago

            I mean, that’s exactly what programmers in the '70s thought. That there would be no way in hell that their crap code would still be in use going onto 2000.

            Thing is, copy/paste is always going to be easier than writing new code and that’s only going to get worse as chat bots start coding for us.