Why were people so scared of Y2K? How could dates changing to 2000 affect computers?

Why were people so scared of Y2K? How could dates changing to 2000 affect computers?

Thalidomide Vintage Ad Shirt $22.14

DMT Has Friends For Me Shirt $21.68

Thalidomide Vintage Ad Shirt $22.14

  1. 2 years ago
    Anonymous

    back in the day it was common practice everywhere to refer to the year using only two digits (like say, '85, '92, etc), and there was some cases of programs being implemented internally like this as well, only expecting two digits for the year, so when 2000 came long, some programs may have rolled over to "00", meaning the year apparently went backwards, which can confuse some programming, since it will have been designed for the year to only move forward
    there was also some cases of the year becoming "100", which can break things in different ways, like i remember seeing pictures of websites where the date was listed as "January 1, 19100", as it was programmed such that it just appended the two-digit year to an assumed "19"

    • 2 years ago
      Anonymous

      also, i know that sounds like a dumb mistake to make, but remember it was the 20'th century for almost 100 years at that point, and for the entirety of electrical computing, so it's not surprising that the everyday habit of two-digit years would seep into some programs
      it seems silly now only because we've now lived through a millenium change, and the scare that was y2k

    • 2 years ago
      Anonymous

      https://i.imgur.com/jMhWtQq.png

      Why were people so scared of Y2K? How could dates changing to 2000 affect computers?

      The dangerous part comes from if an important piece of infrastructure or something similar is set on a clock/timer that wasn't Y2K compliant.

      >bridge is supposed to go up automatically at 1/1/(20)00 oops its 1/1/00, bridge didn't go up

    • 2 years ago
      Anonymous

      similary, we saw this same issue — albeit with a significantly less degree — when browsers got to version 100 and some websites that waren't prepared for a three digit UA

      first reply, best reply

  2. 2 years ago
    Anonymous

    And nothing happened.

    Year 2038 will be a fun one too (32bits int overflow)

    • 2 years ago
      Anonymous

      now this one is more fundamental, since you don't even need to have made any kind of oversight or lazy shortcut for that to frick you over
      that one is a fundamental component of a very widespread time counting format

      people have already exploited that, too, like with the iphone 1970 hoax

    • 2 years ago
      Anonymous

      How are we gonna solve this one, bros?

      • 2 years ago
        Anonymous

        change all the code to 64 bit and dont worry about it for some millennia

      • 2 years ago
        Anonymous

        upgrade to software using 64bit time instead
        >(2^31) seconds = 68.0511039 years
        >(2^63) seconds = 292,277,266 millenia

        • 2 years ago
          Anonymous

          change all the code to 64 bit and dont worry about it for some millennia

          This. And we better start soon, because everyone knows some Unix(like) systems out there, even connected to the Internet, are running very outdated OS versions. If we keep rolling out systems using the original 32bit Unix time, they might actually still be in service in 2038.

      • 2 years ago
        Anonymous

        Upgrade from debian stable to testing.

    • 2 years ago
      Anonymous

      Year 2038 is to the computer what year 2000 is to the human being.
      Pretty sure some heavy shit's going to go down in 2038.
      Goddamn those homosexuals for not using an unsigned 32-bit integer, which would've pushed this problem back to 2106 or something.

      • 2 years ago
        Anonymous

        using an unsigned int would have been moronic, since it would have meant they couldn't record dates of past events, since the epoch was just around the time the format was made

        • 2 years ago
          Anonymous

          also, i really doubt anybody there really expected people to still be using it 68 years later

          • 2 years ago
            Anonymous

            >also, i really doubt
            Dude this computer thing just started and everyone was punching these cards LOL. I do not think they where thinking anything or understand that this computer thing will be big YO.

            If DOS was the dinosaur age then these times where after the big bang before any heavy particles started to form. Only hydrogen atoms everywhere.

          • 2 years ago
            Anonymous

            >also, i really doubt

            Dude this computer thing just started and everyone was punching these cards LOL. I do not think they where thinking anything or understand that this computer thing will be big YO.

            If DOS was the dinosaur age then these times where after the big bang before any heavy particles started to form. Only hydrogen atoms everywhere.

            >Extra byte for past dates, most use of time functions are actually for logging things.
            handling past dates separately sounds like it would be a pita, especially for things that need to support past and future dates (which is a lot of things, including databases, which was an extremely popular task for computers early on)
            reserving a bit for it was a perfectly sensible thing to do, and i'm sure they were certain it would be long replaced well before 68 years had passed, this was implemented before home computers were a thing

            >this was implemented before home computers were a thing
            See above.

        • 2 years ago
          Anonymous

          Extra byte for past dates, most use of time functions are actually for logging things.

          also, i really doubt anybody there really expected people to still be using it 68 years later

          Not unlikely, outside of consumer products, which are the critical ones, which you'd know, if you weren't a homosexual.

          • 2 years ago
            Anonymous

            computers weren't a consumer product in the early 1970's

          • 2 years ago
            Anonymous

            Making it even more unforgivable, if they didn't expect a standard to outlast its runtime.
            Remember, a standard, not hardware itself.
            There are very likely ISO and DIN standards in use that are at least a century old.

            >Extra byte for past dates, most use of time functions are actually for logging things.
            handling past dates separately sounds like it would be a pita, especially for things that need to support past and future dates (which is a lot of things, including databases, which was an extremely popular task for computers early on)
            reserving a bit for it was a perfectly sensible thing to do, and i'm sure they were certain it would be long replaced well before 68 years had passed, this was implemented before home computers were a thing

            Nope, just a bit somewhere that instructs code to interpret it negatively, easy.
            68 years is not that long, they could've anticipated this.
            But 292 million millenia is fine, go ahead and take the bit.

          • 2 years ago
            Anonymous

            was it really a standard at that point though?
            it's easy to consider "unix" a (set of) standard(s) now, but in the early 70's it was just one of many operating systems, the time format was probably already in stone before they even considered porting it to something besides a PDP

          • 2 years ago
            Anonymous

            >68 years is not that long, they could've anticipated this.
            See

            >also, i really doubt

            Dude this computer thing just started and everyone was punching these cards LOL. I do not think they where thinking anything or understand that this computer thing will be big YO.

            If DOS was the dinosaur age then these times where after the big bang before any heavy particles started to form. Only hydrogen atoms everywhere.

            [...]
            >this was implemented before home computers were a thing
            See above.

            My theory is that they where thinking something like this
            >And then we make Unix time 2.0 who uses the year 2000 as the year zero

            I think during that time when most shit was printed out of the computer and there where what 30 computers in the world simply changing the format was considered normal.

            So a full stop and start under this new system of all computers would have bean acceptable.

            Punch cards, paper print outs.Also memory is expensive YO !

          • 2 years ago
            Anonymous

            >Extra byte for past dates, most use of time functions are actually for logging things.
            handling past dates separately sounds like it would be a pita, especially for things that need to support past and future dates (which is a lot of things, including databases, which was an extremely popular task for computers early on)
            reserving a bit for it was a perfectly sensible thing to do, and i'm sure they were certain it would be long replaced well before 68 years had passed, this was implemented before home computers were a thing

      • 2 years ago
        Anonymous

        >unsigned 32-bit integer
        Are you fricken insane ?
        Signed integers can let you detect overflows since if the year is negative this means something fricked up.

        >which would've pushed this problem back to 2106 or something.
        Or change to 64 bit that lets you push the problem to the year
        >Using a signed 64-bit value introduces a new wraparound date that is over twenty times greater than the estimated age of the universe: approximately 292 billion years from now
        Yea that one.

        How are we gonna solve this one, bros?

        Move from 32 bit 64 bit unix time.

        I expect more hardware to fail in 2038 however.

        • 2 years ago
          Anonymous

          >Signed integers can let you detect overflows
          Wow.
          Unlike signed, unsigned is actually fully defined by the standard, by the way.

          • 2 years ago
            Anonymous

            Learn C and talk to my gayot.
            It helps.

          • 2 years ago
            Anonymous

            You suffer from Dunning-Kruger.
            >The problem with this is that according to the C standard, signed integer overflow is undefined behavior. In other words, according to the standard, as soon as you even cause a signed overflow, your program is just as invalid as if you dereferenced a null pointer. So you can't cause undefined behavior, and then try to detect the overflow after the fact, as in the above post-condition check example.
            https://stackoverflow.com/questions/3944505/detecting-signed-overflow-in-c-c

    • 2 years ago
      Anonymous

      >And nothing happened.
      people worked around the clock to fix it in time you moron

      If only we could change the date on these devices and see in advance what will happen.
      They wanted to practice scare tactics on the population at large, then they did 9/11, now we have even more sophisticated scare tactics like masks and vaccines.

      >If only we could change the date on these devices and see in advance what will happen.
      You can't do that to a live banking mainframe numbnuts

      • 2 years ago
        Anonymous

        >what is a development environment

    • 2 years ago
      Anonymous

      >And nothing happened.
      There where incidents.
      https://en.wikipedia.org/wiki/Year_2000_problem#Documented_errors

      >On 1 January 1999, taxi meters in Singapore stopped working, while in Sweden, incorrect taxi fares were given.[47]
      >On 28 December 1999, 10,000 card swipe machines issued by HSBC and manufactured by Racal stopped processing credit and debit card transactions.[16] The stores relied on paper transactions until the machines started working again on 1 January.[48]
      >In Japan, NTT Mobile Communications Network (NTT Docomo), Japan's largest cellular operator, reported that some models of mobile telephones were deleting new messages received, rather than the older messages, as the memory filled up.[51]

    • 2 years ago
      Anonymous

      >Why were people so scared of Y2K?
      There were many bad programmers also back then.
      >How could dates changing to 2000 affect computers?
      Rollover would landed u sin the year 19101 or worse. Back then I was in a team to audit out old products and there were components that just refused to boot up past year 2000. We fixed the problem by setting the local clock to 1970.

      >And nothing happened.
      That is wrong.

      >Year 2038 will be a fun one too (32bits int overflow)
      We also had a week 1024 rollover that also caused havoc. It could happen again.

  3. 2 years ago
    Anonymous

    If only we could change the date on these devices and see in advance what will happen.
    They wanted to practice scare tactics on the population at large, then they did 9/11, now we have even more sophisticated scare tactics like masks and vaccines.

  4. 2 years ago
    Anonymous

    congratulations, zoomer. you just discovered Y2K bug. you have been browsing the aesthetics wikia arent you?

  5. 2 years ago
    Anonymous

    In the years leading up to it, manufacturers put Y2K compatible stickers on all kinds of tech, even shit that had no chance of being affected by it, like 56k modems.

  6. 2 years ago
    Anonymous

    Same reason people were afraid of 5G radiation.

    Tech illiterate boomers and normie morons.

  7. 2 years ago
    Anonymous

    >How could dates changing to 2000 affect computers?
    ZOOM ZOOM does not understand integer overflow.

    >How
    Order these numbers from biggest to lowest

    01
    15
    08
    25
    39
    80
    99
    00

    Now only use 2 digits for the year

    Tell em what is bigger
    98 or 99
    What is bigger
    01 or 00
    What is bigger
    99 or 00

    1999 + 1 = 2000
    or
    99 +1 = 00

    Fun fact it messed with systems who did not upgrade you will have the same thing in

    The end of unix time is coming.

    >How
    Zoom zooms do not understand technology.

    • 2 years ago
      Anonymous

      I'm not a zoomer, I'm 43

  8. 2 years ago
    Anonymous

    God I wish Y2k would have ruined everything.

  9. 2 years ago
    Anonymous

    It was a psyop by tech companies to sell support contracts to the computer illiterate.

  10. 2 years ago
    Anonymous

    If you have an old Mac and set the year to 22, it thinks it's 1922. I think DOS 6.22 was Y2K compliant but older versions (not sure which ones) weren't.

  11. 2 years ago
    Anonymous

    Dates are hilariously mishandled in CS in general. Take all the bad habits of lazy programmers you know now and set them back a few decades as all this is newly discovered. Keep in mind that modern computing still mostly relies on counting seconds from the Java epoch in 1970.

    Examples:
    >Print year? Easy, just concatenate strings 19 and variable limited to ranges 00-99
    >1999 to 1900

    >Print year? Easy, just concatenate strings 19 and variable that increments from current year (99)
    >1999 to 19100

    Space was a commodity back in the days when floppy disks were more prevalent than CDs. We've always had the habit of carrying forward or porting legacy systems instead of outright replacing them. Look at point of sale systems if you want an example, that shit's written in COBOL or PASCAL.

Your email address will not be published. Required fields are marked *