• guy@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    edit-2
    1 year ago

    The IEC changing the definition of 1KB from 1024 bytes to 1000 bytes was a terrible idea that’s given us this whole mess. Sure, it’s nice and consistent with scientific prefix now… except it’s far from consistent in actual usage. So many things still consider it binary prefix following the JEDEC standard. Like KiB that’s always 1024 bytes, I really think they should’ve introduced another new unambiguous unit eg. KoB that’s always 1000 bytes and deprecated the poorly defined KB altogether

    • sudoku@programming.dev
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      M stands for Mega, a SI prefix that existed longer than the computer data that is being labeled. MB being 1000000 bytes was always the correct definition, it’s just that someone decided that they could somehow change it.

      • guy@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        Consistency with proper scientific prefix is nice to have, but consistency within the computing industry itself is really important, and now we have neither. In this industry, binary calculations were centric, and powers of 2 were much more useful. They really should’ve picked a different prefix to begin with, yes. However, for the IEC correcting it retroactively, this has failed. It’s a mess that’s far from actually standardised now

      • barsoap@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        B and b have never been SI units. Closest is Bq. So if people had not been insisting that it’s confusing noone would’ve been confused.