• sudoku@programming.dev
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    M stands for Mega, a SI prefix that existed longer than the computer data that is being labeled. MB being 1000000 bytes was always the correct definition, it’s just that someone decided that they could somehow change it.

    • guy@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Consistency with proper scientific prefix is nice to have, but consistency within the computing industry itself is really important, and now we have neither. In this industry, binary calculations were centric, and powers of 2 were much more useful. They really should’ve picked a different prefix to begin with, yes. However, for the IEC correcting it retroactively, this has failed. It’s a mess that’s far from actually standardised now

    • barsoap@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      B and b have never been SI units. Closest is Bq. So if people had not been insisting that it’s confusing noone would’ve been confused.