There was a time where this debate was bigger. It seems the world has shifted towards architectures and tooling that does not allow dynamic linking or makes it harder. This compromise makes it easier for the maintainers of the tools / languages, but does take away choice from the user / developer. But maybe that’s not important? What are your thoughts?

  • Synthead@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    It seems the world has shifted towards architectures and tooling that does not allow dynamic linking or makes it harder.

    In what context? In Linux, dynamic links have always been a steady thing.

    • ck_@discuss.tchncs.de
      link
      fedilink
      arrow-up
      9
      ·
      1 year ago

      We could argue semantics here (I don’t really want to), but tools like Docker / Containers, Flatpack, Nix, etc. essentially use sort of a soft static link in that the software is compiled dynamically but the shared libraries are not actually shared at all beyond the boundary of the defining scope.

      So it’s semantically true that dynamic libraries are still used, the execution environments are becoming increasingly static, defeating much of the point of shared libraries.

      • uis@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        but tools like Docker / Containers, Flatpack, Nix, etc. essentially use sort of a soft static link in that the software is compiled dynamically but the shared libraries are not actually shared at all beyond the boundary of the defining scope.

        This garbage practice is imported from windows.

    • ck_@discuss.tchncs.de
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      In Linux, dynamic links have always been a steady thing.

      Hot take: This is only still the case because the GNU libc cannot be statically linked easily