• fmstrat@lemmy.nowsci.com
      link
      fedilink
      English
      arrow-up
      14
      ·
      6 days ago

      The problem is “I need function, library with 1000 functions has function, include.” Library’s 823rd function turns out to have a vulnerability.

    • I Cast Fist@programming.dev
      link
      fedilink
      arrow-up
      12
      ·
      6 days ago

      “Yes, I’d like a wheel. I don’t want to invent it. Why, of course, give me the full package of wheel, axis, rotor, engine, fuel tank, windshield, mirrors, tire, front panel, brakes. This wheel will be great for me manually spinning cotton!”

    • dejected_warp_core@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      6 days ago

      You say that, but I’ve watched the JS community move from one framework and tool suite to the next quite rapidly. By my recollection, I’ve seen a wholesale change in popular tooling at least four times in the last decade. Granted, that’s not every developer’s trajectory through all this, but (IMO) that’s still a lot.

      • bleistift2@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 days ago

        But changing frameworks is not why node_modules is so large. You don’t import Angular and Vue.

        • dejected_warp_core@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          6 days ago

          I agree. Rather each one of those is rather substantial on its own. Plus the churn of going from framework to framework makes it less useful to compress and bundle all this stuff into fixed versions on a slower schedule (e.g. like Ubuntu packages do). I think that all contributes to bloat.

    • onnekas@sopuli.xyz
      link
      fedilink
      arrow-up
      22
      ·
      6 days ago

      Why not import all code ever created by human kind just in case we might need some of it.

      • squaresinger@lemmy.world
        link
        fedilink
        arrow-up
        13
        ·
        6 days ago

        I want to build a kick scooter. For that I need some wheels. So I import the well-known semi-truck framework. From that framework I take some huge wheels. They are too large and too many, but I guess I can make do with them.

        But I need to attach the wheels to one another, so I import the bridge-building-library, because they have steel bars in there.

        Lastly, to attach all of that together I import the NASA space ship framework because there’s a hand welder in there, that’s been deprecated years ago, but it’s still rotting away in there because some important products still require the hand welder class for some entirely unrelated use cases.

      • pinball_wizard@lemmy.zip
        link
        fedilink
        arrow-up
        8
        ·
        6 days ago

        …and then we can grind all the code ever created by human kind into a fine paste, and write a clever algorithm to regurgitate it as a squishy code slurry in response to questions about problems that the standard libraries already solved.

  • dejected_warp_core@lemmy.world
    link
    fedilink
    arrow-up
    13
    ·
    edit-2
    6 days ago

    I used to struggle with this, until I realized what’s really going on. To do conventional web development, you have to download a zillion node modules so you can:

    • Build one or more “transpilers” (e.g. Typescript, Sass support, JSX)
    • Build linters and other SAST/DAST tooling
    • Build packaging tools, to bundle, tree-shake, and minify your code
    • Use shims/glue to hold all that together
    • Use libraries that support the end product (e.g. React)
    • Furnish multiple versions of dependencies in order for each tool to have its own (stable) graph

    All this dwarfs any code you’re going to write by multiple orders of magnitude. I once had a node_modules tree that clocked in at over 1.5GB of sourcecode. What I was writing would have fit on a floppy-disk.

    That said, it’s kind of insane. The problem is that there’s no binary releases, nor fully-vendored/bundled packages. The entire toolchain source, except nodejs and npm, is downloaded in its entirety, on every such project you run.

    In contrast, if you made C++ or Rust developers rebuild their entire toolchain from source on every project, they’d riot. Or, they would re-invent binary releases that weekend.

  • TootSweet@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    7 days ago

    Be the change you want to see in the world, people. Don’t use any Node (or Rust or Python or Java or whatever) modules that have more dependencies than they absolutely, positively, 100%, for real have to. It’s really not that hard. It doesn’t have to be this way.

    • InvalidName2@lemmy.zip
      link
      fedilink
      arrow-up
      13
      ·
      7 days ago

      Which sounds like great, practical advice in a theoretical perfect world!

      But, the reality of the situation is that professionals are usually balancing a myriad of concerns and considerations using objective and subjective evaluations of what’s required of us and quite often inefficiency, whether in the form of programmatic complexity or in the form of disk storage or otherwise, has a relatively low precedent compared to everything else we need to achieve if we want happy clients and a pay check.

      • kautau@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        6 days ago

        Lol yeah working in enterprise software for a long time, it’s more like:

        1. Import what you think you need, let the CI do a security audit, and your senior engineers to berate you if you import a huge unnecessary library where you only need one thing
        2. Tree shake everything during the CI build so really the only code that gets built for production is what is being used
        3. Consistently audit imports for security flaws and address them immediately (again, a CI tool)
        4. CI

        Basically just have a really good set of teams working on CI in addition to the backend/frontend/ux/security/infrastructure/ whatever else teams you have

      • TootSweet@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 days ago

        Saying “we can’t in practice reduce the complexity of our dependency tree because we need happy clients and a pay check” is like saying “we can’t in practice turn on the propeller because we need to get this airplane off the ground”.

        • boonhet@sopuli.xyz
          link
          fedilink
          arrow-up
          15
          ·
          edit-2
          6 days ago

          Clients don’t care much about the dependency graph. They do care about delivering on time and sometimes not reinventing a bunch of wheels is crucial for that.

            • marlowe221@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              ·
              6 days ago

              Amen.

              I have sorted out so many JS dependency tangles for my team members, both front end AND back end, that I am loathe to import anything I don’t absolutely have no choice about.

              I will rewrite some stuff before I import it…

  • mesa@piefed.social
    link
    fedilink
    English
    arrow-up
    18
    ·
    7 days ago

    Very true.

    Python feels like that sometimes too. Except much more standard library which is much better than node modules.

      • PhilipTheBucket@piefed.social
        link
        fedilink
        English
        arrow-up
        20
        ·
        edit-2
        7 days ago

        I sort of have a suspicion that there is some mathematical proof that, as soon as it becomes quick and easy to import an arbitrary number of dependencies into your project along with their dependencies, the size of the average project’s dependencies starts to follow an exponential growth curve increasing every year, without limit.

        I notice that this stuff didn’t happen with package managers + autoconf/automake. It was only once it became super-trivial to do from the programmer side, that the growth curve started. I’ve literally had trivial projects pull in thousands of dependencies recursively, because it’s easier to do that than to take literally one hour implementing a little modified-file watcher function or something.

        • CameronDev@programming.dev
          link
          fedilink
          arrow-up
          14
          ·
          7 days ago

          Its certainly more painful to collect dependencies with cmake, so its not worth doing if you can hand roll your own easily enough.

          The flip side is that by using a library, it theoretically means it should be fairly battle-tested code, and should be using appropriate APIs. File watching has a bunch of different OS specific APIs that could be used, in addition to the naive “read everything periodically” approach, so while you could knock something together in an hour, the library should be the correct approach. Sadly, at least in rust land, there are a ton of badly written libraries to wade through… 🤷

          • PhilipTheBucket@piefed.social
            link
            fedilink
            English
            arrow-up
            10
            ·
            7 days ago

            Yeah. I have no idea what the answer is, just describing the nature of the issue. I come from the days when you would maybe import like one library to do something special like .png reading or something, and you basically did all the rest yourself. The way programming gets done today is wild to me.

            • CameronDev@programming.dev
              link
              fedilink
              arrow-up
              3
              ·
              edit-2
              6 days ago

              I’m not sure its a problem in of itself, but i agree it definitely enables a problem. Between “is-even” and vibe coding, modern software engineering is in a very sorry state.

              • PhilipTheBucket@piefed.social
                link
                fedilink
                English
                arrow-up
                4
                ·
                6 days ago

                Yeah. I feel like in a few years when literally nothing works or is maintainable, people are going to have a resurgent realization of the importance of reliability in software design, that just throwing bodies and lines of code at the problem builds up a shaky structure that just isn’t workable anymore once it grows beyond a certain size.

                We used to know that, and somehow we forgot.

      • BehindTheBarrier@programming.dev
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        6 days ago

        At least Rust compiles down to what is used. I don’t know if js has any of that, but at least with rust the final program doesn’t ship tons of bloat.

        • CameronDev@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          6 days ago

          Yes and no, the linker does nicely trim a lot of the fat, but rust binaries are still pretty chonky. Its good chonky (debug etc), and static compile doesnt help, but they are quite fat.

          Also doesnt help compile times that you have to build all this extra stuff, only to throw most of it away.

  • Munrock ☭@lemmygrad.ml
    link
    fedilink
    arrow-up
    8
    ·
    6 days ago

    Except in the picture on the left, someone’s actually reading it.

    Something’s gone wrong if you’re looking in the node_modules folder.

    • deadbeef79000@lemmy.nz
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      6 days ago

      You get books like that for voluminous stuff like parliament debate transcripts for an entire parliamentary term.

      They’re generally one-off or only a handful printed and kept as archival records.

      Almost noone would ever need the physical book, it exists as a physical tome to cite/reference.