Sorry Python but it is what it is.

    • @[email protected]
      link
      fedilink
      102 years ago

      Pip stores everything inside of some random txt file that doesn’t differentiate between packages and dependencies.

          • Farent
            link
            fedilink
            72 years ago

            Isn’t it called a requirements.txt because it’s used to export your project requirements (dependencies), not all packages installed in your local pip environment?

          • @[email protected]
            link
            fedilink
            82 years ago

            Yes, but this file is created by you and not pip. It’s not like package.json from npm. You don’t even need to create this file.

            • @[email protected]
              link
              fedilink
              42 years ago

              Well if the file would be created by hand, that’s very cumbersome.

              But what is sometimes done to create it automatically is using

              pip freeze > requirements. txt

              inside your virtual environment.

              You said I don’t need to create this file? How else will I distribute my environment so that it can be easily used? There are a lot of other standard, like setup.py etc, so it’s only one possibility. But the fact that there are multiple competing standard shows that how pip handles this is kinds bad.

              • @[email protected]
                link
                fedilink
                English
                22 years ago

                If you try to keep your depencies low, it’s not very cumbersome. I usually do that.

                A setup.py/pyproject.toml can replace requirements. txt, but it is for creating packages and does way more than just installing dependencies, so they are not really competing.

                For scripts which have just 1 or 2 packges as depencies it’s also usuall to just tell people to run pip install .

              • Vash63
                link
                fedilink
                32 years ago

                I work with python professionally and would never do that. I add my actual imports to the requirements and if I forget I do it later as the package fails CI/CD tests.

      • SSUPII
        link
        fedilink
        52 years ago

        Honestly its a simple and straightforward solution. What’s wrong with it?

        • @[email protected]
          link
          fedilink
          2
          edit-2
          2 years ago

          If newer versions are released and dependencies change you would still install the old dependencies. And if the dependencies are not stored you can’t reproduce the exact same environment.

    • @[email protected]OP
      link
      fedilink
      English
      82 years ago

      cargo just works, it’s great and everyone loves it.

      npm has a lot of issues but in general does the job. When docs say do ‘npm install X’ you do it and it works.

      pip is a mess. In my experience doing ‘pip install X’ will maybe install something but it will not work because some dependencies will be screwed up. Using it to distribute software is pointless.

      • @[email protected]
        link
        fedilink
        252 years ago

        I use pip extensively and have zero issues.

        npm pulls in a million dependencies for even the simplest functionality.

        • @[email protected]OP
          link
          fedilink
          English
          52 years ago

          It probably works for your own local project. After using it for couple of days to install some 3rd party tool my conclusion is that it has no idea about dependencies. It just downloads some dependencies in some random versions and than it never works. Completely useless.

        • qaz
          link
          fedilink
          52 years ago

          You’ve never had broken dependencies?

          • @[email protected]
            link
            fedilink
            52 years ago

            Nope. I know mixing pip with python packages installed through your systems package manager can be a problem but that’s why I containerize everything.

    • @[email protected]
      link
      fedilink
      5
      edit-2
      2 years ago

      The only time I ever interacted with python packaging was when packaging for nixos. And I can tell you that the whole ecosystem is nuts. You have like ten package managers each with thirty different ways to do things, none of which specify dependencies in a way that can be resolved without manual input because y’all have such glorious ideas as implementing the same interface in different packages and giving each the same name and such. Oh and don’t get me started on setup.py making http requests.

    • TunaCowboy
      link
      fedilink
      392 years ago

      This is programmer humor, 95% of the people here still get defeated by semicolons, have never used a debugger, and struggle to exit vim.

      • Fushuan [he/him]
        link
        fedilink
        English
        152 years ago

        Sometimes I wish there was a community for more advanced users, where the concept of deciding on the best build tool chain per project is not a major hurdle. Venvs? Nbd. Pipenv? Nbd. Conda/mamba/micromamba? Nbd. Pure pip? Oh boy, I hope it a simple one, but I’ll manage. Maven? Fml, but sure. Npm? Sure. “Complex” git workflows, no problem.

        Idk, that’s just setting up the work environment, if your brains get squeezed by that I’m not sure if you will then be able to the actually code whatever its being asked of you. Some people…

        But yeah, this is a newbie space so I guess that we have to ignore some noise.

        • jelloeater
          link
          fedilink
          English
          12 years ago

          Seriously, I usually use Poetry these days for most projects, shit just works, build well and lets me distribute my code from PiPy just fine. Everything in one pyproject.yaml.

    • @[email protected]OP
      link
      fedilink
      English
      172 years ago

      This article someone linked is not 14 years old and it perfectly describes the mess python and pip are: https://chriswarrick.com/blog/2023/01/15/how-to-improve-python-packaging/

      My favorite part is:

      Most importantly: which tool should a beginner use? The PyPA has a few guides and tutorials, one is using pip + venv, another is using pipenv (why would you still do that?), and another tutorial that lets you pick between Hatchling (hatch’s build backend), setuptools, Flit, and PDM, without explaining the differences between them

      But yes, following old blog post is the issue.

            • @[email protected]
              link
              fedilink
              52 years ago

              Friend, while I appreciate the time and effort on the docs, it has a rather tiny section on one of the truly worst aspects of pip (and the only one that really guts usability): package conflicts.

              Due to the nature of Python as an interpreted language, there is little that you can check in advance via automation around “can package A and package B coexist peacefully with the lowest common denominator of package X”? Will it work? Will it fail? Run your tool/code and hope for the best!

              Pip is a nightmare with larger, spawling package solutions (i.e. a lot of the ML work out there). But even with the freshest of venv creations, things still go remarkably wrong rather quick in my experience. My favorite is when someone, somewhere in the dependency tree forgets to lock their version, which ends up blossoming into a ticking time bomb before it abruptly stops working.

              Hopefully, your experiences have been far more pleasant than mine.

      • @[email protected]
        link
        fedilink
        22 years ago

        If you’re using a manually managed venv, you need to remember to activate it, or to use the appropriate Python.

        That really doesn’t seem like a big ask.

        I’ve been using python professionally for like 10 years and package management hasn’t really been a big problem.

        If you’re doing professional work, you should probably be using docker or something anyway. Working on the host machine is just asking for “it works on my machine what do you mean it doesn’t work in production?” issues.

        • @[email protected]OP
          link
          fedilink
          English
          12 years ago

          No, actually most devs don’t use docker like that. Not java devs, not JS devs, not rust devs. That is because maven, npm and cargo manage dependencies per project. You use it for python exactly because pip does it the wrong way and python has big compatibility issues.

    • @[email protected]
      link
      fedilink
      42 years ago

      I have to agree, I maintain and develop packages in fortrat/C/C++ that use Python as a user interface, and in my experience pip just works.

      You only need to throw together a ≈30 line setup.py and a 5 line bash script and then you never have to think about it again.

      • Scribbd
        link
        fedilink
        42 years ago

        If we talk about solutions: python has plenty. Which might be overwhelming to the user.

        I use Direnv to manage my python projects. I just have to add layout pyenv 3.12.0 on top and it will create the virtual environment for me. And it will set my shell up to use that virtual environment as I enter that directory. And reset back to default when I leave the directory.

        But you could use pipenv, poetry, pdm, conda, mamba for your environment management. Pip and python do not care.

  • luky
    link
    fedilink
    182 years ago

    i will get hated for this but: cargo > composer > pip > npm

  • @[email protected]
    link
    fedilink
    32
    edit-2
    2 years ago

    I don’t know what cargo is, but npm is the second worst package manager I’ve ever used after nuget.

    • Pxtl
      link
      fedilink
      English
      3
      edit-2
      2 years ago

      what’s wrong with nuget? I have to say I like the “I want latest” “no, all your dependencies are pinned you want to update latest you gotta decide to do it” workflow. I can think of some bad problems when you try to do fancy things with it but the basic case of “I just want to fetch my program’s dependencies” it’s fine.

      • Lucky
        link
        fedilink
        2
        edit-2
        2 years ago

        I’m guessing they only used it 10 years ago when it was very rough around the edges. It didn’t integrate well with the old .NET Framework because it conflicted with how web.config managed dependencies and poor integration with VS. It was quite bad back then… but so was .NET Framework in general. Then they rebuilt from the ground up with dotnet core and it’s been rock solid since

        Or they just hate Microsoft, which is a common motif to shit on anything Microsoft does regardless of the actual product.

        • Pxtl
          link
          fedilink
          English
          22 years ago

          Imho the VS integration has always been good, it’s the web config that’s always been a trash fire, and that’s not new.

          • Lucky
            link
            fedilink
            12 years ago

            The project I’m on right now originally had the nuget.exe saved in source because they had to manually run it through build scripts, it wasn’t built in to VS until VS2012

    • Lucky
      link
      fedilink
      112 years ago

      I’ve never had an issue with nuget, at least since dotnet core. My experience has it far ahead of npm and pip

      • @[email protected]
        link
        fedilink
        English
        9
        edit-2
        2 years ago

        I’ll second this. I would argue that .Net Core’s package/dependency management in general is way better than Python or JavaScript. Typically it just works and when it doesn’t it’s not too difficult to fix.

        • @[email protected]
          link
          fedilink
          22 years ago

          It’s also much faster to install packages than npm or pip since it uses a local package cache and each package generally only has a few DLL files inside.

  • @[email protected]
    link
    fedilink
    232 years ago

    Memes like this make me ever more confused about my own software work flow. I’m in engineering so you can already guess my coding classes were pretty surface level at least at my uni and CC

    Conda is what I like to use for data science but I still barely understand how to maintain a package manager. Im lowkey a bot when it comes to using non-GUI programs and tbh that paradigm shift has been hard after 18 years of no CLI usage.

    The memes are pretty educational though

    • @[email protected]
      link
      fedilink
      382 years ago

      Try not to learn too much from memes, they’re mostly wrong. Conda is good, if you’re looking for something more modern (for Python) I’d suggest Poetry

      • @[email protected]
        link
        fedilink
        52 years ago

        Never have heard of Poetry, but I’ll check it out tonight! I pretty much exclusively coded in Python and Julia up until I got out of uni. I learned after a couple of months of insanity swapping kernels, init systems, distributions and learning everything about file systems only leads to further insanity and productivity hindrance.

        Something something recommending someone who doesn’t know what a shell is to use emacs and make a Lua/Neovim config. Thanks for the tip!

        • @[email protected]
          link
          fedilink
          English
          22 years ago

          100% this. I remember really really trying to get the hang of them and eventually just giving up and doing it manually every time. I somehow always eventually mess something up or god forbid someone who isn’t me messes it up and I end up spending 4 hours dependency hunting. Venv and pip while still annoying are at least reliable and dead simple to use.

          However, a container is now my preferred way of sharing software for at least the past 6 years.

          • @[email protected]
            link
            fedilink
            12 years ago

            Yup. A container i slow to rebuild, but at least the most robust. This is my preferred way to share python code when there are system dependencies involved.

  • ɐɥO
    link
    fedilink
    372 years ago

    npm is just plain up terrible. never worked for me first try without doing weird stuff

  • Tóth Alfréd
    link
    fedilink
    162 years ago

    What about CPAN?

    You can’t even use it without the documentation of the program that you want to install because some dependencies have to be installed manually, and even then there’s a chance of the installation not working because a unit test would fail.

  • Skull giver
    link
    fedilink
    542 years ago

    Cargo is a pretty good tool, but it’ll happily fill up your disk with cached copies of crates that you downloaded somewhere in your user folder. Using a modern fs like BTRFS with extent deduplication helps to save space, but it doesn’t solve the problems.

    pip and npm are practically equally bad, at least the maintained versions. If you’re using old versions that haven’t been supported for years (Python 2 etc.) then you’re in for a world of pain with both.

    There are much worse alternatives. Anaconda, for example, takes half an hour to resolve dependencies while also autoloading itself into your shell, adding up to half a second (I timed this! I thought zsh was bugged!) of latency between shell prompts.

    The worst tool is probably what many C(++) projects seem to do: use Git as a dependency manager by including entire git repos as submodules. I’m pretty annoyed at having to keep multiple versions of tokio around when building a Rust project, but at least I’m not downloading the entire commit history for boost every time I clone a project!

    • @[email protected]
      link
      fedilink
      52 years ago

      Python virtual environments feel really archaic. It’s by far the worst user experience I’ve had with any kind of modern build system.

      Even a decade ago in Haskell, you only had to type cabal sandbox init only once, rather than source virtualenv/bin/activate.sh every time you cd to the project dir.

      I’m not really a python guy, but having to start touching a python project at work was a really unpleasant surprise.

    • @[email protected]
      link
      fedilink
      102 years ago

      cached copies of crates that you downloaded

      Meh, what else is it supposed to do, delete sources all the time? Then people with slow connections will complain.

      Also size-wise that’s actually not even much (though they could take the care to compress it), what actually takes up space with rust is compile artifacts, per workspace. Have you heard of kondo?

      • @[email protected]
        link
        fedilink
        12 years ago

        Idk, maybe you can share the common packages across projects. (That can never go wrong, right? /s)

            • @[email protected]
              link
              fedilink
              22 years ago

              You can globally share compile artifacts by setting a global target directory in the global Cargo config.

              In $HOME/.cargo/config.toml:

              [build]
              target-dir = "/path/to/dir"
              

              The only problems I had when I did it where some cargo plugins and some dependencies with build.rs files that expected the target folder in it’s usual location.

    • @[email protected]
      link
      fedilink
      62 years ago

      I actually vastly prefer this behavior. It allows me to jump to (readable) source in library code easily in my editor, as well as experiment with different package versions without having to redownload, and (sort of) work offline too. I guess, I don’t really know what it would do otherwise. I think Rust requires you to have the complete library source code for everything you’re using regardless.

      I suppose it could act like NPM, and keep a separate copy of every library for every single project on my system, but that’s even less efficient. Yes, I think NPM only downloads the “built” files (if the package uses a build system & is properly configured), but it’s still just minified JS source code most of the time.

      • @[email protected]
        link
        fedilink
        62 years ago

        With python and virtualenv you can also keep the entire source of your libraries in your project.

  • @[email protected]
    link
    fedilink
    102 years ago

    the only time i’ve had issues with pip is when using it to install the xonsh shell, but that’s not really pip’s fault since that’s a very niche case and i wouldn’t expect any language’s package manager to handle installing something so fundamental anyways.

    • @[email protected]
      link
      fedilink
      52 years ago

      It’s all fun and games until the wheel variant you need for your hardware acceleration package conflicts with that esoteric math library you planned on using.

      • @[email protected]
        link
        fedilink
        12 years ago

        This isn’t a pip issue though. Either these packages work together and the packaging is wrong, or they don’t work together.