In a requirements-*.in file, at the top of the file, are lines with -c and -r flags followed by a requirements-*.in file. Uses relative paths (ignoring URLs).

Say have docs/requirements-pip-tools.in

-r ../requirements/requirements-prod.in
-c ../requirements/requirements-pins-base.in
-c ../requirements/requirements-pins-cffi.in

...

The intent is compiling this would produce docs/requirements-pip-tool.txt

But there is confusion as to which flag to use. It’s non-obvious.

constraint

Subset of requirements features. Intended to restrict package versions. Does not necessarily (might not) install the package!

Does not support:

  • editable mode (-e)

  • extras (e.g. coverage[toml])

Personal preference

  • always organize requirements files in folder(s)

  • don’t prefix requirements files with requirements-, just doing it here

  • DRY principle applies; split out constraints which are shared.

  • Michal@programming.dev
    link
    fedilink
    arrow-up
    2
    ·
    2 days ago

    Requirements are literally the packages your project requires to run,down to a specific version if you wish.

    Constraints specifies what version of a package to install IF the package is required by your requirements, or by transitive requirement (required by packages you require). If package is not required, the constraint is not used.

    I tend to use requirements file to list direct dependencies of my project and their versions. Constraints is useful to pin down and transitive dependencies to make sure they’re not accidentally upgraded (repeatable builds) . Also if the 3rd party package drops a requirement you don’t have to worry that it’ll still be installed if it’s still on your constraints. It’ll simply not be installed.

  • spoonbill@programming.dev
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 days ago

    Constraints are useful for restricting build dependencies of your dependencies, especially if they follow PEP-518.

    • logging_strict@programming.devOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 days ago

      Was working under the assumption that everyone considered constraints (-c) to be non-negotiable required feature.

      If only have requirements (-r), in a centralized pyproject.toml, then how to tackle multiple specific dependency hell issues without causing a huge amount of interconnected clutter?

        • logging_strict@programming.devOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 days ago

          Within the context of resolving dependency conflicts, poetry decided pyproject.toml is a great place to put requirements.

          This is what people know.

          pyproject.toml or venv management should otherwise never come into the conversation.

          My personal opinion is: venv, pip, pyenv, pip-tools, and tox are sufficient to manage venvs.

          venvs are not required to manage requirement files. It’s a convenience so dev tools are accessible.

          Currently the options are: poetry or uv.

          With honorable mention to pip-compile-multi, which locks dependencies.

          poetry and uv manage venvs… Why?

          • spoonbill@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 days ago

            I was asking why you need to have a centralized pyproject.toml file, which is apparently why you need constraint files? Most people don’t have this workflow, so are not even aware of constraint files, much less see them as a must-have.

            • logging_strict@programming.devOP
              link
              fedilink
              arrow-up
              1
              ·
              12 hours ago

              I totally agree with you. So not the best champion of the poetry approach. Someone else would need to step forward, even as devils advocate, and champion poetry. Even if tongue in cheek. Anyone?

              Normally, there is no connection between constraint files and pyproject.toml

              Python appears to be forever stuck with plain text requirement|constraint files. So putting them into pyproject.toml is just adding an extra layer of complexity.

              • spoonbill@programming.dev
                link
                fedilink
                English
                arrow-up
                1
                ·
                9 hours ago

                If most people prefer pyproject.toml over requirements.txt, even if it does not support everything you need, isn’t it more likely that you will have to change workflow rather than python remaining stuck with requirement.txt?

          • Eager Eagle@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 days ago

            are you really asking why use 1 tool instead of 5?

            venvs and dependency management are such interconnected concepts, I don’t even know how you could sustainably handle them separately.

            • logging_strict@programming.devOP
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              2 days ago

              UNIX philosophy. One tool that does one thing well

              Best to have a damn good reason when breaking this principle (e.g. vendoring) or be funded by Money McBags

              requirements files are requirements files, not venvs. They may install into venv, but they are not venvs themselves. The only thing a venv provides that is of interest to ur requirements files are: the relative folder path (e.g. ‘.venv’) and python interpreter path. Nothing more. When using tox, the py version is hardcoded, so only need to provide the relative folder path.

              The venv management tools we have are sufficient. the problem is not the venv, it’s managing the requirements files.

              Your 1 tool suacks just as much as my 5 tools when it comes to managing requirement files. None of them do the job.

              • Eager Eagle@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                2 days ago

                The Python env has been trying this multiple tools approach for decades and consistently delivering a worse experience than languages that pack most things in one tool.

                Rust is a bliss to use, largely thanks to cargo that takes care of build, dependencies, locking, tests, publishing etc. You say do one thing and do it well. In my experience they often do one thing in a mediocre way, while forcing users to understand which and how to combine dozens of possible tools in a development environment that keeps changing. It’s messy, slow, error prone, and requires constant developer attention.

                • logging_strict@programming.devOP
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  12 hours ago

                  Most languages don’t support packages containing multiple languages (C/C++, Cython, and Python). So Python situation is much more complex.

                  distutils

                  setuptools is complex

                  pip is complex

                  requirements files are complex

                  space aliens wrote pytest (and pluggy)

                  publishing and dependencies are super centralized, depending on pypi.org way too much.

                  Comparing Rust vs Python is nonsense. Rust is a stricter compiler on top of C. It has to deal with legacy C libraries. It has it very very easy.

    • logging_strict@programming.devOP
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      3 days ago

      my position is it’s not messy enough

      Lets start off by admitting what the goal is.

      We all want to avoid dependency hell.

      Our primary interest is not merely cleaning up the mess of requirements files.

      Cleaning up the mess results in some unintended consequences:

      1. noise
      2. complexity
      3. confusion

      noise

      All the requirements information is in one place. Sounds great until want to tackle and document very specific issues.

      Like when Sphinx dropped support for py39, myst-parser restricted the Sphinx upper bound version, fixed it in a commit, but did not create a release.

      Or cffi, every single commit just blows our mind. Adding support for things we all want. So want to set a lower bound cffi version.

      My point being, these are all specific issues and should be dealt with separately. And when it’s no longer relevant, know exactly what to remove. Zero noise.

      complexity

      When things go horribly wrong, the wrapper gets in the way. So now have to deal with both the wrapper and the issue. So there is both a learning curve, an API interface, and increased required know how.

      The simple answer here is, do not do that.

      confusion

      When a dependency hell issue arises, have to deal with that and find ourselves drawn to poetry or uv documentation. The issue has nothing to do with either. But we are looking towards them to see how others solve it, in the poetry or uv way.

      The only know-how that should be needed is whats in the pip docs.

      Whats ur suggestion?

      Would prefer to deal with dependency hell before it happens. To do this, the requirements files are broken up, so they are easier to deal with.

      Centralizing everything into pyproject.toml does the opposite.

      Rather than dealing with the issue beforehand, get to deal with it good and hard afterwards.

      • logging_strict@programming.devOP
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 days ago

        Woah! Was giving the benefit of the doubt. You blow my mind.

        The locking is very very specific to apps and dev environment.

        But lacking constraints is like cutting off an arm.

        • spoonbill@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 days ago

          My only use case so far has been fixing broken builds when a package has build-)ldependencies that don’t actually work (e.g. a dependency of a dependency breaks stuff). Not super common, but it happens.

        • logging_strict@programming.devOP
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          3 days ago

          That’s a loaded question. Would like to avoid answering atm. Would lead to a package release announcement which this post is not; not prepared to right right now.

          Instead here is an admittedly unsatisfactory response which i apologize for.

          Wish to have the option to, later, take it back and give the straight exact answer which your question deserves.

          my use case is your use case and everyone else’s use case.

          Avoiding dependency hell while keeping things easily manageable. Breaking up complexity into smallest pieces possible. And having a CLI tool to fix what’s fixable while reporting on what’s not.

          My preference is to do this beforehand.

  • logging_strict@programming.devOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 days ago

    A package’s requirements are left unlocked

    An app’s requirements are locked

    This doesn’t excuse app devs if an requirements.in file is not provided

    e.g. pip freeze > requirements.txt and forget

    This produces a lock file. Including indirect packages. The direct packages info is lost if a requirements.in is not provided.