-
Notifications
You must be signed in to change notification settings - Fork 370
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add dependencies to pyproject.toml
#1417
Comments
Right now we're mostly only supporting conda-forge based installs, although I see little harm in doing this. We mostly provide requirements.txt for people either building from source (which we mostly discourage) or for people running in colab. My understanding, based on the last time I read about these packaging changes, is that pyproject actually can work fine alongside requirements.txt - one doesn't need to pick one or the other and the tooling will generally use both if present, correctly. Is that no longer the case? I think we probably don't need to focus as much on legacy compatibility though, so this is probably a good thing (even though I'm not sure how high-priority it should be given that it's unclear to me if it actually affects anything). I'll put this on the todo list. |
totally understand not wanting to take on another packaging modality due to maintenance burden, but hear me out! If you didn't want to upload to pypi, just adding the deps to
Typically when people use both, they put the "abstract" package specifications (either just the bare package name or package with some generous minimum bound) in the
most don't or require additional config - hatch needs a plugin to read
basically not having deps specified in pyproject.toml makes it impossible to build off of CaImAn. Even without being on PyPI, if the project had its deps specified it would be possible for us to add it as a dependency as a git repo. Without that, the only thing we really can do is copy/paste, add the GPL-2 license in our repo, and add a note that we have literal-included part of caiman -- which is both awkward and also deprives y'all of the much due credit from dependency graphs. If I build a sdist/wheel of caiman right now, the dependencies aren't included (i.e. not read from the There aren't any deps in the conda environment that actually require conda - i.e. with the exception of changing "opencv" to "opencv-python" and adding a Then, having done that, i'm able to build a wheel and an sdist, and with one further command i would be able to upload the package to pypi (but i won't, because i don't want to take the name ;) Side benefit: managing deps using Anyway, tl;dr copy/paste 30 lines and you make your package available to the non-conda python world! would be happy to PR |
Fair enough; this stuff seems to have changed a lot over the last few years, and the guidance hasn't always been clear. I see no harm in the change. I'd accept a PR, but I'm also happy to do it myself; whatever you prefer on that front. I think I'd probably want to avoid optional dependency groups, in order to keep it simple and also to make it clear to our direct users that we still prefer they use conda (at least, provided we don't need to revisit that decision - recently there have been some headaches on that front relating to packaging issues for tensorflow and pytorch) (we certainly don't want to enourage vendoring - we've suffered when we've done that ourselves, for much smaller dependencies) Whatever your decision, thanks for bringing it to our attention. |
Just thought I'd pop in quickly to mention that this is easier said than done, #1391 does some of this. |
Hello!
I see a previous issue on packaging re: adding to pypi - #509 - but this is sufficiently distinct, and the python packaging world has changed dramatically since 2019.
Currently the
pyproject.toml
file doesn't specify any dependencies and they are just listed inrequirements.txt
Moving the
requirements.txt
topyproject.toml
would be essentially a no-effort change that allows the package to be installed via pip/any standards-compliant package manager, and if there are differences in the gpu requirements, those could be added as aextra
oroptional
dependencies.Doing that should also allow you to make a pypi-ready package for free, even if you just put source distributions up there and don't want to worry about building wheels for various platforms and versions, but most importantly it allows people to build downstream tools on top of CaImAn. Currently that's impossible unless you add all the requirements to downstream packages manually, and then that package is on the hook with maintaining a parallel set of deps. Similarly, not being on pypi prevents people building downstream tools except via conda, which is not really ideal for making a generalizable python tool.
Should be no additional maintenance burden (and in fact a lesser one where you can single-source the deps in a format that all python package tooling will support for the indefinite future).
I can PR if y'all are strapped for time, trying it locally and it works exactly as expected. :)
The text was updated successfully, but these errors were encountered: