I am building a python library that would be deployed as a python package on PyPI. The tool I'm using as build system is Setuptools. The library is supposed to work with TensorFlow as a dependency so it is pretty straightforward to add it to the project:
[project]
...
dependencies = [
"tensorflow >= 2.11.0",
]
Now, the problem is that my library is able to work both with tensorflow and tensorflow-gpu and I would like to deliver a python package that, as default, has tensorflow as dependency but letting the user select a different "flavor" (in my case for using the tensorflow-gpu and taking advantage of the GPU on the tensor computations).
I've gone trough the documentation of Setuptools but I couldn't find any help.
As a possible solution I thought that using the section [project.optional-dependencies], could solve my problem, for example setting a field as follows:
[project.optional-dependencies]
gpu = ["tensorflow-gpu >= 2.11.0"]
I could run pip install myLibrary[gpu] to also include the optional dependencies, however, the required dependencies (i.e. tensorflow) would aways be installed and that's not the behavior I'm expecting.
The other drastic way of solving this problem would be to ship two different libraries (myLibrary and myLibrary-gpu) but I think it is not an elegant way of managing this situation, especially because the code inside the library is exactly the same for both versions.
Any help on that would be very much appreciated!
I ran into this same problem, it was hard to find an elegant solution. This is the best solution I have so far. Copy the code below into
setup.pyand modify it so that you can append the right TensorFlow package in the dependencies list depending on the subprocess output.Since python3+ ships with
subprocess:Note: This is not a general solution because running
nvidia-smimay not be possible for everyone due to permission issues.