addons
addons copied to clipboard
Package Addons on conda-forge
This will likely come after we finalize our build automation and toolchain, but wanted to start an issue for visibility. When the time comes we will enable anaconda installs using conda-forge.
cc @kyleabeauchamp
As a starting point, I made a WIP PR to get a conda package for TF2 nightly: https://github.com/conda-forge/tensorflow-feedstock/pull/68
Can I help with anything?
Hi @jjhelmus -- we're starting the process of packaging tensorflow/addons in conda and wanted to ask a few questions to see if you could guide us.
- Is conda-forge the best channel for us?
- It looks like there is a conda-forge/tensorflow and anaconda/tensorflow. Which dependency should we build against? (I think the conda-forge docs say to avoid external channel dependencies?)
- Do you have an idea what the lag time is for tensorflow packages to show up on conda? (i.e. how long after tf2 release will we be able to build our conda package which depends on tf2?) It looks like conda-forge/tensorflow is still 1.13
- I'm 90% sure the answer is yes, but there will be issues with GPU support given the state of nvidia licensing (doesn't support OSS projects like conda-forge).
- We're definitely in a stange place with wanting to support both CPU and GPU work. I think this is a very good question that we need to resolve.
- The lag time for conda-forge is about an hour after the pull request is merged. However, the review cycle can take up to a week.
PS I still think it would be good to see if @jjhelmus agrees.
As a conda-forge maintainer (not part of core though), I would answer:
- Yes, it is. This where nearly all community-led conda package development happens except for specialized things like bioconda.
- For CUDA in conda-forge and the legal problems see https://github.com/conda-forge/conda-forge.github.io/issues/63 and https://github.com/conda-forge/conda-forge.github.io/issues/687
- Review cycles can take a week on complicated PRs but normally once CI has passed it's more around 1-2h during european and US daytime.
-
conda-forge is the largest community-led packaging community and would make an ideal channel to use and contribute too (note that I am on the conda-forge core team so I might be bias). Another option is to create your own channel and publish packages there.
Some organizations likepytorchandrapidsfind this flexibility useful. Using your own channel will require you to setup the necessary infrastructure for producing packages and users may find it harder to find your packages than if they were on conda-forge. -
This depends on how addons interacts with tensorflow. If the interaction is via Python only then either channel should be fine. If you make use of the C/C++ components then you will need to take care to match the ABI used by the channel.
With the newer 1.13 builds conda-forge's linux-64 package is build from source using the conda compilers. The osx-64 and win-64 packages are repackaged wheels. The packages indefaultshave been built from source using the conda provided compilers for a number of versions.
Onlydefaultsprovides a GPU package at this time. The package name is the same,tensorflow, but thetensorflow-gpumetapackage can be used to select the GPU variant. Please avoid using theanacondachannel. It is a mixture of the two subchannels which make updefaultsbut without any subchannel priority which is needed for reliable solutions. -
For conda-forge packages the time between a release and conda packages depends on entirely on volunteers. For
defaultsthe time depends on where the package is prioritized by Anaconda, Inc. Typically packages are build within 2 weeks but this depends greatly on the difficulty of the build and other team priorities. Given the large number of changes in TF 2.0 I would not be surprised if the lag for that release is longer than normal.
Thanks for all the feedback everyone. So to answer one question: Addons does indeed utilize the pre-compiled libtensorflow_framework.so for our C++ op linking.
Just want verify my understanding from the above information is correct:
- It seems like packaging in conda-forge will be ideal
- We should link to
defaultschannel'stensorflowmaking sure we match the ABI that TF was compiled against- Should we also encourage users to install
cudatoolkitif they want GPU support? Currently the plan is to link to tensorflow-gpu and use CPU fallback in case it is not available. In 2.1 these packages will be consolidated.
- Should we also encourage users to install
- We are blocked on this issue until a TF2 full release is made and subsequent package is distributed on conda
Should we add like a "WiP" or "blocked" Label or something like that?
This is now unblocked as TF2 is on conda.
Hi all, what's the status of this?
Hi all, what's the status of this?
This is quite a bit more difficult than we originally suspected. A more complete investigation can be found for SIG IO who has the same goal: https://github.com/tensorflow/io/issues/676
A more immediate fix would be the completion of https://github.com/tensorflow/community/pull/133 so that you can have a conda installed TF and a pip install TFA.
Okey I see, many thanks. The problem is rather that we're having a downstream package zfit which we also like to publish on conda-forge (or rather, which we do, but adding TFA as a dependency breaks this)
Given our availability, we have decided to let the community package Addons on Conda. There are several channels: https://anaconda.org/search?q=Addons+tensorflow
Can we maybe reopen this? The problem with all these channels is that they are not in conda-forge. In practice, many people have conda-forge added to the default. Packages in conda-forge cannot just depend on other channels. So to have a downstream package in conda-forge, in reality, the upstream package is also needed in conda forge.
Since there are many channels, it would be good to let the community move it together. Can we maybe use this thread to organize this?
For those interested, I opened a PR to add this to conda-forge: https://github.com/conda-forge/staged-recipes/pull/16888
Hey all
I managed to get this to build and run the test suite, there's only a handful of tests that fail with some off-by-one errors. I'd really appreciate some input from the maintainers about these (and you'd be very welcome to co-maintain the conda-forge package! 😊)
=========================== short test summary info ============================
FAILED tensorflow_addons/image/tests/transform_ops_test.py::test_rotate_even[cpu-eager_mode-float32]
FAILED tensorflow_addons/image/tests/transform_ops_test.py::test_rotate_even[cpu-eager_mode-float64]
FAILED tensorflow_addons/image/tests/transform_ops_test.py::test_rotate_even[cpu-eager_mode-int32]
FAILED tensorflow_addons/image/tests/transform_ops_test.py::test_rotate_even[cpu-eager_mode-uint8]
FAILED tensorflow_addons/image/tests/transform_ops_test.py::test_rotate_even[cpu-eager_mode-int64]
FAILED tensorflow_addons/image/tests/transform_ops_test.py::test_rotate_even[cpu-eager_mode-float16]
FAILED tensorflow_addons/image/tests/transform_ops_test.py::test_rotate_even[cpu-tf_function-float32]
FAILED tensorflow_addons/image/tests/transform_ops_test.py::test_rotate_even[cpu-tf_function-float64]
FAILED tensorflow_addons/image/tests/transform_ops_test.py::test_rotate_even[cpu-tf_function-int32]
FAILED tensorflow_addons/image/tests/transform_ops_test.py::test_rotate_even[cpu-tf_function-uint8]
FAILED tensorflow_addons/image/tests/transform_ops_test.py::test_rotate_even[cpu-tf_function-int64]
FAILED tensorflow_addons/image/tests/transform_ops_test.py::test_rotate_even[cpu-tf_function-float16]
==== 12 failed, 1872 passed, 572 skipped, 318 warnings in 396.17s (0:06:36) ====
Here's a CI run that has the full logs.
Sample error
__________________ test_rotate_even[cpu-tf_function-float16] ___________________
dtype = tf.float16
@pytest.mark.usefixtures("maybe_run_functions_eagerly")
@pytest.mark.parametrize("dtype", _DTYPES)
def test_rotate_even(dtype):
image = tf.reshape(tf.cast(tf.range(36), dtype), (6, 6))
image_rep = tf.tile(image[None, :, :, None], [3, 1, 1, 1])
angles = tf.constant([0.0, np.pi / 4.0, np.pi / 2.0], tf.float32)
image_rotated = transform_ops.rotate(image_rep, angles)
np.testing.assert_equal(
image_rotated.numpy()[:, :, :, 0],
[
[
[0, 1, 2, 3, 4, 5],
[6, 7, 8, 9, 10, 11],
[12, 13, 14, 15, 16, 17],
[18, 19, 20, 21, 22, 23],
[24, 25, 26, 27, 28, 29],
[30, 31, 32, 33, 34, 35],
],
[
[0, 3, 4, 11, 17, 0],
[2, 3, 9, 16, 23, 23],
[1, 8, 15, 21, 22, 29],
[6, 13, 20, 21, 27, 34],
[12, 18, 19, 26, 33, 33],
[0, 18, 24, 31, 32, 0],
],
[
[5, 11, 17, 23, 29, 35],
[4, 10, 16, 22, 28, 34],
[3, 9, 15, 21, 27, 33],
[2, 8, 14, 20, 26, 32],
[1, 7, 13, 19, 25, 31],
> [0, 6, 12, 18, 24, 30],
],
],
)
E AssertionError:
E Arrays are not equal
E
E Mismatched elements: 2 / 108 (1.85%)
E Max absolute difference: 1.
E Max relative difference: 0.04761905
E x: array([[[ 0., 1., 2., 3., 4., 5.],
E [ 6., 7., 8., 9., 10., 11.],
E [12., 13., 14., 15., 16., 17.],...
E y: array([[[ 0, 1, 2, 3, 4, 5],
E [ 6, 7, 8, 9, 10, 11],
E [12, 13, 14, 15, 16, 17],...
tensorflow_addons/image/tests/transform_ops_test.py:261: AssertionError
@seanpmorgan can we re-open this issue maybe?
@kyleabeauchamp could you maybe re-open this issue since there is progress by @h-vetinari?
Hi @h-vetinari thank you very much for your work on this. I've re-opened this issue seeing the progress you've made on the conda forge PR. Given our bandwidth for maintainers on this repo we'll be unable to support this ourselves but we're happy to onboard someone who would like to own this feature. We know this is a commonly requested feature. Please keep us updated on this progress.
@h-vetinari any news on this?
@h-vetinari any news on this?
Not so far, sorry. I didn't get any feedback from the maintainers here on the errors I noted above. But I can try to revive the PR and cross my fingers that those are gone with a newer version.
Ah, sorry, I had misremembered the situation from https://github.com/conda-forge/staged-recipes/pull/16888. Letting bazel find a pre-installed (i.e. conda-forge-packaged) tensorflow looked to be extremely painful, and I didn't try to bash my head against this particular wall. Happy to receive help/guidance on this of course!
TensorFlow Addons is transitioning to a minimal maintenance and release mode. New features will not be added to this repository. For more information, please see our public messaging on this decision: TensorFlow Addons Wind Down
Please consider sending feature requests / contributions to other repositories in the TF community with a similar charters to TFA: Keras Keras-CV Keras-NLP