NVTabular
NVTabular copied to clipboard
MultiGPU TensorFlow uses Keras
Updated the multi-GPU TensorFlow example with TF.Keras as this is easier to use
Check out this pull request on ![]()
See visual diffs & provide feedback on Jupyter Notebooks.
Powered by ReviewNB
Click to view CI Results
GitHub pull request #1321 of commit be3d3a91138c649ca494570c7f47c43597efe1e1, no merge conflicts.
Running as SYSTEM
Setting status of be3d3a91138c649ca494570c7f47c43597efe1e1 to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/3926/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
> git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
> git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
> git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
> git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
> git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1321/*:refs/remotes/origin/pr/1321/* # timeout=10
> git rev-parse be3d3a91138c649ca494570c7f47c43597efe1e1^{commit} # timeout=10
Checking out Revision be3d3a91138c649ca494570c7f47c43597efe1e1 (detached)
> git config core.sparsecheckout # timeout=10
> git checkout -f be3d3a91138c649ca494570c7f47c43597efe1e1 # timeout=10
Commit message: "update keras"
> git rev-list --no-walk 3f3c3bb0d722224730030d2a45ccf7001a6efb3f # timeout=10
First time build. Skipping changelog.
[nvtabular_tests] $ /bin/bash /tmp/jenkins502869898063893300.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (21.3.1)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (59.6.0)
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.0)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.8.1)
Requirement already satisfied: numpy==1.20.3 in /var/jenkins_home/.local/lib/python3.8/site-packages (1.20.3)
WARNING: Skipping nvtabular as it is not installed.
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/command/easy_install.py:156: EasyInstallDeprecationWarning: easy_install command is deprecated. Use build and pip and other standards-based tools.
warnings.warn(
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
warnings.warn(
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+39.gbe3d3a9 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+39.gbe3d3a9 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+39.gbe3d3a9 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+39.gbe3d3a9 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so ->
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.8.0+39.gbe3d3a9 is already the active version in easy-install.pth
Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Processing dependencies for nvtabular==0.8.0+39.gbe3d3a9
Searching for protobuf==3.19.1
Best match: protobuf 3.19.1
Adding protobuf 3.19.1 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for tensorflow-metadata==1.2.0
Best match: tensorflow-metadata 1.2.0
Processing tensorflow_metadata-1.2.0-py3.8.egg
tensorflow-metadata 1.2.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/tensorflow_metadata-1.2.0-py3.8.egg
Searching for pyarrow==4.0.1
Best match: pyarrow 4.0.1
Adding pyarrow 4.0.1 to easy-install.pth file
Installing plasma_store script to /var/jenkins_home/.local/bin
Using /usr/local/lib/python3.8/dist-packages
Searching for tqdm==4.61.2
Best match: tqdm 4.61.2
Processing tqdm-4.61.2-py3.8.egg
tqdm 4.61.2 is already the active version in easy-install.pth
Installing tqdm script to /var/jenkins_home/.local/bin
Using /var/jenkins_home/.local/lib/python3.8/site-packages/tqdm-4.61.2-py3.8.egg
Searching for numba==0.54.1
Best match: numba 0.54.1
Adding numba 0.54.1 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for pandas==1.3.3
Best match: pandas 1.3.3
Processing pandas-1.3.3-py3.8-linux-x86_64.egg
pandas 1.3.3 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg
Searching for distributed==2021.7.1
Best match: distributed 2021.7.1
Processing distributed-2021.7.1-py3.8.egg
distributed 2021.7.1 is already the active version in easy-install.pth
Installing dask-ssh script to /var/jenkins_home/.local/bin
Installing dask-scheduler script to /var/jenkins_home/.local/bin
Installing dask-worker script to /var/jenkins_home/.local/bin
Using /var/jenkins_home/.local/lib/python3.8/site-packages/distributed-2021.7.1-py3.8.egg
Searching for dask==2021.7.1
Best match: dask 2021.7.1
Processing dask-2021.7.1-py3.8.egg
dask 2021.7.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg
Searching for googleapis-common-protos==1.53.0
Best match: googleapis-common-protos 1.53.0
Processing googleapis_common_protos-1.53.0-py3.8.egg
googleapis-common-protos 1.53.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/googleapis_common_protos-1.53.0-py3.8.egg
Searching for absl-py==0.12.0
Best match: absl-py 0.12.0
Processing absl_py-0.12.0-py3.8.egg
absl-py 0.12.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/absl_py-0.12.0-py3.8.egg
Searching for numpy==1.20.3
Best match: numpy 1.20.3
Adding numpy 1.20.3 to easy-install.pth file
Installing f2py script to /var/jenkins_home/.local/bin
Installing f2py3 script to /var/jenkins_home/.local/bin
Installing f2py3.8 script to /var/jenkins_home/.local/bin
Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for llvmlite==0.37.0
Best match: llvmlite 0.37.0
Processing llvmlite-0.37.0-py3.8-linux-x86_64.egg
llvmlite 0.37.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/llvmlite-0.37.0-py3.8-linux-x86_64.egg
Searching for setuptools==59.6.0
Best match: setuptools 59.6.0
Adding setuptools 59.6.0 to easy-install.pth file
Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for pytz==2021.3
Best match: pytz 2021.3
Adding pytz 2021.3 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for python-dateutil==2.8.2
Best match: python-dateutil 2.8.2
Adding python-dateutil 2.8.2 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for zict==2.0.0
Best match: zict 2.0.0
Processing zict-2.0.0-py3.8.egg
zict 2.0.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg
Searching for tornado==6.1
Best match: tornado 6.1
Processing tornado-6.1-py3.8-linux-x86_64.egg
tornado 6.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg
Searching for toolz==0.11.1
Best match: toolz 0.11.1
Processing toolz-0.11.1-py3.8.egg
toolz 0.11.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/toolz-0.11.1-py3.8.egg
Searching for tblib==1.7.0
Best match: tblib 1.7.0
Processing tblib-1.7.0-py3.8.egg
tblib 1.7.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg
Searching for sortedcontainers==2.4.0
Best match: sortedcontainers 2.4.0
Processing sortedcontainers-2.4.0-py3.8.egg
sortedcontainers 2.4.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg
Searching for PyYAML==5.4.1
Best match: PyYAML 5.4.1
Processing PyYAML-5.4.1-py3.8-linux-x86_64.egg
PyYAML 5.4.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg
Searching for psutil==5.8.0
Best match: psutil 5.8.0
Processing psutil-5.8.0-py3.8-linux-x86_64.egg
psutil 5.8.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg
Searching for msgpack==1.0.2
Best match: msgpack 1.0.2
Processing msgpack-1.0.2-py3.8-linux-x86_64.egg
msgpack 1.0.2 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/msgpack-1.0.2-py3.8-linux-x86_64.egg
Searching for cloudpickle==1.6.0
Best match: cloudpickle 1.6.0
Processing cloudpickle-1.6.0-py3.8.egg
cloudpickle 1.6.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/cloudpickle-1.6.0-py3.8.egg
Searching for click==8.0.1
Best match: click 8.0.1
Processing click-8.0.1-py3.8.egg
click 8.0.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/click-8.0.1-py3.8.egg
Searching for partd==1.2.0
Best match: partd 1.2.0
Processing partd-1.2.0-py3.8.egg
partd 1.2.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg
Searching for packaging==21.2
Best match: packaging 21.2
Adding packaging 21.2 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for fsspec==2021.11.1
Best match: fsspec 2021.11.1
Adding fsspec 2021.11.1 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for six==1.15.0
Best match: six 1.15.0
Adding six 1.15.0 to easy-install.pth file
Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for HeapDict==1.0.1
Best match: HeapDict 1.0.1
Processing HeapDict-1.0.1-py3.8.egg
HeapDict 1.0.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg
Searching for locket==0.2.1
Best match: locket 0.2.1
Processing locket-0.2.1-py3.8.egg
locket 0.2.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg
Searching for pyparsing==2.4.7
Best match: pyparsing 2.4.7
Adding pyparsing 2.4.7 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Finished processing dependencies for nvtabular==0.8.0+39.gbe3d3a9
Running black --check
All done! ✨ 🍰 ✨
169 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.dispatch
nvtabular/dispatch.py:607:11: I1101: Module 'numpy.random.mtrand' has no 'RandomState' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
************* Module nvtabular.ops.categorify
nvtabular/ops/categorify.py:516:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
************* Module nvtabular.ops.fill
nvtabular/ops/fill.py:67:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)
Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.7) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.11.0, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: xdist-2.5.0, forked-1.4.0, cov-3.0.0
collected 1592 items / 1 skipped / 1591 selected
tests/unit/test_dask_nvt.py ............................................ [ 2%]
....................................................................... [ 7%]
tests/unit/test_io.py .................................................. [ 10%]
........................................................................ [ 14%]
..................ssssssss.............................................. [ 19%]
......... [ 19%]
tests/unit/test_notebooks.py ...... [ 20%]
tests/unit/test_tf4rec.py . [ 20%]
tests/unit/test_tools.py ...................... [ 21%]
tests/unit/test_triton_inference.py ................................ [ 23%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 23%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 25%]
................................................... [ 28%]
tests/unit/framework_utils/test_torch_layers.py . [ 28%]
tests/unit/graph/test_column_schemas.py ................................ [ 30%]
.................................................. [ 33%]
tests/unit/graph/test_column_selector.py .................... [ 35%]
tests/unit/graph/ops/test_selection.py ... [ 35%]
tests/unit/inference/test_ensemble.py . [ 35%]
tests/unit/inference/test_export.py . [ 35%]
tests/unit/inference/test_graph.py . [ 35%]
tests/unit/inference/test_inference_ops.py .. [ 35%]
tests/unit/inference/test_op_runner.py ... [ 35%]
tests/unit/inference/test_tensorflow_inf_op.py ... [ 36%]
tests/unit/loader/test_dataloader_backend.py ...... [ 36%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 38%]
........................................s.. [ 41%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 43%]
........................................................ [ 46%]
tests/unit/ops/test_categorify.py ...................................... [ 48%]
................................................................. [ 53%]
tests/unit/ops/test_column_similarity.py ........................ [ 54%]
tests/unit/ops/test_fill.py ............................................ [ 57%]
........ [ 57%]
tests/unit/ops/test_hash_bucket.py ......................... [ 59%]
tests/unit/ops/test_join.py ............................................ [ 62%]
........................................................................ [ 66%]
.................................. [ 68%]
tests/unit/ops/test_lambda.py .... [ 69%]
tests/unit/ops/test_normalize.py ....................................... [ 71%]
.. [ 71%]
tests/unit/ops/test_ops.py ............................................. [ 74%]
.......................... [ 76%]
tests/unit/ops/test_ops_schema.py ...................................... [ 78%]
........................................................................ [ 82%]
........................................................................ [ 87%]
....................................... [ 89%]
tests/unit/ops/test_target_encode.py ..................... [ 91%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 91%]
tests/unit/workflow/test_workflow.py ................................... [ 93%]
........................................................... [ 97%]
tests/unit/workflow/test_workflow_node.py ........... [ 98%]
tests/unit/workflow/test_workflow_ops.py ... [ 98%]
tests/unit/workflow/test_workflow_schemas.py ......................... [100%]
=============================== warnings summary ===============================
tests/unit/test_dask_nvt.py: 3 warnings
tests/unit/test_io.py: 24 warnings
tests/unit/test_tf4rec.py: 1 warning
tests/unit/test_tools.py: 2 warnings
tests/unit/test_triton_inference.py: 7 warnings
tests/unit/inference/test_export.py: 1 warning
tests/unit/loader/test_tf_dataloader.py: 54 warnings
tests/unit/loader/test_torch_dataloader.py: 54 warnings
tests/unit/ops/test_categorify.py: 1 warning
tests/unit/ops/test_column_similarity.py: 7 warnings
tests/unit/ops/test_fill.py: 24 warnings
tests/unit/ops/test_join.py: 1 warning
tests/unit/ops/test_normalize.py: 28 warnings
tests/unit/ops/test_ops.py: 4 warnings
tests/unit/ops/test_target_encode.py: 21 warnings
tests/unit/workflow/test_workflow.py: 30 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (1) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/test_dask_nvt.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 8 files.
warnings.warn(
tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 36 warnings
tests/unit/workflow/test_workflow.py: 44 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/workflow.py:86: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for execution. Please use the client argument to initialize a Workflow object with distributed-execution enabled.
warnings.warn(
tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 52 warnings
tests/unit/workflow/test_workflow.py: 35 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dask.py:375: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for this write operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 6 files did not have enough
partitions to create 7 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 8 files did not have enough
partitions to create 9 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 8 files did not have enough
partitions to create 10 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 8 files did not have enough
partitions to create 11 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 13 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 14 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 15 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 16 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 17 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 18 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 19 files.
warnings.warn(
tests/unit/test_io.py: 96 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/init.py:38: DeprecationWarning: ColumnGroup is deprecated, use ColumnSelector instead
warnings.warn("ColumnGroup is deprecated, use ColumnSelector instead", DeprecationWarning)
tests/unit/test_io.py: 12 warnings
tests/unit/workflow/test_workflow.py: 36 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 20 files.
warnings.warn(
tests/unit/test_io.py::test_multifile_parquet[False-Shuffle.PER_WORKER-5-0-csv]
tests/unit/test_io.py::test_multifile_parquet[False-Shuffle.PER_WORKER-5-2-csv]
tests/unit/test_io.py::test_multifile_parquet[False-None-5-0-csv]
tests/unit/test_io.py::test_multifile_parquet[False-None-5-2-csv]
tests/unit/loader/test_torch_dataloader.py::test_horovod_multigpu
tests/unit/loader/test_torch_dataloader.py::test_distributed_multigpu
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 1 files did not have enough
partitions to create 5 files.
warnings.warn(
tests/unit/test_io.py::test_to_parquet_output_files[Shuffle.PER_WORKER-4-6]
tests/unit/test_io.py::test_to_parquet_output_files[False-4-6]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 6 files.
warnings.warn(
tests/unit/test_io.py: 6 warnings
tests/unit/loader/test_tf_dataloader.py: 2 warnings
tests/unit/loader/test_torch_dataloader.py: 12 warnings
tests/unit/workflow/test_workflow.py: 9 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 1 files did not have enough
partitions to create 2 files.
warnings.warn(
tests/unit/test_io.py: 20 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:521: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler is being used for this shuffle operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(
tests/unit/test_tools.py::test_cat_rep[None-1000]
tests/unit/test_tools.py::test_cat_rep[distro1-1000]
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (3) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/test_tools.py::test_cat_rep[None-10000]
tests/unit/test_tools.py::test_cat_rep[distro1-10000]
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (30) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/inference/test_ensemble.py::test_workflow_tf_e2e_config_verification[parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/graph/ops/workflow.py:66: UserWarning: TF model expects int32 for column x_nvt, but workflow is producing type float64. Overriding dtype in NVTabular workflow.
warnings.warn(
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:79: UserWarning: TF model expects int32 for column name-cat, but workflow is producing type int64. Overriding dtype in NVTabular workflow.
warnings.warn(
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:79: UserWarning: TF model expects int32 for column name-string, but workflow is producing type int64. Overriding dtype in NVTabular workflow.
warnings.warn(
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:278: UserWarning: Column x is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:278: UserWarning: Column y is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:278: UserWarning: Column id is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(
tests/unit/loader/test_tf_dataloader.py::test_nested_list
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (2) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/loader/test_tf_dataloader.py::test_sparse_tensors[False]
tests/unit/loader/test_tf_dataloader.py::test_sparse_tensors[True]
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (25) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/loader/test_tf_dataloader.py::test_sparse_tensors[False]
tests/unit/loader/test_tf_dataloader.py::test_sparse_tensors[True]
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (35) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/ops/test_fill.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_fill.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.3.3-py3.8-linux-x86_64.egg/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame
See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
/var/jenkins_home/.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg/dask/dataframe/core.py:6778: UserWarning: Insufficient elements for head. 1 elements requested, only 0 elements available. Try passing larger npartitions to head.
warnings.warn(msg.format(n, len(r)))
tests/unit/workflow/test_cpu_workflow.py: 6 warnings
tests/unit/workflow/test_workflow.py: 24 warnings
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 1 files did not have enough
partitions to create 10 files.
warnings.warn(
tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-None]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 4 files.
warnings.warn(
-- Docs: https://docs.pytest.org/en/stable/warnings.html
---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Branch BrPart Cover Missing
examples/multi-gpu-movielens/torch_trainer.py 65 0 6 1 99% 32->36
examples/multi-gpu-movielens/torch_trainer_dist.py 63 0 2 0 100%
nvtabular/init.py 18 0 0 0 100%
nvtabular/dispatch.py 341 79 166 29 75% 37-39, 42-46, 51-53, 59-69, 76-77, 118-120, 128-130, 135-138, 142-147, 154, 173, 184, 190, 195->197, 208, 231-234, 265->267, 274, 278-280, 286, 311, 318, 349->354, 352, 355, 358->362, 397, 408-411, 416, 438, 445-448, 478, 482, 523, 547, 549, 556, 571-585, 600, 607
nvtabular/framework_utils/init.py 0 0 0 0 100%
nvtabular/framework_utils/tensorflow/init.py 1 0 0 0 100%
nvtabular/framework_utils/tensorflow/feature_column_utils.py 134 78 90 15 39% 30, 99, 103, 114-130, 140, 143-158, 162, 166-167, 173-198, 207-217, 220-227, 229->233, 234, 239-279, 282
nvtabular/framework_utils/tensorflow/layers/init.py 4 0 0 0 100%
nvtabular/framework_utils/tensorflow/layers/embedding.py 153 12 89 6 91% 60, 68->49, 122, 179, 231-239, 335->343, 357->360, 363-364, 367
nvtabular/framework_utils/tensorflow/layers/interaction.py 47 25 22 1 45% 49, 74-103, 106-110, 113
nvtabular/framework_utils/tensorflow/layers/outer_product.py 30 24 12 0 19% 37-38, 41-60, 71-84, 87
nvtabular/framework_utils/tensorflow/tfrecords_to_parquet.py 58 58 30 0 0% 16-111
nvtabular/framework_utils/torch/init.py 0 0 0 0 100%
nvtabular/framework_utils/torch/layers/init.py 2 0 0 0 100%
nvtabular/framework_utils/torch/layers/embeddings.py 32 2 18 2 92% 50, 91
nvtabular/framework_utils/torch/models.py 45 1 30 4 93% 57->61, 87->89, 93->96, 103
nvtabular/framework_utils/torch/utils.py 75 5 34 5 91% 51->53, 64, 71->76, 75, 118-120
nvtabular/graph/init.py 4 0 0 0 100%
nvtabular/graph/base_operator.py 68 2 22 1 94% 161, 165
nvtabular/graph/graph.py 55 1 36 1 98% 47
nvtabular/graph/node.py 278 46 154 15 80% 49, 76-84, 107, 165, 254, 264-265, 312, 330, 339, 349-354, 359, 381, 390-398, 406-409, 422-423, 431, 432->415, 438-441, 445, 479-484, 503
nvtabular/graph/ops/init.py 5 0 0 0 100%
nvtabular/graph/ops/concat_columns.py 11 0 2 0 100%
nvtabular/graph/ops/identity.py 6 1 2 0 88% 41
nvtabular/graph/ops/selection.py 19 0 2 0 100%
nvtabular/graph/ops/subset_columns.py 13 1 2 0 93% 52
nvtabular/graph/ops/subtraction.py 10 2 2 0 83% 28-29
nvtabular/graph/schema.py 120 6 59 5 94% 38, 65, 167, 174, 199, 202
nvtabular/graph/schema_io/init.py 0 0 0 0 100%
nvtabular/graph/schema_io/schema_writer_base.py 8 0 2 0 100%
nvtabular/graph/schema_io/schema_writer_pbtxt.py 122 8 58 11 89% 45, 61->68, 64->66, 75, 92->97, 95->97, 118->133, 124-127, 169->185, 177, 181
nvtabular/graph/selector.py 78 0 40 0 100%
nvtabular/graph/tags.py 16 0 2 0 100%
nvtabular/inference/init.py 0 0 0 0 100%
nvtabular/inference/graph/init.py 3 0 0 0 100%
nvtabular/inference/graph/ensemble.py 55 0 10 0 100%
nvtabular/inference/graph/node.py 3 0 2 0 100%
nvtabular/inference/graph/op_runner.py 21 0 8 0 100%
nvtabular/inference/graph/ops/init.py 0 0 0 0 100%
nvtabular/inference/graph/ops/operator.py 29 6 12 1 78% 12-13, 18, 35, 39, 45
nvtabular/inference/graph/ops/tensorflow.py 41 11 14 1 75% 47-60
nvtabular/inference/graph/ops/workflow.py 36 0 10 1 98% 65->62
nvtabular/inference/triton/init.py 36 12 14 1 58% 42-49, 68, 72, 76-82
nvtabular/inference/triton/benchmarking_tools.py 52 52 10 0 0% 2-103
nvtabular/inference/triton/data_conversions.py 87 3 58 4 95% 32-33, 84
nvtabular/inference/triton/ensemble.py 290 139 106 7 55% 139-172, 217-261, 360-368, 394-410, 463-473, 522-562, 568-584, 588-655, 662->665, 665->661, 682->681, 747, 753-772, 778-802, 809
nvtabular/inference/triton/model/init.py 0 0 0 0 100%
nvtabular/inference/triton/model/model_pt.py 101 101 42 0 0% 27-220
nvtabular/inference/triton/model_config_pb2.py 299 0 2 0 100%
nvtabular/inference/triton/workflow_model.py 53 53 22 0 0% 27-125
nvtabular/inference/workflow/init.py 0 0 0 0 100%
nvtabular/inference/workflow/base.py 92 92 56 0 0% 27-177
nvtabular/inference/workflow/hugectr.py 37 37 16 0 0% 27-87
nvtabular/inference/workflow/pytorch.py 10 10 6 0 0% 27-46
nvtabular/inference/workflow/tensorflow.py 31 31 10 0 0% 26-67
nvtabular/io/init.py 5 0 0 0 100%
nvtabular/io/avro.py 88 88 32 0 0% 16-189
nvtabular/io/csv.py 57 6 22 5 86% 22-23, 99, 103->107, 108, 110, 124
nvtabular/io/dask.py 183 8 74 11 93% 111, 114, 150, 401, 411, 428->431, 439, 443->445, 445->441, 450, 452
nvtabular/io/dataframe_engine.py 61 5 30 6 88% 19-20, 50, 69, 88->92, 92->97, 94->97, 97->116, 125
nvtabular/io/dataframe_iter.py 21 1 14 1 94% 42
nvtabular/io/dataset.py 346 43 168 28 85% 48-49, 268, 270, 283, 308-322, 446->520, 451-454, 459->469, 476->474, 477->481, 494->498, 509, 520->529, 580-581, 582->586, 634, 762, 764, 766, 772, 776-778, 780, 840-841, 875, 882-883, 889, 895, 992-993, 1111-1116, 1122, 1134-1135
nvtabular/io/dataset_engine.py 31 1 6 0 97% 48
nvtabular/io/fsspec_utils.py 115 101 64 0 8% 26-27, 42-98, 103-114, 151-198, 220-270, 275-291, 295-297, 311-322
nvtabular/io/hugectr.py 45 2 26 2 92% 34, 74->97, 101
nvtabular/io/parquet.py 590 48 218 30 88% 34-35, 58, 80->156, 87, 101, 113-127, 140-153, 176, 205-206, 223->248, 234->248, 285-293, 313, 319, 337->339, 353, 371->381, 374, 423->435, 427, 549-554, 592-597, 713->720, 781->786, 787-788, 908, 912, 916, 922, 954, 971, 975, 982->984, 1092->exit, 1096->1093, 1103->1108, 1113->1123, 1128, 1150, 1177
nvtabular/io/shuffle.py 31 7 18 4 73% 42, 44-45, 49, 62-64
nvtabular/io/writer.py 184 13 78 5 92% 24-25, 51, 79, 125, 128, 212, 221, 224, 267, 299-301
nvtabular/io/writer_factory.py 18 2 8 2 85% 35, 60
nvtabular/loader/init.py 0 0 0 0 100%
nvtabular/loader/backend.py 372 16 154 11 95% 27-28, 159-160, 300->302, 312-316, 363-364, 403->407, 404->403, 479, 483-484, 513, 589-590, 625, 633
nvtabular/loader/tensorflow.py 168 20 58 7 88% 66, 83, 97, 311, 339, 350, 365-367, 396-398, 408-416, 419-422
nvtabular/loader/tf_utils.py 57 10 22 6 80% 32->35, 35->37, 42->44, 46, 47->68, 53-54, 62-64, 70-74
nvtabular/loader/torch.py 87 14 26 3 80% 28-30, 33-39, 114, 158-159, 164
nvtabular/ops/init.py 23 0 0 0 100%
nvtabular/ops/add_metadata.py 17 0 4 0 100%
nvtabular/ops/bucketize.py 37 10 20 3 70% 53-55, 59->exit, 62-65, 84-87, 94
nvtabular/ops/categorify.py 655 69 352 47 87% 251, 253, 271, 275, 283, 291, 293, 320, 341-342, 378, 389->393, 397-404, 486-487, 511-516, 619, 715, 732, 777, 855-856, 871-875, 876->840, 894, 902, 909->exit, 933, 936->939, 988->986, 1048, 1053, 1074->1078, 1080->1035, 1086-1089, 1101, 1105, 1109, 1116, 1121-1124, 1202, 1204, 1274->1296, 1280->1296, 1297-1302, 1342, 1359->1364, 1363, 1373->1370, 1378->1370, 1385, 1388, 1396-1406
nvtabular/ops/clip.py 18 2 8 3 81% 44, 52->54, 55
nvtabular/ops/column_similarity.py 121 27 40 5 74% 19-20, 29-30, 82->exit, 112, 138, 202-203, 212-214, 222-238, 255->258, 259, 269
nvtabular/ops/data_stats.py 56 2 24 3 94% 91->93, 95, 97->87, 102
nvtabular/ops/difference_lag.py 33 1 12 1 96% 73->75, 98
nvtabular/ops/dropna.py 8 0 2 0 100%
nvtabular/ops/fill.py 91 12 40 3 82% 63-67, 93, 121, 147, 151, 162-165
nvtabular/ops/filter.py 20 1 8 1 93% 49
nvtabular/ops/groupby.py 128 6 82 5 94% 72, 83, 93->95, 105->110, 137, 141-146
nvtabular/ops/hash_bucket.py 40 2 22 2 94% 73, 106->112, 118
nvtabular/ops/hashed_cross.py 36 4 17 3 87% 53, 66, 81, 91
nvtabular/ops/join_external.py 92 18 38 7 76% 20-21, 114, 116, 118, 135-161, 177->179, 216->227, 221
nvtabular/ops/join_groupby.py 101 7 38 4 92% 108, 115, 124, 131->130, 215-216, 219-220
nvtabular/ops/lambdaop.py 39 6 20 6 80% 59, 63, 77, 89, 94, 103
nvtabular/ops/list_slice.py 85 29 42 0 63% 21-22, 141-155, 163-185
nvtabular/ops/logop.py 19 0 6 0 100%
nvtabular/ops/moments.py 69 0 24 0 100%
nvtabular/ops/normalize.py 89 10 22 1 88% 89, 97-98, 104, 137-138, 160-161, 165, 176
nvtabular/ops/operator.py 12 1 2 0 93% 53
nvtabular/ops/rename.py 41 3 24 3 91% 47, 88-90
nvtabular/ops/stat_operator.py 8 0 2 0 100%
nvtabular/ops/target_encoding.py 154 11 68 4 91% 168->172, 176->185, 233-234, 237-238, 250-256, 347->350, 363
nvtabular/ops/value_counts.py 30 0 6 1 97% 40->38
nvtabular/tools/init.py 0 0 0 0 100%
nvtabular/tools/data_gen.py 251 12 86 6 95% 25-26, 124-127, 137-139, 161-162, 313, 323, 349
nvtabular/tools/dataset_inspector.py 50 7 22 1 81% 32-39
nvtabular/tools/inspector_script.py 46 46 0 0 0% 17-168
nvtabular/utils.py 106 43 48 8 54% 31-32, 36-37, 50, 61-62, 64-66, 69, 72, 78, 84, 90-126, 145, 149->153
nvtabular/worker.py 80 5 38 7 90% 24-25, 81->97, 89, 90->97, 97->100, 106, 108, 109->111
nvtabular/workflow/init.py 2 0 0 0 100%
nvtabular/workflow/node.py 7 0 4 0 100%
nvtabular/workflow/workflow.py 200 17 84 10 90% 28-29, 47, 176, 182->196, 208-210, 323, 338-339, 357-358, 374, 450, 466-468, 481
TOTAL 8380 1634 3503 366 78%
Coverage XML written to file coverage.xml
Required test coverage of 70% reached. Total coverage: 78.25%
=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:16: could not import 's3fs': No module named 's3fs'
SKIPPED [8] tests/unit/test_io.py:613: could not import 'uavro': No module named 'uavro'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:531: not working correctly in ci environment
========= 1583 passed, 10 skipped, 721 warnings in 1447.55s (0:24:07) ==========
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://api.GitHub.com/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins295705578428510254.sh
Click to view CI Results
GitHub pull request #1321 of commit 1c5aa3327ab4e886c2791b714bf8697df7a33272, no merge conflicts.
Running as SYSTEM
Setting status of 1c5aa3327ab4e886c2791b714bf8697df7a33272 to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/3941/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
> git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
> git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
> git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
> git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
> git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1321/*:refs/remotes/origin/pr/1321/* # timeout=10
> git rev-parse 1c5aa3327ab4e886c2791b714bf8697df7a33272^{commit} # timeout=10
Checking out Revision 1c5aa3327ab4e886c2791b714bf8697df7a33272 (detached)
> git config core.sparsecheckout # timeout=10
> git checkout -f 1c5aa3327ab4e886c2791b714bf8697df7a33272 # timeout=10
Commit message: "Merge branch 'main' into multigpu_tensorflow_movielens"
> git rev-list --no-walk 93d984f22b0aba812f713cfa6fd44cfaee836783 # timeout=10
First time build. Skipping changelog.
[nvtabular_tests] $ /bin/bash /tmp/jenkins2666342371048549055.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (21.3.1)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (59.6.0)
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.0)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.8.1)
Requirement already satisfied: numpy==1.20.3 in /var/jenkins_home/.local/lib/python3.8/site-packages (1.20.3)
Found existing installation: nvtabular 0.8.0+7.gb459467
ERROR: Exception:
Traceback (most recent call last):
File "/var/jenkins_home/.local/lib/python3.8/site-packages/pip/_internal/cli/base_command.py", line 164, in exc_logging_wrapper
status = run_func(*args)
File "/var/jenkins_home/.local/lib/python3.8/site-packages/pip/_internal/commands/uninstall.py", line 97, in run
uninstall_pathset = req.uninstall(
File "/var/jenkins_home/.local/lib/python3.8/site-packages/pip/_internal/req/req_install.py", line 670, in uninstall
uninstalled_pathset = UninstallPathSet.from_dist(dist)
File "/var/jenkins_home/.local/lib/python3.8/site-packages/pip/_internal/req/req_uninstall.py", line 533, in from_dist
assert (
AssertionError: Egg-link /var/jenkins_home/workspace/nvtabular_tests/nvtabular does not match installed location of nvtabular (at /nvtabular)
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://api.GitHub.com/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins699321966361151304.sh
rerun tests
Click to view CI Results
GitHub pull request #1321 of commit 1c5aa3327ab4e886c2791b714bf8697df7a33272, no merge conflicts.
Running as SYSTEM
Setting status of 1c5aa3327ab4e886c2791b714bf8697df7a33272 to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/3957/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
> git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
> git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
> git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
> git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
> git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1321/*:refs/remotes/origin/pr/1321/* # timeout=10
> git rev-parse 1c5aa3327ab4e886c2791b714bf8697df7a33272^{commit} # timeout=10
Checking out Revision 1c5aa3327ab4e886c2791b714bf8697df7a33272 (detached)
> git config core.sparsecheckout # timeout=10
> git checkout -f 1c5aa3327ab4e886c2791b714bf8697df7a33272 # timeout=10
Commit message: "Merge branch 'main' into multigpu_tensorflow_movielens"
> git rev-list --no-walk 93d984f22b0aba812f713cfa6fd44cfaee836783 # timeout=10
First time build. Skipping changelog.
[nvtabular_tests] $ /bin/bash /tmp/jenkins4249857804844571299.sh
rm: cannot remove '/var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link': No such file or directory
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://api.GitHub.com/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins6722771596716163527.sh
Click to view CI Results
GitHub pull request #1321 of commit c39484903a677c7f3cbf93a028cf1485bc87c2e8, no merge conflicts.
Running as SYSTEM
Setting status of c39484903a677c7f3cbf93a028cf1485bc87c2e8 to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/3982/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
> git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
> git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
> git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
> git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
> git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1321/*:refs/remotes/origin/pr/1321/* # timeout=10
> git rev-parse c39484903a677c7f3cbf93a028cf1485bc87c2e8^{commit} # timeout=10
Checking out Revision c39484903a677c7f3cbf93a028cf1485bc87c2e8 (detached)
> git config core.sparsecheckout # timeout=10
> git checkout -f c39484903a677c7f3cbf93a028cf1485bc87c2e8 # timeout=10
Commit message: "Merge branch 'main' into multigpu_tensorflow_movielens"
> git rev-list --no-walk 8bf584ae9109316fbd89e39612e4c54f5cfc3a49 # timeout=10
First time build. Skipping changelog.
[nvtabular_tests] $ /bin/bash /tmp/jenkins4957069612185695176.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (21.3.1)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (59.4.0)
Collecting setuptools
Downloading setuptools-60.0.4-py3-none-any.whl (952 kB)
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.1)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.8.1)
Requirement already satisfied: numpy==1.20.3 in /var/jenkins_home/.local/lib/python3.8/site-packages (1.20.3)
Found existing installation: nvtabular 0.8.0+7.gb459467
Can't uninstall 'nvtabular'. No files were found to uninstall.
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/command/easy_install.py:156: EasyInstallDeprecationWarning: easy_install command is deprecated. Use build and pip and other standards-based tools.
warnings.warn(
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
warnings.warn(
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+47.gc394849 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+47.gc394849 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+47.gc394849 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+47.gc394849 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so ->
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.8.0+47.gc394849 is already the active version in easy-install.pth
Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Processing dependencies for nvtabular==0.8.0+47.gc394849
Searching for protobuf==3.19.1
Best match: protobuf 3.19.1
Adding protobuf 3.19.1 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for tensorflow-metadata==1.5.0
Best match: tensorflow-metadata 1.5.0
Processing tensorflow_metadata-1.5.0-py3.8.egg
tensorflow-metadata 1.5.0 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/tensorflow_metadata-1.5.0-py3.8.egg
Searching for pyarrow==4.0.1
Best match: pyarrow 4.0.1
Adding pyarrow 4.0.1 to easy-install.pth file
Installing plasma_store script to /var/jenkins_home/.local/bin
Using /usr/local/lib/python3.8/dist-packages
Searching for tqdm==4.62.3
Best match: tqdm 4.62.3
Processing tqdm-4.62.3-py3.8.egg
tqdm 4.62.3 is already the active version in easy-install.pth
Installing tqdm script to /var/jenkins_home/.local/bin
Using /usr/local/lib/python3.8/dist-packages/tqdm-4.62.3-py3.8.egg
Searching for numba==0.54.1
Best match: numba 0.54.1
Adding numba 0.54.1 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for pandas==1.3.5
Best match: pandas 1.3.5
Processing pandas-1.3.5-py3.8-linux-x86_64.egg
pandas 1.3.5 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/pandas-1.3.5-py3.8-linux-x86_64.egg
Searching for distributed==2021.7.1
Best match: distributed 2021.7.1
Processing distributed-2021.7.1-py3.8.egg
distributed 2021.7.1 is already the active version in easy-install.pth
Installing dask-ssh script to /var/jenkins_home/.local/bin
Installing dask-scheduler script to /var/jenkins_home/.local/bin
Installing dask-worker script to /var/jenkins_home/.local/bin
Using /var/jenkins_home/.local/lib/python3.8/site-packages/distributed-2021.7.1-py3.8.egg
Searching for dask==2021.7.1
Best match: dask 2021.7.1
Processing dask-2021.7.1-py3.8.egg
dask 2021.7.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg
Searching for googleapis-common-protos==1.54.0
Best match: googleapis-common-protos 1.54.0
Adding googleapis-common-protos 1.54.0 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for absl-py==0.12.0
Best match: absl-py 0.12.0
Processing absl_py-0.12.0-py3.8.egg
absl-py 0.12.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/absl_py-0.12.0-py3.8.egg
Searching for numpy==1.20.3
Best match: numpy 1.20.3
Adding numpy 1.20.3 to easy-install.pth file
Installing f2py script to /var/jenkins_home/.local/bin
Installing f2py3 script to /var/jenkins_home/.local/bin
Installing f2py3.8 script to /var/jenkins_home/.local/bin
Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for setuptools==59.7.0
Best match: setuptools 59.7.0
Adding setuptools 59.7.0 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for llvmlite==0.37.0
Best match: llvmlite 0.37.0
Processing llvmlite-0.37.0-py3.8-linux-x86_64.egg
llvmlite 0.37.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/llvmlite-0.37.0-py3.8-linux-x86_64.egg
Searching for pytz==2021.3
Best match: pytz 2021.3
Adding pytz 2021.3 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for python-dateutil==2.8.2
Best match: python-dateutil 2.8.2
Adding python-dateutil 2.8.2 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for zict==2.0.0
Best match: zict 2.0.0
Processing zict-2.0.0-py3.8.egg
zict 2.0.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg
Searching for tornado==6.1
Best match: tornado 6.1
Processing tornado-6.1-py3.8-linux-x86_64.egg
tornado 6.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg
Searching for toolz==0.11.2
Best match: toolz 0.11.2
Processing toolz-0.11.2-py3.8.egg
toolz 0.11.2 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/toolz-0.11.2-py3.8.egg
Searching for tblib==1.7.0
Best match: tblib 1.7.0
Processing tblib-1.7.0-py3.8.egg
tblib 1.7.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg
Searching for sortedcontainers==2.4.0
Best match: sortedcontainers 2.4.0
Processing sortedcontainers-2.4.0-py3.8.egg
sortedcontainers 2.4.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg
Searching for PyYAML==5.4.1
Best match: PyYAML 5.4.1
Processing PyYAML-5.4.1-py3.8-linux-x86_64.egg
PyYAML 5.4.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg
Searching for psutil==5.8.0
Best match: psutil 5.8.0
Processing psutil-5.8.0-py3.8-linux-x86_64.egg
psutil 5.8.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg
Searching for msgpack==1.0.3
Best match: msgpack 1.0.3
Processing msgpack-1.0.3-py3.8-linux-x86_64.egg
msgpack 1.0.3 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/msgpack-1.0.3-py3.8-linux-x86_64.egg
Searching for cloudpickle==2.0.0
Best match: cloudpickle 2.0.0
Processing cloudpickle-2.0.0-py3.8.egg
cloudpickle 2.0.0 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/cloudpickle-2.0.0-py3.8.egg
Searching for click==8.0.3
Best match: click 8.0.3
Processing click-8.0.3-py3.8.egg
click 8.0.3 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/click-8.0.3-py3.8.egg
Searching for partd==1.2.0
Best match: partd 1.2.0
Processing partd-1.2.0-py3.8.egg
partd 1.2.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg
Searching for packaging==21.3
Best match: packaging 21.3
Adding packaging 21.3 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for fsspec==2021.11.1
Best match: fsspec 2021.11.1
Adding fsspec 2021.11.1 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for six==1.15.0
Best match: six 1.15.0
Adding six 1.15.0 to easy-install.pth file
Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for HeapDict==1.0.1
Best match: HeapDict 1.0.1
Processing HeapDict-1.0.1-py3.8.egg
HeapDict 1.0.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg
Searching for locket==0.2.1
Best match: locket 0.2.1
Processing locket-0.2.1-py3.8.egg
locket 0.2.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg
Searching for pyparsing==3.0.6
Best match: pyparsing 3.0.6
Adding pyparsing 3.0.6 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Finished processing dependencies for nvtabular==0.8.0+47.gc394849
Running black --check
All done! ✨ 🍰 ✨
169 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.dispatch
nvtabular/dispatch.py:607:11: I1101: Module 'numpy.random.mtrand' has no 'RandomState' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)
Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.7) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.11.0, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: xdist-2.5.0, forked-1.4.0, cov-3.0.0
collected 1593 items / 1 skipped / 1592 selected
tests/unit/test_dask_nvt.py ............................................ [ 2%]
....................................................................... [ 7%]
tests/unit/test_io.py .................................................. [ 10%]
........................................................................ [ 14%]
..................ssssssss.............................................. [ 19%]
......... [ 19%]
tests/unit/test_notebooks.py ...... [ 20%]
tests/unit/test_tf4rec.py . [ 20%]
tests/unit/test_tools.py ...................... [ 21%]
tests/unit/test_triton_inference.py ................................ [ 23%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 23%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 25%]
................................................... [ 28%]
tests/unit/framework_utils/test_torch_layers.py . [ 28%]
tests/unit/graph/test_column_schemas.py ................................ [ 30%]
.................................................. [ 33%]
tests/unit/graph/test_column_selector.py .................... [ 35%]
tests/unit/graph/ops/test_selection.py ... [ 35%]
tests/unit/inference/test_ensemble.py . [ 35%]
tests/unit/inference/test_export.py . [ 35%]
tests/unit/inference/test_graph.py . [ 35%]
tests/unit/inference/test_inference_ops.py .. [ 35%]
tests/unit/inference/test_op_runner.py ... [ 35%]
tests/unit/inference/test_tensorflow_inf_op.py ... [ 36%]
tests/unit/loader/test_dataloader_backend.py ...... [ 36%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 38%]
........................................s.. [ 41%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 43%]
........................................................ [ 46%]
tests/unit/ops/test_categorify.py ...................................... [ 48%]
................................................................. [ 52%]
tests/unit/ops/test_column_similarity.py ........................ [ 54%]
tests/unit/ops/test_fill.py ............................................ [ 57%]
........ [ 57%]
tests/unit/ops/test_hash_bucket.py ......................... [ 59%]
tests/unit/ops/test_join.py ............................................ [ 62%]
........................................................................ [ 66%]
.................................. [ 68%]
tests/unit/ops/test_lambda.py .... [ 68%]
tests/unit/ops/test_normalize.py ....................................... [ 71%]
.. [ 71%]
tests/unit/ops/test_ops.py ............................................. [ 74%]
.......................... [ 76%]
tests/unit/ops/test_ops_schema.py ...................................... [ 78%]
........................................................................ [ 82%]
........................................................................ [ 87%]
....................................... [ 89%]
tests/unit/ops/test_target_encode.py ..................... [ 91%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 91%]
tests/unit/workflow/test_workflow.py ................................... [ 93%]
............................................................ [ 97%]
tests/unit/workflow/test_workflow_node.py ........... [ 98%]
tests/unit/workflow/test_workflow_ops.py ... [ 98%]
tests/unit/workflow/test_workflow_schemas.py ......................... [100%]
=============================== warnings summary ===============================
tests/unit/test_dask_nvt.py: 3 warnings
tests/unit/test_io.py: 24 warnings
tests/unit/test_tf4rec.py: 1 warning
tests/unit/test_tools.py: 2 warnings
tests/unit/test_triton_inference.py: 7 warnings
tests/unit/inference/test_export.py: 1 warning
tests/unit/loader/test_tf_dataloader.py: 54 warnings
tests/unit/loader/test_torch_dataloader.py: 54 warnings
tests/unit/ops/test_categorify.py: 1 warning
tests/unit/ops/test_column_similarity.py: 7 warnings
tests/unit/ops/test_fill.py: 24 warnings
tests/unit/ops/test_join.py: 1 warning
tests/unit/ops/test_normalize.py: 28 warnings
tests/unit/ops/test_ops.py: 4 warnings
tests/unit/ops/test_target_encode.py: 21 warnings
tests/unit/workflow/test_workflow.py: 30 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (1) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/test_dask_nvt.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 8 files.
warnings.warn(
tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 36 warnings
tests/unit/workflow/test_workflow.py: 44 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/workflow.py:86: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for execution. Please use the client argument to initialize a Workflow object with distributed-execution enabled.
warnings.warn(
tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 52 warnings
tests/unit/workflow/test_workflow.py: 35 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dask.py:375: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for this write operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 6 files did not have enough
partitions to create 7 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 8 files did not have enough
partitions to create 9 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 8 files did not have enough
partitions to create 10 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 8 files did not have enough
partitions to create 11 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 13 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 14 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 15 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 16 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 17 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 18 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 19 files.
warnings.warn(
tests/unit/test_io.py: 96 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/init.py:38: DeprecationWarning: ColumnGroup is deprecated, use ColumnSelector instead
warnings.warn("ColumnGroup is deprecated, use ColumnSelector instead", DeprecationWarning)
tests/unit/test_io.py: 12 warnings
tests/unit/workflow/test_workflow.py: 36 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 20 files.
warnings.warn(
tests/unit/test_io.py::test_multifile_parquet[False-Shuffle.PER_WORKER-5-0-csv]
tests/unit/test_io.py::test_multifile_parquet[False-Shuffle.PER_WORKER-5-2-csv]
tests/unit/test_io.py::test_multifile_parquet[False-None-5-0-csv]
tests/unit/test_io.py::test_multifile_parquet[False-None-5-2-csv]
tests/unit/loader/test_torch_dataloader.py::test_horovod_multigpu
tests/unit/loader/test_torch_dataloader.py::test_distributed_multigpu
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 1 files did not have enough
partitions to create 5 files.
warnings.warn(
tests/unit/test_io.py::test_to_parquet_output_files[Shuffle.PER_WORKER-4-6]
tests/unit/test_io.py::test_to_parquet_output_files[False-4-6]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 6 files.
warnings.warn(
tests/unit/test_io.py: 6 warnings
tests/unit/loader/test_tf_dataloader.py: 2 warnings
tests/unit/loader/test_torch_dataloader.py: 12 warnings
tests/unit/workflow/test_workflow.py: 9 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 1 files did not have enough
partitions to create 2 files.
warnings.warn(
tests/unit/test_io.py: 20 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:521: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler is being used for this shuffle operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(
tests/unit/test_tools.py::test_cat_rep[None-1000]
tests/unit/test_tools.py::test_cat_rep[distro1-1000]
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (3) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/test_tools.py::test_cat_rep[None-10000]
tests/unit/test_tools.py::test_cat_rep[distro1-10000]
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (30) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/inference/test_ensemble.py::test_workflow_tf_e2e_config_verification[parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/graph/ops/workflow.py:74: UserWarning: TF model expects int32 for column x_nvt, but workflow is producing type float64. Overriding dtype in NVTabular workflow.
warnings.warn(
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:89: UserWarning: TF model expects int32 for column name-cat, but workflow is producing type int64. Overriding dtype in NVTabular workflow.
warnings.warn(
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:89: UserWarning: TF model expects int32 for column name-string, but workflow is producing type int64. Overriding dtype in NVTabular workflow.
warnings.warn(
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:303: UserWarning: Column x is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:303: UserWarning: Column y is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:303: UserWarning: Column id is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(
tests/unit/loader/test_tf_dataloader.py::test_nested_list
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (2) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/loader/test_tf_dataloader.py::test_sparse_tensors[False]
tests/unit/loader/test_tf_dataloader.py::test_sparse_tensors[True]
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (25) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/loader/test_tf_dataloader.py::test_sparse_tensors[False]
tests/unit/loader/test_tf_dataloader.py::test_sparse_tensors[True]
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (35) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/ops/test_fill.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_fill.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/usr/local/lib/python3.8/dist-packages/pandas-1.3.5-py3.8-linux-x86_64.egg/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame
See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
/var/jenkins_home/.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg/dask/dataframe/core.py:6778: UserWarning: Insufficient elements for head. 1 elements requested, only 0 elements available. Try passing larger npartitions to head.
warnings.warn(msg.format(n, len(r)))
tests/unit/workflow/test_cpu_workflow.py: 6 warnings
tests/unit/workflow/test_workflow.py: 24 warnings
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 1 files did not have enough
partitions to create 10 files.
warnings.warn(
tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-None]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 4 files.
warnings.warn(
-- Docs: https://docs.pytest.org/en/stable/warnings.html
---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Branch BrPart Cover Missing
examples/multi-gpu-movielens/torch_trainer.py 65 0 6 1 99% 32->36
examples/multi-gpu-movielens/torch_trainer_dist.py 63 0 2 0 100%
nvtabular/init.py 18 0 0 0 100%
nvtabular/dispatch.py 341 79 166 29 75% 37-39, 42-46, 51-53, 59-69, 76-77, 118-120, 128-130, 135-138, 142-147, 154, 173, 184, 190, 195->197, 208, 231-234, 265->267, 274, 278-280, 286, 311, 318, 349->354, 352, 355, 358->362, 397, 408-411, 416, 438, 445-448, 478, 482, 523, 547, 549, 556, 571-585, 600, 607
nvtabular/framework_utils/init.py 0 0 0 0 100%
nvtabular/framework_utils/tensorflow/init.py 1 0 0 0 100%
nvtabular/framework_utils/tensorflow/feature_column_utils.py 134 78 90 15 39% 30, 99, 103, 114-130, 140, 143-158, 162, 166-167, 173-198, 207-217, 220-227, 229->233, 234, 239-279, 282
nvtabular/framework_utils/tensorflow/layers/init.py 4 0 0 0 100%
nvtabular/framework_utils/tensorflow/layers/embedding.py 153 12 89 6 91% 60, 68->49, 122, 179, 231-239, 335->343, 357->360, 363-364, 367
nvtabular/framework_utils/tensorflow/layers/interaction.py 47 25 22 1 45% 49, 74-103, 106-110, 113
nvtabular/framework_utils/tensorflow/layers/outer_product.py 30 24 12 0 19% 37-38, 41-60, 71-84, 87
nvtabular/framework_utils/tensorflow/tfrecords_to_parquet.py 58 58 30 0 0% 16-111
nvtabular/framework_utils/torch/init.py 0 0 0 0 100%
nvtabular/framework_utils/torch/layers/init.py 2 0 0 0 100%
nvtabular/framework_utils/torch/layers/embeddings.py 32 2 18 2 92% 50, 91
nvtabular/framework_utils/torch/models.py 45 1 30 4 93% 57->61, 87->89, 93->96, 103
nvtabular/framework_utils/torch/utils.py 75 5 34 5 91% 51->53, 64, 71->76, 75, 118-120
nvtabular/graph/init.py 4 0 0 0 100%
nvtabular/graph/base_operator.py 72 2 22 1 95% 192, 196
nvtabular/graph/graph.py 55 1 36 1 98% 47
nvtabular/graph/node.py 263 43 142 16 80% 49, 76-84, 107, 137, 226, 236-237, 284, 302, 311, 321-326, 331, 353, 362-370, 379, 380->375, 394-395, 403, 404->387, 410-413, 417, 451-456, 475
nvtabular/graph/ops/init.py 5 0 0 0 100%
nvtabular/graph/ops/concat_columns.py 16 0 2 0 100%
nvtabular/graph/ops/identity.py 6 1 2 0 88% 41
nvtabular/graph/ops/selection.py 22 0 2 0 100%
nvtabular/graph/ops/subset_columns.py 16 1 2 0 94% 62
nvtabular/graph/ops/subtraction.py 20 2 4 0 92% 49-50
nvtabular/graph/schema.py 120 6 59 5 94% 38, 65, 167, 174, 199, 202
nvtabular/graph/schema_io/init.py 0 0 0 0 100%
nvtabular/graph/schema_io/schema_writer_base.py 8 0 2 0 100%
nvtabular/graph/schema_io/schema_writer_pbtxt.py 122 8 58 11 89% 45, 61->68, 64->66, 75, 92->97, 95->97, 118->133, 124-127, 169->185, 177, 181
nvtabular/graph/selector.py 78 0 40 0 100%
nvtabular/graph/tags.py 16 0 2 0 100%
nvtabular/inference/init.py 0 0 0 0 100%
nvtabular/inference/graph/init.py 3 0 0 0 100%
nvtabular/inference/graph/ensemble.py 55 0 10 0 100%
nvtabular/inference/graph/node.py 3 0 2 0 100%
nvtabular/inference/graph/op_runner.py 21 0 8 0 100%
nvtabular/inference/graph/ops/init.py 0 0 0 0 100%
nvtabular/inference/graph/ops/operator.py 29 6 12 1 78% 12-13, 18, 35, 39, 45
nvtabular/inference/graph/ops/tensorflow.py 45 11 14 1 76% 34-47
nvtabular/inference/graph/ops/workflow.py 37 0 10 1 98% 73->70
nvtabular/inference/triton/init.py 36 12 14 1 58% 42-49, 68, 72, 76-82
nvtabular/inference/triton/benchmarking_tools.py 52 52 10 0 0% 2-103
nvtabular/inference/triton/data_conversions.py 87 3 58 4 95% 32-33, 84
nvtabular/inference/triton/ensemble.py 285 143 100 7 52% 157-194, 238-286, 379-387, 416-432, 485-495, 544-584, 590-606, 610-677, 684->687, 687->683, 704->703, 753, 759-778, 784-808, 815
nvtabular/inference/triton/model/init.py 0 0 0 0 100%
nvtabular/inference/triton/model/model_pt.py 101 101 42 0 0% 27-220
nvtabular/inference/triton/model_config_pb2.py 299 0 2 0 100%
nvtabular/inference/triton/workflow_model.py 52 52 22 0 0% 27-124
nvtabular/inference/workflow/init.py 0 0 0 0 100%
nvtabular/inference/workflow/base.py 114 114 62 0 0% 27-210
nvtabular/inference/workflow/hugectr.py 37 37 16 0 0% 27-87
nvtabular/inference/workflow/pytorch.py 10 10 6 0 0% 27-46
nvtabular/inference/workflow/tensorflow.py 32 32 10 0 0% 26-68
nvtabular/io/init.py 5 0 0 0 100%
nvtabular/io/avro.py 88 88 32 0 0% 16-189
nvtabular/io/csv.py 57 6 22 5 86% 22-23, 99, 103->107, 108, 110, 124
nvtabular/io/dask.py 183 8 74 11 93% 111, 114, 150, 401, 411, 428->431, 439, 443->445, 445->441, 450, 452
nvtabular/io/dataframe_engine.py 61 5 30 6 88% 19-20, 50, 69, 88->92, 92->97, 94->97, 97->116, 125
nvtabular/io/dataframe_iter.py 21 1 14 1 94% 42
nvtabular/io/dataset.py 346 43 168 28 85% 48-49, 268, 270, 283, 308-322, 446->520, 451-454, 459->469, 476->474, 477->481, 494->498, 509, 520->529, 580-581, 582->586, 634, 762, 764, 766, 772, 776-778, 780, 840-841, 875, 882-883, 889, 895, 992-993, 1111-1116, 1122, 1134-1135
nvtabular/io/dataset_engine.py 31 1 6 0 97% 48
nvtabular/io/fsspec_utils.py 115 101 64 0 8% 26-27, 42-98, 103-114, 151-198, 220-270, 275-291, 295-297, 311-322
nvtabular/io/hugectr.py 45 2 26 2 92% 34, 74->97, 101
nvtabular/io/parquet.py 590 48 218 30 88% 34-35, 58, 80->156, 87, 101, 113-127, 140-153, 176, 205-206, 223->248, 234->248, 285-293, 313, 319, 337->339, 353, 371->381, 374, 423->435, 427, 549-554, 592-597, 713->720, 781->786, 787-788, 908, 912, 916, 922, 954, 971, 975, 982->984, 1092->exit, 1096->1093, 1103->1108, 1113->1123, 1128, 1150, 1177
nvtabular/io/shuffle.py 31 7 18 4 73% 42, 44-45, 49, 62-64
nvtabular/io/writer.py 184 13 78 5 92% 24-25, 51, 79, 125, 128, 212, 221, 224, 267, 299-301
nvtabular/io/writer_factory.py 18 2 8 2 85% 35, 60
nvtabular/loader/init.py 0 0 0 0 100%
nvtabular/loader/backend.py 372 16 154 11 95% 27-28, 159-160, 300->302, 312-316, 363-364, 403->407, 404->403, 479, 483-484, 513, 589-590, 625, 633
nvtabular/loader/tensorflow.py 168 20 58 7 88% 66, 83, 97, 311, 339, 350, 365-367, 396-398, 408-416, 419-422
nvtabular/loader/tf_utils.py 57 10 22 6 80% 32->35, 35->37, 42->44, 46, 47->68, 53-54, 62-64, 70-74
nvtabular/loader/torch.py 87 14 26 3 80% 28-30, 33-39, 114, 158-159, 164
nvtabular/ops/init.py 23 0 0 0 100%
nvtabular/ops/add_metadata.py 17 0 4 0 100%
nvtabular/ops/bucketize.py 37 10 20 3 70% 53-55, 59->exit, 62-65, 84-87, 94
nvtabular/ops/categorify.py 655 68 352 47 87% 251, 253, 271, 275, 283, 291, 293, 320, 341-342, 389->393, 397-404, 486-487, 511-516, 619, 715, 732, 777, 855-856, 871-875, 876->840, 894, 902, 909->exit, 933, 936->939, 988->986, 1048, 1053, 1074->1078, 1080->1035, 1086-1089, 1101, 1105, 1109, 1116, 1121-1124, 1202, 1204, 1274->1296, 1280->1296, 1297-1302, 1342, 1359->1364, 1363, 1373->1370, 1378->1370, 1385, 1388, 1396-1406
nvtabular/ops/clip.py 18 2 8 3 81% 44, 52->54, 55
nvtabular/ops/column_similarity.py 121 27 40 5 74% 19-20, 29-30, 82->exit, 112, 138, 202-203, 212-214, 222-238, 255->258, 259, 269
nvtabular/ops/data_stats.py 56 1 24 3 95% 91->93, 95, 97->87
nvtabular/ops/difference_lag.py 33 1 12 1 96% 73->75, 98
nvtabular/ops/dropna.py 8 0 2 0 100%
nvtabular/ops/fill.py 91 8 40 4 86% 63-67, 93, 121, 151, 164
nvtabular/ops/filter.py 20 1 8 1 93% 49
nvtabular/ops/groupby.py 128 6 82 5 94% 72, 83, 93->95, 105->110, 137, 141-146
nvtabular/ops/hash_bucket.py 40 2 22 2 94% 73, 106->112, 118
nvtabular/ops/hashed_cross.py 36 4 17 3 87% 53, 66, 81, 91
nvtabular/ops/join_external.py 92 18 38 7 76% 20-21, 114, 116, 118, 135-161, 177->179, 216->227, 221
nvtabular/ops/join_groupby.py 101 5 38 4 94% 108, 115, 124, 131->130, 215-216
nvtabular/ops/lambdaop.py 39 6 20 6 80% 59, 63, 77, 89, 94, 103
nvtabular/ops/list_slice.py 85 29 42 0 63% 21-22, 141-155, 163-185
nvtabular/ops/logop.py 19 0 6 0 100%
nvtabular/ops/moments.py 69 0 24 0 100%
nvtabular/ops/normalize.py 89 6 22 1 92% 89, 104, 137-138, 165, 176
nvtabular/ops/operator.py 12 1 2 0 93% 53
nvtabular/ops/rename.py 41 3 24 3 91% 47, 88-90
nvtabular/ops/stat_operator.py 8 0 2 0 100%
nvtabular/ops/target_encoding.py 154 9 68 4 92% 168->172, 176->185, 233-234, 250-256, 347->350, 363
nvtabular/ops/value_counts.py 32 0 6 1 97% 40->38
nvtabular/tools/init.py 0 0 0 0 100%
nvtabular/tools/data_gen.py 251 12 86 6 95% 25-26, 124-127, 137-139, 161-162, 313, 323, 349
nvtabular/tools/dataset_inspector.py 50 7 22 1 81% 32-39
nvtabular/tools/inspector_script.py 46 46 0 0 0% 17-168
nvtabular/utils.py 106 43 48 8 54% 31-32, 36-37, 50, 61-62, 64-66, 69, 72, 78, 84, 90-126, 145, 149->153
nvtabular/worker.py 80 5 38 7 90% 24-25, 81->97, 89, 90->97, 97->100, 106, 108, 109->111
nvtabular/workflow/init.py 2 0 0 0 100%
nvtabular/workflow/node.py 7 0 4 0 100%
nvtabular/workflow/workflow.py 201 15 84 10 91% 28-29, 47, 177, 183->197, 209-211, 324, 339-340, 375, 451, 467-469, 482
TOTAL 8415 1641 3493 368 78%
Coverage XML written to file coverage.xml
Required test coverage of 70% reached. Total coverage: 78.23%
=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:16: could not import 's3fs': No module named 's3fs'
SKIPPED [8] tests/unit/test_io.py:613: could not import 'uavro': No module named 'uavro'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:531: not working correctly in ci environment
========= 1584 passed, 10 skipped, 721 warnings in 1492.19s (0:24:52) ==========
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://api.GitHub.com/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins9078128104487811893.sh
Click to view CI Results
GitHub pull request #1321 of commit f809d4a6a8ec908efa09b1d5daf587646f4424d6, no merge conflicts.
Running as SYSTEM
Setting status of f809d4a6a8ec908efa09b1d5daf587646f4424d6 to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/3996/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
> git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
> git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
> git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
> git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
> git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1321/*:refs/remotes/origin/pr/1321/* # timeout=10
> git rev-parse f809d4a6a8ec908efa09b1d5daf587646f4424d6^{commit} # timeout=10
Checking out Revision f809d4a6a8ec908efa09b1d5daf587646f4424d6 (detached)
> git config core.sparsecheckout # timeout=10
> git checkout -f f809d4a6a8ec908efa09b1d5daf587646f4424d6 # timeout=10
Commit message: "Merge branch 'main' into multigpu_tensorflow_movielens"
> git rev-list --no-walk 560c304ae69bac49803547a064f901947fdddf20 # timeout=10
First time build. Skipping changelog.
[nvtabular_tests] $ /bin/bash /tmp/jenkins3617402477249850827.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (21.3.1)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (59.4.0)
Collecting setuptools
Downloading setuptools-60.2.0-py3-none-any.whl (953 kB)
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.1)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.9.0)
Requirement already satisfied: numpy==1.20.3 in /var/jenkins_home/.local/lib/python3.8/site-packages (1.20.3)
Found existing installation: nvtabular 0.8.0+7.gb459467
Can't uninstall 'nvtabular'. No files were found to uninstall.
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/command/easy_install.py:156: EasyInstallDeprecationWarning: easy_install command is deprecated. Use build and pip and other standards-based tools.
warnings.warn(
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
warnings.warn(
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+51.gf809d4a -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+51.gf809d4a -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+51.gf809d4a -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+51.gf809d4a -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so ->
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.8.0+51.gf809d4a is already the active version in easy-install.pth
Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Processing dependencies for nvtabular==0.8.0+51.gf809d4a
Searching for protobuf==3.19.1
Best match: protobuf 3.19.1
Adding protobuf 3.19.1 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for tensorflow-metadata==1.5.0
Best match: tensorflow-metadata 1.5.0
Processing tensorflow_metadata-1.5.0-py3.8.egg
tensorflow-metadata 1.5.0 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/tensorflow_metadata-1.5.0-py3.8.egg
Searching for pyarrow==4.0.1
Best match: pyarrow 4.0.1
Adding pyarrow 4.0.1 to easy-install.pth file
Installing plasma_store script to /var/jenkins_home/.local/bin
Using /usr/local/lib/python3.8/dist-packages
Searching for tqdm==4.62.3
Best match: tqdm 4.62.3
Processing tqdm-4.62.3-py3.8.egg
tqdm 4.62.3 is already the active version in easy-install.pth
Installing tqdm script to /var/jenkins_home/.local/bin
Using /usr/local/lib/python3.8/dist-packages/tqdm-4.62.3-py3.8.egg
Searching for numba==0.54.1
Best match: numba 0.54.1
Adding numba 0.54.1 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for pandas==1.3.5
Best match: pandas 1.3.5
Processing pandas-1.3.5-py3.8-linux-x86_64.egg
pandas 1.3.5 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/pandas-1.3.5-py3.8-linux-x86_64.egg
Searching for distributed==2021.7.1
Best match: distributed 2021.7.1
Processing distributed-2021.7.1-py3.8.egg
distributed 2021.7.1 is already the active version in easy-install.pth
Installing dask-ssh script to /var/jenkins_home/.local/bin
Installing dask-scheduler script to /var/jenkins_home/.local/bin
Installing dask-worker script to /var/jenkins_home/.local/bin
Using /var/jenkins_home/.local/lib/python3.8/site-packages/distributed-2021.7.1-py3.8.egg
Searching for dask==2021.7.1
Best match: dask 2021.7.1
Processing dask-2021.7.1-py3.8.egg
dask 2021.7.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg
Searching for googleapis-common-protos==1.54.0
Best match: googleapis-common-protos 1.54.0
Adding googleapis-common-protos 1.54.0 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for absl-py==0.12.0
Best match: absl-py 0.12.0
Processing absl_py-0.12.0-py3.8.egg
absl-py 0.12.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/absl_py-0.12.0-py3.8.egg
Searching for numpy==1.20.3
Best match: numpy 1.20.3
Adding numpy 1.20.3 to easy-install.pth file
Installing f2py script to /var/jenkins_home/.local/bin
Installing f2py3 script to /var/jenkins_home/.local/bin
Installing f2py3.8 script to /var/jenkins_home/.local/bin
Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for setuptools==59.7.0
Best match: setuptools 59.7.0
Adding setuptools 59.7.0 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for llvmlite==0.37.0
Best match: llvmlite 0.37.0
Processing llvmlite-0.37.0-py3.8-linux-x86_64.egg
llvmlite 0.37.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/llvmlite-0.37.0-py3.8-linux-x86_64.egg
Searching for pytz==2021.3
Best match: pytz 2021.3
Adding pytz 2021.3 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for python-dateutil==2.8.2
Best match: python-dateutil 2.8.2
Adding python-dateutil 2.8.2 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for zict==2.0.0
Best match: zict 2.0.0
Processing zict-2.0.0-py3.8.egg
zict 2.0.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg
Searching for tornado==6.1
Best match: tornado 6.1
Processing tornado-6.1-py3.8-linux-x86_64.egg
tornado 6.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg
Searching for toolz==0.11.2
Best match: toolz 0.11.2
Processing toolz-0.11.2-py3.8.egg
toolz 0.11.2 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/toolz-0.11.2-py3.8.egg
Searching for tblib==1.7.0
Best match: tblib 1.7.0
Processing tblib-1.7.0-py3.8.egg
tblib 1.7.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg
Searching for sortedcontainers==2.4.0
Best match: sortedcontainers 2.4.0
Processing sortedcontainers-2.4.0-py3.8.egg
sortedcontainers 2.4.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg
Searching for PyYAML==5.4.1
Best match: PyYAML 5.4.1
Processing PyYAML-5.4.1-py3.8-linux-x86_64.egg
PyYAML 5.4.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg
Searching for psutil==5.8.0
Best match: psutil 5.8.0
Processing psutil-5.8.0-py3.8-linux-x86_64.egg
psutil 5.8.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg
Searching for msgpack==1.0.3
Best match: msgpack 1.0.3
Processing msgpack-1.0.3-py3.8-linux-x86_64.egg
msgpack 1.0.3 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/msgpack-1.0.3-py3.8-linux-x86_64.egg
Searching for cloudpickle==2.0.0
Best match: cloudpickle 2.0.0
Processing cloudpickle-2.0.0-py3.8.egg
cloudpickle 2.0.0 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/cloudpickle-2.0.0-py3.8.egg
Searching for click==8.0.3
Best match: click 8.0.3
Processing click-8.0.3-py3.8.egg
click 8.0.3 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/click-8.0.3-py3.8.egg
Searching for partd==1.2.0
Best match: partd 1.2.0
Processing partd-1.2.0-py3.8.egg
partd 1.2.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg
Searching for packaging==21.3
Best match: packaging 21.3
Adding packaging 21.3 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for fsspec==2021.11.1
Best match: fsspec 2021.11.1
Adding fsspec 2021.11.1 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for six==1.15.0
Best match: six 1.15.0
Adding six 1.15.0 to easy-install.pth file
Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for HeapDict==1.0.1
Best match: HeapDict 1.0.1
Processing HeapDict-1.0.1-py3.8.egg
HeapDict 1.0.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg
Searching for locket==0.2.1
Best match: locket 0.2.1
Processing locket-0.2.1-py3.8.egg
locket 0.2.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg
Searching for pyparsing==3.0.6
Best match: pyparsing 3.0.6
Adding pyparsing 3.0.6 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Finished processing dependencies for nvtabular==0.8.0+51.gf809d4a
Running black --check
All done! ✨ 🍰 ✨
172 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.dispatch
nvtabular/dispatch.py:607:11: I1101: Module 'numpy.random.mtrand' has no 'RandomState' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)
Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.7) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.11.0, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: xdist-2.5.0, forked-1.4.0, cov-3.0.0
collected 1595 items / 1 skipped / 1594 selected
tests/unit/test_dask_nvt.py ............................................ [ 2%]
....................................................................... [ 7%]
tests/unit/test_io.py .................................................. [ 10%]
........................................................................ [ 14%]
..................ssssssss.............................................. [ 19%]
......... [ 19%]
tests/unit/test_notebooks.py ...... [ 20%]
tests/unit/test_tf4rec.py . [ 20%]
tests/unit/test_tools.py ...................... [ 21%]
tests/unit/test_triton_inference.py ................................ [ 23%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 23%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 25%]
................................................... [ 28%]
tests/unit/framework_utils/test_torch_layers.py . [ 28%]
tests/unit/graph/test_column_schemas.py ................................ [ 30%]
.................................................. [ 33%]
tests/unit/graph/test_column_selector.py .................... [ 35%]
tests/unit/graph/ops/test_selection.py ... [ 35%]
tests/unit/inference/test_ensemble.py ... [ 35%]
tests/unit/inference/test_export.py . [ 35%]
tests/unit/inference/test_graph.py . [ 35%]
tests/unit/inference/test_inference_ops.py .. [ 35%]
tests/unit/inference/test_op_runner.py ... [ 35%]
tests/unit/inference/test_tensorflow_inf_op.py ... [ 36%]
tests/unit/loader/test_dataloader_backend.py ...... [ 36%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 38%]
........................................s.. [ 41%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 43%]
........................................................ [ 46%]
tests/unit/ops/test_categorify.py ...................................... [ 48%]
................................................................. [ 53%]
tests/unit/ops/test_column_similarity.py ........................ [ 54%]
tests/unit/ops/test_fill.py ............................................ [ 57%]
........ [ 57%]
tests/unit/ops/test_hash_bucket.py ......................... [ 59%]
tests/unit/ops/test_join.py ............................................ [ 62%]
........................................................................ [ 66%]
.................................. [ 68%]
tests/unit/ops/test_lambda.py .... [ 69%]
tests/unit/ops/test_normalize.py ....................................... [ 71%]
.. [ 71%]
tests/unit/ops/test_ops.py ............................................. [ 74%]
.......................... [ 76%]
tests/unit/ops/test_ops_schema.py ...................................... [ 78%]
........................................................................ [ 82%]
........................................................................ [ 87%]
....................................... [ 89%]
tests/unit/ops/test_target_encode.py ..................... [ 91%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 91%]
tests/unit/workflow/test_workflow.py ................................... [ 93%]
............................................................ [ 97%]
tests/unit/workflow/test_workflow_node.py ........... [ 98%]
tests/unit/workflow/test_workflow_ops.py ... [ 98%]
tests/unit/workflow/test_workflow_schemas.py ......................... [100%]
=============================== warnings summary ===============================
tests/unit/test_dask_nvt.py: 3 warnings
tests/unit/test_io.py: 24 warnings
tests/unit/test_tf4rec.py: 1 warning
tests/unit/test_tools.py: 2 warnings
tests/unit/test_triton_inference.py: 7 warnings
tests/unit/inference/test_export.py: 1 warning
tests/unit/loader/test_tf_dataloader.py: 54 warnings
tests/unit/loader/test_torch_dataloader.py: 54 warnings
tests/unit/ops/test_categorify.py: 1 warning
tests/unit/ops/test_column_similarity.py: 7 warnings
tests/unit/ops/test_fill.py: 24 warnings
tests/unit/ops/test_join.py: 1 warning
tests/unit/ops/test_normalize.py: 28 warnings
tests/unit/ops/test_ops.py: 4 warnings
tests/unit/ops/test_target_encode.py: 21 warnings
tests/unit/workflow/test_workflow.py: 30 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (1) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/test_dask_nvt.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 8 files.
warnings.warn(
tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 36 warnings
tests/unit/workflow/test_workflow.py: 44 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/workflow.py:86: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for execution. Please use the client argument to initialize a Workflow object with distributed-execution enabled.
warnings.warn(
tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 52 warnings
tests/unit/workflow/test_workflow.py: 35 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dask.py:375: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for this write operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 6 files did not have enough
partitions to create 7 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 8 files did not have enough
partitions to create 9 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 8 files did not have enough
partitions to create 10 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 8 files did not have enough
partitions to create 11 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 13 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 14 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 15 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 16 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 17 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 18 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 19 files.
warnings.warn(
tests/unit/test_io.py: 96 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/init.py:38: DeprecationWarning: ColumnGroup is deprecated, use ColumnSelector instead
warnings.warn("ColumnGroup is deprecated, use ColumnSelector instead", DeprecationWarning)
tests/unit/test_io.py: 12 warnings
tests/unit/workflow/test_workflow.py: 36 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 20 files.
warnings.warn(
tests/unit/test_io.py::test_multifile_parquet[False-Shuffle.PER_WORKER-5-0-csv]
tests/unit/test_io.py::test_multifile_parquet[False-Shuffle.PER_WORKER-5-2-csv]
tests/unit/test_io.py::test_multifile_parquet[False-None-5-0-csv]
tests/unit/test_io.py::test_multifile_parquet[False-None-5-2-csv]
tests/unit/loader/test_torch_dataloader.py::test_horovod_multigpu
tests/unit/loader/test_torch_dataloader.py::test_distributed_multigpu
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 1 files did not have enough
partitions to create 5 files.
warnings.warn(
tests/unit/test_io.py::test_to_parquet_output_files[Shuffle.PER_WORKER-4-6]
tests/unit/test_io.py::test_to_parquet_output_files[False-4-6]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 6 files.
warnings.warn(
tests/unit/test_io.py: 6 warnings
tests/unit/loader/test_tf_dataloader.py: 2 warnings
tests/unit/loader/test_torch_dataloader.py: 12 warnings
tests/unit/workflow/test_workflow.py: 9 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 1 files did not have enough
partitions to create 2 files.
warnings.warn(
tests/unit/test_io.py: 20 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:521: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler is being used for this shuffle operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(
tests/unit/test_tools.py::test_cat_rep[None-1000]
tests/unit/test_tools.py::test_cat_rep[distro1-1000]
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (3) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/test_tools.py::test_cat_rep[None-10000]
tests/unit/test_tools.py::test_cat_rep[distro1-10000]
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (30) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:89: UserWarning: TF model expects int32 for column name-cat, but workflow is producing type int64. Overriding dtype in NVTabular workflow.
warnings.warn(
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:89: UserWarning: TF model expects int32 for column name-string, but workflow is producing type int64. Overriding dtype in NVTabular workflow.
warnings.warn(
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:301: UserWarning: Column x is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:301: UserWarning: Column y is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:301: UserWarning: Column id is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(
tests/unit/loader/test_tf_dataloader.py::test_nested_list
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (2) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/loader/test_tf_dataloader.py::test_sparse_tensors[False]
tests/unit/loader/test_tf_dataloader.py::test_sparse_tensors[True]
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (25) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/loader/test_tf_dataloader.py::test_sparse_tensors[False]
tests/unit/loader/test_tf_dataloader.py::test_sparse_tensors[True]
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (35) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/ops/test_fill.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_fill.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/usr/local/lib/python3.8/dist-packages/pandas-1.3.5-py3.8-linux-x86_64.egg/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame
See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
/var/jenkins_home/.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg/dask/dataframe/core.py:6778: UserWarning: Insufficient elements for head. 1 elements requested, only 0 elements available. Try passing larger npartitions to head.
warnings.warn(msg.format(n, len(r)))
tests/unit/workflow/test_cpu_workflow.py: 6 warnings
tests/unit/workflow/test_workflow.py: 24 warnings
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 1 files did not have enough
partitions to create 10 files.
warnings.warn(
tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-None]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 4 files.
warnings.warn(
-- Docs: https://docs.pytest.org/en/stable/warnings.html
---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Branch BrPart Cover Missing
examples/multi-gpu-movielens/torch_trainer.py 65 0 6 1 99% 32->36
examples/multi-gpu-movielens/torch_trainer_dist.py 63 0 2 0 100%
nvtabular/init.py 18 0 0 0 100%
nvtabular/dispatch.py 341 79 166 29 75% 37-39, 42-46, 51-53, 59-69, 76-77, 118-120, 128-130, 135-138, 142-147, 154, 173, 184, 190, 195->197, 208, 231-234, 265->267, 274, 278-280, 286, 311, 318, 349->354, 352, 355, 358->362, 397, 408-411, 416, 438, 445-448, 478, 482, 523, 547, 549, 556, 571-585, 600, 607
nvtabular/framework_utils/init.py 0 0 0 0 100%
nvtabular/framework_utils/tensorflow/init.py 1 0 0 0 100%
nvtabular/framework_utils/tensorflow/feature_column_utils.py 134 78 90 15 39% 30, 99, 103, 114-130, 140, 143-158, 162, 166-167, 173-198, 207-217, 220-227, 229->233, 234, 239-279, 282
nvtabular/framework_utils/tensorflow/layers/init.py 4 0 0 0 100%
nvtabular/framework_utils/tensorflow/layers/embedding.py 153 12 89 6 91% 60, 68->49, 122, 179, 231-239, 335->343, 357->360, 363-364, 367
nvtabular/framework_utils/tensorflow/layers/interaction.py 47 25 22 1 45% 49, 74-103, 106-110, 113
nvtabular/framework_utils/tensorflow/layers/outer_product.py 30 24 12 0 19% 37-38, 41-60, 71-84, 87
nvtabular/framework_utils/tensorflow/tfrecords_to_parquet.py 58 58 30 0 0% 16-111
nvtabular/framework_utils/torch/init.py 0 0 0 0 100%
nvtabular/framework_utils/torch/layers/init.py 2 0 0 0 100%
nvtabular/framework_utils/torch/layers/embeddings.py 32 2 18 2 92% 50, 91
nvtabular/framework_utils/torch/models.py 45 1 30 4 93% 57->61, 87->89, 93->96, 103
nvtabular/framework_utils/torch/utils.py 75 5 34 5 91% 51->53, 64, 71->76, 75, 118-120
nvtabular/graph/init.py 4 0 0 0 100%
nvtabular/graph/base_operator.py 72 2 22 1 95% 196, 200
nvtabular/graph/graph.py 55 1 36 1 98% 47
nvtabular/graph/node.py 284 55 151 19 77% 49, 73-81, 135, 224, 234-235, 282, 300, 309, 319-324, 329, 331, 337, 351, 361-372, 377->380, 391-399, 408, 409->404, 423-424, 432, 433->416, 439-442, 446, 473, 480-485, 504
nvtabular/graph/ops/init.py 5 0 0 0 100%
nvtabular/graph/ops/concat_columns.py 16 0 2 0 100%
nvtabular/graph/ops/identity.py 6 1 2 0 88% 41
nvtabular/graph/ops/selection.py 22 0 2 0 100%
nvtabular/graph/ops/subset_columns.py 16 1 2 0 94% 62
nvtabular/graph/ops/subtraction.py 20 2 4 0 92% 53-54
nvtabular/graph/schema.py 126 7 59 5 94% 38, 65, 160, 176, 183, 208, 211
nvtabular/graph/schema_io/init.py 0 0 0 0 100%
nvtabular/graph/schema_io/schema_writer_base.py 8 0 2 0 100%
nvtabular/graph/schema_io/schema_writer_pbtxt.py 122 8 58 11 89% 45, 61->68, 64->66, 75, 92->97, 95->97, 118->133, 124-127, 169->185, 177, 181
nvtabular/graph/selector.py 78 0 40 0 100%
nvtabular/graph/tags.py 16 0 2 0 100%
nvtabular/inference/init.py 0 0 0 0 100%
nvtabular/inference/graph/init.py 3 0 0 0 100%
nvtabular/inference/graph/ensemble.py 57 0 26 0 100%
nvtabular/inference/graph/graph.py 27 0 14 0 100%
nvtabular/inference/graph/node.py 15 2 4 0 89% 26-27
nvtabular/inference/graph/op_runner.py 21 0 8 0 100%
nvtabular/inference/graph/ops/init.py 0 0 0 0 100%
nvtabular/inference/graph/ops/operator.py 32 6 12 1 80% 13-14, 19, 36, 40, 49
nvtabular/inference/graph/ops/tensorflow.py 48 11 16 1 78% 34-47
nvtabular/inference/graph/ops/workflow.py 30 0 4 0 100%
nvtabular/inference/triton/init.py 36 12 14 1 58% 42-49, 68, 72, 76-82
nvtabular/inference/triton/benchmarking_tools.py 52 52 10 0 0% 2-103
nvtabular/inference/triton/data_conversions.py 87 3 58 4 95% 32-33, 84
nvtabular/inference/triton/ensemble.py 285 143 100 7 52% 156-192, 236-284, 377-385, 414-430, 483-493, 542-582, 588-604, 608-675, 682->685, 685->681, 702->701, 751, 757-776, 782-806, 813
nvtabular/inference/triton/model/init.py 0 0 0 0 100%
nvtabular/inference/triton/model/model_pt.py 101 101 42 0 0% 27-220
nvtabular/inference/triton/model_config_pb2.py 299 0 2 0 100%
nvtabular/inference/triton/workflow_model.py 52 52 22 0 0% 27-124
nvtabular/inference/workflow/init.py 0 0 0 0 100%
nvtabular/inference/workflow/base.py 114 114 62 0 0% 27-210
nvtabular/inference/workflow/hugectr.py 37 37 16 0 0% 27-87
nvtabular/inference/workflow/pytorch.py 10 10 6 0 0% 27-46
nvtabular/inference/workflow/tensorflow.py 32 32 10 0 0% 26-68
nvtabular/io/init.py 5 0 0 0 100%
nvtabular/io/avro.py 88 88 32 0 0% 16-189
nvtabular/io/csv.py 57 6 22 5 86% 22-23, 99, 103->107, 108, 110, 124
nvtabular/io/dask.py 183 8 74 11 93% 111, 114, 150, 401, 411, 428->431, 439, 443->445, 445->441, 450, 452
nvtabular/io/dataframe_engine.py 61 5 30 6 88% 19-20, 50, 69, 88->92, 92->97, 94->97, 97->116, 125
nvtabular/io/dataframe_iter.py 21 1 14 1 94% 42
nvtabular/io/dataset.py 346 43 168 28 85% 48-49, 268, 270, 283, 308-322, 446->520, 451-454, 459->469, 476->474, 477->481, 494->498, 509, 520->529, 580-581, 582->586, 634, 762, 764, 766, 772, 776-778, 780, 840-841, 875, 882-883, 889, 895, 992-993, 1111-1116, 1122, 1134-1135
nvtabular/io/dataset_engine.py 31 1 6 0 97% 48
nvtabular/io/fsspec_utils.py 115 101 64 0 8% 26-27, 42-98, 103-114, 151-198, 220-270, 275-291, 295-297, 311-322
nvtabular/io/hugectr.py 45 2 26 2 92% 34, 74->97, 101
nvtabular/io/parquet.py 590 48 218 30 88% 34-35, 58, 80->156, 87, 101, 113-127, 140-153, 176, 205-206, 223->248, 234->248, 285-293, 313, 319, 337->339, 353, 371->381, 374, 423->435, 427, 549-554, 592-597, 713->720, 781->786, 787-788, 908, 912, 916, 922, 954, 971, 975, 982->984, 1092->exit, 1096->1093, 1103->1108, 1113->1123, 1128, 1150, 1177
nvtabular/io/shuffle.py 31 7 18 4 73% 42, 44-45, 49, 62-64
nvtabular/io/writer.py 184 13 78 5 92% 24-25, 51, 79, 125, 128, 212, 221, 224, 267, 299-301
nvtabular/io/writer_factory.py 18 2 8 2 85% 35, 60
nvtabular/loader/init.py 0 0 0 0 100%
nvtabular/loader/backend.py 372 16 154 11 95% 27-28, 159-160, 300->302, 312-316, 363-364, 403->407, 404->403, 479, 483-484, 513, 589-590, 625, 633
nvtabular/loader/tensorflow.py 168 20 58 7 88% 66, 83, 97, 311, 339, 350, 365-367, 396-398, 408-416, 419-422
nvtabular/loader/tf_utils.py 57 10 22 6 80% 32->35, 35->37, 42->44, 46, 47->68, 53-54, 62-64, 70-74
nvtabular/loader/torch.py 87 14 26 3 80% 28-30, 33-39, 114, 158-159, 164
nvtabular/ops/init.py 23 0 0 0 100%
nvtabular/ops/add_metadata.py 17 0 4 0 100%
nvtabular/ops/bucketize.py 37 10 20 3 70% 53-55, 59->exit, 62-65, 84-87, 94
nvtabular/ops/categorify.py 658 68 352 47 87% 252, 254, 272, 276, 284, 292, 294, 321, 342-343, 390->394, 398-405, 487-488, 521-526, 629, 725, 742, 787, 865-866, 881-885, 886->850, 904, 912, 919->exit, 943, 946->949, 998->996, 1058, 1063, 1084->1088, 1090->1045, 1096-1099, 1111, 1115, 1119, 1126, 1131-1134, 1212, 1214, 1284->1306, 1290->1306, 1307-1312, 1352, 1369->1374, 1373, 1383->1380, 1388->1380, 1395, 1398, 1406-1416
nvtabular/ops/clip.py 18 2 8 3 81% 44, 52->54, 55
nvtabular/ops/column_similarity.py 123 27 40 5 74% 19-20, 29-30, 82->exit, 112, 147, 211-212, 221-223, 231-247, 264->267, 268, 278
nvtabular/ops/data_stats.py 56 1 24 3 95% 91->93, 95, 97->87
nvtabular/ops/difference_lag.py 33 1 12 1 96% 73->75, 98
nvtabular/ops/dropna.py 8 0 2 0 100%
nvtabular/ops/fill.py 91 14 40 4 80% 63-67, 75-80, 93, 121, 150->152, 162-165
nvtabular/ops/filter.py 20 1 8 1 93% 49
nvtabular/ops/groupby.py 128 8 82 6 93% 72, 83, 93->95, 105->110, 137, 142, 148-153
nvtabular/ops/hash_bucket.py 40 2 22 2 94% 73, 106->112, 118
nvtabular/ops/hashed_cross.py 36 4 17 3 87% 53, 66, 81, 91
nvtabular/ops/join_external.py 95 18 38 7 77% 20-21, 115, 117, 119, 136-162, 178->180, 226->237, 231
nvtabular/ops/join_groupby.py 104 5 38 4 94% 109, 116, 125, 132->131, 225-226
nvtabular/ops/lambdaop.py 39 6 20 6 80% 59, 63, 77, 89, 94, 103
nvtabular/ops/list_slice.py 85 29 42 0 63% 21-22, 141-155, 163-185
nvtabular/ops/logop.py 19 0 6 0 100%
nvtabular/ops/moments.py 69 0 24 0 100%
nvtabular/ops/normalize.py 89 6 22 1 92% 89, 104, 137-138, 165, 176
nvtabular/ops/operator.py 12 1 2 0 93% 53
nvtabular/ops/rename.py 41 7 24 4 83% 47, 64-69, 88-90
nvtabular/ops/stat_operator.py 8 0 2 0 100%
nvtabular/ops/target_encoding.py 157 9 68 4 92% 169->173, 177->186, 243-244, 260-266, 357->360, 373
nvtabular/ops/value_counts.py 32 0 6 1 97% 40->38
nvtabular/tools/init.py 0 0 0 0 100%
nvtabular/tools/data_gen.py 251 12 86 6 95% 25-26, 124-127, 137-139, 161-162, 313, 323, 349
nvtabular/tools/dataset_inspector.py 50 7 22 1 81% 32-39
nvtabular/tools/inspector_script.py 46 46 0 0 0% 17-168
nvtabular/utils.py 106 43 48 8 54% 31-32, 36-37, 50, 61-62, 64-66, 69, 72, 78, 84, 90-126, 145, 149->153
nvtabular/worker.py 80 5 38 7 90% 24-25, 81->97, 89, 90->97, 97->100, 106, 108, 109->111
nvtabular/workflow/init.py 2 0 0 0 100%
nvtabular/workflow/node.py 7 0 4 0 100%
nvtabular/workflow/workflow.py 201 15 84 10 91% 28-29, 47, 177, 183->197, 209-211, 324, 339-340, 375, 451, 467-469, 482
TOTAL 8496 1668 3530 372 78%
Coverage XML written to file coverage.xml
Required test coverage of 70% reached. Total coverage: 78.14%
=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:16: could not import 's3fs': No module named 's3fs'
SKIPPED [8] tests/unit/test_io.py:613: could not import 'uavro': No module named 'uavro'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:531: not working correctly in ci environment
========= 1586 passed, 10 skipped, 720 warnings in 1502.52s (0:25:02) ==========
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://api.GitHub.com/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins4419381298562569054.sh
Click to view CI Results
GitHub pull request #1321 of commit 8459d2faf28cbe9504b56a466ca09fd762e2888e, no merge conflicts.
Running as SYSTEM
Setting status of 8459d2faf28cbe9504b56a466ca09fd762e2888e to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/4036/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
> git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
> git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
> git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
> git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
> git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1321/*:refs/remotes/origin/pr/1321/* # timeout=10
> git rev-parse 8459d2faf28cbe9504b56a466ca09fd762e2888e^{commit} # timeout=10
Checking out Revision 8459d2faf28cbe9504b56a466ca09fd762e2888e (detached)
> git config core.sparsecheckout # timeout=10
> git checkout -f 8459d2faf28cbe9504b56a466ca09fd762e2888e # timeout=10
Commit message: "Fix typo"
> git rev-list --no-walk a1a6d9eb1c782c5310ec2d1acff9743bdaec62bf # timeout=10
First time build. Skipping changelog.
[nvtabular_tests] $ /bin/bash /tmp/jenkins8788338168660742415.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (21.3.1)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (59.4.0)
Collecting setuptools
Downloading setuptools-60.5.0-py3-none-any.whl (958 kB)
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.1)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.9.0)
Requirement already satisfied: numpy==1.20.3 in /var/jenkins_home/.local/lib/python3.8/site-packages (1.20.3)
Found existing installation: nvtabular 0.8.0+7.gb459467
Can't uninstall 'nvtabular'. No files were found to uninstall.
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/command/easy_install.py:156: EasyInstallDeprecationWarning: easy_install command is deprecated. Use build and pip and other standards-based tools.
warnings.warn(
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
warnings.warn(
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+52.g8459d2f -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+52.g8459d2f -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+52.g8459d2f -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+52.g8459d2f -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so ->
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.8.0+52.g8459d2f is already the active version in easy-install.pth
Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Processing dependencies for nvtabular==0.8.0+52.g8459d2f
Searching for protobuf==3.19.1
Best match: protobuf 3.19.1
Adding protobuf 3.19.1 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for tensorflow-metadata==1.5.0
Best match: tensorflow-metadata 1.5.0
Processing tensorflow_metadata-1.5.0-py3.8.egg
tensorflow-metadata 1.5.0 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/tensorflow_metadata-1.5.0-py3.8.egg
Searching for pyarrow==4.0.1
Best match: pyarrow 4.0.1
Adding pyarrow 4.0.1 to easy-install.pth file
Installing plasma_store script to /var/jenkins_home/.local/bin
Using /usr/local/lib/python3.8/dist-packages
Searching for tqdm==4.62.3
Best match: tqdm 4.62.3
Processing tqdm-4.62.3-py3.8.egg
tqdm 4.62.3 is already the active version in easy-install.pth
Installing tqdm script to /var/jenkins_home/.local/bin
Using /usr/local/lib/python3.8/dist-packages/tqdm-4.62.3-py3.8.egg
Searching for numba==0.54.1
Best match: numba 0.54.1
Adding numba 0.54.1 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for pandas==1.3.5
Best match: pandas 1.3.5
Processing pandas-1.3.5-py3.8-linux-x86_64.egg
pandas 1.3.5 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/pandas-1.3.5-py3.8-linux-x86_64.egg
Searching for distributed==2021.7.1
Best match: distributed 2021.7.1
Processing distributed-2021.7.1-py3.8.egg
distributed 2021.7.1 is already the active version in easy-install.pth
Installing dask-ssh script to /var/jenkins_home/.local/bin
Installing dask-scheduler script to /var/jenkins_home/.local/bin
Installing dask-worker script to /var/jenkins_home/.local/bin
Using /var/jenkins_home/.local/lib/python3.8/site-packages/distributed-2021.7.1-py3.8.egg
Searching for dask==2021.7.1
Best match: dask 2021.7.1
Processing dask-2021.7.1-py3.8.egg
dask 2021.7.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg
Searching for googleapis-common-protos==1.54.0
Best match: googleapis-common-protos 1.54.0
Adding googleapis-common-protos 1.54.0 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for absl-py==0.12.0
Best match: absl-py 0.12.0
Processing absl_py-0.12.0-py3.8.egg
absl-py 0.12.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/absl_py-0.12.0-py3.8.egg
Searching for numpy==1.20.3
Best match: numpy 1.20.3
Adding numpy 1.20.3 to easy-install.pth file
Installing f2py script to /var/jenkins_home/.local/bin
Installing f2py3 script to /var/jenkins_home/.local/bin
Installing f2py3.8 script to /var/jenkins_home/.local/bin
Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for llvmlite==0.37.0
Best match: llvmlite 0.37.0
Processing llvmlite-0.37.0-py3.8-linux-x86_64.egg
llvmlite 0.37.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/llvmlite-0.37.0-py3.8-linux-x86_64.egg
Searching for setuptools==59.7.0
Best match: setuptools 59.7.0
Adding setuptools 59.7.0 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for pytz==2021.3
Best match: pytz 2021.3
Adding pytz 2021.3 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for python-dateutil==2.8.2
Best match: python-dateutil 2.8.2
Adding python-dateutil 2.8.2 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for zict==2.0.0
Best match: zict 2.0.0
Processing zict-2.0.0-py3.8.egg
zict 2.0.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg
Searching for tornado==6.1
Best match: tornado 6.1
Processing tornado-6.1-py3.8-linux-x86_64.egg
tornado 6.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg
Searching for toolz==0.11.2
Best match: toolz 0.11.2
Processing toolz-0.11.2-py3.8.egg
toolz 0.11.2 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/toolz-0.11.2-py3.8.egg
Searching for tblib==1.7.0
Best match: tblib 1.7.0
Processing tblib-1.7.0-py3.8.egg
tblib 1.7.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg
Searching for sortedcontainers==2.4.0
Best match: sortedcontainers 2.4.0
Processing sortedcontainers-2.4.0-py3.8.egg
sortedcontainers 2.4.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg
Searching for PyYAML==5.4.1
Best match: PyYAML 5.4.1
Processing PyYAML-5.4.1-py3.8-linux-x86_64.egg
PyYAML 5.4.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg
Searching for psutil==5.8.0
Best match: psutil 5.8.0
Processing psutil-5.8.0-py3.8-linux-x86_64.egg
psutil 5.8.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg
Searching for msgpack==1.0.3
Best match: msgpack 1.0.3
Processing msgpack-1.0.3-py3.8-linux-x86_64.egg
msgpack 1.0.3 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/msgpack-1.0.3-py3.8-linux-x86_64.egg
Searching for cloudpickle==2.0.0
Best match: cloudpickle 2.0.0
Processing cloudpickle-2.0.0-py3.8.egg
cloudpickle 2.0.0 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/cloudpickle-2.0.0-py3.8.egg
Searching for click==8.0.3
Best match: click 8.0.3
Processing click-8.0.3-py3.8.egg
click 8.0.3 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/click-8.0.3-py3.8.egg
Searching for partd==1.2.0
Best match: partd 1.2.0
Processing partd-1.2.0-py3.8.egg
partd 1.2.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg
Searching for packaging==21.3
Best match: packaging 21.3
Adding packaging 21.3 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for fsspec==2021.11.1
Best match: fsspec 2021.11.1
Adding fsspec 2021.11.1 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for six==1.15.0
Best match: six 1.15.0
Adding six 1.15.0 to easy-install.pth file
Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for HeapDict==1.0.1
Best match: HeapDict 1.0.1
Processing HeapDict-1.0.1-py3.8.egg
HeapDict 1.0.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg
Searching for locket==0.2.1
Best match: locket 0.2.1
Processing locket-0.2.1-py3.8.egg
locket 0.2.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg
Searching for pyparsing==3.0.6
Best match: pyparsing 3.0.6
Adding pyparsing 3.0.6 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Finished processing dependencies for nvtabular==0.8.0+52.g8459d2f
Running black --check
All done! ✨ 🍰 ✨
172 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.dispatch
nvtabular/dispatch.py:607:11: I1101: Module 'numpy.random.mtrand' has no 'RandomState' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)
Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.7) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.11.0, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: xdist-2.5.0, forked-1.4.0, cov-3.0.0
collected 1595 items / 1 skipped / 1594 selected
tests/unit/test_dask_nvt.py ............................................ [ 2%]
....................................................................... [ 7%]
tests/unit/test_io.py .................................................. [ 10%]
........................................................................ [ 14%]
..................ssssssss.............................................. [ 19%]
......... [ 19%]
tests/unit/test_notebooks.py ...... [ 20%]
tests/unit/test_tf4rec.py . [ 20%]
tests/unit/test_tools.py ...................... [ 21%]
tests/unit/test_triton_inference.py ................................ [ 23%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 23%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 25%]
................................................... [ 28%]
tests/unit/framework_utils/test_torch_layers.py . [ 28%]
tests/unit/graph/test_column_schemas.py ................................ [ 30%]
.................................................. [ 33%]
tests/unit/graph/test_column_selector.py .................... [ 35%]
tests/unit/graph/ops/test_selection.py ... [ 35%]
tests/unit/inference/test_ensemble.py ... [ 35%]
tests/unit/inference/test_export.py . [ 35%]
tests/unit/inference/test_graph.py . [ 35%]
tests/unit/inference/test_inference_ops.py .. [ 35%]
tests/unit/inference/test_op_runner.py ... [ 35%]
tests/unit/inference/test_tensorflow_inf_op.py ... [ 36%]
tests/unit/loader/test_dataloader_backend.py ...... [ 36%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 38%]
........................................s.. [ 41%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 43%]
........................................................ [ 46%]
tests/unit/ops/test_categorify.py ...................................... [ 48%]
................................................................. [ 53%]
tests/unit/ops/test_column_similarity.py ........................ [ 54%]
tests/unit/ops/test_fill.py ............................................ [ 57%]
........ [ 57%]
tests/unit/ops/test_hash_bucket.py ......................... [ 59%]
tests/unit/ops/test_join.py ............................................ [ 62%]
........................................................................ [ 66%]
.................................. [ 68%]
tests/unit/ops/test_lambda.py .... [ 69%]
tests/unit/ops/test_normalize.py ....................................... [ 71%]
.. [ 71%]
tests/unit/ops/test_ops.py ............................................. [ 74%]
.......................... [ 76%]
tests/unit/ops/test_ops_schema.py ...................................... [ 78%]
........................................................................ [ 82%]
........................................................................ [ 87%]
....................................... [ 89%]
tests/unit/ops/test_target_encode.py ..................... [ 91%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 91%]
tests/unit/workflow/test_workflow.py ................................... [ 93%]
............................................................ [ 97%]
tests/unit/workflow/test_workflow_node.py ........... [ 98%]
tests/unit/workflow/test_workflow_ops.py ... [ 98%]
tests/unit/workflow/test_workflow_schemas.py ......................... [100%]
=============================== warnings summary ===============================
tests/unit/test_dask_nvt.py: 3 warnings
tests/unit/test_io.py: 24 warnings
tests/unit/test_tf4rec.py: 1 warning
tests/unit/test_tools.py: 2 warnings
tests/unit/test_triton_inference.py: 7 warnings
tests/unit/inference/test_export.py: 1 warning
tests/unit/loader/test_tf_dataloader.py: 54 warnings
tests/unit/loader/test_torch_dataloader.py: 54 warnings
tests/unit/ops/test_categorify.py: 1 warning
tests/unit/ops/test_column_similarity.py: 7 warnings
tests/unit/ops/test_fill.py: 24 warnings
tests/unit/ops/test_join.py: 1 warning
tests/unit/ops/test_normalize.py: 28 warnings
tests/unit/ops/test_ops.py: 4 warnings
tests/unit/ops/test_target_encode.py: 21 warnings
tests/unit/workflow/test_workflow.py: 30 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (1) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/test_dask_nvt.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 8 files.
warnings.warn(
tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 36 warnings
tests/unit/workflow/test_workflow.py: 44 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/workflow.py:86: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for execution. Please use the client argument to initialize a Workflow object with distributed-execution enabled.
warnings.warn(
tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 52 warnings
tests/unit/workflow/test_workflow.py: 35 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dask.py:375: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for this write operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 6 files did not have enough
partitions to create 7 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 8 files did not have enough
partitions to create 9 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 8 files did not have enough
partitions to create 10 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 8 files did not have enough
partitions to create 11 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 13 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 14 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 15 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 16 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 17 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 18 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 19 files.
warnings.warn(
tests/unit/test_io.py: 96 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/init.py:38: DeprecationWarning: ColumnGroup is deprecated, use ColumnSelector instead
warnings.warn("ColumnGroup is deprecated, use ColumnSelector instead", DeprecationWarning)
tests/unit/test_io.py: 12 warnings
tests/unit/workflow/test_workflow.py: 36 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 20 files.
warnings.warn(
tests/unit/test_io.py::test_multifile_parquet[False-Shuffle.PER_WORKER-5-0-csv]
tests/unit/test_io.py::test_multifile_parquet[False-Shuffle.PER_WORKER-5-2-csv]
tests/unit/test_io.py::test_multifile_parquet[False-None-5-0-csv]
tests/unit/test_io.py::test_multifile_parquet[False-None-5-2-csv]
tests/unit/loader/test_torch_dataloader.py::test_horovod_multigpu
tests/unit/loader/test_torch_dataloader.py::test_distributed_multigpu
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 1 files did not have enough
partitions to create 5 files.
warnings.warn(
tests/unit/test_io.py::test_to_parquet_output_files[Shuffle.PER_WORKER-4-6]
tests/unit/test_io.py::test_to_parquet_output_files[False-4-6]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 6 files.
warnings.warn(
tests/unit/test_io.py: 6 warnings
tests/unit/loader/test_tf_dataloader.py: 2 warnings
tests/unit/loader/test_torch_dataloader.py: 12 warnings
tests/unit/workflow/test_workflow.py: 9 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 1 files did not have enough
partitions to create 2 files.
warnings.warn(
tests/unit/test_io.py: 20 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:521: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler is being used for this shuffle operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(
tests/unit/test_tools.py::test_cat_rep[None-1000]
tests/unit/test_tools.py::test_cat_rep[distro1-1000]
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (3) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/test_tools.py::test_cat_rep[None-10000]
tests/unit/test_tools.py::test_cat_rep[distro1-10000]
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (30) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:89: UserWarning: TF model expects int32 for column name-cat, but workflow is producing type int64. Overriding dtype in NVTabular workflow.
warnings.warn(
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:89: UserWarning: TF model expects int32 for column name-string, but workflow is producing type int64. Overriding dtype in NVTabular workflow.
warnings.warn(
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:301: UserWarning: Column x is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:301: UserWarning: Column y is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(
tests/unit/inference/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/inference/triton/ensemble.py:301: UserWarning: Column id is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(
tests/unit/loader/test_tf_dataloader.py::test_nested_list
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (2) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/loader/test_tf_dataloader.py::test_sparse_tensors[False]
tests/unit/loader/test_tf_dataloader.py::test_sparse_tensors[True]
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (25) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/loader/test_tf_dataloader.py::test_sparse_tensors[False]
tests/unit/loader/test_tf_dataloader.py::test_sparse_tensors[True]
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (35) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/ops/test_fill.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_fill.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/usr/local/lib/python3.8/dist-packages/pandas-1.3.5-py3.8-linux-x86_64.egg/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame
See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
/var/jenkins_home/.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg/dask/dataframe/core.py:6778: UserWarning: Insufficient elements for head. 1 elements requested, only 0 elements available. Try passing larger npartitions to head.
warnings.warn(msg.format(n, len(r)))
tests/unit/workflow/test_cpu_workflow.py: 6 warnings
tests/unit/workflow/test_workflow.py: 24 warnings
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 1 files did not have enough
partitions to create 10 files.
warnings.warn(
tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-None]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 4 files.
warnings.warn(
-- Docs: https://docs.pytest.org/en/stable/warnings.html
---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Branch BrPart Cover Missing
examples/multi-gpu-movielens/torch_trainer.py 65 0 6 1 99% 32->36
examples/multi-gpu-movielens/torch_trainer_dist.py 63 0 2 0 100%
nvtabular/init.py 18 0 0 0 100%
nvtabular/dispatch.py 341 79 166 29 75% 37-39, 42-46, 51-53, 59-69, 76-77, 118-120, 128-130, 135-138, 142-147, 154, 173, 184, 190, 195->197, 208, 231-234, 265->267, 274, 278-280, 286, 311, 318, 349->354, 352, 355, 358->362, 397, 408-411, 416, 438, 445-448, 478, 482, 523, 547, 549, 556, 571-585, 600, 607
nvtabular/framework_utils/init.py 0 0 0 0 100%
nvtabular/framework_utils/tensorflow/init.py 1 0 0 0 100%
nvtabular/framework_utils/tensorflow/feature_column_utils.py 134 78 90 15 39% 30, 99, 103, 114-130, 140, 143-158, 162, 166-167, 173-198, 207-217, 220-227, 229->233, 234, 239-279, 282
nvtabular/framework_utils/tensorflow/layers/init.py 4 0 0 0 100%
nvtabular/framework_utils/tensorflow/layers/embedding.py 153 12 89 6 91% 60, 68->49, 122, 179, 231-239, 335->343, 357->360, 363-364, 367
nvtabular/framework_utils/tensorflow/layers/interaction.py 47 25 22 1 45% 49, 74-103, 106-110, 113
nvtabular/framework_utils/tensorflow/layers/outer_product.py 30 24 12 0 19% 37-38, 41-60, 71-84, 87
nvtabular/framework_utils/tensorflow/tfrecords_to_parquet.py 58 58 30 0 0% 16-111
nvtabular/framework_utils/torch/init.py 0 0 0 0 100%
nvtabular/framework_utils/torch/layers/init.py 2 0 0 0 100%
nvtabular/framework_utils/torch/layers/embeddings.py 32 2 18 2 92% 50, 91
nvtabular/framework_utils/torch/models.py 45 1 30 4 93% 57->61, 87->89, 93->96, 103
nvtabular/framework_utils/torch/utils.py 75 5 34 5 91% 51->53, 64, 71->76, 75, 118-120
nvtabular/graph/init.py 4 0 0 0 100%
nvtabular/graph/base_operator.py 72 2 22 1 95% 196, 200
nvtabular/graph/graph.py 55 1 36 1 98% 47
nvtabular/graph/node.py 284 55 151 19 77% 49, 73-81, 135, 224, 234-235, 282, 300, 309, 319-324, 329, 331, 337, 351, 361-372, 377->380, 391-399, 408, 409->404, 423-424, 432, 433->416, 439-442, 446, 473, 480-485, 504
nvtabular/graph/ops/init.py 5 0 0 0 100%
nvtabular/graph/ops/concat_columns.py 16 0 2 0 100%
nvtabular/graph/ops/identity.py 6 1 2 0 88% 41
nvtabular/graph/ops/selection.py 22 0 2 0 100%
nvtabular/graph/ops/subset_columns.py 16 1 2 0 94% 62
nvtabular/graph/ops/subtraction.py 20 2 4 0 92% 53-54
nvtabular/graph/schema.py 126 7 59 5 94% 38, 65, 160, 176, 183, 208, 211
nvtabular/graph/schema_io/init.py 0 0 0 0 100%
nvtabular/graph/schema_io/schema_writer_base.py 8 0 2 0 100%
nvtabular/graph/schema_io/schema_writer_pbtxt.py 122 8 58 11 89% 45, 61->68, 64->66, 75, 92->97, 95->97, 118->133, 124-127, 169->185, 177, 181
nvtabular/graph/selector.py 78 0 40 0 100%
nvtabular/graph/tags.py 16 0 2 0 100%
nvtabular/inference/init.py 0 0 0 0 100%
nvtabular/inference/graph/init.py 3 0 0 0 100%
nvtabular/inference/graph/ensemble.py 57 0 26 0 100%
nvtabular/inference/graph/graph.py 27 0 14 0 100%
nvtabular/inference/graph/node.py 15 2 4 0 89% 26-27
nvtabular/inference/graph/op_runner.py 21 0 8 0 100%
nvtabular/inference/graph/ops/init.py 0 0 0 0 100%
nvtabular/inference/graph/ops/operator.py 32 6 12 1 80% 13-14, 19, 36, 40, 49
nvtabular/inference/graph/ops/tensorflow.py 48 11 16 1 78% 34-47
nvtabular/inference/graph/ops/workflow.py 30 0 4 0 100%
nvtabular/inference/triton/init.py 36 12 14 1 58% 42-49, 68, 72, 76-82
nvtabular/inference/triton/benchmarking_tools.py 52 52 10 0 0% 2-103
nvtabular/inference/triton/data_conversions.py 87 3 58 4 95% 32-33, 84
nvtabular/inference/triton/ensemble.py 285 143 100 7 52% 156-192, 236-284, 377-385, 414-430, 483-493, 542-582, 588-604, 608-675, 682->685, 685->681, 702->701, 751, 757-776, 782-806, 813
nvtabular/inference/triton/model/init.py 0 0 0 0 100%
nvtabular/inference/triton/model/model_pt.py 101 101 42 0 0% 27-220
nvtabular/inference/triton/model_config_pb2.py 299 0 2 0 100%
nvtabular/inference/triton/workflow_model.py 52 52 22 0 0% 27-124
nvtabular/inference/workflow/init.py 0 0 0 0 100%
nvtabular/inference/workflow/base.py 114 114 62 0 0% 27-210
nvtabular/inference/workflow/hugectr.py 37 37 16 0 0% 27-87
nvtabular/inference/workflow/pytorch.py 10 10 6 0 0% 27-46
nvtabular/inference/workflow/tensorflow.py 32 32 10 0 0% 26-68
nvtabular/io/init.py 5 0 0 0 100%
nvtabular/io/avro.py 88 88 32 0 0% 16-189
nvtabular/io/csv.py 57 6 22 5 86% 22-23, 99, 103->107, 108, 110, 124
nvtabular/io/dask.py 183 8 74 11 93% 111, 114, 150, 401, 411, 428->431, 439, 443->445, 445->441, 450, 452
nvtabular/io/dataframe_engine.py 61 5 30 6 88% 19-20, 50, 69, 88->92, 92->97, 94->97, 97->116, 125
nvtabular/io/dataframe_iter.py 21 1 14 1 94% 42
nvtabular/io/dataset.py 346 43 168 28 85% 48-49, 268, 270, 283, 308-322, 446->520, 451-454, 459->469, 476->474, 477->481, 494->498, 509, 520->529, 580-581, 582->586, 634, 762, 764, 766, 772, 776-778, 780, 840-841, 875, 882-883, 889, 895, 992-993, 1111-1116, 1122, 1134-1135
nvtabular/io/dataset_engine.py 31 1 6 0 97% 48
nvtabular/io/fsspec_utils.py 115 101 64 0 8% 26-27, 42-98, 103-114, 151-198, 220-270, 275-291, 295-297, 311-322
nvtabular/io/hugectr.py 45 2 26 2 92% 34, 74->97, 101
nvtabular/io/parquet.py 590 48 218 30 88% 34-35, 58, 80->156, 87, 101, 113-127, 140-153, 176, 205-206, 223->248, 234->248, 285-293, 313, 319, 337->339, 353, 371->381, 374, 423->435, 427, 549-554, 592-597, 713->720, 781->786, 787-788, 908, 912, 916, 922, 954, 971, 975, 982->984, 1092->exit, 1096->1093, 1103->1108, 1113->1123, 1128, 1150, 1177
nvtabular/io/shuffle.py 31 7 18 4 73% 42, 44-45, 49, 62-64
nvtabular/io/writer.py 184 13 78 5 92% 24-25, 51, 79, 125, 128, 212, 221, 224, 267, 299-301
nvtabular/io/writer_factory.py 18 2 8 2 85% 35, 60
nvtabular/loader/init.py 0 0 0 0 100%
nvtabular/loader/backend.py 372 16 154 11 95% 27-28, 159-160, 300->302, 312-316, 363-364, 403->407, 404->403, 479, 483-484, 513, 589-590, 625, 633
nvtabular/loader/tensorflow.py 168 20 58 7 88% 66, 83, 97, 311, 339, 350, 365-367, 396-398, 408-416, 419-422
nvtabular/loader/tf_utils.py 57 10 22 6 80% 32->35, 35->37, 42->44, 46, 47->68, 53-54, 62-64, 70-74
nvtabular/loader/torch.py 87 14 26 3 80% 28-30, 33-39, 114, 158-159, 164
nvtabular/ops/init.py 23 0 0 0 100%
nvtabular/ops/add_metadata.py 17 0 4 0 100%
nvtabular/ops/bucketize.py 37 10 20 3 70% 53-55, 59->exit, 62-65, 84-87, 94
nvtabular/ops/categorify.py 658 68 352 47 87% 252, 254, 272, 276, 284, 292, 294, 321, 342-343, 390->394, 398-405, 487-488, 521-526, 629, 725, 742, 787, 865-866, 881-885, 886->850, 904, 912, 919->exit, 943, 946->949, 998->996, 1058, 1063, 1084->1088, 1090->1045, 1096-1099, 1111, 1115, 1119, 1126, 1131-1134, 1212, 1214, 1284->1306, 1290->1306, 1307-1312, 1352, 1369->1374, 1373, 1383->1380, 1388->1380, 1395, 1398, 1406-1416
nvtabular/ops/clip.py 18 2 8 3 81% 44, 52->54, 55
nvtabular/ops/column_similarity.py 123 27 40 5 74% 19-20, 29-30, 82->exit, 112, 147, 211-212, 221-223, 231-247, 264->267, 268, 278
nvtabular/ops/data_stats.py 56 1 24 3 95% 91->93, 95, 97->87
nvtabular/ops/difference_lag.py 33 1 12 1 96% 73->75, 98
nvtabular/ops/dropna.py 8 0 2 0 100%
nvtabular/ops/fill.py 91 14 40 4 80% 63-67, 75-80, 93, 121, 150->152, 162-165
nvtabular/ops/filter.py 20 1 8 1 93% 49
nvtabular/ops/groupby.py 128 8 82 6 93% 72, 83, 93->95, 105->110, 137, 142, 148-153
nvtabular/ops/hash_bucket.py 40 2 22 2 94% 73, 106->112, 118
nvtabular/ops/hashed_cross.py 36 4 17 3 87% 53, 66, 81, 91
nvtabular/ops/join_external.py 95 18 38 7 77% 20-21, 115, 117, 119, 136-162, 178->180, 226->237, 231
nvtabular/ops/join_groupby.py 104 5 38 4 94% 109, 116, 125, 132->131, 225-226
nvtabular/ops/lambdaop.py 39 6 20 6 80% 59, 63, 77, 89, 94, 103
nvtabular/ops/list_slice.py 85 29 42 0 63% 21-22, 141-155, 163-185
nvtabular/ops/logop.py 19 0 6 0 100%
nvtabular/ops/moments.py 69 0 24 0 100%
nvtabular/ops/normalize.py 89 6 22 1 92% 89, 104, 137-138, 165, 176
nvtabular/ops/operator.py 12 1 2 0 93% 53
nvtabular/ops/rename.py 41 7 24 4 83% 47, 64-69, 88-90
nvtabular/ops/stat_operator.py 8 0 2 0 100%
nvtabular/ops/target_encoding.py 157 9 68 4 92% 169->173, 177->186, 243-244, 260-266, 357->360, 373
nvtabular/ops/value_counts.py 32 0 6 1 97% 40->38
nvtabular/tools/init.py 0 0 0 0 100%
nvtabular/tools/data_gen.py 251 12 86 6 95% 25-26, 124-127, 137-139, 161-162, 313, 323, 349
nvtabular/tools/dataset_inspector.py 50 7 22 1 81% 32-39
nvtabular/tools/inspector_script.py 46 46 0 0 0% 17-168
nvtabular/utils.py 106 43 48 8 54% 31-32, 36-37, 50, 61-62, 64-66, 69, 72, 78, 84, 90-126, 145, 149->153
nvtabular/worker.py 80 5 38 7 90% 24-25, 81->97, 89, 90->97, 97->100, 106, 108, 109->111
nvtabular/workflow/init.py 2 0 0 0 100%
nvtabular/workflow/node.py 7 0 4 0 100%
nvtabular/workflow/workflow.py 201 15 84 10 91% 28-29, 47, 177, 183->197, 209-211, 324, 339-340, 375, 451, 467-469, 482
TOTAL 8496 1668 3530 372 78%
Coverage XML written to file coverage.xml
Required test coverage of 70% reached. Total coverage: 78.14%
=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:16: could not import 's3fs': No module named 's3fs'
SKIPPED [8] tests/unit/test_io.py:613: could not import 'uavro': No module named 'uavro'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:531: not working correctly in ci environment
========= 1586 passed, 10 skipped, 720 warnings in 1474.29s (0:24:34) ==========
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://api.GitHub.com/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins1792757220000546090.sh
Click to view CI Results
GitHub pull request #1321 of commit e9bd538fd1e826a58c1b9891dfbc76aac1ef0274, no merge conflicts.
Running as SYSTEM
Setting status of e9bd538fd1e826a58c1b9891dfbc76aac1ef0274 to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/4037/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
> git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
> git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
> git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
> git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
> git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1321/*:refs/remotes/origin/pr/1321/* # timeout=10
> git rev-parse e9bd538fd1e826a58c1b9891dfbc76aac1ef0274^{commit} # timeout=10
Checking out Revision e9bd538fd1e826a58c1b9891dfbc76aac1ef0274 (detached)
> git config core.sparsecheckout # timeout=10
> git checkout -f e9bd538fd1e826a58c1b9891dfbc76aac1ef0274 # timeout=10
Commit message: "Merge branch 'main' into multigpu_tensorflow_movielens"
> git rev-list --no-walk 8459d2faf28cbe9504b56a466ca09fd762e2888e # timeout=10
[nvtabular_tests] $ /bin/bash /tmp/jenkins8808612084982961916.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (21.3.1)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (59.4.0)
Collecting setuptools
Downloading setuptools-60.5.0-py3-none-any.whl (958 kB)
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.1)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.9.0)
Requirement already satisfied: numpy==1.20.3 in /var/jenkins_home/.local/lib/python3.8/site-packages (1.20.3)
Found existing installation: nvtabular 0.8.0+7.gb459467
Can't uninstall 'nvtabular'. No files were found to uninstall.
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/command/easy_install.py:156: EasyInstallDeprecationWarning: easy_install command is deprecated. Use build and pip and other standards-based tools.
warnings.warn(
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
warnings.warn(
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+62.ge9bd538 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+62.ge9bd538 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+62.ge9bd538 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.8.0+62.ge9bd538 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so ->
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.8.0+62.ge9bd538 is already the active version in easy-install.pth
Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Processing dependencies for nvtabular==0.8.0+62.ge9bd538
Searching for packaging==21.3
Best match: packaging 21.3
Adding packaging 21.3 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for protobuf==3.19.1
Best match: protobuf 3.19.1
Adding protobuf 3.19.1 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for tensorflow-metadata==1.5.0
Best match: tensorflow-metadata 1.5.0
Processing tensorflow_metadata-1.5.0-py3.8.egg
tensorflow-metadata 1.5.0 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/tensorflow_metadata-1.5.0-py3.8.egg
Searching for pyarrow==4.0.1
Best match: pyarrow 4.0.1
Adding pyarrow 4.0.1 to easy-install.pth file
Installing plasma_store script to /var/jenkins_home/.local/bin
Using /usr/local/lib/python3.8/dist-packages
Searching for tqdm==4.62.3
Best match: tqdm 4.62.3
Processing tqdm-4.62.3-py3.8.egg
tqdm 4.62.3 is already the active version in easy-install.pth
Installing tqdm script to /var/jenkins_home/.local/bin
Using /usr/local/lib/python3.8/dist-packages/tqdm-4.62.3-py3.8.egg
Searching for numba==0.54.1
Best match: numba 0.54.1
Adding numba 0.54.1 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for pandas==1.3.5
Best match: pandas 1.3.5
Processing pandas-1.3.5-py3.8-linux-x86_64.egg
pandas 1.3.5 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/pandas-1.3.5-py3.8-linux-x86_64.egg
Searching for distributed==2021.7.1
Best match: distributed 2021.7.1
Processing distributed-2021.7.1-py3.8.egg
distributed 2021.7.1 is already the active version in easy-install.pth
Installing dask-ssh script to /var/jenkins_home/.local/bin
Installing dask-scheduler script to /var/jenkins_home/.local/bin
Installing dask-worker script to /var/jenkins_home/.local/bin
Using /var/jenkins_home/.local/lib/python3.8/site-packages/distributed-2021.7.1-py3.8.egg
Searching for dask==2021.7.1
Best match: dask 2021.7.1
Processing dask-2021.7.1-py3.8.egg
dask 2021.7.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg
Searching for pyparsing==3.0.6
Best match: pyparsing 3.0.6
Adding pyparsing 3.0.6 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for googleapis-common-protos==1.54.0
Best match: googleapis-common-protos 1.54.0
Adding googleapis-common-protos 1.54.0 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for absl-py==0.12.0
Best match: absl-py 0.12.0
Processing absl_py-0.12.0-py3.8.egg
absl-py 0.12.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/absl_py-0.12.0-py3.8.egg
Searching for numpy==1.20.3
Best match: numpy 1.20.3
Adding numpy 1.20.3 to easy-install.pth file
Installing f2py script to /var/jenkins_home/.local/bin
Installing f2py3 script to /var/jenkins_home/.local/bin
Installing f2py3.8 script to /var/jenkins_home/.local/bin
Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for setuptools==59.7.0
Best match: setuptools 59.7.0
Adding setuptools 59.7.0 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for llvmlite==0.37.0
Best match: llvmlite 0.37.0
Processing llvmlite-0.37.0-py3.8-linux-x86_64.egg
llvmlite 0.37.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/llvmlite-0.37.0-py3.8-linux-x86_64.egg
Searching for pytz==2021.3
Best match: pytz 2021.3
Adding pytz 2021.3 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for python-dateutil==2.8.2
Best match: python-dateutil 2.8.2
Adding python-dateutil 2.8.2 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for zict==2.0.0
Best match: zict 2.0.0
Processing zict-2.0.0-py3.8.egg
zict 2.0.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg
Searching for tornado==6.1
Best match: tornado 6.1
Processing tornado-6.1-py3.8-linux-x86_64.egg
tornado 6.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg
Searching for toolz==0.11.2
Best match: toolz 0.11.2
Processing toolz-0.11.2-py3.8.egg
toolz 0.11.2 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/toolz-0.11.2-py3.8.egg
Searching for tblib==1.7.0
Best match: tblib 1.7.0
Processing tblib-1.7.0-py3.8.egg
tblib 1.7.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg
Searching for sortedcontainers==2.4.0
Best match: sortedcontainers 2.4.0
Processing sortedcontainers-2.4.0-py3.8.egg
sortedcontainers 2.4.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg
Searching for PyYAML==5.4.1
Best match: PyYAML 5.4.1
Processing PyYAML-5.4.1-py3.8-linux-x86_64.egg
PyYAML 5.4.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg
Searching for psutil==5.8.0
Best match: psutil 5.8.0
Processing psutil-5.8.0-py3.8-linux-x86_64.egg
psutil 5.8.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg
Searching for msgpack==1.0.3
Best match: msgpack 1.0.3
Processing msgpack-1.0.3-py3.8-linux-x86_64.egg
msgpack 1.0.3 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/msgpack-1.0.3-py3.8-linux-x86_64.egg
Searching for cloudpickle==2.0.0
Best match: cloudpickle 2.0.0
Processing cloudpickle-2.0.0-py3.8.egg
cloudpickle 2.0.0 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/cloudpickle-2.0.0-py3.8.egg
Searching for click==8.0.3
Best match: click 8.0.3
Processing click-8.0.3-py3.8.egg
click 8.0.3 is already the active version in easy-install.pth
Using /usr/local/lib/python3.8/dist-packages/click-8.0.3-py3.8.egg
Searching for partd==1.2.0
Best match: partd 1.2.0
Processing partd-1.2.0-py3.8.egg
partd 1.2.0 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg
Searching for fsspec==2021.11.1
Best match: fsspec 2021.11.1
Adding fsspec 2021.11.1 to easy-install.pth file
Using /usr/local/lib/python3.8/dist-packages
Searching for six==1.15.0
Best match: six 1.15.0
Adding six 1.15.0 to easy-install.pth file
Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for HeapDict==1.0.1
Best match: HeapDict 1.0.1
Processing HeapDict-1.0.1-py3.8.egg
HeapDict 1.0.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg
Searching for locket==0.2.1
Best match: locket 0.2.1
Processing locket-0.2.1-py3.8.egg
locket 0.2.1 is already the active version in easy-install.pth
Using /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg
Finished processing dependencies for nvtabular==0.8.0+62.ge9bd538
Running black --check
All done! ✨ 🍰 ✨
174 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.dispatch
nvtabular/dispatch.py:607:11: I1101: Module 'numpy.random.mtrand' has no 'RandomState' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)
Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.7) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.11.0, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: xdist-2.5.0, forked-1.4.0, cov-3.0.0
collected 1641 items / 3 skipped / 1638 selected
tests/unit/test_dask_nvt.py ............................................ [ 2%]
....................................................................... [ 7%]
tests/unit/test_io.py .................................................. [ 10%]
........................................................................ [ 14%]
..................ssssssss.............................................. [ 18%]
......... [ 19%]
tests/unit/test_notebooks.py ...... [ 19%]
tests/unit/test_tf4rec.py . [ 19%]
tests/unit/test_tools.py ...................... [ 21%]
tests/unit/test_triton_inference.py ................................ [ 23%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 23%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 24%]
................................................... [ 27%]
tests/unit/framework_utils/test_torch_layers.py . [ 27%]
tests/unit/graph/test_base_operator.py .... [ 28%]
tests/unit/graph/test_column_schemas.py ................................ [ 30%]
.................................................. [ 33%]
tests/unit/graph/test_column_selector.py ..................... [ 34%]
tests/unit/graph/ops/test_selection.py ... [ 34%]
tests/unit/inference/test_graph.py . [ 34%]
tests/unit/inference/test_inference_ops.py .. [ 34%]
tests/unit/inference/test_op_runner.py ... [ 35%]
tests/unit/inference/test_tensorflow_inf_op.py ... [ 35%]
tests/unit/loader/test_dataloader_backend.py ...... [ 35%]
tests/unit/loader/test_tf_dataloader.py ..........F
=================================== FAILURES ===================================
________________ test_tf_gpu_dl[cpu-False-True-1-parquet-0.01] _________________
tmpdir = local('/tmp/pytest-of-jenkins/pytest-50/test_tf_gpu_dl_cpu_False_True_0')
paths = ['/tmp/pytest-of-jenkins/pytest-50/parquet0/dataset-0.parquet', '/tmp/pytest-of-jenkins/pytest-50/parquet0/dataset-1.parquet']
use_paths = True, device = 'cpu', cpu_true = False
dataset = <nvtabular.io.dataset.Dataset object at 0x7faff8759bb0>
batch_size = 1, gpu_memory_frac = 0.01, engine = 'parquet'
@pytest.mark.parametrize("gpu_memory_frac", [0.01, 0.06])
@pytest.mark.parametrize("engine", ["parquet"])
@pytest.mark.parametrize("batch_size", [1, 10, 100])
@pytest.mark.parametrize("use_paths", [True, False])
@pytest.mark.parametrize("cpu_true", [False, True])
@pytest.mark.parametrize("device", ["cpu", 0])
def test_tf_gpu_dl(
tmpdir, paths, use_paths, device, cpu_true, dataset, batch_size, gpu_memory_frac, engine
):
cont_names = ["x", "y", "id"]
cat_names = ["name-string"]
label_name = ["label"]
if engine == "parquet":
cat_names.append("name-cat")
columns = cont_names + cat_names
conts = cont_names >> ops.FillMedian() >> ops.Normalize()
cats = cat_names >> ops.Categorify()
workflow = nvt.Workflow(conts + cats + label_name)
workflow.fit(dataset)
tests/unit/loader/test_tf_dataloader.py:255:
nvtabular/workflow/workflow.py:216: in fit
results = dask.compute(stats, scheduler="synchronous")[0]
../../../.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg/dask/base.py:568: in compute
results = schedule(dsk, keys, **kwargs)
../../../.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg/dask/local.py:560: in get_sync
return get_async(
../../../.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg/dask/local.py:503: in get_async
for key, res_info, failed in queue_get(queue).result():
/usr/lib/python3.8/concurrent/futures/_base.py:437: in result
return self.__get_result()
/usr/lib/python3.8/concurrent/futures/_base.py:389: in __get_result
raise self._exception
../../../.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg/dask/local.py:545: in submit
fut.set_result(fn(*args, **kwargs))
../../../.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg/dask/local.py:237: in batch_execute_tasks
return [execute_task(a) for a in it]
../../../.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg/dask/local.py:237: in
return [execute_task(a) for a in it]
../../../.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg/dask/local.py:228: in execute_task
result = pack_exception(e, dumps)
../../../.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg/dask/local.py:223: in execute_task
result = _execute_task(task, data)
../../../.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg/dask/core.py:121: in _execute_task
return func((_execute_task(a, cache) for a in args))
../../../.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg/dask/optimization.py:969: in call
return core.get(self.dsk, self.outkey, dict(zip(self.inkeys, args)))
../../../.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg/dask/core.py:151: in get
result = _execute_task(task, cache)
../../../.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg/dask/core.py:121: in _execute_task
return func( (_execute_task(a, cache) for a in args))
../../../.local/lib/python3.8/site-packages/dask-2021.7.1-py3.8.egg/dask/utils.py:35: in apply
return func(*args, **kwargs)
nvtabular/workflow/workflow.py:462: in _transform_partition
output_df = node.op.transform(selection, input_df)
/usr/lib/python3.8/contextlib.py:75: in inner
return func(*args, **kwds)
nvtabular/ops/fill.py:107: in transform
df[col] = df[col].fillna(self.medians[col])
/usr/local/lib/python3.8/dist-packages/cudf/core/series.py:2659: in fillna
return super().fillna(
/usr/local/lib/python3.8/dist-packages/cudf/core/frame.py:1334: in fillna
copy_data[name] = copy_data[name].fillna(value[name], method)
self = <cudf.core.column.numerical.NumericalColumn object at 0x7fb000412f40>
[
975,
996,
975,
1040,
1007,
978,
987,
999,
991,
1022,
...
1019,
1041,
985,
947,
1035,
1009,
984,
1036,
972,
1035
]
dtype: int64
fill_value = 999.5, method = None, dtype = None, fill_nan = True
def fillna(
self,
fill_value: Any = None,
method: str = None,
dtype: Dtype = None,
fill_nan: bool = True,
) -> NumericalColumn:
"""
Fill null values with *fill_value*
"""
if fill_nan:
col = self.nans_to_nulls()
else:
col = self
if method is not None:
return super(NumericalColumn, col).fillna(fill_value, method)
if fill_value is None:
raise ValueError("Must specify either 'fill_value' or 'method'")
if (
isinstance(fill_value, cudf.Scalar)
and fill_value.dtype == col.dtype
):
return super(NumericalColumn, col).fillna(fill_value, method)
if np.isscalar(fill_value):
# cast safely to the same dtype as self
fill_value_casted = col.dtype.type(fill_value)
if not np.isnan(fill_value) and (fill_value_casted != fill_value):
raise TypeError(
f"Cannot safely cast non-equivalent "
f"{type(fill_value).__name__} to {col.dtype.name}"
)
E TypeError: Cannot safely cast non-equivalent float to int64
/usr/local/lib/python3.8/dist-packages/cudf/core/column/numerical.py:380: TypeError
------------------------------ Captured log call -------------------------------
ERROR nvtabular:workflow.py:464 Failed to transform operator <nvtabular.ops.fill.FillMedian object at 0x7fb0180bd700>
Traceback (most recent call last):
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/workflow.py", line 462, in _transform_partition
output_df = node.op.transform(selection, input_df)
File "/usr/lib/python3.8/contextlib.py", line 75, in inner
return func(*args, **kwds)
File "/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py", line 107, in transform
df[col] = df[col].fillna(self.medians[col])
File "/usr/local/lib/python3.8/dist-packages/cudf/core/series.py", line 2659, in fillna
return super().fillna(
File "/usr/local/lib/python3.8/dist-packages/cudf/core/frame.py", line 1334, in fillna
copy_data[name] = copy_data[name].fillna(value[name], method)
File "/usr/local/lib/python3.8/dist-packages/cudf/core/column/numerical.py", line 380, in fillna
raise TypeError(
TypeError: Cannot safely cast non-equivalent float to int64
=============================== warnings summary ===============================
tests/unit/test_dask_nvt.py: 3 warnings
tests/unit/test_io.py: 24 warnings
tests/unit/test_tf4rec.py: 1 warning
tests/unit/test_tools.py: 2 warnings
tests/unit/test_triton_inference.py: 7 warnings
tests/unit/loader/test_tf_dataloader.py: 2 warnings
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (1) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/test_dask_nvt.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 8 files.
warnings.warn(
tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 36 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/workflow.py:86: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for execution. Please use the client argument to initialize a Workflow object with distributed-execution enabled.
warnings.warn(
tests/unit/test_dask_nvt.py: 2 warnings
tests/unit/test_io.py: 52 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dask.py:375: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for this write operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 6 files did not have enough
partitions to create 7 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 8 files did not have enough
partitions to create 9 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 8 files did not have enough
partitions to create 10 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 8 files did not have enough
partitions to create 11 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 13 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 14 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 15 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 16 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 17 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 18 files.
warnings.warn(
tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 12 files did not have enough
partitions to create 19 files.
warnings.warn(
tests/unit/test_io.py: 96 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/init.py:38: DeprecationWarning: ColumnGroup is deprecated, use ColumnSelector instead
warnings.warn("ColumnGroup is deprecated, use ColumnSelector instead", DeprecationWarning)
tests/unit/test_io.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 20 files.
warnings.warn(
tests/unit/test_io.py::test_multifile_parquet[False-Shuffle.PER_WORKER-5-0-csv]
tests/unit/test_io.py::test_multifile_parquet[False-Shuffle.PER_WORKER-5-2-csv]
tests/unit/test_io.py::test_multifile_parquet[False-None-5-0-csv]
tests/unit/test_io.py::test_multifile_parquet[False-None-5-2-csv]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 1 files did not have enough
partitions to create 5 files.
warnings.warn(
tests/unit/test_io.py::test_to_parquet_output_files[Shuffle.PER_WORKER-4-6]
tests/unit/test_io.py::test_to_parquet_output_files[False-4-6]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 2 files did not have enough
partitions to create 6 files.
warnings.warn(
tests/unit/test_io.py::test_parquet_lists[2-Shuffle.PER_PARTITION-0]
tests/unit/test_io.py::test_parquet_lists[2-Shuffle.PER_PARTITION-1]
tests/unit/test_io.py::test_parquet_lists[2-Shuffle.PER_PARTITION-2]
tests/unit/test_io.py::test_parquet_lists[2-None-0]
tests/unit/test_io.py::test_parquet_lists[2-None-1]
tests/unit/test_io.py::test_parquet_lists[2-None-2]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:868: UserWarning: Only created 1 files did not have enough
partitions to create 2 files.
warnings.warn(
tests/unit/test_io.py: 20 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:521: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler is being used for this shuffle operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(
tests/unit/test_tools.py::test_cat_rep[None-1000]
tests/unit/test_tools.py::test_cat_rep[distro1-1000]
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (3) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/test_tools.py::test_cat_rep[None-10000]
tests/unit/test_tools.py::test_cat_rep[distro1-10000]
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (30) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
tests/unit/loader/test_tf_dataloader.py::test_nested_list
/usr/local/lib/python3.8/dist-packages/numba/cuda/compiler.py:865: NumbaPerformanceWarning: [1mGrid size (2) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.[0m
warn(NumbaPerformanceWarning(msg))
-- Docs: https://docs.pytest.org/en/stable/warnings.html
---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Branch BrPart Cover Missing
nvtabular/init.py 18 0 0 0 100%
nvtabular/dispatch.py 341 124 166 35 60% 37-39, 42-46, 51-53, 59-69, 76-77, 107, 110, 112, 116-120, 128-130, 135-138, 142-147, 154, 173, 184, 190, 195->197, 207-210, 223-226, 231-234, 245, 248, 265-266, 274, 278-280, 286, 303, 311, 318, 346-362, 371-377, 382, 397, 408-411, 416, 432, 438, 445-448, 462-464, 466, 468, 472-483, 523, 540, 547, 549, 556, 571-585, 600, 607
nvtabular/framework_utils/init.py 0 0 0 0 100%
nvtabular/framework_utils/tensorflow/init.py 1 0 0 0 100%
nvtabular/framework_utils/tensorflow/feature_column_utils.py 134 78 90 15 39% 30, 99, 103, 114-130, 140, 143-158, 162, 166-167, 173-198, 207-217, 220-227, 229->233, 234, 239-279, 282
nvtabular/framework_utils/tensorflow/layers/init.py 4 0 0 0 100%
nvtabular/framework_utils/tensorflow/layers/embedding.py 153 12 89 6 91% 60, 68->49, 122, 179, 231-239, 335->343, 357->360, 363-364, 367
nvtabular/framework_utils/tensorflow/layers/interaction.py 47 25 22 1 45% 49, 74-103, 106-110, 113
nvtabular/framework_utils/tensorflow/layers/outer_product.py 30 24 12 0 19% 37-38, 41-60, 71-84, 87
nvtabular/framework_utils/tensorflow/tfrecords_to_parquet.py 58 58 30 0 0% 16-111
nvtabular/framework_utils/torch/init.py 0 0 0 0 100%
nvtabular/framework_utils/torch/layers/init.py 2 0 0 0 100%
nvtabular/framework_utils/torch/layers/embeddings.py 32 2 18 2 92% 50, 91
nvtabular/framework_utils/torch/models.py 45 1 30 4 93% 57->61, 87->89, 93->96, 103
nvtabular/framework_utils/torch/utils.py 75 9 34 7 85% 51->53, 64, 71->76, 75, 109, 118-120, 129-131
nvtabular/graph/init.py 4 0 0 0 100%
nvtabular/graph/base_operator.py 95 7 36 6 89% 103-108, 143->148, 157->161, 169->173, 174, 227, 231
nvtabular/graph/graph.py 44 5 24 0 87% 84-89
nvtabular/graph/node.py 282 77 151 20 69% 49, 63, 73-81, 86->89, 133, 136, 202-218, 221-240, 283, 301, 316-321, 326, 328, 334, 348, 358-369, 374->377, 388-396, 405, 406->401, 420-421, 429, 430->413, 436-439, 443, 470, 477-482, 501
nvtabular/graph/ops/init.py 5 0 0 0 100%
nvtabular/graph/ops/concat_columns.py 18 0 2 0 100%
nvtabular/graph/ops/identity.py 6 1 2 0 88% 41
nvtabular/graph/ops/selection.py 20 0 2 0 100%
nvtabular/graph/ops/subset_columns.py 15 1 2 0 94% 60
nvtabular/graph/ops/subtraction.py 21 11 4 0 48% 26-27, 36, 45-51, 54-55
nvtabular/graph/schema.py 129 8 65 7 92% 38, 64, 156, 161->exit, 165, 178, 185, 210, 213, 218->217
nvtabular/graph/schema_io/init.py 0 0 0 0 100%
nvtabular/graph/schema_io/schema_writer_base.py 8 0 2 0 100%
nvtabular/graph/schema_io/schema_writer_pbtxt.py 122 11 58 13 87% 45, 61->68, 64->66, 75, 92->97, 95->97, 118->133, 121-122, 124-127, 129, 169->185, 177, 181
nvtabular/graph/selector.py 88 2 48 1 98% 121, 158
nvtabular/graph/tags.py 16 0 2 0 100%
nvtabular/inference/init.py 0 0 0 0 100%
nvtabular/inference/graph/init.py 3 0 0 0 100%
nvtabular/inference/graph/ensemble.py 57 42 26 0 20% 39-103, 107-118
nvtabular/inference/graph/graph.py 27 4 14 2 80% 42, 50-57
nvtabular/inference/graph/node.py 15 9 4 0 42% 22-23, 26-27, 31-36
nvtabular/inference/graph/op_runner.py 21 0 8 0 100%
nvtabular/inference/graph/ops/init.py 0 0 0 0 100%
nvtabular/inference/graph/ops/operator.py 32 6 12 1 80% 13-14, 19, 36, 40, 49
nvtabular/inference/graph/ops/tensorflow.py 50 18 16 2 64% 34-47, 79-83, 92-95
nvtabular/inference/graph/ops/workflow.py 30 1 4 1 94% 49
nvtabular/inference/triton/init.py 36 12 14 1 58% 42-49, 68, 72, 76-82
nvtabular/inference/triton/benchmarking_tools.py 52 52 10 0 0% 2-103
nvtabular/inference/triton/data_conversions.py 87 3 58 4 95% 32-33, 84
nvtabular/inference/triton/ensemble.py 285 147 100 9 51% 89-93, 156-192, 236-284, 301-305, 377-385, 414-430, 483-493, 542-582, 588-604, 608-675, 682->685, 685->681, 702->701, 751, 757-776, 782-806, 813
nvtabular/inference/triton/model/init.py 0 0 0 0 100%
nvtabular/inference/triton/model/model_pt.py 101 101 42 0 0% 27-220
nvtabular/inference/triton/model_config_pb2.py 299 0 2 0 100%
nvtabular/inference/triton/workflow_model.py 52 52 22 0 0% 27-124
nvtabular/inference/workflow/init.py 0 0 0 0 100%
nvtabular/inference/workflow/base.py 114 114 62 0 0% 27-210
nvtabular/inference/workflow/hugectr.py 37 37 16 0 0% 27-87
nvtabular/inference/workflow/pytorch.py 10 10 6 0 0% 27-46
nvtabular/inference/workflow/tensorflow.py 32 32 10 0 0% 26-68
nvtabular/io/init.py 5 0 0 0 100%
nvtabular/io/avro.py 88 88 32 0 0% 16-189
nvtabular/io/csv.py 57 6 22 5 86% 22-23, 99, 103->107, 108, 110, 124
nvtabular/io/dask.py 183 9 74 12 92% 111, 114, 150, 226, 401, 411, 428->431, 439, 443->445, 445->441, 450, 452
nvtabular/io/dataframe_engine.py 61 5 30 6 88% 19-20, 50, 69, 88->92, 92->97, 94->97, 97->116, 125
nvtabular/io/dataframe_iter.py 21 1 14 1 94% 42
nvtabular/io/dataset.py 346 43 168 28 85% 48-49, 268, 270, 283, 308-322, 446->520, 451-454, 459->469, 476->474, 477->481, 494->498, 509, 520->529, 580-581, 582->586, 634, 762, 764, 766, 772, 776-778, 780, 840-841, 875, 882-883, 889, 895, 992-993, 1111-1116, 1122, 1134-1135
nvtabular/io/dataset_engine.py 31 2 6 1 92% 48, 74
nvtabular/io/fsspec_utils.py 115 101 64 0 8% 26-27, 42-98, 103-114, 151-198, 220-270, 275-291, 295-297, 311-322
nvtabular/io/hugectr.py 45 2 26 2 92% 34, 74->97, 101
nvtabular/io/parquet.py 591 50 218 34 88% 35-36, 59, 81->161, 92, 106, 118-132, 145-158, 181, 210-211, 228->253, 239->253, 247->253, 284->300, 290-298, 318, 324, 342->344, 358, 376->386, 379, 432, 440, 554-559, 597-602, 718->725, 786->791, 792-793, 913, 917, 921, 927, 959, 976, 980, 987->989, 1097->exit, 1101->1098, 1108->1113, 1118->1128, 1133, 1155, 1182, 1186
nvtabular/io/shuffle.py 31 7 18 4 73% 42, 44-45, 49, 62-64
nvtabular/io/writer.py 184 13 78 5 92% 24-25, 51, 79, 125, 128, 212, 221, 224, 267, 299-301
nvtabular/io/writer_factory.py 18 2 8 2 85% 35, 60
nvtabular/loader/init.py 0 0 0 0 100%
nvtabular/loader/backend.py 372 30 154 15 91% 27-28, 93, 98-99, 126, 143, 158-160, 294, 300->302, 312-316, 363-364, 403->407, 404->403, 479, 483-484, 509-518, 589-590, 624-628, 633
nvtabular/loader/tensorflow.py 168 40 58 7 77% 55-57, 66, 83, 92-97, 311, 339, 350, 365-367, 390-400, 404, 408-416, 419-422, 425-429
nvtabular/loader/tf_utils.py 57 10 22 6 80% 32->35, 35->37, 42->44, 46, 47->68, 53-54, 62-64, 70-74
nvtabular/loader/torch.py 87 39 26 3 50% 28-30, 33-39, 114, 119, 124-130, 154-166, 169, 174-179, 182-187, 190-191
nvtabular/ops/init.py 23 0 0 0 100%
nvtabular/ops/add_metadata.py 15 3 2 0 82% 33, 37, 41
nvtabular/ops/bucketize.py 38 20 20 2 38% 52-54, 58->exit, 59-64, 71-87, 90, 93-94
nvtabular/ops/categorify.py 658 147 350 78 73% 252, 254, 272, 276, 280, 284, 288, 292, 294, 298, 321, 324-329, 342-343, 372-376, 390->394, 398-405, 433, 447->450, 451, 456, 459, 482-483, 490-498, 560-565, 597, 624->626, 627, 628->630, 634, 636, 645, 724, 726->729, 732, 749, 758-763, 794, 828, 872-873, 888-892, 893->857, 911, 919, 926-927, 944-945, 950, 953->956, 982, 1002-1020, 1036, 1055->1057, 1060, 1062-1065, 1070, 1073, 1075->1078, 1083->1052, 1091-1098, 1099->1101, 1103-1106, 1118, 1122, 1126, 1133, 1138-1141, 1219, 1221, 1283, 1291->1314, 1297->1314, 1315-1320, 1338, 1342-1350, 1353, 1364-1372, 1379, 1385->1390, 1389, 1395, 1398, 1403-1417, 1438-1446
nvtabular/ops/clip.py 18 2 8 3 81% 44, 52->54, 55
nvtabular/ops/column_similarity.py 122 86 38 0 24% 19-20, 29-30, 73-79, 82-89, 93-115, 126-127, 130-135, 139, 143, 169-198, 207-208, 217-219, 227-243, 252-277, 281-284, 288-289
nvtabular/ops/data_stats.py 56 1 24 3 95% 91->93, 95, 97->87
nvtabular/ops/difference_lag.py 39 20 14 1 42% 60->63, 70-79, 84, 87-92, 95, 98, 101-102
nvtabular/ops/dropna.py 8 3 2 0 70% 39-41
nvtabular/ops/fill.py 65 12 26 6 76% 52-54, 62-66, 72, 102, 106, 108, 134
nvtabular/ops/filter.py 20 3 8 3 79% 49, 56, 60
nvtabular/ops/groupby.py 110 11 72 9 85% 72, 83, 93->95, 105->110, 122, 129, 138->137, 208, 236, 242-249
nvtabular/ops/hash_bucket.py 43 22 22 2 38% 69, 73, 82-93, 98-102, 105-116, 120, 124
nvtabular/ops/hashed_cross.py 37 22 17 1 33% 52, 58-69, 74-79, 82, 87-92
nvtabular/ops/join_external.py 96 19 34 11 72% 20-21, 114, 116, 118, 131, 138, 142-145, 150-151, 156-157, 205-206, 220-227
nvtabular/ops/join_groupby.py 112 20 47 9 76% 106, 108, 115, 121-124, 131-134, 139-141, 172-175, 176->170, 219-220, 235-236
nvtabular/ops/lambdaop.py 46 6 22 6 82% 59, 63, 77, 89, 94, 103
nvtabular/ops/list_slice.py 86 39 42 5 45% 21-22, 67-68, 74, 86-94, 105, 121->127, 142-156, 164-186
nvtabular/ops/logop.py 21 2 6 1 89% 48-49
nvtabular/ops/moments.py 69 1 24 1 98% 71
nvtabular/ops/normalize.py 93 27 22 3 67% 72, 77, 82, 89, 126-128, 134-142, 148, 155-159, 162-163, 167, 176, 180
nvtabular/ops/operator.py 12 1 2 0 93% 53
nvtabular/ops/rename.py 29 3 14 3 86% 45, 70-72
nvtabular/ops/stat_operator.py 8 0 2 0 100%
nvtabular/ops/target_encoding.py 175 126 76 0 20% 165-206, 209-212, 215, 224-225, 228-241, 244-251, 254-257, 261, 264-267, 270-271, 274-275, 278-282, 286-365, 369-397, 407-416
nvtabular/ops/value_counts.py 34 20 6 0 40% 37-53, 56, 59, 62-64, 67
nvtabular/tools/init.py 0 0 0 0 100%
nvtabular/tools/data_gen.py 251 12 86 7 94% 25-26, 124-127, 137-139, 161-162, 313, 323, 347->346, 349
nvtabular/tools/dataset_inspector.py 50 7 22 1 81% 32-39
nvtabular/tools/inspector_script.py 46 46 0 0 0% 17-168
nvtabular/utils.py 106 43 48 8 54% 31-32, 36-37, 50, 61-62, 64-66, 69, 72, 78, 84, 90-126, 145, 149->153
nvtabular/worker.py 80 5 38 7 90% 24-25, 81->97, 89, 90->97, 97->100, 106, 108, 109->111
nvtabular/workflow/init.py 2 0 0 0 100%
nvtabular/workflow/node.py 7 0 4 0 100%
nvtabular/workflow/workflow.py 201 15 84 10 91% 28-29, 47, 177, 183->197, 209-211, 324, 339-340, 375, 451, 467-469, 482
TOTAL 8389 2288 3515 448 70%
Coverage XML written to file coverage.xml
FAIL Required test coverage of 70% not reached. Total coverage: 69.56%
=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:16: could not import 's3fs': No module named 's3fs'
SKIPPED [1] tests/unit/inference/test_ensemble.py:32: could not import 'nvtabular.loader.tf_utils.configure_tensorflow': No module named 'nvtabular.loader.tf_utils.configure_tensorflow'; 'nvtabular.loader.tf_utils' is not a package
SKIPPED [1] tests/unit/inference/test_export.py:8: could not import 'nvtabular.loader.tf_utils.configure_tensorflow': No module named 'nvtabular.loader.tf_utils.configure_tensorflow'; 'nvtabular.loader.tf_utils' is not a package
SKIPPED [8] tests/unit/test_io.py:613: could not import 'uavro': No module named 'uavro'
!!!!!!!!!!!!!!!!!!!!!!!!!! stopping after 1 failures !!!!!!!!!!!!!!!!!!!!!!!!!!!
===== 1 failed, 586 passed, 11 skipped, 299 warnings in 609.85s (0:10:09) ======
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://api.GitHub.com/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins5465786226495981951.sh
@bschifferer I liked this change, but not sure we ever got it to a point where we could merge it. Should we keep this open and revive it or close and re-submit a version built on main?