tvm icon indicating copy to clipboard operation
tvm copied to clipboard

Open deep learning compiler stack for cpu, gpu and specialized accelerators

Results 636 tvm issues
Sort by recently updated
recently updated
newest added
trafficstars

Now that we've published the apache-tvm wheel, we can tell folks to use pip to install tvm rather than building from source. This issue tracks updating the docs: - [...

Hi, I'm attempting to run a model via RPC and TVMC but I'm running into an issue: ``` ValueError: Cannot pass argument 1, type Map is not supported by RPC...

type: bug

Using this test script: ```python import tvm import tempfile import tvm.script.tir as T @T.prim_func def bad_message( B: T.Buffer[(64), "float32"], B_pack: T.Buffer[(8, 8), "float32"], ): with T.block("root"): for jo in range(8):...

type: bug

This is a global tracking issue for landing the meta schedule. The RFC can be found [here](https://github.com/apache/tvm-rfcs/blob/main/rfcs/0005-meta-schedule-autotensorir.md). ## Steps The steps are numbered following TensorIR (#7527). ### [M3a] Core infrastructure...

type:rfc-tracking

Support GREATER quantization operation conversion as part of issue #9187

This updates the TF version to be used in TVM CI to 2.9.1, which brings improvements so that more platforms are supported by official packages. This PR updates the Docker...

there are a few different test cases here that can be parameterized via tvm.testing.parameter. at least these, please check the rest of the file too: test_forward_add_layer_params test_forward_multiply_layer_params test_forward_concat_layer_params

type:ci
actionable

This issue is to track progress for `Name mangling in IRModules` PR: #12066 RFC discussion: https://github.com/apache/tvm-rfcs/pull/84 Forum: https://discuss.tvm.apache.org/t/pre-rfc-name-mangling-in-irmodules/12944/7

type:rfc-tracking

We have seen a couple bugs due to microTVM being presumed-ON in config.cmake. Namely, you get python errors importing TVM right now when USE_MICRO is OFF. We should have a...

when I try to relay dlrm (pytorch model), I encounter the error: ImplementationError: The following operators are not implemented: ['aten::embedding_bag'] pytorch=1.7.0, tvm is the latest

status: help wanted