tvm icon indicating copy to clipboard operation
tvm copied to clipboard

[Relax][Torch] AssertionError: Unsupported function types ['mean.default']

Open locnd182644 opened this issue 2 weeks ago • 1 comments

Summary

Happen error when create module from exported_program have torch.mean without dim.

Reproduce

  • Module:
class MeanModule(nn.Module):
    def forward(self, x):
        return torch.mean(x)
...
# Export → Relax
ep = torch_export(m, (x,))
mod = from_exported_program(ep)
  • Error log:
---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
Cell In[2], line 13
     11 # Export → Relax
     12 ep = torch_export(m, (x,))
---> 13 mod = from_exported_program(ep)
     15 mod.show()
     17 target = "llvm"

File ~/Programming/tvm/python/tvm/relax/frontend/torch/exported_program_translator.py:1783, in from_exported_program(exported_program, keep_params_as_input, unwrap_unit_return_tuple, no_bind_return_tuple, run_ep_decomposition)
   1780 if run_ep_decomposition:
   1781     exported_program = exported_program.run_decompositions()
-> 1783 return ExportedProgramImporter().from_exported_program(
   1784     exported_program,
   1785     keep_params_as_input,
   1786     unwrap_unit_return_tuple,
   1787     no_bind_return_tuple,
   1788 )

File ~/Programming/tvm/python/tvm/relax/frontend/torch/exported_program_translator.py:1642, in ExportedProgramImporter.from_exported_program(self, exported_program, keep_params_as_input, unwrap_unit_return_tuple, no_bind_return_tuple)
   1639 nodes: List[fx.Node] = exported_program.graph.nodes
   1641 # Find all the missing function types
-> 1642 self._check_unsupported_func_type(nodes)
   1644 with self.block_builder.function(
   1645     name=func_name, params=list(inputs_vars.values()).copy(), attrs=func_attrs
   1646 ):
   1647     output = None

File ~/Programming/tvm/python/tvm/relax/frontend/torch/base_fx_graph_translator.py:182, in BaseFXGraphImporter._check_unsupported_func_type(self, nodes)
    174 def _check_unsupported_func_type(self, nodes: List[fx.Node]):
    175     missing_func_types = list(
    176         {
    177             node.target.__name__
   (...)    180         }
    181     )
--> 182     assert not missing_func_types, f"Unsupported function types {missing_func_types}"

AssertionError: Unsupported function types ['mean.default']

Resolve:

  • Add "mean.default" into create_convert_map in class ExportedProgramImporter.

locnd182644 avatar Dec 10 '25 09:12 locnd182644

Summary of Changes

Hello @locnd182644, I'm Gemini Code Assist[^1]! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request addresses a specific conversion issue within the Relax frontend for PyTorch's exported_program. Previously, torch.mean calls that did not explicitly specify a dimension (dim) would cause an assertion failure during the translation process. The changes ensure that these default mean operations are now correctly mapped and handled, improving the robustness of the exported_program importer.

Highlights

  • Bug Fix: Resolved an AssertionError that occurred when converting exported_program containing torch.mean operations without a specified dim argument to Relax.
  • Conversion Map Update: The create_convert_map in ExportedProgramImporter has been updated to explicitly recognize and handle mean.default (i.e., torch.mean() without dim).
  • New Test Case: Added a new test case, MeanWithoutDim, to ensure that torch.mean() operations are correctly translated and verified within the frontend from exported_program.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with :thumbsup: and :thumbsdown: on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

[^1]: Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

gemini-code-assist[bot] avatar Dec 10 '25 09:12 gemini-code-assist[bot]

cc @tvm-bot rerun

tlopex avatar Dec 12 '25 16:12 tlopex