keras icon indicating copy to clipboard operation
keras copied to clipboard

ONNX Support

Open davideboschetto opened this issue 8 years ago • 15 comments

Hi there.

I see no discussions about ONNX, an open source framework to share models among different deep learning libraries. I don't know if this is better implemented at the Keras level or at the backend level (CNTK already supports it), but I think it's worth discussing!

davideboschetto avatar Nov 30 '17 07:11 davideboschetto

As of now, ONNX is a talking point used by large companies, that doesn't seem to address an existing problem and doesn't seem to have actual users. This may change in the future, but this is the current situation.

Keras takes a very programatic approach to development, implementing features that match user needs rather than abstract PR-driven strategies. If ONNX becomes significant in the future, we will add support for it. But while its purpose it to serve as a corporate PR talking point, we will not invest development efforts into it.

I encourage you to look at NNEF and OpenVX. Should Keras support them? Why? Why not? Is ONNX more relevant than these?

I would also point out that there is an existing NN exchange format that works across TF, CNTK, Theano, browsers, the JVM, and even MXNet to an extent: Keras savefiles. Importantly, Keras savefiles already have lots of users.

fchollet avatar Nov 30 '17 19:11 fchollet

Now that we have this, https://cloudblogs.microsoft.com/opensource/2020/01/21/microsoft-onnx-open-source-optimizations-transformer-inference-gpu-cpu/

Can we reopen this issue?

hrisheekeshr avatar Feb 19 '20 10:02 hrisheekeshr

there is an existing NN exchange format that works across TF, CNTK, Theano, browsers, the JVM, and even MXNet to an extent: Keras savefiles. Importantly, Keras savefiles already have lots of users.

Is there any documentation on how to use Keras savefiles in the latest version of Keras?

Ark-kun avatar Aug 07 '20 09:08 Ark-kun

I think reopening the issue might be the right choice! ONNX provided this converter, in the meantime: https://github.com/onnx/tensorflow-onnx It only works with saved_model or .meta, though (usual drawbacks of freezing the graph and exporting from there)

davideboschetto avatar Aug 07 '20 12:08 davideboschetto

cc @sachinprasadhs This should be reopened.

innat avatar Dec 31 '23 12:12 innat

@innat can you say more about what you had in mind?

Something that goes directly keras.ops to the onnx format?

And how would that compare to going to onnx through a backend? E.g. set the torch bachend, use torch onnx tools to export to onnx?

mattdangerw avatar Jan 04 '24 02:01 mattdangerw

@mattdangerw

Something that goes directly keras.ops to the onnx format?

Sounds good. Here is jax2xla, a numpy backedn onnx interpreter, google/jaxonnxruntime.

And how would that compare to going to onnx through a backend? E.g. set the torch bachend, use torch onnx tools to export to onnx?

I think this suits more. However,

  • If it is ensured that a jax-backend written keras mdoel can be run out of the box with torch-backend, then setting torch backend, torch.onnx can be used to export.
  • AFAIK, there is no direct conversion from jax2onnx. To achieve this one can first convert jax2tf and then tf2onnx

innat avatar Jan 04 '24 19:01 innat

Thanks all for the inputs.

I think the most approvable way to do this so far is either using TF savedmodel to ONNX, or torch backend (less mature on Keras side) to ONNX. For any Keras model, user probably can take following workflow:

  1. build/train/test the Keras model in any backend.
  2. Save the model in keras format.
  3. Reload the keras model back with TF backend.
  4. Export the keras model with TF backend to TF savedmodel.
  5. Convert to ONNX.

Since Keras is a high level framework, we could also choose to directly export ONNX, but this will take signaficant effort to implement all the ops mapping, and I think leverage TF to ONNX is probably the most low cost approach at this moment.

Keras side probably should provide a guide/example for how to do this.

I believe this feature will be important since it opens up the way to user to leverage more downstream system like Nvidia TensorRT for inference, etc.

@fchollet for more inputs.

qlzh727 avatar Jan 07 '24 17:01 qlzh727

Discussed this in the team meeting.

In general we will leverage each backend for their own way to export to ONNX, eg TF has its own way to convert tf savedmodel to ONNX, same for JAX and pytorch. Keras will not support direct export to ONNX at this stage (eg implement ONNX ops set). We will provide documentation, and potential APIs to support user to convert to ONNX.

qlzh727 avatar Jan 11 '24 18:01 qlzh727

Adding Neel since this is related to saving/exporting.

qlzh727 avatar Jan 11 '24 18:01 qlzh727