openvino
openvino copied to clipboard
[Good First Issue][TF FE]: Support MatrixDiagV3 operation for TensorFlow
Context
OpenVINO component responsible for support of TensorFlow models is called as TensorFlow Frontend (TF FE). TF FE converts a model represented in TensorFlow opset to a model in OpenVINO opset.
In order to infer TensorFlow models with MatrixDiagV3 operation by OpenVINO, TF FE needs to be extended with this operation support.
What needs to be done?
For MatrixDiagV3 operation support, you need to implement the corresponding loader into TF FE op directory and to register it into the dictionary of Loaders. One loader is responsible for conversion (or decomposition) of one type of TensorFlow operation.
Here is an example of loader implementation for TensorFlow Einsum
operation:
OutputVector translate_einsum_op(const NodeContext& node) {
auto op_type = node.get_op_type();
TENSORFLOW_OP_VALIDATION(node, op_type == "Einsum", "Internal error: incorrect usage of translate_einsum_op.");
auto equation = node.get_attribute<std::string>("equation");
OutputVector inputs;
for (size_t input_ind = 0; input_ind < node.get_input_size(); ++input_ind) {
inputs.push_back(node.get_input(input_ind));
}
auto einsum = make_shared<Einsum>(inputs, equation);
set_node_name(node.get_name(), einsum);
return {einsum};
}
In this example, translate_einsum_op
converts TF Einsum
into OV Einsum
. NodeContext
object passed into the loader packs all info about inputs and attributes of Einsum
operation. The loader retrieves an attribute of the equation by using the NodeContext::get_attribute()
method, prepares input vector, creates Einsum
operation from OV opset and returns a vector of outputs.
Responsibility of a loader is to parse operation attributes, prepare inputs and express TF operation via OV operations sub-graph. Example for Einsum
demonstrates the resulted sub-graph with one operation. In PR https://github.com/openvinotoolkit/openvino/pull/19007 you can see operation decomposition into multiple node sub-graph.
Once you are done with implementation of the translator, you need to implement the corresponding layer tests test_tf_MatrixInverse.py
and put it into layer_tests/tensorflow_tests directory. Example how to run some layer test:
export TEST_DEVICE=CPU
cd openvino/tests/layer_tests/tensorflow_tests
pytest test_tf_Shape.py
Hint
Check how MatrixBandPart
operation support was implemented here: https://github.com/openvinotoolkit/openvino/pull/23082
Example Pull Requests
- https://github.com/openvinotoolkit/openvino/pull/19007
Resources
- What is OpenVINO?
- How to Build OpenVINO
- Developer documentation for TensorFlow Frontend
- Contribution guide - start here!
- Intel DevHub Discord channel - engage in discussions, ask questions and talk to OpenVINO developers
Contact points
- @openvinotoolkit/openvino-tf-frontend-maintainers
- @rkazants in GitHub
- rkazants in Discord
Ticket
No response
.take
Thank you for looking into this issue! Please let us know if you have any questions or require any help.
Hi @RaffaelloFornasiere, any update on this task?
Hi @RaffaelloFornasiere, any update on this task?
Hi @rkazants, thankyou for reaching out. I appreciate your patience regarding the task.
I'm working on it, although I must admit I've encountered some challenges. I'm currently in the process of understanding how to complete it by studying other examples, such as matrix_band_part you suggested. However, I'm still facing some difficulties, since is quite a new thing for me.
I'm also managing this task alongside work commitments and my master's thesis. While I may not have full-time availability, I'm dedicated to completing it to the best of my ability. I'll keep you updated on my progress.
Hello @RaffaelloFornasiere, thanks for the update! Can we help you with any of these challenges? We're here to answer questions.
.take
Thank you for looking into this issue! Please let us know if you have any questions or require any help.
Hello @anzr299, are you still working on that issue? Do you need any help?
Hi, I would like to unassign myself. I am focusing on a smaller subset of problems currently. Sorry for the trouble.
.take
Thank you for looking into this issue! Please let us know if you have any questions or require any help.