nntrainer icon indicating copy to clipboard operation
nntrainer copied to clipboard

[compiler] Revisit tflite interpreter

Open mhs4670go opened this issue 2 years ago • 4 comments

This patch revisits tflite interpreter.

  1. Refactor the way of exporting tflite binary.
  2. Support operators included in Resnet network.

Self evaluation:

  1. Build test: [X]Passed [ ]Failed [ ]Skipped
  2. Run test: [X]Passed [ ]Failed [ ]Skipped

Related: #1879 Signed-off-by: seongwoo [email protected]

mhs4670go avatar May 19 '22 06:05 mhs4670go

:memo: TAOS-CI Version: 1.5.20200925. Thank you for submitting PR #1912. Please a submit 1commit/1PR (one commit per one PR) policy to get comments quickly from reviewers. Your PR must pass all verificiation processes of cibot before starting a review process from reviewers. If you are new member to join this project, please read manuals in documentation folder and wiki page. In order to monitor a progress status of your PR in more detail, visit http://ci.nnstreamer.ai/.

taos-ci avatar May 19 '22 06:05 taos-ci

Flatten layer erases dimension information. Before flatten, we must be transposed back to nhwc. because weights in layers after flatten layer takes nchw format.

Well, flatten layer inherit the reshape layer and it changes the dimension in reshape layer and it is x, 1, 1, width ( in_dim.getFeatureLen()). So dimension is changed to x, 1, width, 1 in Exporter. Talking about the matter of nchw of the following layer, all the weights in the layers will be changed as a channel last order. But for the input and output tensors, we only change their dimensions only ( NCHW --> NHWC ). Because only the dimension of input/output tensors and weight data matter for the tflite inference. So we don't need to care about the transpose anymore if the input is transposed at input layer and there will be no bottleneck during the inference.

jijoongmoon avatar May 24 '22 01:05 jijoongmoon

all the weights in the layers will be changed as a channel last order.

As far as I know, this statement does not hold for fc layers. I ran into the exact same situation when I was building golden tests.

1. conv(channel_first) -> flatten -> fc

2. transpose -> conv(channel_last) -> flatten -> fc

will not provide the same result even if conv weight changes when fc weights are identical

but

3. transpose -> conv(channel_last) -> transpose -> flatten -> fc is identical to 1. when fc weights does not change.

All will run because dimension matches but the values in 1 and 2 are different.

zhoonit avatar May 24 '22 13:05 zhoonit

@zhoonit Yes. On second thought, you are right. Transpose is required before flatten and do the fc without transposing the weight. We might need a kind of realizer to check and modify GraphRepresentation.

jijoongmoon avatar May 31 '22 13:05 jijoongmoon

This Pull Request Merged through #1994

DonghakPark avatar Nov 08 '22 01:11 DonghakPark