swift-models
swift-models copied to clipboard
Transformer model can not handle batch_size bigger than 1 during inference.
When trying to do inference with more than one seed sentences or simply duplicate the input token like this:
var tokarr: [Int32] = Array<Int>(pytok)!.map { Int32($0) }
tokarr = Array<[Int32]>(repeating: tokarr, count: 2).flatMap { $0 }
tokens = Tensor<Int32>(shape: [2, Int32(tokarr.count/2)], scalars: tokarr)
It failed at line 113 of Model.swift
key: state.key.concatenated(with: input.key, alongAxis: 1)
This is the error message:
Thread 1: Fatal error: ConcatOp : Dimensions of inputs should match: shape[0] = [12,0,64] vs. shape[1] = [24,2,64]
Python version supports batch_size bigger than 1.
Branch stable and Swift for Tensorflow 0.2 release 2019-03-02.
Xcode Version 10.1 (10B61)
macOS High Sierra (Version 10.13.6)