Konquer your Mind
Konquer your Mind
pending for it.
I checked out the tf.keras document and found no example on how to use Wrapper layer.
thx for sharing. I will try it.
我用自己制作的lmdb训练精度总是到0.48就不能再往上了。我觉得可能是生成的数据存在问题。我希望能够讨论下这个问题,我的邮箱是[email protected]。希望能够取得联系。
one difference between my implement and the description of original paper is how the hidden layers are normalized. I used layer normalization to use small batch which can fit in...
the dataset is not mine. I just implement the algorithm. please ask the original author for the correction of the dataset.
可以清空/data3/huyan/liheng/tmp/AdaSeq/AdaSeq/experiments/eBay/231125005305.028950/output_best/configuration.json最底下plugin的list中的内容,变成空list。但是运行可能还有其他问题。
I don't have the dataset for now. the project was done at my former company. the dataset is still there.
you should maintain an buffer of the time sequence and feed it into the model. the buffer serves as a fifo of the online frames.
you can feed data sequence of any length as long as you assign the input parameter 'sequence_lengths' in the input dictionary correctly.