djl
djl copied to clipboard
IdEmbedding not support PyTorchEngine
IdEmbedding userEmbedding = new IdEmbedding.Builder() .setDictionarySize(userCount) .setEmbeddingSize(64) .build(); userEmbedding.initialize(manager, DataType.FLOAT32, usersND.getShape());
userEmbedding.forward(ps, new NDList(manager.create(new int[] {1,2,3,4})), true).singletonOrThrow();
Exception in thread "main" java.lang.UnsupportedOperationException: Not supported! at ai.djl.ndarray.BaseNDManager.invoke(BaseNDManager.java:285) at ai.djl.nn.transformer.MissingOps.gatherNd(MissingOps.java:31) at ai.djl.nn.transformer.IdEmbedding.forwardInternal(IdEmbedding.java:72) at ai.djl.nn.AbstractBlock.forward(AbstractBlock.java:121) at ai.djl.nn.Block.forward(Block.java:122)
@dxjjhm Thanks for reporting this issue. Currently DJL only implemented gather in MXNet. We prioritize this for PyTorch.
Do you have some methods to get embedding-batch by input-batch
by using NDArrayEx Interface, I am already found a solution as follow.
NDArray inputUserEmbedding = userInput.getNDArrayInternal() .embedding(userInput, userEmbeddingTable, SparseFormat.DENSE) .singletonOrThrow();
gather has been implemented in PyTorch.