LiLT icon indicating copy to clipboard operation
LiLT copied to clipboard

Official PyTorch implementation of LiLT: A Simple yet Effective Language-Independent Layout Transformer for Structured Document Understanding (ACL 2022)

Results 30 LiLT issues
Sort by recently updated
recently updated
newest added

@jpWang first of all congratulations to all the authors of this great paper and a milestone work, it truly justifies the title **SIMPLE yet EFFECTIVE** Question 1. From the paper...

FUNSD, lilt-roberta-en-base return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse) RuntimeError: CUDA error: device-side assert triggered 7%|▋ | 143/2000 [00:28

你好,我是东华大学的一名研究生。 很荣幸能够阅读你这么优秀的文章,并进行复现。现在我在运行你的实例代码时,碰到了一些问题: ![image](https://user-images.githubusercontent.com/107447700/210551412-34bb5fb8-55c0-4f21-80e0-dd5d4149ed25.png) 希望你能抽空看看,解决一下。十分感谢!

非常感谢你们的开源工作,非常有趣的是我本科和硕士导师都是来自华工的。回归正题,我无法下载在OneDrive中的预处理数据,不知道你们能提供其他的获取方式吗?比如谷歌云盘,或者麻烦你们发我邮箱[email protected],感谢了!

Hi, LiLT processes a maximum of 512 tokens. Is there a good option to get a comparable and commercial useable model that can process more tokens? It is of course...

Hi @jpWang , thanks for your repo, I have used it for my project: extract keys and values in complicated layout document types 1. The NER model looks good 2....

Hello, Could you let me know when you have a Custom dataset and how to organize your dataset into the? and do you recommend any tutorial for this step? Thank...

Hi :) I'm confused about pretrain process when I change language model. I hope to use LiLT using korean Roberta model which is already pretrained with Korean language dataset. According...

Hi, I wanted to make inference with `LiLT`with model parameters to `Half `(`float16`) dtype on CPU (I did try on GPU and it worked). As I'm using Transformers from Hugging...

Hi, I'm using Hugging Face libraries in order to run `LiLT`. How can I decrease inference time? Which code to use? I've already try `BetterTransformer ` (`Optimum`) and `ONNX `...