LongLM
LongLM copied to clipboard
Memory usage
Dear Author, hello. I have a question about the memory usage. I implemented the original torch on an V100×8, but when the inference length reaches 8k, the memory is insufficient. Could you share the related memory usage of self extend?