Huarong

Results 8 comments of Huarong

这个问题很严重啊,为什么作者不修复? @liuwons

> There are 3 consective commits and they addressed the issue with tensor parallel in inference. The intermediate size was changed from 13696 to 14336 such that the MLP layers...

> If I understand correctly, the intermediate size in your models are still 13696: no need to change it if tensor parallel is not needed. > > `!` is token...

Could not parse name from QianfanChatEndpoint too. serialized: {'lc': 1, 'type': 'not_implemented', 'id': ['langchain_community', 'chat_models', 'baidu_qianfan_endpoint', 'QianfanChatEndpoint'], 'repr': "QianfanChatEndpoint(client=, qianfan_ak=SecretStr('**********'), qianfan_sk=SecretStr('**********'), temperature=0.1, model='ERNIE-Bot-4')"} kwargs: {'invocation_params': {'endpoint': None, 'model': 'ERNIE-Bot-4', '_type':...

@guanxingke you can use what @MarkWuNLP has mentioned above. Instead I reconstructed the code to a class having two methods. ``` def save(self, path): with open(path, 'wb') as f: pickle.dump(self,...

I need this feature too. I have about 500G training data in a directory, which consists about 500 files. It will be helpful if multi files can be scanned at...