StarWhisper icon indicating copy to clipboard operation
StarWhisper copied to clipboard

StarWhisper:LLM for Astronomy

Results 14 StarWhisper issues
Sort by recently updated
recently updated
newest added

Hi YuYang, Congratulations on your great work! It would be really nice if you can upload the model to Hugging Face hub. This would help model discovery and integration with...

Your model notices that NGC7714 is a spiral galaxy. That's cool but actually, it is also a merging galaxy. Your generated images of the galaxy look artistic instead of scientific....

你好,给定的示例代码是否有误,我这边显示加载完以后得model没有chat属性

您好,非常感谢您的工作。最近在尝试将领域教材和文献喂给模型,想请教您关于增量预训练问题,非常感谢! ①如何处理这些文档,成为训练数据的格式。另外,教材或文献里可能有表格、图片这些模态的数据,请问是如何处理的?并喂给LLM(或者VLM?)。 ![image](https://github.com/Yu-Yang-Li/StarWhisper/assets/17664669/c39fe75e-1dfa-4236-8a59-71c350ca0af0) ②ocr在识别完教材后,是如何构建“知识库”和“可训练数据”的?二者在处理方式上有何异同? ![image](https://github.com/Yu-Yang-Li/StarWhisper/assets/17664669/acb829b4-ff65-4c0d-904f-238108e8f93d) ③最后是预训练、微调、本地知识库,请问您在您的工作中,哪一个环节更重要?换言之,如果一个开源的llm(如chatglm),不做训练以扩充知识面,直接做rag的效果如何?还是说经过微调或预训练,才能够达到预期目标。