Abdurrahman Beyaz

Results 7 comments of Abdurrahman Beyaz

I think the model shared is a GPT2LMHeadModel however the tokenizer is OpenAIGPTTokenizer lol!

I'm getting the same error while calling `ChatVectorDBChain` the error I'm getting is for `chain_type` = `refine`,`map_reduce`, or `map_rerank` the code ``` from langchain.vectorstores.weaviate import Weaviate from langchain.llms import OpenAI...

> I think there is a different reason for this. The original google-files seem to have a slightly different format and the parser for the binary file reads one byte...

so I did the implementation by my self, and I sharing it with you, https://gist.github.com/alabrashJr/d71cf74bc9713bb0a5bb12ccd331a405

I attempted to run it on my Apple M1 Pro, but unfortunately, I encountered an error that I was unable to resolve despite my efforts. I followed the instructions provided...

> I attempted to run it on my Apple M1 Pro, but unfortunately, I encountered an error that I was unable to resolve despite my efforts. I followed the instructions...