serge icon indicating copy to clipboard operation
serge copied to clipboard

[WIP] rewrite langchain integration

Open Wingie opened this issue 1 year ago • 0 comments

iver":{"name":"PyMongo|Motor","version":"4.3.3|3.1.1"},"os":{"type":"Linux","name":"Linux","architecture":"aarch64","version":"5.15.49-linuxkit"},"platform":"CPython 3.10.6.final.0|asyncio"}}}
serge-serge-1  | INFO:     172.19.0.1:40168 - "GET /chat/7ec288fb-1100-4c77-bbfd-2cc7dd8ee939 HTTP/1.1" 200 OK
serge-serge-1  | INFO:     172.19.0.1:40162 - "GET /chat/ HTTP/1.1" 200 OK
serge-serge-1  | INFO:llama_index.token_counter.token_counter:> [query] Total LLM token usage: 0 tokens
serge-serge-1  | INFO:llama_index.token_counter.token_counter:> [query] Total embedding token usage: 0 tokens
serge-serge-1  | INFO:serge.utils.generate:None
serge-serge-1  | INFO:     172.19.0.1:52206 - "GET /chat/7ec288fb-1100-4c77-bbfd-2cc7dd8ee939/question?prompt=can+you+rewrite+it+in+perl%3F HTTP/1.1" 200 OK
serge-serge-1  | INFO:     172.19.0.1:52206 - "GET /chat/7ec288fb-1100-4c77-bbfd-2cc7dd8ee939 HTTP/1.1" 200 OK

langchain integration is sort of working? (it kinda works outside docker but not inside it) im trying to work out something that lets me compare the prompt with index and prompt without. because i also want to support https://github.com/lastmile-ai/llama-retrieval-plugin eventually..

Wingie avatar Mar 30 '23 19:03 Wingie