generative_ai_with_langchain
generative_ai_with_langchain copied to clipboard
Build large language model (LLM) apps with Python, ChatGPT and other models. This is the companion repository for the book on generative AI with LangChain.
Books Errata: Chapter 3 Setting up Docker: Build the Docker image from the Dockerfile in this repository: docker build -t langchain_ai This should be Build the Docker image from the...
The following models have been deprecated on January 4th by OpenAI: * text-ada-001 * text-babbage-001 * text-curie-001 * text-davinci-001 * text-davinci-002 * text-davinci-003 * davinci-instruct-beta * curie-instruct-beta * code-search-ada-code-001 *...
In `chat_with_retrieval/app.py`, an error occurs if `use_flare = True` and , error if `use_moderation =True`. chat_with_documents.py: FlareChain and ConversationalRetrievalChain have different input and output names. For FlareChain, `user_input` and `response`...
Please note that the code on GitHub and the book refer to [particular versions of LangChain](https://github.com/benman1/generative_ai_with_langchain/blob/main/requirements.txt#L3). Because LC is changing so heavily, if you are using different versions the code...
Hello, I am trying to use the scripts in the chat_with_retrieval folder . Created a python environment with requirements. Unfortunatelly the streamlit app is crashing (please see the attached image)....
Hi, Can you suggest an working alternative for the summarize texts example given in your book, instead of using ChatOpenAI?
Docker build gives this error: ``` docker build -t langchain_ai . ... Collecting lanarky==0.7.16 (from -r requirements.txt (line 27)) Downloading lanarky-0.7.16-py3-none-any.whl.metadata (6.7 kB) ERROR: Ignored the following versions that require...
Hi, The following code snippet from Chapter 3 : from langchain.llms import HuggingFaceHub llm = HuggingFaceHub( model_kwargs={"temperature": 0.5, "max_length": 64}, repo_id="google/flan-t5-xxl" ) prompt = "In which country is Tokyo?" completion...
I am working on Windows 11. After executing the code:- from transformers import pipeline import torch generate_text = pipeline( model="aisquared/dlite-v1-355m", torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto", framework="pt" ) generate_text("In this chapter, we'll discuss...
I am using conda and wind10. Retrieving notices: ...working... done Collecting package metadata (repodata.json): done Solving environment: failed ResolvePackageNotFound: - readline=8.2 - ncurses=6.4 Then I tried to install readline and...