langgraph icon indicating copy to clipboard operation
langgraph copied to clipboard

docs: optimize the syntactic expression of graph usage

Open Undertone0809 opened this issue 9 months ago • 7 comments

I have already used langgraph in some production projects. When dealing with particularly complex logic, it is easy to misspell node names by simply using string to declare node names. We can optimize the way of node declaration and use enumeration to declare node names. Ref: https://github.com/langchain-ai/langgraph/pull/255

I have only changed some of the previous examples. If you agree with this way of writing, I think I can help modify the writing of node declarations in all documents.

from langgraph.graph import StateGraph, END
from enum import Enum

class NodeType(str, Enum):
    PLAN = "planner"
    EXECUTE = "agent"
    REPLAN = "replan"

workflow = StateGraph(PlanExecute)

# Add the plan node
workflow.add_node(NodeType.PLAN, plan_step)

# Add the execution step
workflow.add_node(NodeType.EXECUTE, execute_step)

# Add a replan node
workflow.add_node(NodeType.REPLAN, replan_step)

workflow.set_entry_point(NodeType.PLAN)

# From plan we go to agent
workflow.add_edge(NodeType.PLAN, NodeType.EXECUTE)

# From agent, we replan
workflow.add_edge(NodeType.EXECUTE, NodeType.REPLAN)

workflow.add_conditional_edges(
    NodeType.REPLAN,
    # Next, we pass in the function that will determine which node is called next.
    should_end,
    {
        # If `tools`, then we call the tool node.
        True: END,
        False: NodeType.EXECUTE,
    },
)

# Finally, we compile it!
# This compiles it into a LangChain Runnable,
# meaning you can use it as you would any other runnable
app = workflow.compile()

Undertone0809 avatar Apr 30 '24 10:04 Undertone0809

Recently, I was developing an Agent communication framework, studied its design pattern, and put forward a suggestion on syntax optimization. Could you give me some suggestions?

Link: https://github.com/langchain-ai/langchain/discussions/20420

Undertone0809 avatar Apr 30 '24 10:04 Undertone0809

I think rather than updating all the existing notebooks, we could add a "how-to" showing how to use enums for node names to make it easier to manage

hinthornw avatar May 07 '24 16:05 hinthornw

this looked cute so I tried it, but it breaks serialization in LangServe streaming events (RemoteRunnable.astream_events) because the NodeType's value object is set as a key along the data object of on_chain_stream and WellKnownLCSerializer.dumps doesn't like that

ClaudiaJ avatar May 07 '24 18:05 ClaudiaJ

I think rather than updating all the existing notebooks, we could add a "how-to" showing how to use enums for node names to make it easier to manage

Agree with you, so that we do not have to change all the notebooks, can also be singled out to emphasize this point, I will make further optimization.

Undertone0809 avatar May 07 '24 19:05 Undertone0809

this looked cute so I tried it, but it breaks serialization in LangServe streaming events (RemoteRunnable.astream_events) because the NodeType's value object is set as a key along the data object of on_chain_stream and WellKnownLCSerializer.dumps doesn't like that

Thanks for the feedback, it was something I hadn't noticed, could you provide more detailed code to help me understand the problem more clearly, it will help us to think more fully.

Undertone0809 avatar May 07 '24 19:05 Undertone0809

this looked cute so I tried it, but it breaks serialization in LangServe streaming events (RemoteRunnable.astream_events) because the NodeType's value object is set as a key along the data object of on_chain_stream and WellKnownLCSerializer.dumps doesn't like that

Thanks for the feedback, it was something I hadn't noticed, could you provide more detailed code to help me understand the problem more clearly, it will help us to think more fully.

it's basically as you had shared; compile the graph to a Runnable and provide to LangServe add_routes to expose as a route:

demo_server.py
from fastapi import FastAPI, HTTPException, Request
from langserve import add_routes

# your example compiled workflow
from example import app as workflow

app = FastAPI(
    title="LangChain Server",
    version="1.0",
    description="Spin up a simple api server using Langchain's Runnable interfaces",
)

add_routes(app, workflow, path="/example")

if __name__ == "__main__":
    import uvicorn

    uvicorn.run(app, host="localhost", port=8000)

then on client side I put together this little utility REPL:

demo_remote_runnable.py
import pprint
import asyncio
from enum import StrEnum
from uuid import uuid4

from langserve import RemoteRunnable
from langchain_core.messages import AIMessage

class Commands(StrEnum):
    NEW_THREAD = "new thread"
    OLD_THREAD = "prev thread"
    DEBUG_ENABLE = "debug"

class Modes(StrEnum):
    STREAM = "stream_async"
    GENERATE_EVENTS = "generate_events"

async def prompt():
    while True:
        try:
            _prompt = input("Human: ")
        except EOFError:
            break
        if _prompt == "q":
            print("Exiting...")
            break

        yield _prompt

async def run():
    runner = RemoteRunnable(
        url=f"http://localhost:8000/example",
    )

    # NOTE: default ON for purpose of example and also I haven't finished this next part
    debug: bool = True

    user_id = str(uuid4())
    thread_id = str(uuid4())
    prev_thread: str = ""

    async for user_input in prompt():
        if user_input == Commands.NEW_THREAD:
            prev_thread = thread_id
            thread_id = str(uuid4())
            continue
        elif user_input == Commands.OLD_THREAD:
            prev_thread, thread_id = (thread_id, prev_thread)
            continue
        elif user_input == Commands.DEBUG_ENABLE:
            debug = not debug
            continue

        print("...")

        try:
            async for event in runner.astream_events(
                input={"input": user_input},
                config={
                    "configurable": {
                        "user_id": user_id,
                        "thread_id": thread_id,
                    },
                    "run_name": "Test Run",
                    "metadata": {"remote_meta": "test"},
                },
                version="v1",
                include_names=["/example"],
            ):
                if debug:
                    pprint.pp(event)

                # TODO: format the event stream into something more fun and exciting to look at in a demo video :/

        except Exception as e:
            print(f"Whoopsy doodle, something broke :(\n{e}")

        print("\n---")

if __name__ == "__main__":
    asyncio.run(run())

ClaudiaJ avatar May 07 '24 19:05 ClaudiaJ

Interesting. cc @eyurtsev - seems like something that could be fixable in langserve?

hinthornw avatar May 09 '24 17:05 hinthornw