OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

Separate agent controller and server via EventStream

Open rbren opened this issue 1 year ago • 4 comments

Architecture refactor progress:

  • ~~rearrange directory structure~~
  • separate server and controller via event stream (this PR)
  • separate controller and runtime via event stream
  • utilize event stream for agent history

The main goal here is to start creating separation between the server and the agent controller. Ideally they should be able to communicate over a network connection--though we're not quite there.

Now, when there's a request to change agent state (init, start, pause, stop) a ChangeAgentStateAction gets put into the event stream. Once the agent controller has responded, it puts a AgentStateChangedObservation into the event stream.

Two exceptions to this flow:

  • on init events, the server actually instantiates the controller first
  • on start events, the server passes the task to the controller first (I think this case we can make smoother in the future)

Some other stuff in here:

  • I changed "TaskState" to "AgentState" in most places. IMO task state is about how the agent is going about checking boxes off its current task, whereas it's the agent that's running/paused/stopped/etc.
  • I deduplicated commands to change the state, with the state itself. Now you just send your desired state. i.e. pause is now paused, start/resume are now running

rbren avatar May 02 '24 22:05 rbren

Running on this branch with

export WORKSPACE_DIR=/Users/myuser/code/workspace
docker run \
--pull=always \
-e LLM_API_KEY \
-e WORKSPACE_MOUNT_PATH=$WORKSPACE_DIR \
-v $WORKSPACE_DIR:/opt/workspace_base \
-v /var/run/docker.sock:/var/run/docker.sock \
-p 2000:3000 \
--add-host host.docker.internal=host-gateway \
ghcr.io/opendevin/opendevin:1538-merge

I am getting

requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0xffff8969da60>: Failed to establish a new connection: [Errno 111] Connection refused'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/app/opendevin/controller/agent_controller.py", line 143, in _run
    finished = await self.step(i)
               ^^^^^^^^^^^^^^^^^^
  File "/app/opendevin/controller/agent_controller.py", line 268, in step
    action = self.agent.step(self.state)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/agenthub/monologue_agent/agent.py", line 238, in step
    resp = self.llm.completion(messages=messages)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 289, in wrapped_f
    return self(f, *args, **kw)
           ^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 379, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 314, in iter
    return fut.result()
           ^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 382, in __call__
    result = fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^
  File "/app/opendevin/llm/llm.py", line 112, in wrapper
    resp = completion_unwrapped(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 3129, in wrapper
    raise e
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 3027, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2200, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 8965, in exception_type
    raise e
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 8774, in exception_type
    raise ServiceUnavailableError(
          ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/exceptions.py", line 157, in __init__
    super().__init__(
  File "/app/.venv/lib/python3.12/site-packages/openai/_exceptions.py", line 82, in __init__
    super().__init__(message, response.request, body=body)
                              ^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'request'
22:17:59 - opendevin:INFO: agent_controller.py:196 - Setting agent state from AgentState.RUNNING to AgentState.STOPPED

Although ollama works when try

❯ docker exec -it 1d5288890a08 curl http://host.docker.internal:11434
Ollama is running

I have been using it on the main branch without any problem. It seems like Pause the agent task and Restart a new agent task button don't work.

isavita avatar May 03 '24 22:05 isavita

Hmm thanks for the report @isavita. This looks like the issue described here: https://opendevin.github.io/OpenDevin/modules/usage/troubleshooting#unable-to-connect-to-llm

Are you sure you're using the exact same environment variables when you run main vs my branch? You'll need to add things like LLM_BASE_URL

rbren avatar May 05 '24 01:05 rbren

In console it works well. With UI, I think this one might be relevant: I ran with CodeAct, gave a task, it wrote a file. OK. Then I said, "run it". (I'm not actually sure if this is on main, I tend to think it's not. Ran simply with make build / make run.)

INFO:     connection open
INFO:     127.0.0.1:55631 - "GET /api/messages/total HTTP/1.1" 200 OK
INFO:     127.0.0.1:55631 - "GET /api/messages HTTP/1.1" 200 OK
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/Users/enyst/Library/Caches/pypoetry/virtualenvs/opendevin-EKSafskD-py3.12/lib/python3.12/site-packages/uvicorn/protocols/websockets/websockets_impl.py", line 240, in run_asgi
    result = await self.app(self.scope, self.asgi_receive, self.asgi_send)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/enyst/Library/Caches/pypoetry/virtualenvs/opendevin-EKSafskD-py3.12/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/enyst/Library/Caches/pypoetry/virtualenvs/opendevin-EKSafskD-py3.12/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/Users/enyst/Library/Caches/pypoetry/virtualenvs/opendevin-EKSafskD-py3.12/lib/python3.12/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/Users/enyst/Library/Caches/pypoetry/virtualenvs/opendevin-EKSafskD-py3.12/lib/python3.12/site-packages/starlette/middleware/errors.py", line 151, in __call__
    await self.app(scope, receive, send)
  File "/Users/enyst/Library/Caches/pypoetry/virtualenvs/opendevin-EKSafskD-py3.12/lib/python3.12/site-packages/starlette/middleware/cors.py", line 77, in __call__
    await self.app(scope, receive, send)
  File "/Users/enyst/Library/Caches/pypoetry/virtualenvs/opendevin-EKSafskD-py3.12/lib/python3.12/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/Users/enyst/Library/Caches/pypoetry/virtualenvs/opendevin-EKSafskD-py3.12/lib/python3.12/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/Users/enyst/Library/Caches/pypoetry/virtualenvs/opendevin-EKSafskD-py3.12/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/Users/enyst/Library/Caches/pypoetry/virtualenvs/opendevin-EKSafskD-py3.12/lib/python3.12/site-packages/starlette/routing.py", line 756, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/Users/enyst/Library/Caches/pypoetry/virtualenvs/opendevin-EKSafskD-py3.12/lib/python3.12/site-packages/starlette/routing.py", line 776, in app
    await route.handle(scope, receive, send)
  File "/Users/enyst/Library/Caches/pypoetry/virtualenvs/opendevin-EKSafskD-py3.12/lib/python3.12/site-packages/starlette/routing.py", line 373, in handle
    await self.app(scope, receive, send)
  File "/Users/enyst/Library/Caches/pypoetry/virtualenvs/opendevin-EKSafskD-py3.12/lib/python3.12/site-packages/starlette/routing.py", line 96, in app
    await wrap_app_handling_exceptions(app, session)(scope, receive, send)
  File "/Users/enyst/Library/Caches/pypoetry/virtualenvs/opendevin-EKSafskD-py3.12/lib/python3.12/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/Users/enyst/Library/Caches/pypoetry/virtualenvs/opendevin-EKSafskD-py3.12/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/Users/enyst/Library/Caches/pypoetry/virtualenvs/opendevin-EKSafskD-py3.12/lib/python3.12/site-packages/starlette/routing.py", line 94, in app
    await func(session)
  File "/Users/enyst/Library/Caches/pypoetry/virtualenvs/opendevin-EKSafskD-py3.12/lib/python3.12/site-packages/fastapi/routing.py", line 348, in app
    await dependant.call(**values)
  File "/Users/enyst/repos/devin/opendevin/server/listen.py", line 45, in websocket_endpoint
    await session_manager.loop_recv(sid, agent_manager.dispatch)
  File "/Users/enyst/repos/devin/opendevin/server/session/manager.py", line 35, in loop_recv
    await self._sessions[sid].loop_recv(dispatch)
  File "/Users/enyst/repos/devin/opendevin/server/session/session.py", line 37, in loop_recv
    await dispatch(self.sid, action, data)
  File "/Users/enyst/repos/devin/opendevin/server/agent/manager.py", line 35, in dispatch
    await self.sid_to_agent[sid].dispatch(action, data)
  File "/Users/enyst/repos/devin/opendevin/server/agent/agent.py", line 91, in dispatch
    action_obj = action_from_dict(action_dict)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/enyst/repos/devin/opendevin/events/action/__init__.py", line 50, in action_from_dict
    raise AgentMalformedActionError(
opendevin.core.exceptions.AgentMalformedActionError: 'action['action']='user_message'' is not defined. Available actions: dict_keys(['kill', 'run', 'run_ipython', 'browse', 'read', 'write', 'recall', 'think', 'talk', 'finish', 'delegate', 'add_task', 'modify_task', 'change_agent_state', 'push'])
INFO:     connection closed

That said, this refactoring is a beauty. ❤️

enyst avatar May 05 '24 03:05 enyst

@rbren thank you! I pulled the newest version of this branch and all worked perfect with ollama/llama3:8b-instruct-q8_0.

log details
❯ export WORKSPACE_DIR=/Users/isavita/code/workspace
docker run \
--pull=always \
-e SANDBOX_USER_ID=$(id -u) \
-e LLM_API_KEY \
-e WORKSPACE_MOUNT_PATH=$WORKSPACE_DIR \
-v $WORKSPACE_DIR:/opt/workspace_base \
-v /var/run/docker.sock:/var/run/docker.sock \
-p 2000:3000 \
--add-host host.docker.internal=host-gateway \
ghcr.io/opendevin/opendevin:1538-merge
1538-merge: Pulling from opendevin/opendevin
Digest: sha256:e318edd445b5377e44161e0831932662d4fae794333208e1042b08cc22a419b6
Status: Image is up to date for ghcr.io/opendevin/opendevin:1538-merge
Docker socket group id: 0
INFO:     Started server process [33]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:3000 (Press CTRL+C to quit)
INFO:     127.0.0.1:59772 - "GET / HTTP/1.1" 307 Temporary Redirect
INFO:     127.0.0.1:59772 - "GET /index.html HTTP/1.1" 200 OK
INFO:     127.0.0.1:59772 - "GET /assets/index-DZ5GKr_e.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:59778 - "GET /assets/index-DD3lkNKs.css HTTP/1.1" 200 OK
10:12:13 - opendevin:ERROR: auth.py:33 - Invalid token
10:12:13 - opendevin:INFO: listen.py:77 - Invalid or missing credentials, generating new session ID: 78ed30d6-0ad4-421c-b4df-7359297ea5b5
INFO:     127.0.0.1:59772 - "GET /api/auth HTTP/1.1" 200 OK
INFO:     127.0.0.1:59778 - "GET /locales/en/translation.json HTTP/1.1" 200 OK
INFO:     127.0.0.1:59772 - "GET /locales/en-US/translation.json HTTP/1.1" 404 Not Found
INFO:     ('127.0.0.1', 59779) - "WebSocket /ws?token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzaWQiOiI3OGVkMzBkNi0wYWQ0LTQyMWMtYjRkZi03MzU5Mjk3ZWE1YjUifQ.JaDAkpOx--X295rTQUYRrHL-3_NldhbqvxdqeSUaKQY" [accepted]
Starting loop_recv for sid: 78ed30d6-0ad4-421c-b4df-7359297ea5b5
INFO:     connection open
INFO:     127.0.0.1:59778 - "GET /api/refresh-files HTTP/1.1" 200 OK
INFO:     127.0.0.1:59778 - "GET /api/litellm-models HTTP/1.1" 200 OK
INFO:     127.0.0.1:59772 - "GET /api/messages/total HTTP/1.1" 200 OK
INFO:     127.0.0.1:59778 - "GET /api/messages/total HTTP/1.1" 200 OK
INFO:     127.0.0.1:59780 - "GET /api/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:59772 - "GET /favicon-32x32.png HTTP/1.1" 200 OK
10:12:14 - opendevin:INFO: agent.py:156 - Creating agent CodeActAgent using LLM gpt-3.5-turbo
10:12:14 - opendevin:INFO: llm.py:71 - Initializing LLM with model: gpt-3.5-turbo
10:12:14 - opendevin:INFO: ssh_box.py:67 - SSHBox is running as opendevin user with USER_ID=501 in the sandbox
10:12:15 - opendevin:INFO: ssh_box.py:370 - Container stopped
10:12:15 - opendevin:WARNING: ssh_box.py:382 - Using port forwarding for Mac OS. Server started by OpenDevin will not be accessible from the host machine at the moment. See https://github.com/OpenDevin/OpenDevin/issues/897 for more information.
10:12:15 - opendevin:INFO: ssh_box.py:391 - Mounting workspace directory: /Users/isavita/code/workspace
10:12:15 - opendevin:INFO: ssh_box.py:412 - Container started
10:12:16 - opendevin:INFO: ssh_box.py:428 - waiting for container to start: 1, container status: running
10:12:16 - opendevin:INFO: ssh_box.py:191 - Connecting to opendevin@localhost via ssh. If you encounter any issues, you can try `ssh -v -p 59787 opendevin@localhost` with the password '29a039bd-8c39-432a-baad-686ab38bf1e2' and report the issue on GitHub. If you started OpenDevin with `docker run`, you should try `ssh -v -p 59787 opendevin@localhost` with the password '29a039bd-8c39-432a-baad-686ab38bf1e2 on the host machine (where you started the container).
10:12:17 - opendevin:INFO: mixin.py:24 - Copied files from [/Users/isavita/git/OpenDevin/opendevin/runtime/plugins/jupyter] to [/opendevin/plugins/jupyter] inside sandbox.
10:12:17 - opendevin:INFO: mixin.py:32 - Initializing plugin [jupyter] by executing [/opendevin/plugins/jupyter/setup.sh] in the sandbox.
...
JupyterKernelGateway started with PID: 92
Execution server started with PID: 93
Jupyter kernel ready.
10:12:20 - opendevin:INFO: mixin.py:24 - Copied files from [/Users/isavita/git/OpenDevin/opendevin/runtime/plugins/swe_agent_commands] to [/opendevin/plugins/swe_agent_commands] inside sandbox.
10:12:20 - opendevin:INFO: mixin.py:32 - Initializing plugin [swe_agent_commands] by executing [/opendevin/plugins/swe_agent_commands/setup_default.sh] in the sandbox.
10:12:21 - opendevin:INFO: mixin.py:40 - Plugin swe_agent_commands initialized successfully
...
10:12:21 - opendevin:INFO: mixin.py:50 - Sourced ~/.bashrc successfully
10:12:21 - opendevin:INFO: browser_env.py:38 - Starting browser env...
10:12:25 - opendevin:INFO: browser_env.py:51 - Browser env started.
10:13:07 - opendevin:INFO: agent.py:156 - Creating agent CodeActAgent using LLM ollama/llama3:8b-instruct-q8_0
10:13:07 - opendevin:INFO: llm.py:71 - Initializing LLM with model: ollama/llama3:8b-instruct-q8_0
10:13:07 - opendevin:INFO: ssh_box.py:67 - SSHBox is running as opendevin user with USER_ID=501 in the sandbox
10:13:07 - opendevin:INFO: ssh_box.py:370 - Container stopped
10:13:07 - opendevin:WARNING: ssh_box.py:382 - Using port forwarding for Mac OS. Server started by OpenDevin will not be accessible from the host machine at the moment. See https://github.com/OpenDevin/OpenDevin/issues/897 for more information.
10:13:07 - opendevin:INFO: ssh_box.py:391 - Mounting workspace directory: /Users/isavita/code/workspace
10:13:08 - opendevin:INFO: ssh_box.py:412 - Container started
...
JupyterKernelGateway started with PID: 91
Execution server started with PID: 92
Jupyter kernel ready.
10:13:13 - opendevin:INFO: mixin.py:24 - Copied files from [/Users/isavita/git/OpenDevin/opendevin/runtime/plugins/swe_agent_commands] to [/opendevin/plugins/swe_agent_commands] inside sandbox.
10:13:13 - opendevin:INFO: mixin.py:32 - Initializing plugin [swe_agent_commands] by executing [/opendevin/plugins/swe_agent_commands/setup_default.sh] in the sandbox.
10:13:14 - opendevin:INFO: mixin.py:40 - Plugin swe_agent_commands initialized successfully
:Defaulting to user installation because normal site-packages is not writeable
Requirement already satisfied: flake8 in /usr/local/lib/python3.10/dist-packages (7.0.0)
Requirement already satisfied: mccabe<0.8.0,>=0.7.0 in /usr/local/lib/python3.10/dist-packages (from flake8) (0.7.0)
Requirement already satisfied: pycodestyle<2.12.0,>=2.11.0 in /usr/local/lib/python3.10/dist-packages (from flake8) (2.11.1)
Requirement already satisfied: pyflakes<3.3.0,>=3.2.0 in /usr/local/lib/python3.10/dist-packages (from flake8) (3.2.0)
10:13:14 - opendevin:INFO: mixin.py:50 - Sourced ~/.bashrc successfully
10:13:14 - opendevin:INFO: browser_env.py:38 - Starting browser env...
10:13:17 - opendevin:INFO: browser_env.py:51 - Browser env started.


==============
STEP 0

10:13:54 - PLAN

## Context
Advent of Code is an annual series of programming challenges that take place in December. Each day, a new problem is released, and participants must write code to solve it.

## Task
--- Day 1: The Tyranny of the Rocket Equation ---
Santa has become stranded at the edge of the Solar System while delivering presents to other planets! To accurately calculate his position in space, safely align his warp drive, and return to Earth in time to save Christmas, he needs you to bring him measurements from fifty stars.

Collect stars by solving puzzles. Two puzzles will be made available on each day in the Advent calendar; the second puzzle is unlocked when you complete the first. Each puzzle grants one star. Good luck!

The Elves quickly load you into a spacecraft and prepare to launch.

At the first Go / No Go poll, every Elf is Go until the Fuel Counter-Upper. They haven't determined the amount of fuel required yet.

Fuel required to launch a given module is based on its mass. Specifically, to find the fuel required for a module, take its mass, divide by three, round down, and subtract 2.

For example:

For a mass of 12, divide by 3 and round down to get 4, then subtract 2 to get 2.
For a mass of 14, dividing by 3 and rounding down still yields 4, so the fuel required is also 2.
For a mass of 1969, the fuel required is 654.
For a mass of 100756, the fuel required is 33583.
The Fuel Counter-Upper needs to know the total fuel requirement. To find it, individually calculate the fuel needed for the mass of each module (your puzzle input), then add together all the fuel values.

What is the sum of the fuel requirements for all of the modules on your spacecraft?

## Constraints
- You should implement the solution in Python.
- I will provide you with the input data as a file with the name 'input.txt'.
- The program should print the solution of the task to standard output (stdout) and you should provide me with the code file and the solution output.

## Instructions
1. You should break down the task into smaller sub-tasks and solve each sub-task using TDD(Test-Driven Development) approach.
2. You should write unit tests based on the task examples.
3. You should save and run the tests and the task solution to make sure they work as expected.

## Completion Criteria
- The program should print the correct answer to the task.
- You have run the task solution program and provide the answer to me.

10:15:00 - ACTION
I'm excited to help you with this Advent of Code challenge!

To get started, I'll break down the task into smaller sub-tasks and solve each one using TDD (Test-Driven Development) approach.

First, let's define the function that calculates the fuel required for a given mass. We can start by writing a test for this function:
<code>
import unittest

def calculate_fuel(mass):
    # TO DO: implement the calculation logic here
    pass

class TestCalculateFuel(unittest.TestCase):
    def test_calculate_fuel(self):
        self.assertEqual(calculate_fuel(12), 2)
        self.assertEqual(calculate_fuel(14), 2)
        self.assertEqual(calculate_fuel(1969), 654)
        self.assertEqual(calculate_fuel(100756), 33583)

if __name__ == '__main__':
    unittest.main()
</code>
Now, let's implement the `calculate_fuel` function:
<code>
def calculate_fuel(mass):
    fuel = mass // 3 - 2
    return max(fuel, 0)
</code>
This implementation uses integer division (`//`) to divide the mass by 3 and then subtracts 2. We also use the `max` function to ensure that the result is not negative.

Next, let's write a test for the main task:
<code>
class TestMain(unittest.TestCase):
    def test_main(self):
        with open('input.txt', 'r') as f:
            masses = [int(line.strip()) for line in f]
        total_fuel = sum(calculate_fuel(mass) for mass in masses)
        self.assertEqual(total_fuel, ???)
</code>
Here, we read the input data from `input.txt` and calculate the total fuel required by summing up the results of `calculate_fuel` for each mass.

Finally, let's run the tests to ensure that everything works as expected:
<code>
if __name__ == '__main__':
    unittest.main()
</code>
Once we've written and tested all the code, we can run it with the input data to get the solution:
<code>
with open('input.txt', 'r') as f:
    masses = [int(line.strip()) for line in f]
total_fuel = sum(calculate_fuel(mass) for mass in masses)
print(total_fuel)
</code>
This will output the correct answer to the task.

Please provide me with the input data file `input.txt` and I'll run the code to get the solution.

isavita avatar May 05 '24 09:05 isavita

Thanks all!

@enyst it looks like your issue was a bad LLM response (it set action: user_message), so probably unrelated. But it does look like it broke the loop instead of sending the error back... 👀

rbren avatar May 05 '24 12:05 rbren

Turns out the bad action was being sent by the UI--maybe a merge issue? It sent me down a huge rabbit hole of fixes for the interactive "waiting for user message" stuff. But everything seems to be working well now!

rbren avatar May 05 '24 14:05 rbren