LLMStack
LLMStack copied to clipboard
[v0.0.7] LocalAI without TLD is not acessible
Describe the bug Unable to add localAI to docker-compose and testing it since URL requires TLD
To Reproduce
- Start the stack
- Create example Website chatbot
- Preview function and ask something
Expected behavior
LLMstack should be able to communicate with local local-ai like http://local-ai:8080
Version v0.0.7
Environment DISTRIB_DESCRIPTION="Linux Mint 21.2 Victoria" Docker version 24.0.5, build ced0996 Docker Compose version v2.20.3
Screenshots
Additional context
lmstack-007-rqworker-1 | return self.__get_result()
llmstack-007-rqworker-1 | ^^^^^^^^^^^^^^^^^^^
llmstack-007-rqworker-1 | File "/usr/local/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
llmstack-007-rqworker-1 | raise self._exception
llmstack-007-rqworker-1 | File "/usr/local/lib/python3.11/site-packages/asgiref/sync.py", line 306, in main_wrap
llmstack-007-rqworker-1 | result = await self.awaitable(*args, **kwargs)
llmstack-007-rqworker-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
llmstack-007-rqworker-1 | File "/code/common/utils/crawlers.py", line 75, in run_playwright
llmstack-007-rqworker-1 | html_content = await page.content()
llmstack-007-rqworker-1 | ^^^^^^^^^^^^^^^^^^^^
llmstack-007-rqworker-1 | File "/usr/local/lib/python3.11/site-packages/playwright/async_api/_generated.py", line 9142, in content
llmstack-007-rqworker-1 | return mapping.from_maybe_impl(await self._impl_obj.content())
llmstack-007-rqworker-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
llmstack-007-rqworker-1 | File "/usr/local/lib/python3.11/site-packages/playwright/_impl/_page.py", line 462, in content
llmstack-007-rqworker-1 | return await self._main_frame.content()
llmstack-007-rqworker-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
llmstack-007-rqworker-1 | File "/usr/local/lib/python3.11/site-packages/playwright/_impl/_frame.py", line 415, in content
llmstack-007-rqworker-1 | return await self._channel.send("content")
llmstack-007-rqworker-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
llmstack-007-rqworker-1 | File "/usr/local/lib/python3.11/site-packages/playwright/_impl/_connection.py", line 61, in send
llmstack-007-rqworker-1 | return await self._connection.wrap_api_call(
llmstack-007-rqworker-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
llmstack-007-rqworker-1 | File "/usr/local/lib/python3.11/site-packages/playwright/_impl/_connection.py", line 482, in wrap_api_call
llmstack-007-rqworker-1 | return await cb()
llmstack-007-rqworker-1 | ^^^^^^^^^^
llmstack-007-rqworker-1 | File "/usr/local/lib/python3.11/site-packages/playwright/_impl/_connection.py", line 97, in inner_send
llmstack-007-rqworker-1 | result = next(iter(done)).result()
llmstack-007-rqworker-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^
llmstack-007-rqworker-1 | playwright._impl._api_types.Error: Unable to retrieve content because the page is navigating and changing the content.
Just to clarify, since I didn't see it in the OP, when using the LocalAI provider with a base url that includes localhost, the error shown in the logs is:
INFO 2024-01-12 08:07:41,631 coordinator Actor _inputs1 has no dependencies. Sending BEGIN message
ERROR 2024-01-12 08:07:41,633 __init__ Exception occurred while processing
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/llmstack/common/blocks/llm/__init__.py", line 42, in process
return self._process(self.parse_validate_input(input), self.configuration)
File "/usr/local/lib/python3.10/dist-packages/llmstack/common/blocks/llm/openai.py", line 136, in _process
http_input = HttpAPIProcessorInput(
File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for HttpAPIProcessorInput
url
URL host invalid, top level domain required (type=value_error.url.host)
Please use provider config and use openai provider to work with openai compatible endpoints. https://docs.trypromptly.com/providers for more.