AGiXT
AGiXT copied to clipboard
AGiXT is a dynamic AI Agent Automation Platform that seamlessly orchestrates instruction management and complex task execution across diverse AI providers. Combining adaptive memory, smart features, a...
When trying to run the docker container, I'd get: `Error response from daemon: Ports are not available: exposing port TCP 0.0.0.0:5000 -> 0.0.0.0:0: listen tcp 0.0.0.0:5000: bind: address already in...
Hi, Great project. It is exactly what the Autonomous Agent space is lacking, to get rid of the dependency to OpenAI or other commercial AI providers. Based on my own...
Apparently the agents are set up in a way that they only expect to receive the response back from the API, however text-gen-ui using the gradio api currently sends back...
Back end support has been added for Chains in the API. We will need the functionality for the chains added to the front end. # What are Chains? Chains will...
Api info here: https://docs.searxng.org/dev/search_api.html Google API is limited, needs a key, you only get 100 queries and then have to pay. Other options exist but face many of the same...
Backend API support has been added for custom prompts. # What are Custom Prompts? Custom prompts are what gives an agent its initial mindset essentially. It is how we tell...
Version: ghcr.io/josh-xt/agent-llm-backend:v1.0.7 Reproduction: - docker compose up --build - Navigate to web interface - Type a task into the "Provide agent with objective" field and submit - Backend unexpectedly exits...
Running provider in Docker container using OpenAI API Key. Using 1.0.7 ![image](https://user-images.githubusercontent.com/4380009/234183418-d4d2f3b8-f779-41de-a3de-f77b0bf5624d.png)
Connection refused when trying to use Agent-LLM with oobabooga env file important parts: # ========================= # AI PROVIDER CONFIG # ========================= AI_PROVIDER=oobabooga AI_MODEL=vicuna AI_TEMPERATURE=0.2 MAX_TOKENS=2000 # AI PROVIDER: CUSTOM (e.g.,...
Theme is set to state, not LocalStorage