RaGe
RaGe
Here's the finished workflow: https://github.com/foragerr/OpenDevin/actions/runs/8577986362/workflow How to use it: 1. Setup repo secret called `OPENAI_API_KEY` containing a valid OPENAI key 2. Create an issue containing sufficient context in the title...
PR: https://github.com/OpenDevin/OpenDevin/pull/803
caution: Adding labels on issues is not controlled, anyone would be able to add a label and generate spurious PRs, and more importantly use up OpenAI Credits. Perhaps the trigger...
This is completed in #803
How does the backend send the value of `LLM_MODEL` to the frontend? Are you thinking that's a new endpoint?
Are you folks making backend and frontend separate containers? or jamming them into 1 container?
Not to take away from @jrruethe 's excellent work, but I feel like this whole OpenDevin in a container could be a lot simpler - 1. Is docker on docker...
Are you doing Docker in docker by mapping the docker socket from host to OpenDevin container?
Sounds like you'll want to eventually support multiple deployment modes - 1. If you're an OpenDevin developer; do something very close to what's available today, Makefiles, poetry, pnpm etc 2....
Plan: - [x] Stage 1 : Create a docker image that combines UI and backend into 1 process, exposes a single port; but still relies on an external docker based...