pgalko
pgalko
Hi Alex, I will consider adding this functionality. In the meantime, if you have a csv file on dropbox you can add this data as custom. Just go to "Custom...
Good point. What it does during that gap is developing a new version of the code, incorporating the fix. We can easily enable a stream to terminal by just changing...
The bamboo ai can run in a while loop with memory enabled natively, you should not need to enclose it in a for loop. Basically, if you run it with...
Unfortunately the library currently does not support Azure Open AI models. It is on a TODO list though. Should be a relatively easy fix, I just need to find some...
If you want to explore Open Source coding models without GPU, the best way about it is to use Google Colab and one of the GPTQ models. At this point...
I am glad you like it :-) The high token use is due to a few things. Please see the below breakdown. 1. To ensure that the models respond accurately...
I forgot to mention one more thing. You can set 'exploratory=False'. BambooAI will skip the break down of the question into task list and will go straight to code generation....
@Murtuza-Chawala great to hear that :-). I frequently use OpenAI Ci as a benchmark. The CI is much faster and the outputs formatting is so much nicer, but often bamboo...
@Murtuza-Chawala I have identified the issue that was leading to excessive token usage. The problem stemmed from the way the default example code was incorporated into the prompt template. Specifically,...
Yes, that is correct. You should see a lot less error corrections, hence reduced token usage.