BlockAGI
BlockAGI copied to clipboard
It seems to consume a lot of tokens
Maybe it's just an impression, but I find that BlockAGI consumes a lot of tokens compared to AgentGPT or SuperAGI.
Yes, unfortunately the way BlockAGI works is quite inefficient at the moment. A lot of tokens are used on formatting instructing model to just work well.
Future iteration's goal include reducing this overhead as well as reducing repetition in the narration tasks.