Tomáš Dvořák
Tomáš Dvořák
Great work! I can acknowledge that the issue has been fixed in v0.0.53
Can I see such HTML Table so I can debug it?
If your table looks like this ``` A C A1 B1 C1 A1 C1 ``` Then you can do this ```typescript const data = await tableParser(page, { selector: '#table-overview', asArray:...
I am closing this for being stale.
We reworked the instrumentation package for BeeAI. Is this issue still present?
The `start/finish` events should not be present in the latest version `0.1.10`.
Can you clarify a little bit, @codefromthecrypt?
We support multiple providers, and we internally rely on LiteLLM, and we can't drop it. Only for the OpenAI adapter can we do that, but I'm not sure if it...
Great @Abiji-2020! Firstly you need to define the model template in `src/adapters/shared/llmChatTemplates.ts`. then you have to updates presets for BAM and WatsonX, finally update the examples.