Cole Murray

Results 20 comments of Cole Murray

Before ```python from transformers import AutoModelForCausalLM, AutoTokenizer from PIL import Image # Load the model model = AutoModelForCausalLM.from_pretrained( "vikhyatk/moondream2", revision="2025-01-09", trust_remote_code=True, # Uncomment for GPU acceleration & pip install accelerate...

Following up on this: I see two paths forward here, we could a) utilize the https://github.com/jamesmbourne/aws4-axios and implement as an interceptor, as seen in @jarruda's commit b) Implement using system...

Looking at the list of prices, this would be broken for OpenAI as well: https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json All OAI models do not have a provider/ prefix. I think it would make sense...

I've made a pull request in litellm that will resolve this: https://github.com/BerriAI/litellm/pull/5688

There also appears to be a bug on the secretsManager permission being applied to the logArns, rather than a specific secret arn or "*"

Fix here: https://github.com/SWE-agent/SWE-ReX/pull/260/files

@carlosejimenez when available, can you take a look?

@klieret @carlosejimenez any thoughts on this?

langfuse supports Otel, so preferably we could instrument with Otel and have langfuse as one of N many different OtelProcessors https://langfuse.com/docs/opentelemetry/get-started