Nitan Alexandru Marcel

Results 125 comments of Nitan Alexandru Marcel

> Omg this pr couldnt be bigger hihi. I'm not done

Can someone tell me what's a token limit? Because I don't get any =))) ![Screenshot from 2024-09-13 00-06-18](https://github.com/user-attachments/assets/1e28b750-7cc3-481d-a10a-d4049dd5fb13)

Oh, this one -_-. I was close enough tho ``` openai.BadRequestError: Error code: 400 - {'error': {'message': "This model's maximum context length is 8192 tokens. However, you requested 8226 tokens...

> @nitanmarcel before you get too far.. Re-Implement OpenAI Re-implement Anthropic Re-Implement Llama Re-Implement Bedrock Re-Implement Groq Re-Implement Google Re-Implement NousResearch > > All of these can just be served...

Would be something like: ``` @process_response(processor=function_to_convert_response) def unsupported_model_call(...):```

> Would be something like: > > ``` > @process_response(processor=function_to_convert_response) > def unsupported_model_call(...):``` > ``` Tho it implies that it supports the same tool format. Can create a pre_processor argument...

> Can someone tell me what's a token limit? Because I don't get any =))) > > ![Screenshot from 2024-09-13 00-06-18](https://private-user-images.githubusercontent.com/41646249/367060721-1e28b750-7cc3-481d-a10a-d4049dd5fb13.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MjYxNzc4OTAsIm5iZiI6MTcyNjE3NzU5MCwicGF0aCI6Ii80MTY0NjI0OS8zNjcwNjA3MjEtMWUyOGI3NTAtN2NjMy00ODFkLWExMGEtZDQwNDlkZDVmYjEzLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNDA5MTIlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQwOTEyVDIxNDYzMFomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWVjNTg5OWViNTJmNGY5NGI4YjFiYjE1ZTQyYmJjZTBhZTA4MzNhMzdiNDg1Zjc5Mzk2Mzk2NGIwOWUzZDBkZjEmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0JmFjdG9yX2lkPTAma2V5X2lkPTAmcmVwb19pZD0wIn0.4eg0ryFDl3Io8Eiq24Bt3FKi1tQudJtLu7NUW6WtoS0) Anyway, I have this to figure out. The chunking...

> > > @nitanmarcel before you get too far.. Re-Implement OpenAI Re-implement Anthropic Re-Implement Llama Re-Implement Bedrock Re-Implement Groq Re-Implement Google Re-Implement NousResearch > > > All of these can...

> Yeah, I've used instructor. But with litellm, you don't need to parse anything raw. I have no interest in maintaining transformations for so many models when it already exists,...