Uri Peled

Results 9 issues of Uri Peled

I set 256 to be the population size but after some generations it change a bit to 257 or 255 and my fifteens function only works if the population size...

```python from django.db import models class Location(models.Model): city = models.CharField(max_length=100) class Store(models.Model): name = models.CharField(max_length=100) location = models.ForeignKey(Location, on_delete=models.CASCADE, null=True) class LocationSerializer(serializers.ModelSerializer): class Meta: model = Location fields = ('id',...

You can see [here](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_call_functions_with_chat_models.ipynb) the openai-cookbook for using functions with [openai-python](https://github.com/openai/openai-python). The biggest problem for adding support for it in [OpenAIClient](https://github.com/uripeled2/llm-client-sdk/blob/main/llm_client/llm_api_client/openai_client.py) through [chat_completion](https://github.com/uripeled2/llm-client-sdk/blob/main/llm_client/llm_api_client/openai_client.py#L41) is that openai functions return an object...

enhancement
help wanted

@aharonYK think you can take it? I think it will be similar to https://github.com/uripeled2/llm-client-sdk/issues/23

enhancement
good first issue

We can add and abstract method to BaseLLMClient like: `def list_models(**kwargs) -> list[str]` And then we need to implement it in the different clients, we can add a static constants...

enhancement

Implement MosaicML BaseLLMAPIClient, find out more about MosaicML API docs [here](https://docs.mosaicml.com/en/latest/inference.html#api-reference). You can find the steps of adding BaseLLMAPIClient in the end of the [README.md](https://github.com/uripeled2/llm-client-sdk#readme)

enhancement
good first issue

Implement Cohere BaseLLMAPIClient, find out more about Cohere API docs [here](https://docs.cohere.com/reference/about). You can find the steps of adding BaseLLMAPIClient in the end of the [README.md](https://github.com/uripeled2/llm-client-sdk#readme)

enhancement
good first issue

I recently published a package [llm-client](https://github.com/uripeled2/llm-client-sdk) that can be very helpful in enabling the support to run other LLM models, including OpenAI, Google, AI21, HuggingfaceHub, Aleph Alpha, Anthropic, Local models...

Do you have any plans to add configuration options that would allow the use of custom LLMs in future versions? I recently published a package [llm-client](https://github.com/uripeled2/llm-client-sdk) that can be very...