Uri Peled
Uri Peled
Do you have or know a better implementation? Can you explain or show to me how you checked it?
If you only went re-assigning you don't need to use this library, just use ModelSerializer.
@xalien10 I don't understated your solution
I recently published a package [llm-client](https://github.com/uripeled2/llm-client-sdk) that can be very helpful in enabling the support to run other LLM models, including OpenAI, Google, AI21, HuggingfaceHub, Aleph Alpha, Anthropic, Local models...
you can use the p.run(eval_genomes, num_of_generations) winner = p.run(eval_genomes, num)
current code: #Check Col for i in range(0, len(bo)): if bo[i][pos[1]] == num and pos[1] != i: return False should be: #Check Col for i in range(0, len(bo)): if bo[i][pos[1]]...
I recently published a package [llm-client](https://github.com/uripeled2/llm-client-sdk) that can be very helpful in enabling the support to run other LLM models, including OpenAI, Google, AI21, HuggingfaceHub, Aleph Alpha, Anthropic, Local models...
Checkout llm-client, an open-source Python package for seamless integration with LLMs🌟 https://github.com/uripeled2/llm-client-sdk
I recently published a package [llm-client](https://github.com/uripeled2/llm-client-sdk) that can be very helpful in enabling the support to run other LLM models, including OpenAI, Google, AI21, HuggingfaceHub, Aleph Alpha, Anthropic, Local models...
I recently published a package [llm-client](https://github.com/uripeled2/llm-client-sdk) that can be very helpful in enabling the support to run other LLM models, including OpenAI, Google, AI21, HuggingfaceHub, Aleph Alpha, Anthropic, Local models...