nhost
nhost copied to clipboard
Allow other models than OpenAI for auto embeddings
Is your feature request related to a problem? Please describe. At the moment is only possible to use OpenAI for auto embeddings. It would be great if we could use other sentence transformers, for example https://huggingface.co/danielheinz/e5-base-sts-en-de or https://huggingface.co/intfloat/multilingual-e5-base.
Describe the solution you'd like A convenient way would be if I could specify the models and all other required settings for the auto embeddings in the config, sth. like:
[ai.autoEmbeddings.custom]
name = 'my_custom_auto_embedding'
synchPeriodMinutes = 5
model = 'intfloat/multilingual-e5-base'
num_dimensions = 768
schema = 'public'
table = 'some_table'
column = 'some_vector_column'
query = 'somegraphqlquery'
mutation = 'somegraphqlmutation'
cpu = 128
memory = 256
gpu = ? # not sure if that is possible
If that is not possible, a model-selection dropdown in the dashboard would also be fine.
Describe alternatives you've considered An alternative would be to use a txtai-API docker container , run it via "Run" service on nhost, create an edge function with a scheduler and create / update the embeddings myself.