M Pearson
M Pearson
Did anyone figure out the solution to this? I am hitting this when I try to put content into tabs. If the dash_tables are rendered directly on the main page,...
Same issue here.
If you need a typical user to test run this, am happy to do so. Am currently fighting to get it linked up and working in Jupyter with local Ollama...
Ran it with success (yay!) on our system, but got some weird results. Our data input is a set of transcripts of presentations on subsidence. Many of the labels were...
> @mepearson To me, it actually seems like the stop token is not correctly initialized/chosen since it should have stopped after `Study Results Presentation` which seems like a decent topic...
So this may have something to do with how the prompts are formatted, or the integration with Ollama models? I ran it again but also did an OpenAI representation model...
Sorry for confusion - that was just a couple columns of output from rows 1:5 of the `topic_model.get_topic_info()` dataframe presented in a more readable form. So it's the output from...
It may also be that I need to grab more code from the branch. As of now I'm just using the `langchain_rep.py` file, which runs but may not run _correctly_...
@MaartenGr - yes, OpenAI is working great and is what I expected to get. It's when I use our internal Ollama models with the new langchain connection that things get...