Jeff Tang

Results 47 comments of Jeff Tang

> @jeffxtang it depends. Generally, if it is possible in native Android and iOS, then it is theoretically possible in React Native. > > I did a quick check on...

> > > @jeffxtang it depends. Generally, if it is possible in native Android and iOS, then it is theoretically possible in React Native. > > > I did a...

> The query from index.query(...) still wants me to use OpenAI API key? I am not sure how to avoid this. Looking for some ideas As @imeckr said, and I...

Depending on the characteristics of your historical data, traditional ML models such as regressions, KNN and decision trees (you can use [scikit-learn](https://scikit-learn.org/stable/auto_examples/index.html) to quickly train and test with your data),...

There's no mention of a preferred format for Llama 3. According to the [Llama 3 model card prompt format](https://llama.meta.com/docs/model-cards-and-prompt-formats/meta-llama-3/), you just need to follow the new Llama 3 format there...

I'm not aware of such preference for Llama 3, but it should be easy, with some automated RAG evaluation frameworks (there're quite a few nice open source frameworks), to compare...

Are you looking for a demo app that shows how to use Llama in Colab? If so, you can just upload the notebook HelloLlamaCloud.ipynb to Colab:

Thanks @RichmondAlake for the changes and the screenshots. I tried following steps under Atlas UI (there're 10!) but got lost - I already logged in and completed the survey then...

@nsubordin81 Thanks for all the updates. Running Llama2-70b locally with llama.cpp used to be at least slow (I haven't checked out their latest update). How fast were you able to...

@monaalsh sorry for the late reply. did you run TGI on your local machine? Can you try steps [here](https://github.com/facebookresearch/llama-recipes/blob/main/demo_apps/llama-on-prem.md#setting-up-tgi-with-llama-2)?