Did you know you can use Local LLMs with MATLAB?

Generative AI
Follow


Did you know you can use Local LLMs with MATLAB?

David on 13 Sep 2024 (Edited on 18 Sep 2024)
Latest activity Edit by David on 18 Sep 2024

Local large language models (LLMs), such as llama, phi3, and mistral, are now available in the Large Language Models (LLMs) with MATLAB repository through Ollama™!
Read about it here:
Sign in to participate
David
David on 13 Sep 2024
I've been 'playing' with Ollama since it was released. It's great to pull down new/updated models to see what they can do. Typically, models sizes of 7Billion or less run well on my personal laptop with 16 gigs of RAM. While I haven't found one that's good enough for coding, there are a couple such as Mistral and Llama3 that can serve several helpful use cases - such as summarization, brain storming, etc.
Hans Scharler
Hans Scharler on 13 Sep 2024
Wow, this is awesome. Didn't think about local models. They are getting more and more capable.

Tags

Go to top of page