Local large language models (LLMs), such as llama, phi3, and mistral, are now available in the Large Language Models (LLMs) with MATLAB repository through Ollama™!
Read about it here:
Local LLMs with MATLAB
Local large language models (LLMs), such as llama, phi3, and mistral, are now available in the Large Language Models (LLMs) with MATLAB repository through Ollama™! This is such exciting news that I can’t think of a better introduction than to share with you this amazing development. Even if you don’t read any further (but I hope you do), because you
2 Comments
Sign in to participate
I've been 'playing' with Ollama since it was released. It's great to pull down new/updated models to see what they can do. Typically, models sizes of 7Billion or less run well on my personal laptop with 16 gigs of RAM. While I haven't found one that's good enough for coding, there are a couple such as Mistral and Llama3 that can serve several helpful use cases - such as summarization, brain storming, etc.
Wow, this is awesome. Didn't think about local models. They are getting more and more capable.