Ollama

A tool to run open-source LLMs locally. You can run models simply by running

$ ollama run mixtral

$ ollama run llama3