Back
Similar todos
test OpenHermes 2.5 Mistral LLM
Download LM Studio
Run openllm dollyv2 on local linux server
try load #monkeyisland in my own local LLM
Playing with llama2 locally and running it from the first time on my machine
✏️ wrote about running Llama 3.1 locally through Ollama on my Mac Studio. micro.webology.dev/2024/07/24…
work on setting up the system locally #labs
Starting up local server #olisto
field model installed #klimy
📝 prototyped an llm-ollama plugin tonight. models list and it talks to the right places. prompts need more work.
ollama is worth using if you have an M1/M2 mac and want a speedy way to access the various llama2 models.