Back
Similar todos
✏️ I wrote and published my notes on using the Ollama service micro.webology.dev/2024/06/11…
✏️ wrote about running Llama 3.1 locally through Ollama on my Mac Studio. micro.webology.dev/2024/07/24…
⬆️ upgraded ollama and tried out some new features
worked on my Ollama cli tool to add history, load sessions, and format logs as markdown #research
🤖 spent my evening writing a better console for some more advanced Ollama 3.1 projects. #research
try client side web based Llama 3 in JS #life webllm.mlc.ai/
🤖 played with Ollama's tool calling with Llama 3.2 to create a calendar management agent demo #research
read up on llm embedding to start building something new with ollama
🤖 played with Aider and it mostly working with Ollama + Llama 3.1 #research
⚙️ updated #dotfiles to add more ollama download options to make it easier to sync
I used Claude 3.5 Project + Artifacts to help refactor most of the Ollama + Llama 3.1 #research project. I would call it a writing bot, but I'm not building it to automate writing. It's mostly a content wrapper around the Chat interface but it's good at generating code and building off of chat history.
🤖 more working with Ollama and Llama 3.1 and working on a story writer as a good enough demo. #research
created several personal AI agents via ollama #leifinlavida
prototype a simple autocomplete using local llama2 via Ollama #aiplay
Made mockups for demo #olisto
📝 prototyped an llm-ollama plugin tonight. models list and it talks to the right places. prompts need more work.
Researched frontend options for site #olisto
Wrote my weekly report in #olisto