Back
Similar todos
installed cody to cursor, so that i can use llama3.1 and gemma2 via ollama #astronote #leifinlavida
check out Llama 3.1 #life
trying out cursor #knifegeek
🤖 got llama-cpp running locally 🐍
✏️ wrote about running Llama 3.1 locally through Ollama on my Mac Studio. micro.webology.dev/2024/07/24…
try client side web based Llama 3 in JS #life webllm.mlc.ai/
switch #therapistai to Llama 3.1
#thecompaniesapi run my phi3-128k flow using llama3.1 and the results are mind blowing, it's insane how good llama is at conserving context and original purpose even when supplied with thousands of tokens; also shipped multiple hotfixes in robot UI; about to merge a month of work and then hop on fine tuning
explore cursor composer 🤓
got llamacode working locally and it's really good
get cursor pro #life
Playing with llama2 locally and running it from the first time on my machine
Explore llama 3 8b for embeddings
Giving Cursor.sh a test drive #chores
prototype a simple autocomplete using local llama2 via Ollama #aiplay
🤖 played with Aider and it mostly working with Ollama + Llama 3.1 #research
realize #therapistai with Llama3-70B actually understands WTF is going on now