Back
Similar todos
installed cody to cursor, so that i can use llama3.1 and gemma2 via ollama #astronote #leifinlavida
See similar todos

No replies yet

check out Llama 3.1 #life
See similar todos

No replies yet

trying out cursor #knifegeek
See similar todos

No replies yet

🤖 got llama-cpp running locally 🐍
See similar todos

No replies yet

✏️ wrote about running Llama 3.1 locally through Ollama on my Mac Studio. micro.webology.dev/2024/07/24…
See similar todos

No replies yet

install finally ollama with llama3 #life
See similar todos

No replies yet

try client side web based Llama 3 in JS #life webllm.mlc.ai/
See similar todos

No replies yet

switch #therapistai to Llama 3.1
See similar todos

No replies yet

#thecompaniesapi run my phi3-128k flow using llama3.1 and the results are mind blowing, it's insane how good llama is at conserving context and original purpose even when supplied with thousands of tokens; also shipped multiple hotfixes in robot UI; about to merge a month of work and then hop on fine tuning
See similar todos

No replies yet

get amazed by cursor, late to the party? 🎉
See similar todos

No replies yet

explore cursor composer 🤓
See similar todos

No replies yet

got llamacode working locally and it's really good
See similar todos

No replies yet

get cursor pro #life
See similar todos

No replies yet

Playing with llama2 locally and running it from the first time on my machine
See similar todos

No replies yet

Explore llama 3 8b for embeddings
See similar todos

No replies yet

Giving Cursor.sh a test drive #chores
See similar todos

No replies yet

prototype a simple autocomplete using local llama2 via Ollama #aiplay
See similar todos

No replies yet

🤖 played with Aider and it mostly working with Ollama + Llama 3.1 #research
See similar todos

No replies yet

try groq.com/ #fajarsiddiq
See similar todos

No replies yet

realize #therapistai with Llama3-70B actually understands WTF is going on now
See similar todos