Back
Similar todos
Load previous page…
ollama is worth using if you have an M1/M2 mac and want a speedy way to access the various llama2 models.
livecode even more WebRTC stuff
quick livecode; get useAuth almost working as a lib
livecode some basic gatsby setup #serverlesshandbook
livecode some research #learnwhileyoupoop
livecode some progress #threadcompiler
get a NoCode client #labs
#thecompaniesapi run my phi3-128k flow using llama3.1 and the results are mind blowing, it's insane how good llama is at conserving context and original purpose even when supplied with thousands of tokens; also shipped multiple hotfixes in robot UI; about to merge a month of work and then hop on fine tuning
livecode progress on my little CMS #codewithswiz
more fun with LLAMA2 and figuring out how to better control/predict stable output
livecode messing around with Remix
livecode messing around with nextjs #codewithswiz