Back
Similar todos
Load previous page…
Release #writeas Linux app
See similar todos

No replies yet

set up linux VM for testing
See similar todos

No replies yet

Got whatdo and lets-play running locally
See similar todos

No replies yet

setup local ssl environement #pocketpager
See similar todos

No replies yet

setup local #chocolab dev environment
See similar todos

No replies yet

#contracting Install Swoole on local environment and staging server
See similar todos

No replies yet

Shipped BoltAI v1.13.6, use AI Command with local LLMs via Ollama 🥳 #boltai
See similar todos

No replies yet

get #remoteok working on M1X local dev
See similar todos

No replies yet

Run Telegram client in cloud #telepost
See similar todos

No replies yet

got llamacode working locally and it's really good
See similar todos

No replies yet

test on linux #menubar
See similar todos

No replies yet

Over the past three days and a half, I've dedicated all my time to developing a new product that leverages locally running Llama 3.1 for real-time AI responses. It's now available on Macs with M series chips – completely free, local, and incredibly fast. Get it: Snapbox.app #snapbox
See similar todos

No replies yet

setup local dev env #sanderfish
See similar todos

No replies yet

setup local dev env #chime
See similar todos

No replies yet

setup linode server for client #freelancegig
See similar todos

No replies yet

hide Linux user on #nomads /open
See similar todos

No replies yet

setup pm2 on server #myoutfit
See similar todos

No replies yet

get #inflationchart running on local dev M1X
See similar todos

No replies yet

get #ideasai running on local dev M1X
See similar todos

No replies yet

ember local install
See similar todos

No replies yet