Jay
@jayair
Founder @SST_dev
The juice is not the fruit. You try something, you like it. You want it more. But you only want the good parts. So you spend all your time trying to squeeze the essence out of the thing. Now you have this new concentrated thing. What you think you wanted all along. And you…
the term vibe coding has done more harm than good
i wish we'd stop calling it vibe coding that term roughly translates to your head of marketing typing random shit into a chatbot and not really knowing if the output is correct thats not what I do
It works harder if you call it daddy, just saying
quick saturday release - opencode 0.3.77 has github integration in any github issue you can run `/opencode fix me daddy` and it'll run and submit a PR if there are any code changes get started with `opencode github install`
you can't push to prod on Friday cos it's too close to the weekend but you can't push to prod on Thursday cos it's too close to Friday also can't push to prod on Wednesday cos it's too close to Thursday probably can't push to prod on Tuesday cos it's too close to Wednesday…
claude code shipped subagents today so i guess we gotta too ...and it's done - available in opencode 0.3.65 i made a subagent to teach me a lesson if i get too cocky - you all know i need it
Every time I see a session share link from OpenCode I’m amazed others don’t make this kind of behavior more common place. Such good UX
new qwen coder model is available in opencode via openrouter quick test worked well link in reply
Cmd+Esc to launch opencode for the current file Cmd+Opt+K to inject the selected file/lines into the prompt
opencode now integrates with vscode, cursor, windsurf, etc it'll be aware of what file you're in and you can easily send inject selected lines into the prompt start opencode inside your editor and it'll install the extension
opencode making a pong game in vite+react using (4bit) qwen/qwen3-235b-a22b-2507 locally, served by lmstudio. It used like 130GB of RAM, 0 issues with tool calls. This is completely usable locally now. Whether it's at claude level or not, idk yet, but I've no doubt we'll be…
in opencode v0.3.53 we implemented message queuing so you can send a message while opencode is working and it'll update the LLM on the next opportunity - basically lets you resteer it in a different direction should feel pretty natural
in v0.3.45 you can finally....copy text can you believe we went this long without implementing it btw this was supposed to be adam's task but instead i had to spend my sunday afternoon implementing it because he is a lazy ass
we updated models.dev to include vercel ai gateway so you can use it with opencode - single api key, access many models - pass through pricing, no markup - if you have a team plan you get $5 a month in credit
I would be more annoyed with AWS cloning other products but honestly they are bad at doing that too.
extremely good news - something like opencode is now usable on tier 2
We've increased Tier 1-4 rate limits for Claude Sonnet 4 on the Anthropic API: Tier 1: 20K → 30K ITPM, 8K OTPM Tier 2: 40K → 450K ITPM, 16K → 90K OTPM Tier 3: 80K → 800K ITPM, 32K → 160K OTPM Tier 4: 200K → 2M ITPM, 40K → 400K OTPM
The more impressive stat here is the 200 releases. 7 releases a day, for 27 days.
it's been 27 days since we launched opencode and in that time it hit 13K github stars, 200K downloads and 200 releases it's the second fastest growing project i've been a part of first fastest was the line to bang ur mom