Faisal Amir
@urmauur
UI Engineer - Making pretty things do useful stuff
I'm glad to announce our latest research:🍎 Lucy. Lucy is a model trained specifically to optimize machine-generated task vectors. Let’s say you have a reasoning model where everything happens inside: <think>... etc ...</think> Essentially, the moment the <think> tag ends,…
Introducing Lucy: 1.7B model that Google for you It's an agentic‑search model that can even run on your phone. - Agentic search on tap - Lucy calls tools (<think></think>‑aware) - Fits in your pocket - runs on CPU or mobile Under the hood: - Built on @Alibaba_Qwen's Qwen3‑1.7B…
what a time to be alive: a local 4B model (Jan-nano) + Web Search tool is already amazing to get the info you need with no overhead 🚀
Jan is an open source alternative to ChatGPT that runs 100% offline on your computer
BOOOM! transformers now has a baked-in http server w/ OpenAI spec compatible API Launch it with `transformers serve` and connect your favorite apps. Here I'm running @jandotai with local transformers and hot-swappable models. There is preliminary tool call support as well!
Open-source models can do Deep Research too. This video shows a full research report created by Jan-nano. To try it: - Get Jan-nano from Jan Hub - In settings, turn on MCP Servers and @serperapi - Paste your Serper API key Your deep research assistant is ready.
@jandotai It works wonderfully, it has better performance, and it is rightly at the top on GitHub. The result that is seen is when I turn on the GPU integrated into the CPU and when it was off.
Right now, VSCode Dark+ is my favorite code block theme on the @jandotai app 🚀

Just saw Menlo/Jan-nano trending on the Hugging Face landing page, Amazing work, @huggingface @jandotai @menloresearch

Jan v0.6.0 is here: It's a whole new vibe! Jan's been redesigned to be faster, cleaner, and easier to use. It also lays the groundwork for upcoming tool use. You can now create assistants with custom instructions and settings. Update your Jan or download the latest.
Meet Jan-nano, a 4B model that outscores DeepSeek-v3-671B using MCP. It's built on Qwen3-4B with DAPO fine-tuning, it handles: - real-time web search - deep research Model + GGUF: huggingface.co/collections/Me… To run it locally: - Install Jan Beta: jan.ai/docs/desktop/b… - Download…
MCP Servers are now in 👋 Jan Beta! Where do I enable MCP? Settings > MCP Servers: toggle or pick one. Tool use for local models: Settings -> Model Providers -> llama.cpp -> Models -> Edit -> Switch "Tool" on/off. This is a beta build - get it here: discord.com/channels/11071…
Hey to the new folks 👋 Quick heads-up: Jan's part of a bigger update, soon you can try out MCPs + build your own assistants. To test the beta version, join our Discord - link's in the profile.
Jan's beta drops soon, with the new design and a few features we think you'll like. Join us on Discord to try it early, first announcement goes there. 🤙 discord.gg/Exe46xPMbK
Search across threads is coming. Built by @ivanleomk, who dropped a quick video demo so you can see it in action.
New local API server logs are on the way. Logs won't disappear when you close Jan. If something breaks, the history's still there when you come back.