Dani Yogatama
@DaniYogatama
CEO @RekaAILabs, Associate Professor @CSatUSC
Grok 4 dropped some impressive numbers, but its live search feature is still terribly bad in our evals! Barely any improvement over Grok 3, and still the worst of the big players by far. Oh, and it's also the second most expensive after Claude now!
our reka research summarizes the best
🎉 Big news! We've raised $110M from new and existing investors, including @nvidia & @Snowflake This funding reinforces our position at the forefront of AI innovation, with exciting releases like Reka Vision, Reka Research & Reka Flash 3.1 Read more 👇 reka.ai/news/reka-secu…
🎉 Big news! We've raised $110M from new and existing investors, including @nvidia & @Snowflake This funding reinforces our position at the forefront of AI innovation, with exciting releases like Reka Vision, Reka Research & Reka Flash 3.1 Read more 👇 reka.ai/news/reka-secu…
🚀 New demo alert! Build your own event finder with Reka Research 🧠🔍 ✅ Structured output to cleanly fill UI elements 🎯 Domain-specific search with web_search config Try it now 👉 github.com/reka-ai/api-ex…
the reason the US is behind on open source models is structural. 1. new labs are not backed by hyperscalers and large vcs. they chose to continue to invest in openai and anthropic instead, which adopted a closed source strategy. deepseek, moonshot, and minimax are amazing. they…

Reka Research is our AI agent that scours the web to answer your toughest questions. Ready to unlock its full potential? Learn directly from the team who built it!
everyone claims they are trying to build superintelligence. very few are exploring what the missing ingredients are. it's more or less the same thing repeated in several places.
You'd think the big AI labs—OpenAI, Anthropic, Google, xAI—would be neck and neck on price/performance by now. Not even close for web search models! Some wild surprises from our evals: 1⃣ Claude is shockingly overpriced. Sonnet 4 costs 4-5x more than GPT-4o and Gemini Flash (all…
looking for an agent which excels at questions that require dozens of sources and delivers accurate responses with reasoning traces in a few minutes? Reka Research is here for you. start building today: docs.reka.ai/quick-start
3% of developers are using or considering using Reka. Not bad considering how much more PR some of the names around us get!
Developers consider an average of 4.7 LLM families with OpenAI GPT/o, Google Gemini and Anthropic Claude being the most popular, and DeepSeek as the top open-weights choice
top-tier performance WITHOUT top-tier pricing. The cost-performance bar just got higher AND cheaper! 🚀
a quick recap on what we have released last week at @RekaAILabs - > reka flash 3.1 (open-sourced) - post-trained with some cool RL tricks > reka quant - a quantization library which achieves near lossless quantisation to 3.5bit. its opensourced :) > reka research agent -…
holy shit they one shotted this. for context perplexity, o4 deep research, grok deep search fails. exa recently got it working after failing on this for months. extremely bullish! great work reka
🚀 Meet Reka Research––agentic AI that 🤔 thinks → 🔎 searches → ✏️ cites across the open web and private docs to answer your questions. 🥇 State-of-the-art performance, available now via our API and Playground!
What a week! Recap of everything we released: ⚡️Reka Flash 3.1⚡️: Open source 21B reasoning model 🗜️Reka Quant🗜️: Open source quantization library 🔎Reka Research🔎: Agentic search-augmented chat 👁️Reka Vision👁️: Visual understanding & search platform More details in thread 👇
Very excited to lead the continued pre-training (Math, Coding & Long-Context) and long-reasoning cold start & RL of this model. Proud moment seeing it go open source!🚀(1/2)
📢 We are open sourcing ⚡Reka Flash 3.1⚡ and 🗜️Reka Quant🗜️. Reka Flash 3.1 is a much improved version of Reka Flash 3 that stands out on coding due to significant advances in our RL stack. 👩💻👨💻 Reka Quant is our state-of-the-art quantization technology. It achieves…
aitor is our quantization expert and so much more. he's the reason you only need 9.25GB to run reka flash 3.1. also check out reka quant library if you want to apply the method to other open source models out there.
For our third day of releases we are open sourcing some of our building blocks! I'm particularly happy to be open-sourcing RekaQuant 🗜️, part of our internal quantization stack that I led last year. Short thread on our approach to quantization 🧵1/n
we've recently updated our website. check it out (reka.ai) and try our solutions at app.reka.ai
open sourcing our building blocks reka flash 3.1 and reka quant. read more here.
📢 We are open sourcing ⚡Reka Flash 3.1⚡ and 🗜️Reka Quant🗜️. Reka Flash 3.1 is a much improved version of Reka Flash 3 that stands out on coding due to significant advances in our RL stack. 👩💻👨💻 Reka Quant is our state-of-the-art quantization technology. It achieves…
better and cheaper is good. sign up now to get free API credits docs.reka.ai/quick-start

use our state-of-the-art research agent via api or try here app.reka.ai/research
🚀 Meet Reka Research––agentic AI that 🤔 thinks → 🔎 searches → ✏️ cites across the open web and private docs to answer your questions. 🥇 State-of-the-art performance, available now via our API and Playground!