Josh Gordon
@random_forests
Open source ML @ Google NYC
New function calling guide with Gemma 3 & KerasHub -- really nice to be able to run locally.
Function calling guide for Keras!
KerasHub now includes HGNetV2! We’re excited to bring the high-efficiency, high-accuracy HGNetV2 image classification backbone into KerasHub’s model family. Model details and quickstart notebook are available on Kaggle: kaggle.com/models/keras/h… #keras #kerashub #HGNet
The JAX team is hosting a dinner / networking event during ICML on Thursday. Join us for an evening of food, drinks, and discussion of all things JAX. @SingularMattrix and other JAX team member will be attending. Please register early as capacity is limited. RSVP:…
🎬 Generate videos with the Gemini CLI Add: 🧑💻 GenMedia MCP servers for Imagen, Veo & Chirp 📝 A GEMINI܂md file explaining your ✨ creative process And you too can take 🙀 Rusty the Cat on an adventure ⬇️ Full tutorial in the vid ⬇️
Some interesting Gemini CLI use cases and tutorials 🧵⬇️
We’re building Keras Recommenders at Google, and would love to hear from people working in RecSys to understand what they need. What features matter most to you? DMs are open, feel free to reach out! keras.io/keras_rs/
People are getting monster offers to work on LLMs behind closed doors, and meanwhile this team from Stanford is doing everything in the open and making it available to everyone. Open science all the way!
Open development of language models in action!
It was *so good* seeing everyone at the JAX and OpenXLA DevLab this week! Best event in a long time. Let’s do it again!
I am PUMPED to finally share what we’ve been working on: 🖥️ Introducing the Gemini CLI! It can code, sure, but with access to your system shell, files and MCP servers, it can also: 👩🔬 Do research 💽 Organise your MP3s Resolve rebases 🔬 Even strace that weird hung process…
Here's more detail on how to load a Hugging Face checkpoint into a KerasHub model. Thanks for the walkthrough, @yufengg , @divyasheess, and @monicadsong ! developers.googleblog.com/en/load-model-…
You can find performance & scale optimized JAX models in MaxText and MaxDiffusion: * github.com/AI-Hypercomput… * github.com/AI-Hypercomput… You can also use Keras / JAX to tune many Hugging Face Transformers model checkpoints by loading them into a KerasHub model. It's pretty cool!…
KerasHub supports loading checkpoints of many model architectures from HuggingFace. So if there is a model checkpoint on HF that is not in the Keras format, you can easily load it in Keras and use it as a regular Keras model in any backends (JAX, TF, or PyTorch).
You can find performance & scale optimized JAX models in MaxText and MaxDiffusion: * github.com/AI-Hypercomput… * github.com/AI-Hypercomput… You can also use Keras / JAX to tune many Hugging Face Transformers model checkpoints by loading them into a KerasHub model. It's pretty cool!…
If you're running JAX and you need to grab a model checkpoint from HuggingFace, KerasHub has you covered. Load, fine-tune, quantize, export for inference.
You can find performance & scale optimized JAX models in MaxText and MaxDiffusion: * github.com/AI-Hypercomput… * github.com/AI-Hypercomput… You can also use Keras / JAX to tune many Hugging Face Transformers model checkpoints by loading them into a KerasHub model. It's pretty cool!…
You can find performance & scale optimized JAX models in MaxText and MaxDiffusion: * github.com/AI-Hypercomput… * github.com/AI-Hypercomput… You can also use Keras / JAX to tune many Hugging Face Transformers model checkpoints by loading them into a KerasHub model. It's pretty cool!…
Really excited for the upcoming JAX & OpenXLA DevLab this Monday! This is a small group deep dive on the latest techniques, with breakouts on special interests. We'll record the tutorials for everyone, too. Opportunity: If you're interested in *healthcare research* with JAX /…