Rishabh Anand (in SF)
@rishabh16_
cooking antibodies @prescientdesign • backpropagating @yale • geometric DL + generative modelling for proteins & RNA • prev @nusingapore @cambridge_cl 🧬🛠
🚨 My Graph Deep Learning extended tutorial has a new home!!! It's finally up on my personal website: rish-16.github.io/posts/gnn-math/ Do RT, share, and tag your GDL buddies if you find it useful 🙌🏻🙌🏻
🚨NEW WEEKEND READING🚨 I'm publishing "Graph Neural Networks for Novice Math Fanatics" – a primer on the math behind GNNs using colourful drawings & diagrams rish16.notion.site/Graph-Neural-N… (I hope this becomes the definitive guide to GDL for those hoping to enter the field!!!)
5 months from funding to shipping our first frontier model. Latent-X achieves state-of-the-art hit rates for macrocycles and mini-binders, with picomolar binding affinities — a breakthrough in de novo protein binder design. Available now on our no-code platform!
🤗⚛️ Join the @entalpic_ai Team ⚛️🤗 At Entalpic, we combine MLIP, generative models, quantum simulations and lab experiments to optimise carbon intensive industrial processes and develop the next-generation materials. ⚠️ We are specially looking for ⚠️ * (Senior and Intern)…
Linkedinfluencers are gonna LOVE writing their essays about this “case study” in marketing
Thank you for your interest in Astronomer.
In 1965, Margaret Dayhoff published the Atlas of Protein Sequence and Structure, which collated the 65 proteins whose amino acid sequences were then known. Inspired by that Atlas, today we are releasing the Dayhoff Atlas of protein sequence data and protein language models.
Apply for the AITHYRA-CeMM International PhD Program! 15-20 fully funded PhD fellowships available in Vienna in AI/ML and Life Sciences Deadline for applications: 10 September 2025 apply.cemm.at
🤯Three (!!!) new papers today in @ScienceMagazine on the application of generative AI for the de novo design of peptide/HLA binding molecules! Completely unique 3D structure and binding mode compared with natural TCRs and TCR mimics! Links to papers 👇
AI4Mat is back for NeurIPS! time to crystallize those ideas and make a solid-state submission by august 22, 2025 💪 new this year: opt-in your work for our Research Learning from Speaker Feedback program -- a new structured discussion format where spotlight presenters receive…
Really excited to (finally) share the updated JAMUN preprint and codebase! We perform Langevin molecular dynamics in a smoothed space which allows us to take larger integrator steps. This requires learning a score function only at a single noise level, unlike diffusion models.
If you are in the Bay Area, consider attending our workshop, "Graph Learning Meets TCS," at the Simons Institute (simons.berkeley.edu/workshops/grap…).
anyone out there capable and interested in joining the Midjourney team to train some diffusion text models? really feels like we could do something special and i'm increasingly tempted to try
Excited to be partnering with @HenryYin_ and @naomiiixia from @agihouse_org to host a deep dive session on some of the most topical recent research in RL. We’ll have amazing researchers @jiayi_pirate talking about his recent work on Adaptive Parallel Reasoning, and…
A gentle reminder that TMLR is a great journal that allows you to submit your papers when they are ready rather than rushing to meet conference deadlines. The review process is fast, there are no artificial acceptance rates, and you have more space to present your ideas in the…
HLE has recently become the benchmark to beat for frontier agents. We @FutureHouseSF took a closer look at the chem and bio questions and found about 30% of them are likely invalid based on our analysis and third-party PhD evaluations. 1/7
Our 500+ page AI4Science paper is finally published: Artificial Intelligence for Science in Quantum, Atomistic, and Continuum Systems. Foundations and Trends® in Machine Learning, Vol. 18, No. 4, 385–912, 2025 nowpublishers.com/article/Detail…
in case you are wondering this is academia now
ICML’s Statement about subversive hidden LLM prompts We live in a weird timeline…
Modern-day mech-interp work for diffusion models is finally here :O Very cool work! I wonder whether concepts from circuit theory and hierarchical features are relevant here too and can be used for steering 🤔
*Emergence and Evolution of Interpretable Concepts in Diffusion Models* by @berk_tinaz @zalan_fabian @mahdisoltanol SAEs trained on cross-attention layers of StableDiffusion are (surprisingly) good and can be used to intervene on the generation. arxiv.org/abs/2504.15473
🌞🌞🌞 The third Structured Probabilistic Inference and Generative Modeling (SPIGM) workshop is **back** this year with @NeurIPSConf at San Diego! In the era of foundation models, we focus on a natural question: is probabilistic inference still relevant? #NeurIPS2025
Introducing Latent-X — our all-atom frontier AI model for protein binder design. State-of-the-art lab performance, widely accessible via the Latent Labs Platform. Free tier: platform.latentlabs.com Blog: latentlabs.com/latent-x/ Technical report: tinyurl.com/latent-X
We are gonna make it 😌
Protein structure prediction contest CASP gets temporary funding from Google DeepMind as NIH grant runs out. trib.al/bGoz7lf