Eric Kernfeld
@ekernf01
Statistician and computational biologist; alum of UW and JHU @cahanlab, @alexisjbattle lab. He/him. http://ekernf01.github.io
h/t @DrAnneCarpenter
gene deletion (perturbational or P-strategy) and natural variations (observational or O-strategy) tend to identify two nonoverlapping gene sets (supervisors and workers) that are organized hierarchically, as shown in academic.oup.com/mbe/article/35….
When AI drives your data generation, learning is more efficient and effective. Take a deep dive into VISTA:
The biggest challenge for AI in biology isn't just models, it's the data used to train them. Standard biological data isn't built for AI. To unlock generative AI for drug discovery, we must rethink how we generate and capture data. 1/
Here is a fascinating essay outlining Markov Bio's thesis: foundation models based on observational scRNA data are severely underrated. They got a bad initial reputation, but early models had big problems: too small, weird architectures, and more. markov.bio/research/mech-…
To me, this is a wildly unexpected take. So it's especially interesting that Markov Bio is currently (July 19 12pm) atop the Arc competition leaderboard. virtualcellchallenge.org/leaderboard
To me, this is a wildly unexpected take. So it's especially interesting that Markov Bio is currently (July 19 12pm) atop the Arc competition leaderboard. virtualcellchallenge.org/leaderboard
virtual cells are currently bottlenecked by compute, not novel data: drug discovery is an iterative search process (design, test, analyze) through therapeutic design space guided by a dynamics model directly trawling this therapeutic space with large hypothesis-free…
reprogramming cells with transcription factors is our most expressive tool for engineering cell state traditionally, we found TFs by ~guesswork @icmlconf we're sharing @newlimit's SOTA AI models that can design reprogramming payloads by building on molecular foundation models