Eric J. Michaud
@ericjmichaud_
Resident at Goodfire AI and PhD student at MIT. Trying to make deep neural networks among the best understood objects in the universe. 馃捇馃馃馃懡馃敪馃殌
Understanding the origin of neural scaling laws and the emergence of new capabilities with scale is key to understanding what deep neural networks are learning. In our new paper, @tegmark, @ZimingLiu11, @uzpg_ and I develop a theory of neural scaling. 馃У: arxiv.org/abs/2303.13506
I've moved to SF and am working at @GoodfireAI this summer! Excited to be here and to spend time with many friends, old and new.

It's been a pleasure working on this with @ericjmichaud_ and @tegmark, and I'm excited that it's finally out! In this work, we study the problem of creating narrow AI and find that: (1/4)
Today, the most competent AI systems in almost *any* domain (math, coding, etc.) are broadly knowledgeable across almost *every* domain. Does it have to be this way, or can we create truly narrow AI systems? In a new preprint, we explore some questions relevant to this goal...