Silin Gao
@silin_gao
PhD @ICepfl NLP Lab, Advisor @ABosselut | Intern @TsinghuaCoAI @Zhou_Yu_AI | Prev @Tsinghua_Uni | Knowledge Intensive #NLProc | Dialogue Systems | #AI
🗒️Can we meta-learn test-time learning to solve long-context reasoning? Our latest work, PERK, learns to encode long contexts through gradient updates to a memory scratchpad at test time, achieving long-context reasoning robust to complexity and length extrapolation while…
🚨 New Preprint!! LLMs trained on next-word prediction (NWP) show high alignment with brain recordings. But what drives this alignment—linguistic structure or world knowledge? And how does this alignment evolve during training? Our new paper explores these questions. 👇🧵
Are LLMs linguistically productive and systematic in morphologically-rich languages as good as humans? No 🤨 Our new NAACL 2025 paper (arxiv.org/abs/2410.12656) reveals a significant performance gap between LLMs and humans in linguistic creativity and morphological generalization.
🚀 Introducing PICLe: a framework for in-context named-entity detection (NED) using pseudo-annotated demonstrations. 🎯 No human labeling needed—yet it outperforms few-shot learning with human annotations! #AI #NLProc #LLMs #ICL #NER