Pieter Delobelle
@pieterdelobelle
Fairness in LLMs and Dutch NLP Currently LLM engineer @aleph__alpha Prev: @apple, PhD & postdoc @KU_Leuven
this guy's channel is so small with only a couple K views here and there if you're interested in GPU programming and still a beginner, he's worth a look (Simon Oz on yt)
If you are at #ICLR25 and care about tokenizers, drop by our (@Aleph__Alpha)’s Birds of a Feather session – happening now at Opal 103.

As a proud Dutch speaker 🇧🇪🇳🇱, it’s unfortunate to see our language often overlooked in the development of LLMs. To change this, Antony Rathé (@anthony_rathe) and I built ChocoLlama 🍫🦙, a family of 6 Dutch LLMs based on @AIatMeta’s Llama.
Unsure where to submit your next research paper to now that aideadlin.es is not updated anymore? Is the location for you as important as the conference? Then check out 🗺️ deadlines.pieter.ai

Announcing 🐤 Büble-LM 🐤, our new state-of-the-art 2B language model (LM) for German! Trained by @pieterdelobelle with a novel trans-tokenization approach, it outperforms other German LMs like Sauerkraut and LLäMmlein on most benchmarks. Try it out! huggingface.co/flair/bueble-l…