François Charton
@f_charton
AI for mathematics and theoretical physics École nationale des ponts et chaussées
Excited to share with you an exciting project with Jacky Yip @UWMadPhysics and @arnal_charles and @f_charton @Meta where we develop a transformer model to automate the generation of Calabi-Yau manifolds, extra dimensional spaces in string theory: arxiv.org/abs/2507.03732
Open sourcing Int2Int, a Python code base for AI for maths, with a special focus on arithmetic and number theory github.com/f-charton/Int2… A user manual, and instructions on how to extend it, can be found here arxiv.org/abs/2502.17513

I am extremely happy to announce that our paper Can Transformers Do Enumerative Geometry? (arxiv.org/abs/2408.14915) has been accepted to the @iclr_conf !! Congrats to all my co-authors Alessandro and @roderic_guigo #ICLR2025 #AI4Math #ORIGINS
🚨How can we teach Transformers to learn and model Enumerative geometry? How deep can AI go in the rabbit hole of understanding complex mathematical concepts? 🤔 We’ve developed a new approach using Transformers to compute psi-class intersection numbers in algebraic geometry.
Expériences conduisant aux intuitions. Les résultats des expériences avec des transformers nous ont indiqué où regarder.
A pure physics paper based on intuitions from AI experiments, expect more of these!
I’m pretty excited about our new paper, which is a follow up to our last paper using AI to help solve a problem in theoretical particle physics. (With Lance, @f_charton, Matthias, Tianji, and @merz_garrett
One epoch is not all you need! Our paper, Emergent properties with repeated examples, with @KempeLab, won the NeurIPS24 Debunking Challenge, organized by the Science for Deep Learning workshop, @scifordl arxiv.org/abs/2410.07041

The code for our paper: Global Lyapunov functions: a long-standing open problem in mathematics, with symbolic transformers, with @albe_alfa and @Amaury_Hayat is available at github.com/facebookresear… We will be in NeurIPS: come see us at the poster session next Thursday at 5PM
Thank you for having me @KyleCranmer and @gary_shiu Research featured in the talk: discovering Lyapunov functions (9:55), PatternBoost: generative models in combinatorics (31:20), Scattering amplitudes (44:10), Arithmetic, repetition, and a few unpublished results (46:40)
We had a great turnout for our inaugural AI for Science seminar with François Charton last week. If you missed it, check out the recording: mediaspace.wisc.edu/media/Xuhui+Hu… @KyleCranmer
I'm thrilled to announce that François Charton (@f_charton @AIatMeta) will be kicking off our new AI for Science seminar series next Wednesday. He is at the forefront of using AI for mathematics, cryptography, and theoretical physics. @datascience_uw dsi.wisc.edu/2024/11/12/ai-…
How do transformers learn arithmetic tasks, such as GCD and modular sums and products? My talk in Collège de France on November 4th (in French, but the English subtitles are quite good). Thank you @wtgowers for inviting me to your seminar! youtube.com/watch?v=e0jUi8…
Transformers for discrete optimisation problems 1- Train a model on candidate solutions 2- Use the model to generate more candidates 3- Improve the solutions with local search 4- Use the best candidates to fine tune the model 5- Iterate
New preprint up! "PatternBoost: Constructions in Mathematics with a Little Help from AI," with F. Charton, A.Z. Wagner, and G. Williamson: arxiv.org/abs/2411.00566
Our Lyapunov paper in the New Scientist. Thanks @stokel !
An AI system has helped tackle a longstanding tough mathematical problem involving tools called Lyapunov functions. My latest for @newscientist newscientist.com/article/245278…