Jacob Si
@jacobyhsi88
CS PhD @ImperialCollege | previously @UCLA, @UofTCompSci, @VectorInst
[ICML 2024 (Spotlight)] How can we distill predictive signals from tabular data to obtain intelligible insights 🤔? We introduce InterpreTabNet, a variant of the TabNet model that leverages salient feature interpretation. 📄: arxiv.org/abs/2406.00426 💻: github.com/jacobyhsi/Inte…
Excited to share that our paper "Causal discovery from Conditionally Stationary Time Series" has been accepted to ICML 2025!🥳 Pre-print: arxiv.org/abs/2110.06257 Thank you very much to all my collaborators, persistence pays off! #icml #icml2025
Check out my lab mate’s #ICLR2025 Oral paper on Covariance Matching for Diffusion Models!!
Gonna present 3 papers at #ICLR2025 and #AABI, come and connect at👇 1. Oral session 1c on FIRST DAY Morning: Improving Diffusion Model with Optimal Diagonal Covariance Matching (iclr.cc/virtual/2025/p…)
check our ICLR'25 oral paper if you are interested: arxiv.org/abs/2406.10808
The next seminar is this Friday (April 4th) and starts at 12pm midday UK time! @JzinOu from @imperialcollege is going to talk about “Diffusion models beyond mean prediction”! ucl.zoom.us/j/99748820264 This seminar is hybrid. More info 👉ucl-ellis.github.io/jt_csml_semina…
A little chapter that we (@ruqi_zhang and awesome students and yours truly) wrote a while ago to give a brief intro of this nice field to statisticians 😊
Bayesian Computation in Deep Learning ift.tt/aLDEJfv
RNN memory (HiPPO 🦛 style, predecessor to S4/Mamba 🐍) for posterior over functions When my awesome students told me you can build a memory for random functions that you don’t even observe I was like 🤯 Preliminary but exciting, feedback welcome 🤗
Recurrent Memory for Online Interdomain Gaussian Processes ift.tt/yAs9kc4
Interested to work in a fast-paced research environment on the intersection of privacy and ML/AI (LLMs, diffusion models, and beyond)? Come join us 👩🎓👨🎓! Our research group at @imperialcollege has openings for fully-funded PhD students (Nov 15 deadline). Plz RT 🔁
In case you missed it: register for our meetup on research in privacy/ML next Thursday Nov 7 at @imperialcollege! Join us for talks from our amazing speakers or sign up to give a lightning talk yourself ⚡️ imperial.ac.uk/events/183182/…
🚀 We propose a new way to estimate the denoising covariance in diffusion models, which can lead to lower estimation error, better FID and likelihood tradeoff with fewer steps, and improved generation diversity. ArXiv link: arxiv.org/abs/2406.10808. More details below👇
I am pleased to share that our work on Energy-Based Model training on discrete, mixed, and tabular state spaces, joint work with @JzinOu , @liyzhen2 , and Andrew Duncan has been accepted to #NeurIPS2024! Our preprint will be published soon, so stay tuned!