Justin Jude
@Justin_Jude
Postdoctoral fellow at Harvard Medical School and Brown University researching speech decoding using Brain-Computer Interfaces
I’m excited to present our new preprint! An intuitive, bimanual, high-throughput QWERTY touch typing neuroprosthesis for people with tetraplegia. medrxiv.org/content/10.110…
Our new preprint describes a multimodal intracortical brain-computer interface that a man with ALS has used at home, independently, almost every day for >19 months. It decodes both speech and cursor control to enable him to communicate and use his computer. Here’s a quick tour👇
We're excited to announce the Brain-to-Text '25 competition, with a new intracortical speech neuroscience dataset and $9,000 in prizes generously provided by @BlackrockNeuro_!. Can you do better than us at decoding speech-related neural activity into text? kaggle.com/competitions/b…
20 years and 14 BrainGate participants - our latest preprint delivers the first systematic look at Utah array longevity, stability & decoding accuracy in people! 1/7 medrxiv.org/content/10.110…
Our brain-to-voice synthesis brain-computer interface paper was published in @Nature today! This neuroprosthesis synthesized the voice of a man with ALS instantaneously, enabling him to ‘speak’ flexibly and modulate the prosody of his BCI-voice. 1/7 Paper: rdcu.be/eqH3C
Now published in the Journal of Neural Engineering! iopscience.iop.org/article/10.108…
New brain-computer interface preprint led by PhD student Tyler Singer-Clark! This video of a man with paralysis accurately controlling a cursor looks like something you've seen since ~2017. BUT! This is driven by multielectrode arrays in ventral (speech) motor cortex‼️ 1/
Excited to share our NEW preprint: a "brain-to-voice" neuroprosthesis that directly synthesizes voice from neural activity with closed-loop audio feedback. It allowed a man with ALS to speak expressively by modulating intonation & sing melodies via BCI! doi.org/10.1101/2024.0… 1/
Our new study is out today in the New England Journal of Medicine! We demonstrate a speech neuroprosthesis that decodes the attempted speech of a man with ALS into text with 97.5% accuracy, enabling him to communicate with his family, friends, and colleagues in his own home. 1/9
Are transformers now much better than convnets at describing visual cortex? arxiv.org/pdf/2302.03023…
Super excited to share a preprint! At @Caltech, we build the first closed-loop brain-machine (#BMI) interface that is able to decode internal speech 💭. Check it out here: medrxiv.org/content/10.110…
very excited to share our paper on reconstructing language from non-invasive brain recordings! we introduce a decoder that takes in fMRI recordings and generates continuous language descriptions of perceived speech, imagined speech, and possibly much more biorxiv.org/content/10.110…
Pleased to announce our work has been accepted to @icmlconf 2022! We achieve state-of-the-art movement decoding performance from completely unseen sessions of neural data. Work with @mattperich, @PresNCM and @MatthiasHennig6. Pre-print: arxiv.org/pdf/2202.06159…

hi! my name’s arabella: i’m a size 4, my parents bought me a house in london fields and here’s my 17 minute video on why YOU need to stop buying fast fashion!!!!!!!!!!
Absolutely losing it at the UK spending 12 BILLION pounds on a test and trace system which stores its data in fucking MICROSOFT EXCEL
30 miles to Barnard Castle #MadeInDreams #DomCum #PS4share store.playstation.com/#!/en-gb/tid=C…
In the summer of 1978 two former members of a B-17 crew met to discuss their escape from death in a raid on Kassel in 1943. Sitting on a porch in New York, co-pilot Bohn Fawkes turned to his navigator Elmer "Benny" Bendiner & said: “Remember we were hit with 20mm shells?” 1/9