UMC Utrecht Brain Center
Department of Neurology and Neurosurgery
STR room 4.117
UMC Utrecht

Contact Information

Email: y.berezutskaya @

Research Interests

  • Brain-Computer Interface
  • Data sharing
  • Deep learning
  • ECoG
  • Functional electrical stimulation
  • Machine Learning
  • sEEG
  • Speech decoding

Julia Berezutskaya





Julia Berezutskaya is a postdoc working on the INTRECOM and INTENSE neurotechnology projects. In 2020, she completed her PhD on “Data-driven modelling of speech processes in intracranial data” at the University Medical Center in Utrecht within the Language in Interaction consortium. In 2020-2022, she worked as a postdoc at the Artificial Intelligence department of the Radboud University.

Julia’s background is in Cognitive Neuroscience, Speech Technology, Brain-Computer Interfaces and Artificial Intelligence. Her mission is to translate knowledge of artificial intelligence and cognitive neuroscience to the field of neurotechnology for healthcare by developing personalized care solutions for individuals with severe motor paralysis. Her current work focuses on 1) decoding speech from neural signals for the development of brain-computer interfaces and 2) modeling of speech and movement processes in the human brain. Recently, she obtained her first grant (REANIMATE, NWO Open Science XS, November 2022) to explore the potential of functional electrical stimulation of facial muscles for brain-computer interfaces.

She is a member of the Postdoc and Student Committee of the BCI Society, a Repronim fellow and a member of the UMC Utrecht Young Academy.

Key Publications

  • 2022
    How does artificial intelligence contribute to iEEG research?
  • 2022
    Direct Speech Reconstruction from Sensorimotor Brain Activity with Optimized Deep Learning Models
    bioRxiv 2022.08.02.502503
  • 2022
    Open multimodal iEEG-fMRI dataset from naturalistic stimulation with a short audiovisual film
    Scientific Data volume 9, Article number: 91
  • 2020
    Cortical network responses map onto data-driven features that capture visual semantics of movie fragments
    Scientific Reports volume 10, Article number: 12077
  • 2020
    High‐density intracranial recordings reveal a distinct site in anterior dorsal precentral cortex that tracks perceived speech
    Human Brain Mapping, Volume41, Issue16
  • 2020
    Brain-optimized extraction of complex sound features that drive continuous auditory perception
    PLoS Comput Biol 16(7): e1007992
  • 2017
    Neural tuning to low-level features of speech throughout the perisylvian cortex
    Journal of Neuroscience, 37 (33) 7906-7920