Researchers at Google use AI to turn brain scans into music

  • Share
  • The study could be a first step towards being able to compose using only one's thoughts.
  • Researchers at Google use AI to turn brain scans into music image
  • Research just published by Google and Osaka University has demonstrated a technique for converting thoughts into music. Five volunteers were played over 500 different tracks in ten musical styles while lying inside an fMRI scanner. The images captured of their brain activity were fed into an AI model called Brain2Music, which learned to create songs similar to those that the subjects were listening to, using Google's AI music generator MusicLM. The study's authors conclude that "the generated music resembles the musical stimuli that human subjects experienced, with respect to semantic properties like genre, instrumentation, and mood." The results are available to hear on the Google Research Github. The thought is that this could lead one day to intelligent music composition software that would be able to decode thoughts and translate them into music. The researchers said the next step would be to use AI to generate music based purely on someone's imagination, rather than on a specific musical stimulus. Of course, a telepathic DAW is still some way off—unless you happen to have a fMRI scanner in your bedroom studio. The entire research paper is available on arXiv, Cornell University's free online archive of scholarly articles. Image: Google DeepMind
RA