Technology has done a lot to help with those tip-of-your-tongue, name-that-tune moments. Thanks to AI, now you don’t even need to sing or hum a song—just think about it.
That’s the future scientists are bringing about after a breakthrough study successfully used artificial intelligence (AI) to recreate music simply from scanning brain activity while they thought about a song.
In a paper published in PLOS Biology, researchers led by the University of California at Berkeley were able to generate recognizable audio of Pink Floyd’s “Another Brick in the Wall, Part 1” using only data from the brain.
The study involved recording electrical signals directly from the brains of epilepsy patients already undergoing monitoring for seizure treatment. As the patients passively listened to the classic rock song, electrodes on the surface of their brains captured the activity of auditory processing regions.
The researchers then fed this brain data set into machine learning algorithms. By analyzing patterns in how different areas of the auditory cortex reacted to components like pitch, tempo, vocals, and instruments, the AI models learned to associate specific neural activity with particular acoustic features.
Go to Source to See Full Article
Author: Jose Antonio Lanz
Tip BTC Newswire with Cryptocurrency