Arts & LifeMusic

Music that puts the “artificial” in artificial intelligence

AI can be used to speed up manufacturing, which is important in the music industry. AI models can identify a song's melody, lyrics, rhythm, and even vocals, which may then be dissected and studied to improve the overall sound quality. However, it shouldn't be used to copy something that ought to be unique. Photo credit: Ana Sophia Papa

The possibilities of using artificial intelligence to develop new things are virtually boundless. With that, the amount of media derived from AI is no longer a foreign subject.

For instance, according to the IBM Global AI Adoption Index of 2022, 35% of enterprises reported adopting artificial intelligence technology in some capacity within their organizations.

It should not be surprising that the artificial intelligence market is expected to develop significantly, given how frequently businesses are implementing it. The music industry is no stranger to this concept.

AI can be utilized to streamline the production process, which is crucial in the music business. A song’s melody, lyrics, rhythm and even voices can all be recognized by AI models, which can then be broken down and analyzed to enhance the overall sound quality.

However, what if AI replicates artists’ voices a little too well?

Artificial intelligence generators have become a trend recently, used to mimic a popular artist’s voice and program it to another song. It’s easy to believe the artist actually sang the song with the rise of AI. Unfortunately, however, this technology could fall into the wrong hands and undermine artists of all types.

“It’s pretty deceitful, to be honest. I didn’t think artists would actually press charges, but I probably would too if I were them. Imagining my voice being used in ways I didn’t consent to is scary,” third-year choral-vocal music major Sarah Jimerson says.

Jimerson feels it isn’t fair that many artists have the potential to release music and even covers of their own only for the opportunity to be taken from them.

Max Miller from Third Eye Records holds a similar view.

“I feel like this was something that started off as harmless fun that turned into people taking advantage of it,” says Miller. “I’m not sure if the people programming the songs to voices are making money off of it, but even if they weren’t, it’s still wrong.”

Miller also mentions that if he were one of the celebrities being taken advantage of, he wouldn’t be able to trust releasing music without it being taken out of context.

Even if it they were received well initially, the fake covers raise a Pandora’s box of ethical and legal dilemmas around how AI songs affect a singer’s decision in what their voice is used for and what they represent.

You may also like

Comments are closed.

More in:Arts & Life