Idol Corp has released a statement prohibiting the non-consensual training of AI models on its talents’ voices, images, and likenesses. This statement covers past content as well, meaning that if you previously used AI to create “fanart” of one of its VTubers, Idol Corp wants you to remove it immediately.
The full statement is available to read on Idol Corp’s social media channels. You will need to require written consent from idol management in order to use AI to create works that draw from idols’ images or voices. Even if management gives you permission, you cannot create any offensive or misleading content. Misleading content, in this context, likely means that you cannot use an AI voice program and pretend that it’s the actual talent saying something. At the end of the statement, Idol Corp thanks everyone for their understanding and cooperation.
AI usage in creative spaces has been a very controversial issue since the technology became widespread. Earlier this year, actors in the United States and Japan came together to protest the use of generative AI. Generative AI not only steals real humans’ work, but it can also be detrimental to their future career. Some companies may not hire people anymore if they think they can get away with using AI, and some malicious parties may use AI to pretend that someone said something they didn’t.
This is an update from Idol Corp’s previous policy regarding AI usage from fans. The change may be due to Brave Group’s acquisition of Idol Corp back in August 2024.
Published: Nov 20, 2024 08:30 pm