Several panelists also emphasized the potentially existential threat AI poses to all kinds of workers well beyond the entertainment industry. “In the last three months, we’ve gotten possibly 6,000 to 7,000 audio files removed from certain websites,” he said. He also said that NAVA has been working with sites that host video game modders - who use coding to modify the appearance of game avatars, sometimes to resemble real people and real voices - to take down voice mods that have been obtained without the actor’s consent. Act, which is currently working through the European Parliament. To that end, Friedlander said that NAVA is working with the European Union to get voice protection written into the A.I. The panelists also said it’s common for contracts to include broad language allowing studios to own the actor’s work “in perpetuity” and for use “in any technology currently existing or to be developed.”įor actors just starting out, that kind of contract language could mean “your first job could potentially be your last job if you don’t have protection,” said Friedlander.Īdded Burch, “I don’t want the next generation of voice actors to not have the potential to build a life that we’ve been able to build in this business.” One of the greatest concerns voiced was how AI would specifically foreclose working opportunities for up-and-coming voice actors trying to get their start in the field with smaller roles in animation and video games - roles that AI could easily replace. “With the pace of this technology, by the time we figure out the abuses that we know are going to occur, it’s too late,” Alton said. The sense of urgency to find a resolution was brought up several times during the panel. Jones added that she’s in the early stages of building a company that would provide “the first actor-first, ethical use of AI with voice over.” Jones said that, because “there’s no stuffing this genie back in the bottle,” for the past 18 months, NAVA has been creating “a framework to work within AI in an ethical manner.” The main points are that voice actors need to give active, informed consent for their voices to be used by AI (in contrast to what she called far more expansive “passive consent” that is buried in contractual language) that voice actors need to have control over how their voices are used and that they need to be fairly compensated for that use. “As an actor, you need to know what they’re going to do with some digital version of you that they’re creating using AI, not in general,” Crabtree-Ireland continued. That is exactly what we’re talking about.” Without referring to the movie by name, Crabtree-Ireland likened the issue to “a story of a small mermaid and sea witch that literally steals that mermaid’s voice.” As the audience laughed in recognition of Disney’s “The Little Mermaid,” Crabtree-Ireland continued, “I remember seeing that for the first time and thinking how horrifying is it that this sea witch steals the voice of this person and then uses it for whatever. With AI cloning the voices of actors, however, “We’ve lost control over what our voice could possibly say,” he said. “As a human voice actor, I can walk into a room and get a script that says something that I didn’t either agree to say or something that I would never say, I personally have that ability to walk out of that room,” Friedlander said. At issue for the panel was the growing certainty that without explicit contractual and statutory protections in place, AI could not only effectively replace the vast majority of work for voice actors, but manipulate their voices to create content without their expressed consent.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |