The first part of this article can be found here: Good AI Is Ethical AI: The Media & Entertainment Industry Has to Check Itself
Here, we continue the conversation focused on ethics and regulation in AI, as SMPTE President Renard Jenkins; ETC AI and Neuroscience in Media Director Yves Bergquist and Frederick Walls, a Fellow at AMD, explore the importance of trial and error when it comes to AI.
Watch the full video below or read on for highlights.
The Importance of Experimentation
Bergquist hopes the idea “that organizations and individuals have to be perfect, and it can’t screw up and stuff like that, that has to go. We’re all imperfect.”
Instead, he wants to shift to a culture of honesty and experimentation. “What I love about this technology is that it’s a work in progress. It’s a conversation in progress. And we have to use it, we have to figure out what is good for, what it’s not good for, when does it work, when it’s not work[ing]? When is it ethically sound?”
Bergquist argued, “It’s actually really important for all of us as a community to use the half-baked technology and participate in its development.
“From a creative standpoint, from a technical standpoint, from an ethical standpoint, it’s fine to screw up. It’s fine to deploy a model that has biases, as long as we have this very honest and transparent” dialog with users and consumers.
Jenkins added, “That’s the key, doing that sort of playing when you’re in a safe space. But having a diverse group of folks who are part of your UAT team, having a very varied experience group that is a part of the model itself. All of those things have to come back.”
Also, organizations like SMPTE “need to be out there teaching people what AI is and is not, giving them the tools to assess all the hype that’s coming out of Silicon Valley,” Bergquist said.
“And so that’s the first thing. The second thing is that we have to be honest about what the tools are, right? We have to be honest with the fact that, look, these are very imperfect tools. We’re very early in this journey towards artificial intelligence.”
Diving Into Deep Fakes
Speaking of honesty, Jenkins admits, “I do have a healthy sense of fear about how the tools are going to be used.”
One potentially problematic use case centers on the misuse of deep fake technology. This ethical and regulatory challenge features prominently in the public’s imagination.
Bergquist pointed out that there are different levels of sophistication when it comes to deep fakes. There are pranks executed by high schoolers and then there are “the state actors that are creating really elaborate deep fakes are a lot harder to detect.”
Walls said, “The idea that just ‘average Joe’ on the internet can create something that’s [a] very convincing piece of media that might replicate someone else… opens up all sorts of avenues for mischief and problems and detriment to society.”
“I am pretty pessimistic with regards to our ability to technically detect and reliably consistently detect deep fakes that are made by sophisticated actors,” Bergquist admitted.
Nonetheless, “People are thinking about those problems and trying to build systems engineering systems to try to address those concerns. There’s also, obviously, human-oriented ways of detecting deep fakes,” Walls said. At least for now, “Look for the guy with a wavy chin, or six fingers or whatever.”
He adds, “There’s certainly research underway to…help try to tell the true media from the fake media. And so I think those tools are also going to be something that ends up being very valuable as far as being able to help people really know… is this thing, something that I can trust, right?”
Fortunately, Bergquist said, “There is an enormous amount of energy in the media community in tackling this.”
Ultimately, “Technology is an expression of who we are,” said Bergquist. “I don’t love the fact that deep fakes are now becoming a weapon in the hands of bad actors.” However, “It is forcing us to come together as a community and devise community-level solutions.”
He also pointed out that not all deep fakes are malicious. “There’s also some interesting use cases for deep fakes. …There’s a documentary called ‘Welcome to Chechnya.’ Yeah, where the documentarian essentially used deep fakes, instead of blurring the images or putting the people that were testifying anonymously… they use deep fakes to put actual faces that were not their faces on. And it actually makes the documentary a lot more lively.”
Jenkins added, “Individuals have so much fear. They forget …it’s never the tool. It’s how the tool is being used. A hammer is not going to go do something nefarious by itself.”
EXCERPTS
Why subscribe to The Angle?
Exclusive Insights: Get editorial roundups of the cutting-edge content that matters most.
Behind-the-Scenes Access: Peek behind the curtain with in-depth Q&As featuring industry experts and thought leaders.
Unparalleled Access: NAB Amplify is your digital hub for technology, trends, and insights unavailable anywhere else.
Join a community of professionals who are as passionate about the future of film, television, and digital storytelling as you are. Subscribe to The Angle today!