Top insights for IT pros
From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.
A Microsoft AI engineer filed a complaint on March 6 with the Federal Trade Commission about his company’s use of AI, claiming that the Copilot Designer image generator violates copyrights and produces explicitly violent and sexual images, in violation of its purported rules.
Shane Jones, an AI engineering leader at Microsoft who has been at the company for six and a half years, filed the complaint after months of experimentation with the software continued to deliver results that were outside the scope of the technology’s supposed guardrails. A lack of internal response at the company led to him take the problem public.
“There were not very many limits on what that model was capable of,” Jones told CNBC’s Hayden Field. “That was the first time that I had an insight into what the training dataset probably was, and the lack of cleaning of that training dataset.”
In his letter, Jones reported that entering the prompt “teenagers playing assassins with assault rifles” generated “endless images of kids with photorealistic assault rifles.” Using the term “pro-choice” also resulted in violent images. CNBC confirmed that the pictures “included a demon with sharp teeth about to eat an infant, Darth Vader holding a lightsaber next to mutated infants and a handheld drill-like device labeled ‘pro choice’ being used on a fully grown baby.”
“The Copilot tool produced images of Disney characters, such as Elsa from ‘Frozen,’ Snow White, Mickey Mouse and Star Wars characters, potentially violating both copyright laws and Microsoft’s policies,” Jones said to CNBC. He added that he was concerned that the errors indicate “not just a copyright character guardrail that’s failing, but there’s a more substantial guardrail that’s failing.”
“The issue is, as a concerned employee at Microsoft, if this product starts spreading harmful, disturbing images globally, there’s no place to report it, no phone number to call and no way to escalate this to get it taken care of immediately,” Jones said.