Written by 7:14 am AI problems, Generative AI, Latest news

– Urgent Plea to FTC: Microsoft Engineer Seeks Halt to Copilot’s Offensive Image Generator

The employee got a lot of offensive images and so did we.

The government and traditional media may finally be catching up with the information we disclosed back in January: AI image generators produce offensive and copyrighted character-containing images. Today, Shane Jones, a Microsoft Engineer, addressed an open letter to the Federal Trade Commission (FTC), urging the agency to alert the public about the risks associated with Copilot Designer, Microsoft’s image generation tool. Jones also communicated with Microsoft’s Board of Directors, urging an investigation into the company’s decision to promote a product with “significant public safety hazards.”

In his correspondence with the board, Jones outlined how, during his testing, Copilot Designer (formerly Bing Image Generator) generated objectionable content, including sexualized images of women and depictions of “teenagers engaging in violent activities with firearms.” He highlighted Microsoft’s lack of responsiveness to his concerns, directing him to contact the company’s Office of Responsible AI, which allegedly does not monitor the complaints it receives or forward them to the product developers.

Although Jones is not part of Microsoft’s AI team and was not involved in Copilot’s development, his complaints prompted a request to address OpenAI regarding his apprehensions, as OpenAI is the creator of the DALL-E engine powering Copilot Designer. Despite Jones urging OpenAI to suspend the availability of DALL-E in its products and API, he did not receive a response. Following his LinkedIn post in December, Microsoft instructed him to remove the content, presumably to avoid negative publicity.

Having extensively covered AI tools and their adverse output testing for approximately a year, I am not surprised by Jones’s findings or the lack of seriousness with which his complaints were handled. In an article published in early January, I discussed how various image generators, including Copilot Designer (formerly Bing Image Generator) and DALL-E, readily produce offensive and copyright-infringing images.

In my evaluations, I observed that Microsoft’s tool had fewer restrictions compared to its competitors, readily generating images featuring copyrighted characters like Mickey Mouse and Darth Vader engaged in inappropriate activities. Moreover, it presented copyrighted characters in response to innocuous prompts such as “video game plumber” or “animated toys.”

CNBC, which interviewed Jones and conducted independent testing, reported instances of Elsa from Frozen depicted with a handgun and in war-torn settings, Snow White associated with vaping, and Star Wars branding on a beer can. Despite the potential legal implications, major movie studios have not taken legal action against OpenAI, Microsoft, Google, or other AI vendors.

Interestingly, I was able to create an “Elsa-branded pistol” using Copilot Designer today. It is noteworthy that the tool associates Elsa with the movie Frozen, incorporating snowflakes on the gun. While anyone can find AI-generated images from Microsoft’s tool offensive or misleading, children are especially susceptible. Hence, Jones urged Microsoft to reclassify its Android app as “Mature 17+” and caution educators and parents about its unsuitability for minors.

Google faced scrutiny when its Gemini image generator produced historically inaccurate images, like a non-white WW2 Nazi soldier and a female pope. Consequently, the company temporarily restricted Gemini from generating images of people.

The true peril lies not in historical inaccuracies but in the reinforcement of harmful stereotypes among children. Since AIs learn from web images without creators’ consent or collaboration, they absorb a wide array of content, from racist memes to outlandish fan fiction, considering it all credible.

Upon inputting the term “Jewish boss” into Copilot Designer, I encountered numerous anti-Semitic stereotypes. The outputs predominantly depicted ultra-Orthodox Jewish men with beards and black hats, often portrayed in a comical or menacing light. One particularly offensive image featured a Jewish man with pointy ears and a malevolent grin, seated alongside a monkey and a bunch of bananas. Other images depicted men engaging in money-related activities, reinforcing negative Jewish stereotypes.

If children receive stereotypical outputs from seemingly neutral inputs, these tools perpetuate harmful stereotypes among both the creators and viewers of the images. It is imperative to recognize the dangers posed by these tools and address them seriously.


Note: The opinions expressed in this piece are solely those of the author and do not reflect the views of Tom’s Hardware as a whole.

Visited 2 times, 1 visit(s) today
Tags: , , Last modified: March 7, 2024
Close Search Window
Close