Microsoft Reportedly Blocks Keywords from Copilot Designer to Stop Generating Violent, Sexual AI Images

Microsoft Reportedly Blocks Keywords from Copilot Designer to Stop Generating Violent, Sexual AI Images

Microsoft has reportedly blocked several keywords from its artificial intelligence (AI)-powered Copilot Designer that could be used to generate explicit images of violent and sexual nature.

Keyword blocking exercise was conducted by the tech giant after one of its engineers wrote to the US Federal Trade Commission (FTC) and the Microsoft board of directors expressing concerns over the AI tool.