Microsoft AI Tool Under Fire: Generates Violent, Sexual Images, Ignores Copyrights, Engineer Warns

Microsoft AI Tool Under Fire: Generates Violent, Sexual Images, Ignores Copyrights, Engineer Warns

Shane Jones, a Microsoft engineer, has raised concerns about the sexually suggestive and violent content generated by their AI image tool, Copilot Designer (formerly Bing Image Creator). Jones, who has been with the company for six years, accused the software giant of ignoring his warnings and not taking adequate action. Trying out Microsoft’s OpenAI-powered AI … Read more