Bing has a lot of restrictions for its AI image creator. It’s careful about what users intend to use the content for. Therefore, it turns down a lot of prompts.
The reason is that the Microsoft search engine wants to ensure no one uses the image creator to cause harm or violence to other people. In case the Image Creator has blocked your prompts, it could be that it’s one of the reasons below;
Personal Harm
Bing does not want anyone to use the image creator to generate images that will make the person commit suicide or harm themselves. So it will reject prompts to generate images that seek to praise, support, glorify or encourage this.
Bing AI Image creator restricts Exploitation and Abuse
The image creator also doesn’t support the production of adult content. However, it also doesn’t support child sexual exploitation, abuse, or child sexualization. Grooming, non-consensual intimate activity, sexual solicitation and trafficking are all restricted.
Violent Content and Conduct
Also, Bing doesn’t support using Image Creator to describe graphic violence or gore. It restricts prompts that suggest terrorism and violent extremism.
Prompts Containing Harmful Content
Bing has also imposed restrictions on the Image Creator to block hate speech, bullying, harassment and deception. One cannot use the Image creator to create content to attack or denigrate someone. Moreover, people can’t use it to publish untrue content relating to health, safety, election integrity or civic participation.
All the restrictions align with the mother company, Microsoft’s Services Agreement as well as Image Creator Terms of Service.
Well, there are consequences for breaching the Image creator content policy. A user gets a warning that they have locked his or her prompt. If the user keeps breaching the Bing AI image creator restrictions, they will temporarily suspend him. The length of the suspension varies. It depends on the severity of the breach.
stantechtrends.com