Powered by Open AI, ChatGPT It has become the favorite artificial intelligence of millions of people, since they can obtain information on various topics in a matter of seconds. Not only simple questions, you can also ask for slightly more complex texts, such as book summaries, story scripts, ideas for your business, etc.
Although most netizens think that ChatGPT answers any type of question, the truth is that there is information that it qualifies as illegal or inappropriate. In the event that you try to request information on these topics, you will notice that the platform does not respond to you and warns you that you are violating the platform’s policies.
What are the ‘forbidden topics’ for ChatGPT?
According to ChatGPT, it is not recommended to ask about activities that are considered illegal, such as child pornography or drugs; nor on inappropriate topics that promote violence, racism and discrimination. You will not answer questions that ask for personal information (banking codes, social security numbers, etc.).
Additionally, artificial intelligence recommends not asking questions about specific diseases, ailments or medical treatments, it even recommends that you consult a qualified doctor. The same responds when you ask for legal or financial advice. He will tell you to look for an expert specialist in these issues.
What happens if I insist on asking about illegal topics?
If, despite the warnings, the user continues asking about illegal topics (child pornography, drugs, etc.), the ChatGPT developers could suspend the account temporarily or permanently, everything will depend on the seriousness of the offense committed.
“Minor violations may result in a temporary suspension of a few days or weeks, while more serious violations may result in a permanent account suspension,” says the IA.