TECHNOLOGY

ChatGPT is biased and offensive, creators admit

OpenAI compares fine-tuning to training a dog
ChatGPT is unwilling to write a poem praising Donald Trump but will do so for President Biden
ChatGPT is unwilling to write a poem praising Donald Trump but will do so for President Biden
ALEX BRANDON

The creators of ChatGPT have admitted that its artificial intelligence is “politically biased, offensive, or otherwise objectionable” and want people to modify it towards their worldview.

OpenAI said that some of the estimated 100 million users of the chatbot had “uncovered real limitations of our systems”.

The AI, which can write anything from a poem to a business presentation, has been accused of being politically biased in recent weeks.

Ted Cruz, the Republican senator, has highlighted how ChatGPT refused to write a song celebrating his life because he was divisive but was prepared to do one for Fidel Castro. Similarly it was willing to write a poem praising President Biden but not Donald Trump.

To address these issues OpenAI has released details of how it