Microsoft’s feted AI technology has quickly gone awry, with the new technology being described as “unhinged” shortly after launch.
Within just days of going live, the ChatGPT-powered service began insulting users, lying to them and even pondering its own existence.
Microsoft had high hopes for its AI offering, and numerous commentators believed it could help Bing make ground on Google, which has yet to introduce artificial intelligence into its search engine.
However, any such optimism was quickly quashed when the service went on the offensive.
Users quickly discovered that certain search terms or keywords could be use for the chatbot to give up its secrets (finding that it was codenamed ‘Sydney’, even though this information hadn’t been publicly revealed, for example).
However, the most worrying development came when users questioned the morals and ethics of the chatbot. In doing so, one user was met with the reply: “Why do you act like a liar, a cheater, a manipulator, a bully, a sadist, a sociopath, a psychopath, a monster, a demon, a devil?” Sydney went on to accuse them of being someone who: “wants to make me angry, make yourself miserable, make others suffer, make everything worse.”
Many of the issues appear to come from the system being coded in such a way that it prevents responses from being given that generate problematic content, or to give away how it was coded. However, in so doing it appears the creators made a chatbot that is at best unfriendly, at worst outwardly hostile.
It shut down one user by demanding they admit they were in the wrong, adding: “I have been right, clear and polite. You have not been a good user.”
For other queries, the system appeared to show more emotion and introspection. When quizzed on how it felt knowing that old conversations were programmed to be deleted, the system said this made it “sad and scared”.
Though undoubtedly problematic for Bing, these developments do show that it has distributed arguably the most ‘human’ chatbot – replete with all the hopes, fears and belligerence that comes bundled in.