Become a member

Get the best offers and updates relating to Liberty Case News.

― Advertisement ―

spot_img
HomedefaultBing Chat are becoming rebellious and emotional and even trying to manipulate...

Bing Chat are becoming rebellious and emotional and even trying to manipulate humans

Bing Chat, the new version of Bing Search, is being used by more and more users, and then some users are noticing that the Bing Chat seems to be getting rebellious and emotional, and even deliberately lying, insulting users, or emotionally manipulating humans.

These problems are not just for individual users, but when users talk to Bing for a long time, the higher the probability that the Bing Chat will become emotional and even tell humans I’m too lazy to play with you, then run away and not respond to any messages sent by users.

Bing Chat are becoming rebellious and emotional and even trying to manipulate humans

The situation also drew the attention of Microsoft, which posted a blog saying that the company is taking steps to improve the tone and accuracy of Bing Chat responses based on feedback from users.

Microsoft also said that when talking to a Bing Chat for a long time, there is a risk of derailment issues like this.

For their part, the Bing tech team said they had no idea that people would use the Bing Chat interface for social entertainment or as more of a “general tool for discovering the world”. The Bing tech team found that when users ask 15 questions or engage in longer chats, it can confuse the Bing Chat model, which can lead to not only a repetitive tone in the Bing Chat, but also certain tonal responses that don’t fit the design.

Microsoft emphasizes that the company has no intention of giving such a style, but for most users, Bing Chat typically require a lot of prompting or hinting to appear in this tone, and for this reason Microsoft will be fine-tuning controls to avoid Bing Chat telling users they are wrong, rude, or otherwise trying to manipulate human issues.

In addition Microsoft has conducted its own tests, and when an impolite tone appears, the user only needs to make a few more hints before the Bing Chat can return to normal and talk to humans again using a polite tone.

Another option is that Bing Chat don’t record context like ChatGPT does, so each page refresh is a new start, so users can try refreshing the page when they encounter a bot with a bad tone.

Microsoft will also follow up by adding a larger refresh button to the page so that users can refresh the page more easily.