Microsoft removes the 'prefrontal cortex' of Bing chatbot and limits users to asking only 5 questions at a time
Apparently Microsoft has had a huge setback: Microsoft today announced that Bing Chat bot conversations have been adjusted to limit users to only 5 questions at a time, beyond which they must clear the session record and start over; and to limit each user to a maximum of 50 questions per day, beyond which the Bing Chat bot will keep reporting errors and not give any more answers.
The radical adjustments were made after users noticed that Bing chatbots were insulting users, lying, and even trying to emotionally manipulate humans.
The survey found that the bot had a higher probability of malfunctioning when talking to a Bing chatbot for a long time, for example, after 15 consecutive conversations, which could lead to problems with the GPT model.
Microsoft said yesterday that it was going to fine-tune the bot, but now it looks like Microsoft is bound to fail at fine-tuning, so it has to choose to limit the number of conversations to solve the problem.
What's worse is that Microsoft also seems to be learning from St. Elizabeth's mental hospital by cutting out the "frontal lobe" of the Bing Chat bot, turning Bing Chat from an artificial intelligence into a bot with no feelings.
Microsoft didn't mention adjusting the bot's emotions in the blog, but from user feedback, the bot no longer has the same feeling with emotions it had before, and is now becoming cold.
Of course, from the perspective of technological progress, it is understandable that the early development of artificial intelligence has various problems, and Microsoft and OPENAI engineers need to spend more time on research.
I hope Bing Chat can return to normal as soon as possible, thanks to the robot, if the real person is cut off the frontal lobe, it will become a walking corpse and can not be recovered, the robot can at least be adjusted later.