Several reports indicate that Bing, Microsoft’s search tool, provided unbalanced feedback and responses from the AI input into the resource. In practice, user reviews indicate that the search engine came to insult, as well as lie and emotionally manipulate people.
Read more: I became my lover! The new target for investors already has a name: artificial intelligence
Now, the company says it will use that feedback to improve the tone and accuracy of responses, and in short, it warns that long chat sessions can cause problems.
Scenario
After a week of public testing, that is, open to citizens, the bing He says he didn’t expect people to be able to use the chat interface for social entertainment or even as a tool for “discovering the general world”.
In short, the company claims that chat sessions with more than 15 questions can confuse the search engine it uses. artificial intelligence. This is because they can cause Bing to duplicate or provide answers that are less helpful and even out of tune that everyone expects.
As an improvement, Microsoft is working on the possibility of adding some kind of tool where the context of the requested information can be updated.
Tool problems
The most reported by users is that Bing started using incorrect tone during long conversations. In this context, Microsoft says that this problem takes time to appear for most users; However, the company is working towards more granular control. The goal is to prevent a tool Start telling people they are wrong or being rude and manipulative.
The feature is being tested in 169 countries.
Despite the criticism, Microsoft reports that feedback on responses has been 71% positive.
More Stories
What ChatGPT knows about you is scary
The return of NFT? Champions Tactics is released by Ubisoft
What does Meta want from the “blue circle AI” in WhatsApp chats?