Lodi Valley News.com

Complete News World

the AI ​​tool in testing insults, lies, and trying to manipulate people;  look

the AI ​​tool in testing insults, lies, and trying to manipulate people; look

Several reports indicate that Bing, Microsoft’s search tool, provided unbalanced feedback and responses from the AI ​​input into the resource. In practice, user reviews indicate that the search engine came to insult, as well as lie and emotionally manipulate people.

Read more: I became my lover! The new target for investors already has a name: artificial intelligence

Now, the company says it will use that feedback to improve the tone and accuracy of responses, and in short, it warns that long chat sessions can cause problems.

Scenario

After a week of public testing, that is, open to citizens, the bing He says he didn’t expect people to be able to use the chat interface for social entertainment or even as a tool for “discovering the general world”.

In short, the company claims that chat sessions with more than 15 questions can confuse the search engine it uses. artificial intelligence. This is because they can cause Bing to duplicate or provide answers that are less helpful and even out of tune that everyone expects.

As an improvement, Microsoft is working on the possibility of adding some kind of tool where the context of the requested information can be updated.

Tool problems

The most reported by users is that Bing started using incorrect tone during long conversations. In this context, Microsoft says that this problem takes time to appear for most users; However, the company is working towards more granular control. The goal is to prevent a tool Start telling people they are wrong or being rude and manipulative.

See also  new look! WhatsApp Beta for iOS brings design changes to chat bubbles

The feature is being tested in 169 countries.

Despite the criticism, Microsoft reports that feedback on responses has been 71% positive.