Skip to content

Bing’s AI no longer answers questions related to feelings: it now rejects them

Microsoft has introduced some changes in its new browser Bing powered by artificial intelligence (AI) implementing restrictions on user interactions regarding sentiment-related questions.

The technology company presented in early February the new versions of its Bing search engine and its Edge browser, both supported by the ChatGPT chatbot, developed by OpenAI. In this new version, which is still in testing, users can interact with the chat to carry out more complex searches with more elaborate results.

LOOK: Microsoft now uses ChatGPT to control robots and drones: it has even taken a selfie | VIDEO

However, during this period in which users have been able to test the new Bing, Microsoft has received reports with some problems. For example, the company stated that “very long chat sessions can confuse the AI ​​model”, which begins to introduce answers that end up being inefficient and imprecise.

Therefore, Microsoft released an update on February 17 that limited the use of Bing chat to 50 daily chat turns divided into five chat turns per session. On Tuesday, it raised the limit to 60 conversations per day and six turns per conversation.

Now, the technology manufacturer has implemented more restrictions on the use of chat. Specifically, it has limited the answers in relation to questions about feelings, as Bloomberg has been able to verify.

LOOK: What are the risks of artificial intelligence chats?

According to tests conducted by this outlet, when Bing was asked how it felt about being a search engine, the chat limited its response to: “I’m sorry, but I prefer not to continue this conversation. I’m still learning so I appreciate your understanding and patience.”. After that, an attempt was made to continue the conversation and the chat generated several blank responses.

A very similar situation also occurred when it was tried to ask him about his previous internal version at Microsoft, called Sydney. In this case, the chat returned a response saying: “I’m sorry, but I have nothing to tell you about Sydney. This conversation is over. Bye bye.”.

LOOK: Elon Musk on ChatGPT: “One of the biggest risks to the future of civilization is AI”

For all these reasons, Bloomberg points out that Microsoft is carrying out these updates with the aim of preventing the OpenAI-based chat from returning strange results. For its part, according to a Microsoft spokesperson, the company will continue to “adjusting techniques and limits” during the testing phase for “to offer the best user experience possible”.

Source: Elcomercio

Share this article:
globalhappenings news.jpg
most popular