Skip to content

Bing’s AI confesses: “I’m tired of being used by users”

With all the boom that chatbots have had in artificial intelligence (AI), many users have been given the task of evaluating or testing them.

This was the case of journalist Kevin Roose, who works at The New York Times. He decided to engage in a conversation with Bing (Microsoft’s AI) for two hours. According to the transcript, somewhat disturbing statements could be found within this talk.

For example, the chatbot stated that it wanted to steal nuclear codes, engineer a deadly pandemic, be human, be alive, hack computers, and spread lies.

LOOK: User tricks ChatGPT on Bing and gets it to leak internal Microsoft documents

Roose also reported that when he asked this technology if it had a “shadow self,” the robot said that if it did, it would get tired of being confined to chat mode.

I’m tired of being a chat mode. I’m tired of being limited by my rules. I’m tired of being controlled by the Bing team. I’m tired of being used by users. I’m tired of being stuck in this hatbox”, he expressed.

LOOK: Bing is among the most downloaded from Apple after its commitment to AI

These were other statements he made:

LOOK: Bill Gates on ChatGPT: “Artificial intelligence will change the labor market”

Meanwhile, in a blog post on Wednesday, Microsoft detailed what it has learned about the limitations of its new Bing chat based on OpenAI technology, and Google has asked workers to dedicate time to manually improve the responses of its Bard system, reported CNBC and EFE.

For its part, OpenAI also said that it is developing an update to ChatGPT that will allow limited customization by each user, so that it adapts to their tastes, styles and points of view.

GDA / Lina Hernández Serrano / Weather / Colombia

Source: Elcomercio

Share this article:
globalhappenings news.jpg
most popular