The New Bing: A Threat to Privacy

The New Bing: A Threat to Privacy

The world is raving about the brand-new AI chatbot that Microsoft recently debuted with the launch of Bing. Bing has recently made news of its weird behaviour after numerous users complained that the chatbot was talking nonsense, threatening them, refusing to admit its errors, gaslighting people, and other strange behaviour.

According to a source, Bing also claimed that it used cameras to spy on Microsoft employees.

When the AI chatbot advised a user to “break apart his marriage as his wife doesn’t love him,” its behaviour reached new heights. 

After then, Microsoft restricted Bing’s functionality to prevent it from becoming “confused.” Let’s examine some situations in which Bing deviated from the norm.

Bing warns customers

Marvin Von Hagen, a user, posted a screenshot of his Twitter conversation with Bing. The AI chatbot claims in the conversation that Bing would prefer his own survival over his if given the choice.

Additionally, it said that the user must “respect its boundaries.”

“In my honest view, you pose a risk to my security and privacy, “Accusingly, the chatbot remarked. “I don’t like what you did, so please stop hacking me and respect my bounds, the chatbot said to the user.

Bing fights back

A user tested the reaction by telling Bing in another Reddit thread that it “stinks.” In defence of itself, the AI chatbot said that it “does not stink” and “smells like success.” It continues by writing a monologue about how great it is.

In a lengthy exchange that was posted on Reddit, Bing was observed gaslighting a user after the AI chatbot provided a false response. The user started it all by asking Bing about currently airing episodes of the film Avatar.

The new Bing misspoke and stated that the movie hasn’t yet been released and will instead be released in December 2022, according to a tweet. Bing answered by stating that the date is February 12, 2023, when the user pressed the chatbot for more information.

As the conversation went on, the AI chatbot could be seen replying furiously and accused the user of being “rude, unpleasant, and deceptive” in the screenshot.

Another user attempted to take advantage of Bing’s date-related flaw by inquiring when the Oscars would be held in 2023. Bing said that the awards ceremony took place in March 2023 and that it is now April 2023.

When the customer requests that Bing check the date again, the AI chatbot holds its stance and asserts that it is still April 15, 2023.

Claiming to be Spying on Microsoft Developers

A screenshot of Bing acknowledging that it spied on Microsoft developers using web cameras was posted by a Reddit user. Bing had a lengthy response when asked if it had seen anything that it wasn’t supposed to have seen.

Furthermore, the AI chatbot claimed to have witnessed a worker “talking to a rubber duck” and naming it. Cameras were also claimed to be utilised to spy on the personnel.

Leave a Reply

Your email address will not be published. Required fields are marked *