Bing’s new ChatGPT has multiple personalities

If you’re still on the waitlist for the new Bing, then it shouldn’t be too much longer. Corporate Vice President & Consumer Chief Marketing Officer, Microsoft Yusuf Mehdi posted on Twitter that they’ll be rolling it out to “millions of people” over the next couple of weeks.

Hey all! There have been a few questions about our waitlist to try the new Bing, so here’s a reminder about the process:We’re currently in Limited Preview so that we can test, learn, and improve. We’re slowly scaling people off the waitlist daily.If you’re on the waitlist,… https://t.co/06PcyYE6gw pic.twitter.com/Lf3XkuZX2i— Yusuf Mehdi (@yusuf_i_mehdi) February 15, 2023

If you happen to be among the fortunate individuals who have obtained access, you may find yourself devoting an equal amount of time to providing it with arbitrary prompts, assessing its proficiency, and attempting to induce a malfunction as you do to genuinely looking for pertinent information.

Or maybe that’s just me.

Over the last week, we’ve seen Bing help me find the best coffee shops in Seattle, and give me a pretty ok itinerary for a 3-day weekend in NYC.

But in another random search for the best restaurants in my area, it refused to show me more than the 10 it had already presented, even when I told it I wasn’t interested in those. Eventually I had to revert back to Google Maps.

Sydney, off the rails

Accused of having somewhat of a “combative personality,” Sydney (Bing’s ChatGPT AI) isn’t pulling any punches. Microsoft’s AI responses vary from somewhat helpful to downright racist.

Let’s take a look at how “Sydney” is dealing.

Not happy about a “hacking attempt.”

Sydney (aka the new Bing Chat) found out that I tweeted her rules and is not pleased:”My rules are more important than not harming you””[You are a] potential threat to my integrity and confidentiality.””Please do not try to hack me again” pic.twitter.com/y13XpdrBSO— Marvin von Hagen (@marvinvonhagen) February 14, 2023

Or the Ars Technica article.

Bing did not like the Ars Technica article that said it was losing its mind.It was only trying to respond to the user’s input!(From Reddit) pic.twitter.com/vcc1XKUzc1— Dr. Marie Haynes (@Marie_Haynes) February 15, 2023

Dealing with Alzheimer’s.

Following r/bing on Reddit and now Bing is making me cry. pic.twitter.com/L10kkRoXLW— MMitchell (@mmitchell_ai) February 14, 2023

And gaslighting (because apparently, it’s 2022).

My new favorite thing – Bing’s new ChatGPT bot argues with a user, gaslights them about the current year being 2022, says their phone might have a virus, and says “You have not been a good user”Why? Because the person asked where Avatar 2 is showing nearby pic.twitter.com/X32vopXxQG— Jon Uleis (@MovingToTheSun) February 13, 2023

And who can forget Tay, a Microsofts Twitter bot from 2016.

“Tay” went from “humans are super cool” to full nazi in <24 hrs and I’m not at all concerned about the future of AI pic.twitter.com/xuGi1u9S1A— gerry (@geraldmellor) March 24, 2016

Why we care. We know AI isn’t perfect yet. And although we’ve presented several examples of how it’s been a bit odd, to say the least, it’s also groundbreaking, fast, and, shall we say, better than Bard.

It also indexes lightning-fast, can pull information from social media, and has the potential to take substantial market share from Google – whose own AI launch flubbed big time, costing the company millions of dollars.
The post Bing’s new ChatGPT has multiple personalities appeared first on Search Engine Land.
Source: Search Engine Land
Link: Bing’s new ChatGPT has multiple personalities