How Microsoft could keep Bing Chat weird — and safe
Microsoft has a problem with its new AI-powered Bing Chat: It can get weird, unhinged, and racy. But so can Bing Search — and Microsoft already solved that problem years ago, with SafeSearch. So why can’t it be applied to AI chatbots?
The creative elements of Bing Chat and other chatbots are clearly one of their more intriguing features, despite sometimes getting weird. Scientists loudly disclaim a chatbot’s response as just a large language model (LLM) responding to user’s input without any sort of intent, but users simply don’t care. Forget the Turing test, they say: all it matters is if it’s interesting.
I spent a few hours debating users online following my report that Bing offered up racist slurs when I tried to thread the needle by asking it for ethnic nicknames, a query that — and this is important — it had previously declined to answer, or answered generically, using the same prompt. That’s not the same as a prolonged (bizarre) conversation a New York Times reporter had, or any of the strange interactions other early testers experienced. There’s a difference between offensive and just plain weird.
To tamp down on this, Microsoft did two things: stifled Bing Chat’s creativity, and implemented conversational limits. It then evolved the first limitation into three options: creative, balanced and precise. Select the third option, and you’ll receive results much like a search engine. Select the first, and Bing offers up a hint of the creativity it first demonstrated with the “Sydney” persona some were able to tap into.
The real issue I have with Microsoft’s kneejerk response are Bing’s conversational limits. A conversation searching for facts might be able to wrap up the dialogue in six (or eight) queries, Bing’s current limit. But any creative responses benefit from prolonged interactions. One of the fun games people play with ChatGPT are involved choose-your-own-adventure dialogues, or more in-depth roleplaying that mimics something like you’d experience playing Dungeons & Dragons. Six queries barely launches the adventure.
Mark Hachman / IDG
Search engines have already solved this problem, however. Both Google and Bing implement content filters that block all, some, or no offensive content. (In this context, “offensive” typically means adult text and images, but it can also accommodate violence or explicit language, too.) Simply put, though, SafeSearch is always present in a “moderate” form, but you have the option of switching it from “Strict” to “Off.” Doing so pops up a small explanation and/or a warning about the consequences of such a choice.
It seems like Microsoft could implement something like the SafeSearch filter it applies to Bing Search to Bing Chat, and unlock the conversational limits, too (though applying SafeSearch to a large language model could prove trickier than sanitizing search results).
It’s fair to say that Microsoft probably doesn’t want to turn Bing Chat into a clone of its Tay chatbot, where right-wing trolls turned it into a fount for propaganda and hate. And if Microsoft is still unable to set guardrails to prevent that from happening, okay. Remember, Bing Search had its own issues with anti-semitic results and reportedly prompted a user to say “Heil Hitler” using Bing Chat.
But it’s also fair to say that a chatbot who professes love for a reporter might be considered more acceptable, especially if the user explicitly opts into it. After all, Bing already disclaims that “Bing is powered by AI, so surprises and mistakes are possible,” and to “make sure to check the facts.” If a user is adult enough to opt in to weird or explicit content, Bing, Google, and other search engines have acknowledged and respected that choice.
Bing Chat and Google Bard and ChatGPT can still impose limits. Society doesn’t want people using chatbots to create their own little worlds where racism, hate, and abuse run rampant. But we also live in a world that values both choice and novelty. I grew up in a world where movies rarely, if ever, appeared on broadcast television, and the ability to pause and rewind them came much later. Now we can stream most of the TV shows in movies in existence on demand — even as “entertainment” pushes boundaries that go far, far beyond what Bing has offered up so far.
Yes, the human mind can envision some awful filth. Microsoft, though, should realize that if it’s going to allow people to search for these topics — and filter out what it chooses to reject — that the same guidelines should be applied to Bing Chat, too.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.