Bing gets weird — and (maybe) why
For weeks, everyone was talking about how great the Large Language Model (LLM) ChatGPT is, or else showing that it can make serious mistakes of fact or logic. But since the alliance between OpenAI and Microsoft added (a version of) this LLM to (a version of) Bing, people have been encountering weirder issues. As Mark Frauenfelder pointed out a couple of days ago at BoingBoing, "Bing is having bizarre emotional breakdowns and there's a subreddit with examples". The cited subreddit, r/bing, has examples going back to the start of the alliance. And today, Kevin Roose posted a long series of strikingly strange passages from his own interactions with the chatbot , "Bing's A.I. Chat: 'I Want to Be Alive", NYT 2/16/2023.
Read the rest of this entry »