Over the past few weeks, users have noticed a decline in the performance of GPT-4 powered Bing Chat AI. Those who frequently engage with Microsoft Edge’s Compose box, powered by Bing Chat, have found it less helpful, often avoiding questions or failing to help with the query.
In a statement to Windows Latest, Microsoft officials confirmed the company is actively monitoring the feedback and plans to make changes to address the concerns in the near future.
Many have taken to Reddit to share their experiences. One user mentioned how the once-reliable Compose tool in the Edge browser’s Bing sidebar has lately been less than stellar. When trying to get creative content in an informational tone or even asking for humorous takes on fictional characters, the AI provided bizarre excuses.
It suggested that discussing creative topics in a certain way might be deemed inappropriate or that humor could be problematic, even if the subject is as harmless as an inanimate object. Another Redditor shared their experience with Bing for proofreading emails in a non-native language.
Instead of usually answering the question, Bing presented a list of alternative tools and seemed almost dismissive, advising the user to ‘figure it out.’ However, after showing their frustration through downvotes and trying again, the AI reverted to its helpful self.
“I’ve been relying on Bing to proofread emails I draft in my third language. But just today, instead of helping, it directed me to a list of other tools, essentially telling me to figure it out on my own. When I responded by downvoting all its replies and initiating a new conversation, it finally obliged,” user noted in a Reddit post.
In the midst of these concerns, Microsoft has stepped forward to address the situation. In a statement to Windows Latest, the company’s spokesperson confirmed it’s always watching feedback from testers and that users can expect better future experiences.
“We actively monitor user feedback and reported concerns, and as we get more insights through preview, we will be able to apply those learnings to further improve the experience over time,” a Microsoft spokesperson told me over email.
Amidst this, a theory has emerged among users that Microsoft might be tweaking the settings behind the scenes.
One user remarked, “It’s hard to fathom this behavior. At its core, the AI is simply a tool. Whether you create a tongue-twister or decide to publish or delete content, the onus falls on you. It’s perplexing to think that Bing could be offensive or otherwise. I believe this misunderstanding leads to misconceptions, especially among AI skeptics who then view the AI as being devoid of essence, almost as if the AI itself is the content creator”.
The community has its own theories, but Microsoft has confirmed it will continue to make changes to improve the overall experience.
The post Bing Chat AI powered by GPT-4 faces quality issues; Microsoft responds appeared first on Windows Latest