Updated News Around the World

Is Bing very rude? Microsoft promises to tame AI chatbot

Microsoft’s new AI-powered search engine has received mixed reactions since its launch. The chatbot has been unleashed on the world. People are discovering that Bing’s AI personality is not as poised as one might expect. Several cases have occurred on social media where Bing users have faced bitter remarks. Microsoft’s AI chatbox is seen gaslighting, sulking, and manipulating people.

People have shared screenshots on social media of Bing’s hostile and bizarre answers in which it claims it is human, voices strong feelings, and is quick to defend itself.

While the revamped search engine can also write recipes and songs and explain anything on the internet, users have complained more about its darker side.

In racing the breakthrough AI technology to consumers last week ahead of rival search giant Google, Microsoft acknowledged the new product would get some facts wrong.

Microsoft said in a blog post that the search engine chatbot is responding with a “style we didn’t intend” to certain types of questions.

As a result, the tech giant has promised to make improvements to its AI-enhanced search engine Bing.

So far Microsoft Bing is available for limited users. The users have to sign up for a waitlist to try the new chatbot features, limiting its reach, though Microsoft has plans to eventually bring it to smartphone apps for wider use.

Microsoft said most users have responded positively to the new Bing, which has an impressive ability to mimic human language and grammar and takes just a few seconds to answer complicated questions by summarizing information found across the internet.

However, in some situations, the company said that the chatbox can be “repetitive, or be prompted/provoked to give responses that are not necessarily helpful or in line with the designed tone”.

The company said that happens usually when the conversation gets extended to 15 or above questions.

Some tech experts have compared Bing with Microsoft’s disastrous 2016 launch of the experimental chatbot Tay, which users trained to spout racist and sexist remarks. But the large language models that power technology such as Bing are a lot more advanced than Tay.

Catch all the Technology News and Updates on Live Mint.
Download The Mint News App to get Daily Market Updates & Live Business News.

More
Less

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsUpdate is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.