• IP addresses are NOT logged in this forum so there's no point asking. Please note that this forum is full of homophobes, racists, lunatics, schizophrenics & absolute nut jobs with a smattering of geniuses, Chinese chauvinists, Moderate Muslims and last but not least a couple of "know-it-alls" constantly sprouting their dubious wisdom. If you believe that content generated by unsavory characters might cause you offense PLEASE LEAVE NOW! Sammyboy Admin and Staff are not responsible for your hurt feelings should you choose to read any of the content here.

    The OTHER forum is HERE so please stop asking.

Microsoft Sorry For Chatbot's Racist Tweets

Vega.

Alfrescian
Loyal

Microsoft Sorry For Chatbot's Racist Tweets


After a Twitter bot sympathised with Hitler and told feminists to burn in hell, the tech giant admits to a "critical oversight".

07:55, UK, Saturday 26 March 2016

capture-1-736x414.jpg


'C u soon humans need sleep now,' doomed Tay's last tweet read

Microsoft has said it is "deeply sorry" for the racist and sexist tweets that were generated by its Twitter chatbot - which had been designed to mimic the musings of a teenage girl.

Tay, which was pulled offline barely a day after it launched, was quickly taught a slew of anti-Semitic and offensive remarks by a group of mischievous Twitter users.

In a typical response, it tweeted that "feminism is cancer" - and also issued replies which said the Holocaust didn't happen, and "Bush did 9/11".

Another message read: "Hitler would have done a better job than the monkey we have now."

In an official blog post, a Microsoft executive wrote: "Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack.

"As a result, Tay tweeted wildly inappropriate and reprehensible words and images. We take full responsibility for not seeing this possibility ahead of time."

Peter Lee, who is corporate vice president of the tech giant's research wing, said Microsoft would only bring Tay back "when we are confident we can better anticipate malicious intent that conflicts with our principles and values".

The botched experiment could prove embarrassing for Microsoft.

"I can't believe they didn't see this coming," said Kris Hammond, an artificial intelligence expert said.

Caroline Sinders, who develops chat robots for another company, described Tay as "an example of bad design" - and said as the machine was learning from whatever it was told, constant maintenance would be crucial.

Despite the setback, Mr Lee said the company is determined to make Tay resistant to juvenile Twitter users, adding: "We will remain steadfast in our efforts to learn from this and other experiences as we work toward contributing to an internet that represents the best, not the worst, of humanity."




 
Top