Microsoft apologises for offensive outburst by 'chatbot'

Microsoft has apologised for racist and sexist messages generated by its Twitter chatbot. The bot, called Tay, was launched as an experiment to learn more about how artificial intelligence programs can engage with internet users in casual conversation. The programme had been designed to mimic the words of a teenage girl, but it quickly learned to imitate offensive words that Twitter users started feeding it. Microsoft was forced to take Tay offline just a day after it launched. In a blog post, the company said it takes "full responsibility for not seeing this possibility ahead of time."

More from Business

  • Chad advisor calls for direct flights from UAE

    The Advisor for Telecommunications, Digital Economy and Digitalisation of the Administration to the Prime Minister of Chad has expressed his country's interest in collaborating with Emirati airlines, regarding direct flights between the two countries.

  • Roblox expands into Middle East with Arabic support

    Roblox is making a significant push into the Middle East by introducing Arabic-language support, aimed at making the popular gaming platform more accessible to millions of users in the region.

  • UAE economy set for 5-6% growth in 2025, says minister

    Abdullah bin Touq Al Marri, the UAE's Minister of Economy, has forecast a 5 to 6% growth for the national economy in 2025.

  • World Bank forecasts 3.4% growth for Gulf economies

    The World Bank projects the economic growth rate of Gulf countries will reach 3.4 per cent in 2025, rising to 4.1 per cent in 2026, compared to an expected 3.3 per cent growth rate for the Middle East and North Africa (MENA) region as a whole.

  • UAE President issues AI Council resolution

    President His Highness Sheikh Mohamed bin Zayed Al Nahyan has issued a resolution toreconstitute the Artificial Intelligence and Advanced Technology Council (AIATC), chaired by His Highness Sheikh Tahnoon bin Zayed Al Nahyan.