The China-based company selling the merchandise likely had no idea what the English description said, experts tell CBC News, as an artificial intelligence (AI) language program produced the content. Experts in the field of AI say it's part of a growing list of examples where real-world applications of AI programs spit out racist and biased results. 'Stereotypes are quite deeply ingrained in the algorithms in very complicated ways,' says James Zou of Stanford University, who studies the biases of AI language programs. Sasha Luccioni, a post-doctoral researcher with Mila, an AI research institute in Montreal, says the question of how to solve the problem with racism and stereotypes in AI technology is a source of debate. The study, which Zou conducted along with another academic at Standford and one from McMaster University in Hamilton, found "persistent anti-Muslim bias" in AI language programs.
Source: CBC News May 17, 2021 07:52 UTC