BERT is one of a number of AI systems that learn from lots and lots of digitised information, as varied as old books, Wikipedia entries and news articles. Decades and even centuries of biases — along with a few new ones — are probably baked into all that material. But scientists are still learning how technology like BERT, called “universal language models,” works. Through learning these tasks, BERT comes to understand in a general way how people put words together. But now, thanks to BERT, Google correctly responds to the same question with a link describing the physical demands of life in the skin care industry.
Source: bd News24 November 12, 2019 03:04 UTC