AI That Writes Prose And Poetry Is Getting Stronger (Uh-Oh)

“The more text to which an algorithm can be exposed, and the more complex you can make the algorithm, the better it performs. … The model that underpins [the AI software] GPT-3 boasts 175bn parameters, each of which can be individually tweaked — an order of magnitude larger than any of its predecessors. It was trained on the biggest set of text ever amassed, a mixture of books, Wikipedia and Common Crawl, a set of billions of pages of text scraped from every corner of the internet.” That means, alas, that GPT-3 has picked up some of the uglier material found in some of those corners. – The Economist