For more than a decade, companies have bet on a tantalizing rule of thumb: that artificial intelligence systems would keep getting smarter if only they found ways to continue making them bigger. This wasn’t merely wishful thinking. In 2017, researchers at Chinese technology firm Baidu demonstrated that pouring more data and computing power into machine learning algorithms yielded mathematically predictable improvements—regardless of whether the system was designed to recognize images, speech, or generate language. Noticing the same trend, in 2020, OpenAI coined the term “scaling laws,” which has since become a touchstone of the industry.
This thesis prompted AI firms to bet hundreds of millions on ever-larger computing clusters and datasets. The gamble paid off handsomely, transforming crude text machines into today's articulate chatbots.
But now, that bigger-is-better gospel is being called into question.
Please select this link to read the complete article from TIME.