It's been a year of supersized AI models.
When OpenAI released GPT-3 in June 2020, the neural network's apparent grasp of language was uncanny. GPT-3 was also monstrous in scale—larger than any other neural network ever built.
But the impact of GPT-3 became even clearer in 2021. This year brought a proliferation of large AI models built by multiple tech firms and top AI labs, many surpassing GPT-3 itself in size and ability. How big can they get, and at what cost?
From MIT Technology Review
View Full Article (May Require Paid Registration)
No entries found