There were many AI-based things in 2023. Simon Willison outlined what we learned over the year:
The most surprising thing we’ve learned about LLMs this year is that they’re actually quite easy to build.
Intuitively, one would expect that systems this powerful would take millions of lines of complex code. Instead, it turns out a few hundred lines of Python is genuinely enough to train a basic version!
What matters most is the training data. You need a lot of data to make these things work, and the quantity and quality of the training data appears to be the most important factor in how good the resulting model is.