Many people are struggling to understand what changed in late 2022 to cause so much excitement in AI. In one word, the answer is quantity. We crossed a threshold, and everything has been on fast-forward ever since.

Though there wasn't a groundbreaking invention in 2022 or in the immediately preceding years within the diverse AI domain, we experienced a defining moment when outcomes from models like Stable Diffusion and Dall-E 2 confirmed our course. This validation turned the AI landscape into a hive of activity, which can be understood through the lens of emergent properties.

Emergent properties are a powerful force in nature and natural sciences. When smaller, simpler constituent parts of a system interact, their multiplicity creates chaos, which beyond a point, results in new characteristics that are not visible at the constituent levels.

In layperson's terms, consider that individual atoms or molecules do not possess a temperature. However, when a substantial number of them gather, the concept of temperature emerges. Biological sciences often present even more instances of emergent phenomena than physical sciences do. A prime example of this is our brain, where straightforward neuron cells and their connectivity functions transform into something vastly complex once their number surpasses approximately 80 billion.

The world of Gen AI has mirrored this natural phenomenon. While small research teams were toiling away on foundational models using novel techniques introduced in the mid-2010s, results were middling until a significant shift occurred. The use of larger testing datasets - tokens and parameters - led to diffusion models exhibiting human-like behavior once they crossed the 100-billion parameter mark. This validation set off a chain reaction, with models from Google's Palm and Lamda, Meta's Llama, Hugging Face, Firefly, and Falcon, springing to life following substantial training.

The ramifications of this, which can be termed a no-new-innovation yet massive innovation, extend across all disciplines. We've discovered that the neural network constituents of our foundational models function effectively. Though they can be enhanced and made more efficient, comprehensive training is the key to their power.

What is most remarkable to note, and one that is not well understood, is that at constituent levels, the programs are relatively simple. And, as much as people talk about intensive resource requirements, they are relatively small for any reasonably sized corporate. This is the reason why understanding the emerging competitive landscape is essential. More about that later. For now, let's marvel at the power of quantity and emergent properties at the heart of the AI revolution. innovation

PS: some of this author's beliefs on emergent properties within this book review https://bit.ly/3Chi0yw

Related Articles on Innovation