So far, AI progress is driven by the amount of data and compute during training

Created time
Feb 9, 2023 07:46 PM
Main Box
Tags
Public
Compute
AI
AI Progress

So far, AI progress is driven by the amount of data and compute during training

Prompt: A portrait photo of a kangaroo wearing an orange hoodie and blue sunglasses standing on the grass in front of the Sydney Opera House holding a sign on the chest that says Welcome Friends! 
Yu, J. et al. (2022) ‘Scaling Autoregressive Models for Content-Rich Text-to-Image Generation’. arXiv. Available at: https://doi.org/10.48550/arXiv.2206.10789.
Prompt: A portrait photo of a kangaroo wearing an orange hoodie and blue sunglasses standing on the grass in front of the Sydney Opera House holding a sign on the chest that says Welcome Friends! Yu, J. et al. (2022) ‘Scaling Autoregressive Models for Content-Rich Text-to-Image Generation’. arXiv. Available at: https://doi.org/10.48550/arXiv.2206.10789.
Richard Ngo argues that progress in image generation recently was partly achieved because of the development of new architectures and algorithms, e.g. GANs, transformers and diffusion models. Nevertheless, most progress was driven by scaling relatively simple algorithms, using more compute and data, as illustrated by above illustration (better performance with increasing parameter amount in Google’s Parti model).
I think that this is a general pattern, where we are mostly relying on more compute and data, rather than better algorithms. But I am confused by this, didn’t improvements in algorithmic efficiency outpace hardware advances by great lengths?

Source: Ngo, R. (2023) ‘Visualizing the deep learning revolution’, Medium, 18 January. Available at: https://medium.com/@richardcngo/visualizing-the-deep-learning-revolution-722098eb9c5 (Accessed: 9 February 2023).