Even for training on a massive dataset, you are still limited by compute and processing time (good ole Computational Complexity), which is why HPC projects like the Exascale Compute Project were created in 2015 along with additional funding+research in efficient and alternative data structures.
I highly recommend going down the rabbit hole of High Performance/Accelerated Machine Learning.
The eminence of AI is more 'volumetric' than 'out'
Out scales "up and out like a hill or a lift"
- volumetric is spherical - it presses into the future AND the past (it already has been harvesting history, but created a fire-hosed spigot-interface for the future as well) -- and draws it into its center for eval...
They are different.
Always on the positive expressions of XYZ axis - but had ignored the negative of each, where AGI will go in all dimensions...
I don't think this aligns with the Chinchilla scaling law. There is indeed a point at which you can oversaturate a model with data, as well as such thing as not giving it enough. Compute is the constraining factor, and it scales more-or-less linearly in both directions.
Yes! thats what I mean by scaling VOLUMETRICALLY -- scaling horizontally and vertically are now replaced by volumetrically. (coining a term?)
its going to be an obloid-worping sphere-oid
Thats what I see down the pipe..?
Thoughts, anyone?