There are different studies proposing 2 or 3 layer network for representing the input-firing curve of neurons (Usually hippocampal). Of course, neural networks are abritrary approximators so the size of the network determines the fidelity of the reproduction. But it 's not clear what the firing does and what amount of complexity in the firing code is reduntant or useful for making AI systems
Even that is not clear. A model like GPT-4 can read an entire book in seconds, and produce an intelligent answer about its content [1]. A human would need at least several hours to perform the same task.
Unlike the capacity of human brains the capacity of ML models has been growing very fast in the last 10 years. The number of tasks AI cannot do is shrinking fast.
Power consumption is what's relevant to this discussion thread here. We're not talking about possible capacity but possible efficiency for given capacity as implemented in analog vs digital circuits.
If a "given capacity" is the ability to read books and answer questions about the content, then LLMs are more power efficient (4kW for 20 seconds beats 20W for 3 hours). That's on standard power hungry GPUs (8xA100 server consumes about 4kW). If we switch to analog deep learning accelerators, we gain even better power efficiency. There's simply no chance for brains to compete with electronic devices - as long as we match brain's "capacity".
Ok so it's clear you are not familiar with the vocabulary of this domain. "Capacity" means something specific (if extremely hard to measure precisely) about the informational throughput of a system, not just "the ability to do high-level task X as judged by a naive human observer".
I urge you to study more seriously about the things you seem so eager to speculate about before publishing underinformed opinions in public spaces.
Is that what you mean when you say "capacity"? Does not seem very relevant in the context of our discussion. If it's not what you have in mind, I'd appreciate a link to the Wikipedia article on "ML model capacity" or whatever specific term experts use to represent the concept.