That's pretty crazy to think about, even if you consider overhead and multimedia assets like images often included in PDFs. I remember the old "the library of congress fits on this CDROM" analogies (which wasn't entirely true) but this takes it to a whole new level.
At some point, it seems like in research, it will be far easier to skip the lit review and just do the work then later if you do the same work someone else did, compare results for consistency. We may yet get through the hurdle of the reproducibility crisis due to the deluge of information. The underlying issue though is, because you couldn't find the related effort that caused you to independently attempt to replicate the effort, you may also never find all the duplicate efforts to compare for consistency.
PDF is fairly inefficient compared to formats like DVI (and consider that so many papers are produced using TeX anyway, though figures may be in various image formats.)
But 77TB? You could host the whole thing in a shoebox with nine 8TB flash drives, a 10-socket powered USB hub, and a Raspberry Pi.
Each neuron is unique though, you can't just replace them. So they are more than just switches and memory, they are a part of the brains "software" as you call it.
>working on a few things and not being particularly well rounded has worked for me in multiple disparate fields
arguably "well roundedness" is key to major breakthroughs in science & engineering
while most of us work on incremental improvements, the real innovation often happens on intersection of two or more disparate fields that until then were considered weakly related, and it does require knowing a bit of everything
the best scientists and engineers I know are "knowledge omnivores" and will digest anything. It does require well developed "filtering" capacity though, picking up signal from a lot of noise, and a sort of information hoarding mentality