There's quite probably some of that. A quote from J.S. Mill on the distinction between science and technology strikes me as useful:
"One of the strongest reasons for drawing the line of separation clearly and broadly between science and art is the following:—That the principle of classification in science most conveniently follows the classification of causes, while arts must necessarily be classified according to the classification of the effects, the production of which is their appropriate end."
Essays on some unsettled Questions of Political Economy
It's not a dumb question at all. The classification accuracy and "uncertainty" are different, but the explanation depends on what you mean by uncertainty.
Let me try and give one intuitive explanation; if others would like to chime in with something better, by all means.
Let's suppose that you are classifying objects in images - say bananas and oranges, but it could be tumors or anything that you like.
So we train a classifier to predict this, and we find that of 100 classifications on a hold-out set, we get 73 of them correct. You might, quite reasonably, interpret this to mean that if we randomly select a new image of either an orange or a banana, we will have a 0.73 probability of classifying it correctly. (There are actually some subtleties in this interpretation which I'm ignoring, but they aren't so important for the point I want to make.)
Suppose, however, that we draw out an image that we want to feed into our classifier, and we look at it for a moment. Suppose further that this image contains an object that is long, thing, curved and yellow. We'd expect our classifier to classify it as a banana, and sure enough, it does. Now we draw out another image, except this one has an object that is long, but bent almost completely in a circle, and is more orange than yellow. Now, we might still expect our classifier to classify this as a banana, but should the classifier really be as certain about this prediction as it was about the previous one? Intuitively, I would say not. However, the overall classification accuracy remains unchanged, and so we can't say anything in particular about the certainty of this prediction.
So uncertainty isn't just the proportion of your results that you classify correctly.
Furthermore, it also isn't exactly equivalent to the class probability produced by your classifier, though I don't think this is the best forum for me to get into the details on that.
I will position this first before I go on: I think in theory, you can learn almost anything on your own, I'm now going to talk about the reality that I feel is true for most people and things based on my observations and observations I have seen, heard, and/or read about online.
I would argue not programmer, but perhaps computer scientist.
They are fundamentally different things to me. Programming, and its human facilitator, the programmer, can certainly be learned without a degree. I can teach myself to program fairly well in say, Python, in a few months.
What I can't teach easily, in my opinion or rather what can't be taught easily without some uni resources (going to college, maybe not, but to be honest i think the learning format has some advantages here), is say how to proof formal methods, Computational geometry, higher levels of information theory. Quantum Computing. all realms of computer science. Yes, lots and lots of CS depts teach you how to program in languages, but the ones I find that don't burn out in the long term aren't merely programmings, but have a strong understanding of the discrete mathematics that make up a lot of our modern systems.
I could go on, but I feel like its going to go into rant like an old grump territory.
I do have a bone to pick with this particular article as well:
"I will write separate articles on Data Science Books (I’ve read 127 of those in last six months)"
Unless those books are 20 pages long, you have not read them. Skimmed maybe, but completely read and logically understand the implications of those books? I have to call foul on this.
I personally learned programming on my own, and after about two years of doing it, I went back and started taking some computer science courses in data structures, discrete mathematics, algorithms as well as some other topics. I took some coursework through the University I got my undergrad from but most through local community colleges because they were 1/10th of the cost.
In my experience, I do not think you need a degree to be a programmer. You need to have extreme grit and motivation to learn it on your own.
I took the coursework after doing it because trying to learn advanced computer science topics on top of work in my own time simply wasn't working. It's not incredibly fun to learn, dissect and implement algorithms. At least for me it wasn't. Having no one to ask about advanced mathematics also sucked honestly. For those reasons, a quality education or professor is worth their weight in gold.
As someone who came up through universities with the full traditional CS background, and as someone who has hired and been a tech lead over many developers, I can count only one person I know who didn't get a degree who is a great developer. The people with degrees all had to learn a lot after school, as did I, but the one who is self-taught is some kind of savant, I kid you not. And as great a developer as he is, he had some holes in his knowledge that I ran across from time to time.