The cases presented don't give much evidence for the impact of bigger images on conversion rates. Only the first case seems to be a direct comparison of images of different sizes. In the two others, the copy and design in general have changed significantly as well. In the second case, the image on the page is actually the same size, the button has been resized.
In general, though, I do completely agree with the article that the only way to find out what works is to test, test, test.
Examples 2 and 3 as presented in the article couldn't be more horrendous examples of invalid inferences from test results if the author had tried.
It's the cargo cult approach to testing: "let's change everything and if the revamp increases conversions we'll guess what factor had the most effect".
Only testing specific examples is not enough. Without testing against a theory you gain little knowledge from these experiments. I would like to see reports of people testing their beliefs about how web design decisions affect user behaviour instead of finding out which design works better for a given site. Of course this is more time consuming but it will also yield more valuable insight.
Also I think it is important to include feedback from actual users in your test results. How did they feel using the different sites? What are the users feelings about your own concerns (invisible content below the fold, ridiculous big buttons, etc)?
Without these considerations you are just poking around the mud with a stick.
In general, though, I do completely agree with the article that the only way to find out what works is to test, test, test.