From the comments, I think readers are generally failing to take an opportunistic attitude toward this article. Dig out useful ideas and ignore what doesn't help.
Yes, there are situations where working too quickly will bite you, so you can't always do it. But the key idea here is that by working quickly you reduce your expectations of how costly a new effort is. Then you'll actually start it. Inertia is a powerful adversary. Any weapon you can put on your belt to fight it is worthwhile. I'm keeping this idea.
I don't think that is accurate without an additional assumption about the reason you don't want to do something.
If I prioritize quick responses and turn arounds to requests that are important, provide value, and allow me to improve important skills, that is not incompetence.
If you de-prioritize items, regardless of their importance and value but simply because you don't want to do, that could quite conceivably be called incompetence. However that rests on the assumption that you don't want to be competent.
I generally want to be competent. When there are things I don't want to do, it is usually because they are part of a bad, inefficient process that I haven't managed to change yet.
... and if you consider that bad, inefficient process beyond your influence in the foreseeable future, then intentional incompetence is the good strategy to avoid doing it. Such situations easily arise once you work within larger organisations.
The phrase may put you off but I think the sentiment is sound. Too many dismiss something/somebody in entirety because of one flaw. I think it's better to try to find value in everything; even if it's 99% shit, you can still learn from that 1%.
Or you could find something that's 95% shit and learn from its 5%. You can't learn everything, so why would you spend your time learning the 1% good things and dealing with 99% shit? There are too many easy ways to find value out there to be spending your time panning for scraps in sludge.
What's your search cost? If you have to hit "back" on 5 articles that were 99% shit to find the one that's 95% shit, you were better off just taking your 1% tidbits.
(In practice, I've found that anything that's >50% shit or so isn't worth wasting time on. Stuff that's >50% good, informative stuff is usually qualitatively different from stuff that's 5% good informative stuff; go where the former is.)
I think that's really a matter of what level of mastery you are looking to achieve, and how broadly you intend to achieve it. If you're looking to be the best of the best in one single, narrow subject, you might quickly run out of <50% shit resources and need to dig into some really bad stuff to find little, rare bits of good knowledge.
If you want to be the best of the best in one single, narrow subject, then listening to what other people think is useless. Rely on direct experience instead; perform your own experiments, test out your own ideas. That's what it means to be the "best of the best": you're the one injecting new data into the ecosystem.
Well, I was referring more to the situation where you're already reading something, or finished reading it. Obviously there is an opportunity cost that you could have sought out something better to read, but now that you're there - take what you can from it. In my experience a lot of people will dismiss an entire article based on one wrong fact or crazy opinion.
And when extended to people - just because someone says something stupid or wild doesn't mean everything they say is that way.
I have to agree. My personal experience suggests if you want to get good at something, keep doing lots of it and aim for speed rather than perfection. You'll end up being speedy and get closer to perfection, than if you just try for perfection.
But one good counter example does come to mind - designing a database schema.
I'm trying to wrestle with what the difference might be. I think Markov processes, e.g. processes where the future state depends on a prior state, are relevant.
Maybe we could say tasks are either "strongly Markvoian", that is, how well we do them now will influence our future work and hence we should really think them through, e.g. designing a schema. Weakly Markovian, in which case there may be some future impact but not much, and so we exercise caution but "done is better than perfect". And finally non-Markovian - e.g. throwing out the garbage, cooking dinner, most emails - getting the thing done is simply a pass/fail and so we just have to do it, quality is relatively unimportant.
I think what I'm saying is, most tasks will be weakly or non-markovian, so we should "move fast and break things", but every now and then there'll be something we need to do that is strongly markovian. For such things we should be prepared to take a step back and give ourselves a little extra time, so things don't blow up further down the track.
In general I disagree. I learned far more in 6 months at a "high quality" shop working excruciatingly slow than I did in the 5 years at the previous job where I banged out code as fast as possible.
I feel like when you focus on speed you very quickly learn just enough to get the task done quickly. And then your progress stalls.
I find that when I'm doing any work that involves modifying existing code, there's a huge difference in how quickly I can work when I'm touching code that was originally written by people with different work styles. Usually I can sail through the more methodically-written stuff. When the time comes to make changes to code written by the people most fond of uttering, "The perfect is the enemy of the good," though, progress slows to a crawl. Adding new features without introducing new defects to that code is like trench warfare.
That said, much like the article says, the fast workers did tend to get assigned more new tasks. I think maybe this was a huge win for them. From a wider perspective, though, it's a bit tragic because it results in this inexorable downward spiral in terms of code quality. Of course that worked out well for them too because more defects meant more opportunities for them to cape up, swoop in, and save the day. I can't remember who it was who said, "Beware of your firefighters, they are probably your chief arsonists," but there's a lot of truth in that statement.
In that boat right now, but instead of 5 years in, 1.5 years in. I've been reading books on design patterns (we barely use them, you should get an idea of how bad it is here) and just in general trying to be better.
Good on you. I left a place a little while back that taught me all the things I now know to look out for when I eventually start my own business or interview at new places.
All of my car/motorcycle track instructors were big believers in "Slow is smooth, and smooth is fast".
Basically go slow and work on the correct form/line/being smooth/etc... Once you have that down, you are ready to go fast and you will be better/faster than someone who hasn't gotten the flow down at a manageable speed first.
I used to teach guitar and always had to tell students to practice it slowly, but perfectly first, and only try to speed things up once they had the basics mastered.
Speed is impressive, but you literally have to learn to walk before you can run!
Measure your writing speed. Aim for 70-80 WPM. If you get that speed then it's a) enough and b) it doesn't matter at all if you "touch type" properly or have invented your own typing system.
The advantage of touch typing is on a good keyboard/desk/etc you can do it 10 hours a day for 30 years without wrecking your hands. That said, most programmers don't actually type that much so it's not nearly as important.
I bemoan the death (or near-extinction) of text-input games. I learned entirely on my own how to touch type by playing classic Sierra adventure games and spending a ton of time in DOS. I never would have done that if I had gotten into computing a few years later with the transition to mouse-based UIs.
Definitely same here. I feel I was fortunate that my parents were poor graduate students. As a result, I learned my formative computing skills on DOS 6.2/Windows 3.1 at a time when respectable citizens were packing 486DX4s - Pentium 90s.
It depends on your goal. Even with database schemas, iterating quickly optimizes for learning and skill improving.
Aiming for quality tries to optimizes for creating a quality product faster. Still, taking some time for skill honing might be better than trying to design the perfect system on your first try.
I'd say it depends on your current skill level rather than on a task property like "strongly Markovian". Learning has diminishing returns. At some point more learning is not worth it anymore.
Don't bang frenetically on walls too. It's very very good to be able to reach deep understanding without hyperfast iterations. There will be times where you don't have that luxury too. Make haste slowly, balance.
Do things fast when the cost of doing them wrong is low. If you're learning something, or doing something with low risk, then doing it as fast as possible is a really good idea (for all the reasons set out in the article).
But...
Do things slowly if the cost of getting it wrong is so high that you'll have no opportunity to try again. For example, don't pack a parachute quickly.
The key is recognising that there's more than one way to approach soemthing; selecting the right method for the problem at hand is the winning strategy.
I think the concept you're trying to described is Reversible Decisions (unfortunately I can't recall who coined that phrase).
The idea is that any decision that is straightforward or easy to change should not be sweated over for any appreciable amount of time, and in fact can be deferred indefinitely (deciding not to decide).
Meanwhile, any decision or indecision that will have long term repercussions should be considered at length and with all due haste.
I mention indecision here deliberately, because things like deciding not to put authentication into your application in version 1 counts as a decision, one with far reaching and usually fairly aggravating (IME at least) long term effects on the project. Others would include thread safety, the ability to cluster or shard your design, multilingual support, audit trails, etc. If you are the only solution in the space then you often have time to correct these mistakes. But if one of your competitors figures these things out before you, you can find yourself in real trouble (one of the aspects of the Innovators Dilemma).
Let's say in a life time, the accident rates due to packing a parachute quickly is 1/10000, and the fatal rate of the slow packing group is 1/1000000. Even though the fast group faces bigger danger than the people in the slow group (or people sitting at home), but other than the few who have the bad luck, the rest of them will practice way more than the other group, jump more times, go to more places, have bigger opportunities to become a world champion of parachute packing or whatever parachuting sports.
Sure, a few will be forgotten by the world.
The victors we see in the world are probably the people who are still alive in the fast group, and have produced lots of results because of their speed and being still alive. Someone in that group will pay a huge price, but it's not necessary you or any particular one.
You're attacking this analogy with made-up numbers and wild logical leaps. What is the real risk increment to packing a chute hastily, and does the real number help or hurt your position? Now make the stakes really high. Also consider the possibility that your choices have externalities, and others around you may not want to share their jumps with someone they perceive to be that reckless idiot who's going to get himself killed.
You didn't give specific numbers on how often someone can jump, but consider the realistic bounds on how much more often a person who packs hastily can skydive. How often is this person jumping? Are we in a scenario where the amount of time it takes to pack a parachute is really the limiting factor, to the point where the hasty packers can jump "way more?" Seems like what that would mean in concrete terms is that as soon as you hit the ground you're going to hit the john, re-pack your chute, and immediately be back in the plane. Is that a realistic scenario?
Hostile to the idea, not the person. With some reason - there are a lot of pernicious myths that survive purely on the false mimesis you can generate by using made-up numbers. See: Politics.
As a skydiver - slow and carefully is better, but with that one becomes quicker as one does when practising lots. The number of jumps one can have in a day is limited by aircraft availability rather than packing speed. So of course there are 'professional' packers (say army) who are not constrained by aircraft and say have a week or two to repack a couple of hundred chutes - so different number likely apply. Further ones reserve chute needs to be repacked yearly (or so)- and that's one fella that one does wants the packer (needs a certified packer) to be slow and careful with. Thinking about it aviation (and space) seems to be one area where care comes before speed. So dude I think you are right in calling this out. [edited]
I'm specifically saying that "You're attacking this analogy with made-up numbers and wild logical leaps" is too harsh a way to begin a comment here - it's not civil. The dude just said some hypothetical thoughts, you could have responded "Unfortunately I feel your numbers are not drawn from the real-world, and I think some of your conclusions take real leaps of logic. Specifically..."
but hey it's just me. this place is usually pretty civil - it's in the rules.
Yeah I thought we were talking hypothetically and in a very general sense (hence the obviously made up numbers to merely indicate some scales) about an inherently very general issue.
To be honest, if packing a parachute takes an hour longer and increases the chance of living by 100* I'd say that's worthwhile. But then, I also wouldn't call 1/10000 an especially high risk. We face those sorts of odds just driving a car and they don't put many of us off[1].
If the probability of death from packing a parachute quickly were 1/100 then I think your argument would break down somewhat. Like I said, you should spend the appropriate time on something depending on the costs and risks associated with it. "Do everything fast" is wrong, but so is "Do everything slow".
Speaking as one who is laid back in a personality sense, I could imagine nothing worse than a life lived non-stop frenetically under the imagined need to speed up all activity in the name of productivity.
Take time to pause, reflect, think, and enjoy your life experiences.
Even when it comes to work-related activity, there are times and places to do things quickly and there are times and places to do them deliberately. If nothing else, just for sanity's sake, it is important to pace yourself through a day, through a week, through a month, through a year, through a career. Even if speed were exactly correlated with maximum productivity and effectiveness, it is vital that you have times when you simply feel you can enjoy being at work, being with people, doing your activities, without everything feeling you have to work like a machine that will be evaluated by engineering standards only.
Even more, we all have different personalities and some people do not work well if they feel they are forced to work at some arbitrarily quick pace as opposed to one that suits their style.
Finally, even speed as a factor can vary with your activities as you develop skills in those activities. When I began years ago to try to write things, I was agonizingly slow about the process. I felt I had a quick mind but the process of getting what was in my mind down on paper made me feel plain stupid. Whatever I did, it would never come out right. Through a very tedious process of writing and re-writing, it would eventually become passable and that was it. It might take me a week in such cases to write something expository of modest length. Yet, realizing this was a weakness, I worked damned hard to fix it and, through a process of many years and countless hours of effort, I reached a breakthrough point where I could do "walls of text" (in the phrasing of some) in 10-15 minutes and produce quality stuff. I now write very quickly and effectively. But had I tried to do so years ago with my limited abilities at that time, all I would have produced was hash.
So, lighten up and do it in your own style. Yes, speed does matter. But it is only one of many factors that will determine how you do at work or, even more important, at life itself. By all means, apply yourself well - be diligent, hard-working, etc. but do it fast or slow as suits your needs and your own style. At least that is how I view it.
> a life lived non-stop frenetically under the imagined need to speed up all activity in the name of productivity.
I don't think that is quite what the article is suggesting. I think it was reminding us to thinking about the effects of the speed at which you accomplish things. If there is a behavior you are trying to encourage in others (e.g. requesting a code review), focusing on responding quickly can be crucial to helping foster that behavior. Similarly, if there is a communication channel (e.g. Slack vs. email) that is not being adopted, holding yourself back from quick responses to those emails while responding quickly to Slack messages will help foster the transition.
This doesn't mean that you need to do everything as quickly as possible. It does mean that if there is something that you do slowly that you want to improve on, it may be helpful to be aware of the additional mental cost you associate with the activity so that you can compensate for it. Similarly, if you are trying to improve the quality of your writing, focusing on improving the speed of your writing (or even just the speed of your typing) while maintaining the same quality might pay off faster than just focusing on improving quality.
One of the tenets of Extreme Programming is, "Quit when you're tired." Why? Because it's faster.
It's not faster today - if you kept working, you'd presumably get more than zero done. But you'd also create more bugs, and you'd come back more tired tomorrow. Coding is not an assembly line; your brain needs to be fresh.
Taking time to pause, reflect,and think is the same. It's slower in the next minute, maybe in the next hour. But stopping to think and realizing what is the right thing to do can save you days of waste.
My first boss said, "You need to learn when the most productive thing you can do is go look out the window for 15 minutes." After 30 years, it's still good advice.
I'm ignoring your point about work-life balance here. All I'm saying is, too much emphasis on speed slows you down, even only considering work.
Perhaps this applies more to shooting than software development, but...
Slow is smooth. Smooth is fast.
If I take 20 minutes more to code a module because I'm thinking about it, but spend 30 minutes less debugging problems with the module, that's fast.
If I take a day to respond to an email, but the person I'm conversing with gets the info they need, avoiding three more days of back and forth, that's fast.
If I take a week longer to iterate through a project idea, but nail the implementation, then I can know that I'm pivoting because the idea was wrong, not the implementation.
How do mere mortals become wizards to their peers? They take the time to read documentation and code and really understand the tools that they are working on. What software projects have stood the test of time? The ones that were painstakingly thought out and progressed slowly. Of course there's a balance, and our current system of financing software projects rewards fast and loose, and there will always be a place for fast prototyping to help understand the problem, but to create something truly amazing takes a lot of time.
"What software projects have stood the test of time? The ones that were painstakingly thought out and progressed slowly."
I kinda think that the opposite is true, or at least as true as your statement (meaning that at least as much half-baked stuff rushed out the door 'stood the test of time' as did stuff that took a lot of time to ship.) Unix, C, Windows, PHP, JavaScript...
To offer an example of a project that has stood the test of time far longer than the ones you mentioned, consider Fortran. The first compiler was released in 1957, several years after it was first proposed. The specification took a couple of years to complete. This was over 60 years ago, and even now they are releasing an update to Fortran (Fortran 2015).
Even in the examples you mentioned, Unix, C, and Javascript are all run through standards bodies now. It took years for C11 to be finalized. It's been years since ES6 has started development. PHP7 has taken a few years (and a version update before even being released).
Windows, depending on who you're talking to, can be a good example of half-baked, or an example of why half-baked is terrible, with regards to Windows 8 and Windows 10 releases.
This feels like a classic example of survivorship bias. Based on the nature of computing resources in the 1950's and academic culture in general, it's likely that all language/compiler projects of the era were planned deliberately and slowly. One of them survived.
So all projects that have "stood the test of time" for that length of time have that attribute. Also having that attribute are all the projects of that era that failed miserably.
May be you and your parent comment are both right? Release the first few versions very fast (even if it is not good) and if it is successful, then slow down, redesign etc? Examples include PHP and JS
I think there's a big difference between working quickly and working rushing a project. Doing things thoroughly and correctly is of course well worth it; you can complete this work as efficiently as possible without cutting corners and building half-baked products.
In addition to that, I like to try to answer every question I can at work, even if I have to just go Google it myself. Telling someone you don't know and they should just Google it is robbing yourself of an opportunity to 1) learn it yourself, and 2) explain it to someone else (which is a GREAT way of making sure you really understand it).
Plus, people seem to like it when they ask something, and you help research through it with them. They come back to you again, which gives you another free opportunity to share someone else's learning, and you quickly turn into that person who either knows everything, or knows where to find out.
I agree with you completely here, but you definitely have to be careful, particularly as you become more knowledgeable. Some developers get into the habit of simply asking every time they can't find an answer, or giving up after only a few minutes trying, without realizing that the searching builds a better foundation than the answer many times.
My general rule has been - always ask what they've tried first. If it seems a sincere effort has been put into it so far, by all means help out. It may be something you know immediately, but there is value in teaching people to learn for themselves.
For sure. If I'm dealing with a junior engineer, I take a much more hands-off approach. It ends up taking longer, but that's ok, because usually, the net long-term benefit of that engineer getting practice at self-directed research is much higher than the output of the task itself.
On the other hand, if I'm dealing with a "senior" engineer who's doing that, I'm actually less worried about whether there's been sincere effort. It's just the other side of the same coin: They've offered their learning opportunity to me, and I'm happy to take advantage of it.
Conversely, there's an easy-to-fall-into bad habit of googling up an answer every time somebody on IRC asks a question, rather than waiting to see if somebody else is actually knowledgeable about the area, or doing the questioner the courtesy of assuming they're capable of googling for themselves...
* Momentum - To stick with something and finish it, you need to use momentum. Lacking momentum, projects languish. Quoting author Steven Pressfield: "Second only to habit, momentum is a writer’s (or artist’s or entrepreneur’s) mightiest ally in the struggle against Resistance."
* Waiting is painful. That's the point behind the examples in the middle section (waiting for an email reply, waiting for Google, waiting for an employee to finish a task). One thing this article makes me more aware of is the concept of getting impatient with yourself. Part of you is the boss or client that wants things accomplished, and it is sizing up the worker part of you, wondering if it assigns a task whether that task will get done quickly or require a lot of waiting, and there's a relationship to manage there. (Perhaps this dynamic underpins the phenomenon of momentum?)
* Quantity Always Trumps Quality - There's a blog post by Jeff Atwood with this title. The point is, if you want to get better at something, do a lot, and don't worry about quality while you're practicing.
Focusing on "speed" is probably a good mental trick for a couple reasons. First, it validates and acknowledges that we hate to wait. Second, it helps overcome perfectionism and the tendency to think and judge instead of doing and creating, by giving us something to measure that is not about quality but is instead correlated with action and progress.
> Quantity Always Trumps Quality - There's a blog post by Jeff Atwood with this title. The point is, if you want to get better at something, do a lot, and don't worry about quality while you're practicing.
I wholeheartedly disagree. My martial arts instructor had a saying: "Practice doesn't make perfect. Perfect practice makes perfect." I've found that to be true. After all, if you practice bad technique, how do think you're going to perform in real situations?
Not to mention, you usually practice things in a controlled environment where you are evaluating your own technique. That's often not true in real situations, where your focus needs to be split on many different things. If you want to execute well in a real environment, good technique needs to be second nature so you don't have to think about it. You just do it. Being able to do that requires a large amount of good practice.
"Quality" here means the quality of your result. If you are learning to paint/sing/code, it means how good the painting/singing/code is. The advice is to not "worry" about the quality, in the sense that if you are not producing good quality output, that does not mean you are doing anything wrong. When you are just learning something, the output will not be good quality! You need to produce a lot of crap results, and that makes most people uncomfortable.
I can relate what you're saying to my experience taking singing lessons, though even there, I find that making progress is all about turning off the inner judge while you practice. All you need to practice something is a bit of intent and a bit of awareness; it's not important that you "worry" about the quality of what you are doing, per se.
Speed matters and there are more ways to make things faster other than doing them often.
1. If you are coding ensure that your debugging environment is good (simplest example would be if you write HTML how long does it take to see the output or is it auto-refresh itself int the second monitor when you change the file?).
2. If you are building a project how long does it take to deploy? Do you have CI with a one button push to production environment? If you have then you'll deploy faster and more often there is no overhead. If not you won't want to deploy often due to the overhead of the deploy.
etc.
The most important lesson I've learned about speed is that "design your environment and build for speed and remove the repetitiveness" then you can be fast. You can apply this rule to many things. Blogging, ensure that your blog tool makes it easier (how easy to link stuff, find images, does it really necessary to add an image to all posts, do you FTP upload, or do you just copy&paste and image to the editor, what happens if it crashes, do you lose data or does it just recover?)/
Reminds me of Mjoolnar's (author of vim) talk on 7 habits of effective editing 2.0; the core idea is essentially (my current interpretation): awareness: look for repetition - they are candidates for automation and/or being more effective. Rule of three: If you do something (anything) 3 times - figure out if it can be done more efficiently. In vim (or any other editor) an example might be shifting the indendation of code: maybe you only remember how to do it a line at a time; figure out how to work on blocks of code; figure out how to do it on a file - a whole project.
> I’ve noticed that if I respond to people’s emails quickly, they send me more emails.
An alternative explanation would be that if you don't take your time to understand people's mail and just rush to answer them as quickly as possible, things that would take two mails to communicate now end up being a thread of ten mails, two phone calls and an in-person meeting.
Nothing is more infuriating than a person that replies 30 seconds later with a message that suggests they didn't read past the first sentence.
I don't know the author or his ways of responding to emails, but in my experience the above often applies to people that value speed above all else.
I also don't respect people that always respond to email immediately because it makes me think they don't have anything important enough to work on that requires unbroken concentration.
I see a lot of responses that seem to reflect what I consider a misunderstanding of the real advantage of speed.
It's not to get the same amount of work done in a shorter time (the management fallacy referenced in some of the comments).
The point of speed is to increase the number of feedback opportunities. Each feedback datum allows for slight course corrections / confirmation of original hypothesis.
By analogy, think of it like sample size. If you accept time as a primary constraint, then faster iterations (even if you accomplish less!) tend to give you more samples. More samples mean less variance.
Making decisions on a better model (less variance) is very appealing to me.
This article reminds me of one of my all-time favorite articles. Quantity Always Trumps Quality by Jeff Atwood [1]. There is a bunch of commentary on it on Less Wrong [2], some of which is interesting. Also an interesting parallel to a Paul Graham post [3]:
> I was taught in college that one ought to figure out a program completely on paper before even going near a computer. I found that I did not program this way. I found that I liked to program sitting in front of a computer, not a piece of paper. Worse still, instead of patiently writing out a complete program and assuring myself it was correct, I tended to just spew out code that was hopelessly broken, and gradually beat it into shape. Debugging, I was taught, was a kind of final pass where you caught typos and oversights. The way I worked, it seemed like programming consisted of debugging.
> For a long time I felt bad about this, just as I once felt bad that I didn't hold my pencil the way they taught me to in elementary school. If I had only looked over at the other makers, the painters or the architects, I would have realized that there was a name for what I was doing: sketching. As far as I can tell, the way they taught me to program in college was all wrong. You should figure out programs as you're writing them, just as writers and painters and architects do.
Before anything, the obsession with speed seems like a management fantasy that tries to squeeze more out of workers in less time. We shouldn't forget that we're all human, and there are myths and facts related to working fast.
Working "fast" is tricky. The ironic conclusion part of the article is an example to that. People will miss the point and screw up more while trying to be "faster". However if a task becomes more automatic, it will become faster, which means that actually doing something "fast" would consume less energy, since it is more or less automated and unconscious, while "trying" to be fast would consume more energy.
The key here seems to be that just do something "a lot", and it will become faster by itself through time by being more and more "automated". But take your time, and stop worrying about speed. Speed is just pressure which will bring more pressure as you do stuff faster. Don't create unrealistic pressure.
Another key is garbage collection. Just get stuff "done", and get rid of the old tasks, or treat your long to-do items as "later" lists. If something really matters, you will do it anyway. You won't even need a to-do list to keep track of your "important tasks".
Ok, i will hold up my hand, im a software manager, and nothing fustrates me more than developers trying to do everything at break neck speed and just plain geting it wrong. They either fail to read or understand the requirements in the rush to get started, or they will rapidly push out a pile of shit, and fein suprise when it repeatably gets rejected by QA. My best guys are the ones who take the time to read tbe specs and talk to the stakeholders about what they want, and who spend time making sure that what they produce will pass muster. Those guys are golden, they are worth 10 of the immature speed freaks.
The implied benefit is that repetition improves quality, so more repetitions, more quickly, means your quality will improve more quickly.
You need to self-reflect after each iteration, mind you, to make sure that you learn each time you do the thing. But, this is necessary for skill progress whether you're going fast or slow.
(And, having to make the corrections can be as informative or more so, depending on how you're able to reflect on the "why" you're refining or fixing the thing).
"The implied benefit is that repetition improves quality, so more repetitions, more quickly, means your quality will improve more quickly."
Yes, but that's frequently not the case. Developers at "sweat shops" tend to write far more code, and work many more hours, than the average dev at a top tier software house and yet they are usually much worse devs.
That's largely (IMO) because those sweat-shop developers started out as much, much worse devs, whose only viable option was a sweat-shop.
The claim is that repetition and a short feedback/learning cycle will improve a developer more quickly than a long cycle. It doesn't say that a poor developer with a fast cycle will out-learn a different, stronger developer with a slow cycle.
>> You need to self-reflect after each iteration, mind you, to make sure that you learn each time you do the thing. But, this is necessary for skill progress whether you're going fast or slow.
I feel a tendency towards perfectionism is implied by the tendency toward slowness. Working quickly could balance out the most self-defeating aspects of perfectionism.
Entrepreneurs would say that speed is the signal of mature markets: more and more effort for less and less return. Mobile games were very profitable in 2009-10 (low effort, high return), balanced in 2012 (speed was important at that stage), unbeatable and unbearable as a business in 2015 (however fast you are, you're just playing a lottery).
Thinking slowly doesn't imply working slowly. Thinking slowly means making a plan, doing some research, before working. Each one of these can be done fast.
- You can only hold so many details in your head at once, and you can only sustain that collection for a limited duration. Holding those details there can be crucial for doing good work and the faster you work the more you make use of them.
- Doing good work often requires a lot of experimentation and iteration. In a number of circumstances this may only be practical if you can do them fast enough
I am reminded of the story of how Leonardo da Vinci took three years to paint The Last Supper and produced only a small number of paintings, generally considered masterpieces, during his life.
Working quickly is important because it's the better method to survive and thrive. Quickly producing 100 deliverables with qualities ranging from shit to pretty good, will probably on average always defeat carefully crafting one deliverable in the same time frame.
Speed lets you try many alternatives, experiment with many different even opposite options, and draw out creativeness.
Another possible reason is that speed is the strength of the younger generations. In human history, if new comers want to beat the current authority (in business or politics) who have already mastered the intricacies of the current game, one have to propose and experiment large quantity of new alternatives, new rules of new games, even though most of experiments might have low quality results judging by the established rules. But it's the better way to compete and survive.
The author rests his entire argument on illusions to "the mind" that are untested and impossible to test. For example, "If you work quickly, the cost of doing something new will seem lower in your mind." Not sure how we get into someone's mind to measure the "cost of doing something."
We could not confirm or disconfirm this as truth, say in the context of an experiment. This phenomenon is better described by the concept of immediacy of reinforcement. As one decreases the delay to reinforcement, the strength of behavior maintained by that reinforcer increases [0].
Strict behaviorism? I'm advocating for the experimental analysis of behavior (sometimes referred to as radical behaviorism). When applied to practical human problems, it is referred to as applied behavior analysis.
The study I cited provides support for the authors conclusion. However, his description is flawed because it doesn't provide means for prediction and control of behavior.
It's a blog post - an opinion piece - not a scientific paper.
Do you go around criticizing all opinions that refer to people's thought processes as 'flawed because it doesn't provide means for prediction and control of behavior'?
I understand that this is just a blog post. This post provided me an opportunity to spread the word about behavior analysis. If the author is advocating for behavior change (i.e., "doing things faster"), then we should speak in terms that have grounding in a science of behavior. That doesn't mean you have to conduct an experiment, but base your argument in science. If I were to make claims about how a particular piece of software worked or some law of physics that were untestable, I think people would have something critical to say (especially in HN comments). Why should we treat human behavior differently?
"What goes on in the mind" is behavior. It can be studied scientifically but one can run into the "private event" problem. That is, what goes on in the mind is a private event only observable to the subject. In many psychological experiments, what goes on in the mind is inferred from observable behavior. This is problematic.
In behavior analysis all behavior is considered the subject matter, including private events. So the answer to your question is: No, I'm not saying that.
It seems you think that people should only use behavior analysis when talking about the mind.
I wish you luck in persuading people to accept this philosophy. You may have an uphill struggle convincing people that it is comprehensive enough to replace all the other ways humans have thought about their experiences to date.
>"It seems you think that people should only use behavior analysis when talking about the mind."
That's not what I think. See the difficulty in inferring someone else's thoughts? ;)
If we are just theorizing or talking about what the mind is and so on, then this belongs in the realm of philosophy.
Behavior analysis is alive and well on the psychological scene. APA's lifetime achievement award this year went to an applied behavior analyst. If behaving is doing and behavior is the subject matter of behavior analysis, that's pretty comprehensive.
It may not be what you think, but you continue to argue (condescendingly) that behavior analysis is the only valid approach to any topic concerning human behavior. Now you do so by an appeal to authority.
Have you considered the possibility that some philosophy might actually be useful in this domain?
[incidentally: I have nothing against behavioral analysis in itself - just the claim that thinking about what goes on in the mind should be excluded in its favor]
My bad for coming off as condescending. That wasn't my intention. Philosophy is useful when talking about or discussing things. The problem with philosophy though is it doesn't allow one to predict and control behavior. Thus, if practical behavior change is your goal then behavior analysis is appropriate.
Prediction and control is what science rests on whether we're talking about behavior or physics. You can't prove that an independent variable (IV) caused a change in a dependent variable (e.g., behavior) unless you can predict and control it by systematically manipulating the IV while observing changes in the DV.
However, this is way beyond the scope of the original blog post or my original comment. In my first comment, I offered an alternative description of a phenomenon that the author described. You suggested that I was advocating for "strict behaviorism." That wasn't the case, so I clarified. At this point, I'm not sure what the purpose of this discussion is.
The purpose is to point out that people sharing information about how they think about things is a valid a way of influencing behavior. You are simply wrong to dismiss that.
I agree that humans influencing each other through talking about how they think about things is a hard phenomenon to reduce to the kind of science that you are advocating, but that is a limitation of your preferred methods, and it's inappropriate to dismiss phenomena just because you don't have a good way to understand them.
> "The purpose is to point out that people sharing information about how they think about things is a valid a way of influencing behavior. You are simply wrong to dismiss that."
I agree that people sharing information about how they think about things may INFLUENCE behavior. In your previous comment you used the term "cause." These are very different words, especially when we are talking about science. It's not clear what you think I dismissed.
> "inappropriate to dismiss phenomena just because you don't have a good way to understand them."
I never suggested that we dismiss phenomena. If you read my original comment, I agreed with the authors general premise but I offered an alternative explanation: "This phenomenon is better described by the concept of immediacy of reinforcement. As one decreases the delay to reinforcement, the strength of behavior maintained by that reinforcer increases."
You are incorrect. I did not dismiss the author's reasoning. I said: "We could not confirm or disconfirm this as truth, say in the context of an experiment." The author isn't running an experiment, he's just talking about how he feels when he does things quickly. So what's your point?
My point in the original comment about the concept of immediacy of reinforcement is that it is a broader, evidence-based concept that encompasses what the author described. I thought HN users might get value from such an explanation as they could apply it to more aspects of their lives than just "doing things quickly." It seems as though your purpose is to argue with me over things I didn't say or ideas I don't hold, which isn't productive or meaningful to the HN community.
First, that's not the reply you were talking about when you stated that I was being dismissive above. Yet, your interpretation is still wrong. Second, except for the first sentence I didn't even use the term "behavior analysis" and instead used the term "science." Nowhere in that reply did I suggest that "discussions about behavior that do not solely use behavior analysis" should be dismissed. Throughout this discussion, you have attempted to mischaracterize my statements and your most recent reply is another example. Thus, I'm not sure what your point is other than attempting to build a strawman and blow it down.
Now you are being intellectually dishonest. You didn't use the term behavioral analysis in that specific reply. True, you used he word science, but this is irrelevant to the fact that you were dismissive.
Attempting to evade this through being pedantic rather supports my point.
I'm not being intellectually dishonest nor am I being pedantic. That's kind of offensive. Nowhere in my post did I suggest that behavior analysis was THE ONLY way to conceptualize "the mind" or whatever we are talking about at this point. Yet, that's what you're accusing me of saying. It's ridiculous because all of our words are above. I don't want to argue with you, but you're wrong.
Nonetheless, you are arguing with me. You responded to my comment. I would rather you not tell me what I think or what what I said, when the facts don't show it.
For every piece of code you write, there should be a unit test covering it that you can run fairly instantly with 1 keypress which catches at least internal-consistency errors. (Integration tests would cover external-consistency errors, but when you're refactoring/adding/deleting code, internal-consistency is violated far more often than external consistency, in my experience.)
Once you experience this for yourself, you will NOT want to go without it. It allows you to IMMEDIATELY get back to coding without waiting for test runs to finish and without creating bugs unknowingly that you only catch much later.
This is why I love Haskell---the check, though approximate, is built in to the language as the type checker. Pair it up with the speed Haskell coder's best friend, "undefined", and you can be just 40 characters into writing part of a method and check it right there on the spot, getting into a good flow.
I see that appeal of Haskell, and I think the strict focus on typing does catch many possible bugs... but I don't think it catches all the types of bugs that an actual unit test would, so I'm almost afraid that it's leaned on a bit too much.
There is always a unit test which could catch any type mismatch. The problem is that every type declaration implies a set of unit tests (probably an unbounded set!) and in practice it's not likely you'll actually write enough of them.
One apparent counter-example is that when practicing the piano, some common advice is to play pieces much slower than you think is necessary, but play them correctly. Then speed up once you have the right habits.
But the overall goal is to learn new pieces as efficiently as possible. Someone who is efficient at practicing will make much more progress (and have more fun doing it) than someone who isn't.
That's the catch, isn't it? You probably want to get high quality fast, but you can't get fast without losing quality, at least in the beginning. But I'm pretty sure speed trumps quality in most cases - having a good enough product out is eons better than having a perfect product years away...
I feel like there's a bunch of data over-fitting going on here. The examples here are merely the ones that fit with the author's model, are they not? I'm sure I could have had a theorem that points to the idea that "fast work is catastrophic", gone back in time, and shown how the slow individuals / components of a system were significantly more effective than the fast ones.
It's funny that they include google as a fast example. On my Nexus4 and Nexus7 Google search is the slowest thing you can do on the internet. Youtube HD videos are way faster than a google search and fail less often when on the train/bus. I always wonder how sending a String and getting back a list of Strings can be so much more expensive than a HD video but who am I to judge, right?
I think he's having in mind the time when you had to enter altavista-digital.com to enter a search site.
There were worlds between that and google.com (both entering the URL and getting the results).
I am not sure if the writer has thought out the drawbacks to working quickly but there is probably some benefit to deciding quickly.
The writer mentions that faster employees have more work assigned to them. Woo hoo, but what happens when the work queue fills up faster than it can be emptied? Then the once-fast employee now seems slow.
The article reminds myself when I was younger when I was 27. Now I am 28, and I have grown wise and old in this year to know that working quickly does exactly that, produce quick results without much quality. It works well in cases where you need a result or just something to build momentum, but it's not at all sustainable. Just like coding, if you speed without thinking through the steps you end up with a mountain of technical debt, instead, it's actually faster to think slowly and careful and doing the right things instead of your ego which gets a huge kick out of doing more in higher quantity than quality in short period of time. Again, it works well in some scenarios which I can't think of right now but not at as important as this young buck has written.
so inspiring.
this person is generous, admitting to being the slowest.
-=in defence of taking your sweet time=-
the arguments for speed and the arguments for quality are not either or. they both contribute to a product that works.
to introduce the case for slowness a natural precedent is appropriate. the development phase for human beings was on the order of 2 billion years.
and in that time a whole lot of nothing happened. all those evolutionary competitors at every level of the tree of life were more rapidly produced than humans. and now humans rule the world and in 200000 years have used that comprehensive development period to move rapidly and adapt so effectively to the world that we have changed it to support 7 billion of us and tripled our life spans. the hockey stick curve of our technology speaks to the benefits of long development, and the long tail of non-adaptive more-rapidly developed ideas that come to nought.
those practising rapid development and launch save costs during the development stage and increase costs during the much longer operating stage
by code that has more bugs, takes more to maintain, is more brittle, and these things constribute to being slower to adapt to
customers and competitors when it counts, that is, when you have customers, are burning operating costs, and competing.
it works
better to develop comprehensively when it is cheap to and build the most efficient product to be really useful when you run with it. longer development, then move faster in operations.
otherwise you end up
solving your terrible code base by hiring more brains, and those brains could be better put to use creating improvements for your customers and not fixing the consequences you shipped in a sprint.
pre launch development is
cheap, so it works to take your time and not optimize that __process__ prematurely. everything is more expensive after launch when the stakes are real and where an advantage can be moving fast -- if the code you crafted creates that you can adapt quickly, then you've minimized
costs over the operating period, and your brains can work on growing and retaining, rather than building an
ozymandius of monkey patches.
the time
when speed is important is after launch not before it. if you rush your pre launch development, you will
be a slow operator, and this will cost
you exponentially more than the linear increase in cost
from a longer development time.
also the more
robust the system you build is,
the slower ( as in slow thinking )
you can create your
decisions in the operating period to really consider strategy.
let your
competitors
ship
first
and watch the things they miss. let them pay for the experiments you now decline to run. if someone seems to be capturing the market through their business model then you are too slow anyway and the high order bit for you isn't code anymore. if that is not your market, then it is filled with competitors who
are mimicing each others sub-monopoly strategies. so you can step in, be the last mover, and take the market. invest time during development to make code and tools that work, grasp your business plan, and have the possibility of thinking strategically about the business, in the operation phase, once you have launched.
Yes, there are situations where working too quickly will bite you, so you can't always do it. But the key idea here is that by working quickly you reduce your expectations of how costly a new effort is. Then you'll actually start it. Inertia is a powerful adversary. Any weapon you can put on your belt to fight it is worthwhile. I'm keeping this idea.