I’ve met a number of self-taught programmers. These are people who make their living programming every day, but never went to school to learn how to do it.
A few of these people have expressed a mild regret for not learning more computer science. They know how to program well, but they don’t have a good understanding of some of the deeper math and theory behind the programs they write.
Which brings me to my question: Should these programmers learn more theory? Would they be better programmers if they did?
Learning Bottom-Up or Top-Down
The way a lot of self-taught people learn skills is purely through usage. Programmers start trying to program from an early age, maybe to make games or websites. Everything they learn is motivated by trying to figure out how to do something they want to do. Let’s call this style of learning bottom-up.
This differs a lot from the approach that happens in academic environments. In those settings, recognized experts decide what theoretical knowledge will be useful to students and push them to learn it, even if learning those things isn’t obviously useful to the immediate practical ends of the student. Let’s call this style of learning top-down.
Bottom-up learners only pick up the theory they need to solve the problem in front of them. If you’re learning another language through immersion, you’ll pick up grammatical rules when you need to express yourself, understand another person or notice you’re not saying it right. You don’t learn the rules in advance and then wait for a situation to apply them.
Is it Better to Learn Top-Down or Bottom-Up?
I’ve thought a lot about which approach is better for learning, bottom-up or top-down. In truth, I’ve used both. The MIT Challenge was clearly a top-down learning project, as I aimed to follow a particular curriculum rather than teach myself. The Year Without English, on the other hand, was mostly bottom-up, using immersion to drive improvement.
My feeling tends to be that in the short-term, bottom-up tends to do better. It’s very hard for anyone (even an expert) to know exactly what concepts should be learned in what order. If you learn from trying to do things and pick up theory as-needed, you rarely learn anything that isn’t useful. In contrast, much, perhaps most of the time, in school is learning things which aren’t useful.
The long-term picture is less clear, however. In the long-term, there’s probably some advantages to a top-down approach, because often there are ideas which only appear useful after you’ve learned them. A bottom-up approach misses these opportunities entirely.
This suggests to me that, if your goal is to learn a skill you intend to use, then you should start closer to bottom-up and shift to top-down only later. What would this look like, in practice?
- Programming. Start by learning via a particular goal: making a game, website or app. Once you’re pretty good, then start to introduce more top-down theory to round out your knowledge.
- Languages. Start by learning via immersion. Once your conversational, spend time on those tricky grammar points with a textbook.
- Business. Start by running a business or working in it. Once you have some experience, then go back and build your theoretical knowledge (say with an MBA or self-education).
- Art. Paint, draw and sketch a lot. When you get stuck, look for advice on your specific weakness. Once you’re decent and stop improving as fast, learn more about theories of composition, colors, art history, etc.
You’ll note that this is the opposite approach most learners use. Most learners start top-down, and only move to bottom-up strategies once they feel confident enough.
Why Learn Theory?
Of course, all of this assumes your goal is to learn a practical skill. If your learning goal is more abstract knowledge in the first place (psychology, economics, math, etc.) it’s probably not possible to learn bottom-up.
Bottom-up learning also requires more confidence and motivation. Starting directly from a real-use situation when your ability is quite low can feel daunting. Getting through that initial frustration period can overwhelm less experienced or casual learners, so for those people, taking a class which is less efficient but less overwhelming may not be a bad idea.
I’m curious to hear your thoughts. Do you think learning extra theory (beyond what you need to solve immediate problems) is important for a skill you know well? Are there any traps that come from learning something bottom-up first? Share your thoughts in the comments.