Classic goal-setting theory says that you first set a goal, then you make a plan to achieve it and finally you start taking action. Recently, I’ve begun to question the wisdom of this approach. Now, I’m inclined to believe that, for certain types of efforts, you’re better of setting goals in the middle.
At first you might be wondering how it’s even possible to set goals in the middle. After all, doesn’t the goal define the project itself? But in general, there’s a lot of different aspects to making a plan, and some of them may not be strictly necessary to begin.
Any goal or project will usually have the following qualities:
- A general ambition or motivation. (e.g. get in shape, learn French)
- A specific target. (e.g. lose 15 pounds, speak fluently)
- A time-frame or deadline. (e.g. in 6 months)
- Constraints or methods. (e.g. by eating fewer calories, practicing every day)
- Overall impression of effort/time required. (e.g. a few hours per week of moderate effort)
That’s just the basics. A plan can have a lot more details: milestones, schedules, metrics, accountability systems, coaches, etc..
It’s clear from this idea that a goal really isn’t one thing, but a bunch of different features that tend to get bundled together. Some of these are likely necessary for any kind of voluntary action in a particular direction, but some are optional. In fact, unless you’re a very systematic goal-setter, you likely undertake projects all the time with some of the above elements missing.
The question we need to answer in setting goals properly is when should all of these elements be in place. Some argue that all of these elements should be in place before action begins. I’d like to argue that, for a subset of goals, it might be better to postpone finalizing them.
Which Goals Should You Start in the Middle?
The first time I stumbled into this process of setting goals in the middle was during my project with Vat to learn languages. Many aspects of our plan were fixed: the time-frame (three months in each country), method (not speaking English, some study) and effort required (nearly full-time, as much as feasible).
What we didn’t fix was a specific target. We honestly weren’t sure of what level we would reach, so we didn’t bother to make a specific goal out of it (say aiming for conversational fluency or to pass a particular language exam).
Initially I thought this might be a disadvantage over the more typical method of setting the goal first. That was how I handled the MIT Challenge, setting an ambitious target before starting.
After going through this process though, I found it much better than the typical approach. Setting a goal in the middle allows one to fine-tune the challenge level and expectations when you have information. I applied this same approach when in China to undertake the HSK 4 exam at the end of my stay. Importantly, I only made this decision after a month and a half in China, where I could anticipate my progress and decide what level was achievable.
This suggests a possible rule of thumb: when a goal has high uncertainty as to what level is achievable to reach within a particular time-frame, it’s better to set the specific target in the middle of the process, not at the beginning.
This doesn’t mean no planning can occur, simply that you plan with the variables you have the most control over: overall direction, time-frame, level of effort, strategies and constraints.
More Reasons to Postpone Goal-Setting
So far I’ve been arguing that highly-uncertain goals should be set in the middle of efforts so as to more productively set the correct challenge level to maximize effort. However, recently I’ve come across research that suggests there might be another reason too.
The goal-setting literature is fairly unanimous that goal-setting improves performance, that difficult goals typically do better than easy ones and that feedback is good, all things one would expect. What I found interesting was that there are counter-examples to this trend, particularly at the level of individual tasks. (Read the full paper here.)
Some research shows that for particularly complex tasks, goal-setting can reduce effectiveness. The reasoning is that complex tasks require your full cognitive resources. When you monitor your performance, this too requires cognitive resources. The added load can impair your performance. (For the full study, click here.)
This suggests a double-whammy for learning projects when the subject or skill is relatively unknown. These projects suffer from high-uncertainty, therefore it’s very easy to set goals way too large or too little. Second, these projects are incredibly cognitively demanding in the beginning, and are frequently frustrating because you can’t perform adequately. Setting goals in this setting may further impede progress if you’re mindful of those goals while learning.
Does This Apply to All Goals?
I think the standard approach of setting specific targets works well in relatively known domains, where past accomplishments can be used as benchmarks for future success. These goals are more straightforward and likely benefit from the increased precision and motivation that a hard target can create from the first day.
On a task-level, setting specific goals are likely to be more useful once your skill has become more mature. Goal-setting is an important component of deliberate practice, when one has already achieved adequate performance and now the goal is continued improvement. However, they may be counterproductive in completely new areas where you may get overwhelmed.
If you find yourself asking how you can possibly know what’s achievable for you, that’s a good sign you should wait to set a specific target. Keep the effort and timeline constrained instead. For most people, these usually have a lot less uncertainty, and so it’s easier to keep up a goal without coasting (because it was too easy) or giving up (because it was too hard).