Persona development is a common and powerful component of marketing strategy and product development — ensuring that your team is working with their customers in mind. Less common, but arguably as powerful, is focusing on what those personas are trying to accomplish. These “jobs to be done,” as defined by writer Anthony Ulwick, form the basis of “outcome-driven innovation,” as described in Ulwick’s book, What Customers Want.
The concept of Jobs to Be Done is based on a fairly simple idea: people buy products and services to get jobs done. The new products, services, and apps that win in the marketplace are the ones that help customers achieve these jobs faster, better, or less expensively.
As such, organizations seeking a data-driven approach to feature prioritization may find themselves drawn to Ulwick’s approach, and for good reason.
Less About Personas, More About Jobs to Be Done
All prioritization frameworks attempt to “satisfy” customers, but where outcome-driven innovation (ODI) is different is in its relentless focus on the tasks (or jobs) customers wish to complete, which informs that satisfaction.
Within any interaction, a customer is looking for an exchange of value. In return for money or time, they are seeking to complete a job. These jobs vary, depending on the context of the situation, but they reflect the goals of the customer. A key step in ODI is defining what these jobs to be done are/
Defining jobs to be done is similar – but not identical – to the user stories commonly used in agile development.
In agile development, the proposed framework is structured as such:
As a [role], I want [capability] so I [benefit].
This statement within ODI would be slightly rephrased:
When [situation], I want to [task] so I can [expected outcome].
ODI replaces the role of the user with the context of the moment. An expected outcome is more objective than a desired benefit, and thus more useful, particularly when dealing with engineers and developers tasked with implementing features.
Yet, merely identifying the jobs to be done isn’t the only differentiator of ODI compared to other prioritization frameworks.
The Axis of Jobs to Be Done
Ulwick next suggests that for proper prioritization, a grid should be used, with the X axis representing the relative importance of a given offering, and the Y axis corresponding to customer satisfaction of existing solutions.
Here, the approach could be compared to Kano Analysis, which also plots satisfaction on the vertical, yet tracks functionality on the horizontal. Unlike ODI, Kano allows the functionality of a given feature could be done so poorly to lead to dissatisfied users — a distinction you won’t find in ODI.
Once an offering has been released, it’s crucial to gather ample data from users, asking them to rate both the importance and satisfaction of the solution and then plotting them on the graph. To this point, ODI has offered a slight difference from other models, but now will diverge more significantly.
Finding the Sweet Spot with ODI
Ulwick suggests a sample size ranging from 150-600, significantly more than other approaches. Once appropriate data has been gathered and plotted, practitioners will find their results fitting within three distinct triangles on the graph.
The upper left quadrant represents users that – on the whole – are more satisfied than their perceived importance of a given feature. In ODI-speak, this market has been “overserved,” and a new offering of this feature will likely become lost in the sea of competition.
A better opportunity exists just to the right of the original segment, offering solutions where customers are relatively (but not completely) satisfied. This segmentation is considered to be “appropriately served.”
The true opportunity lives in the final triangle to the far right. Here, the correlation is clear; a market that places great importance on an offering that is – on the whole – dissatisfied with available options.
This represents the opportunity to respond to an underserved market.
Ranking Jobs to Be Done
While the ODI graph allows for a quick visual reference, product owners rarely (if ever) need to plot a single feature. More often, they compare several. Thus, plotting all of them can render analysis difficult.
To allow for swift feature ranking, ODI relies upon an “opportunity algorithm,” a simple formula expressed as follows:
Importance + (Importance-Satisfaction) = Opportunity
Written in this manner, added weight is being given to the importance of a specific offering, relative to customer satisfaction of existing offerings, resulting in an “opportunity score.”
To evaluate the results, follow these guidelines:
- Opportunity scores greater than 15 represent extreme areas of opportunity that should not be ignored.
- Opportunity scores between 12 and 15 are “low-hanging fruits” ripe for improvement.
- Opportunity scores between 10 and 12 are worthy of consideration especially when discovered in a broad market.
- Opportunity scores below 10 are viewed as unattractive in most markets, offering diminishing returns.
For those companies seeking to prioritize many features at a time, the “opportunity score,” similar to RICE prioritization, offers an objective approach to ranking all options.
A key is to evaluate customers’ current satisfaction levels (or lack thereof) for each job. Areas that score high in importance with low satisfaction scores present the biggest opportunities. However, these must also be assessed through a customer lens. The jobs that will have the largest impact on achieving business outcomes need to be at the top of your list.
Challenges with Jobs to Be Done Framework
While the Jobs to Be Done framework offers a useful approach for driving customer-centric innovation, companies often encounter difficulties putting the methodology into practice.
Common challenges include:
Defining Jobs Too Narrowly
Some organizations fail to capture the full picture of the job customers want to get done and instead define the job too narrowly. For example, defining the job simply as “listening to music on the go” versus the broader job of “enjoying an immersive audio experience whenever I want.”
Always push for a more expansive view of the complete job.
Relying on Surface-Level Jobs
There is a temptation to rely purely on functional and superficial jobs. For example, focus on jobs like “store files in the cloud.”
Make sure to also explore the emotional, social, and aspirational jobs to uncover deeper needs. The job someone hires a product for goes beyond just functional performance.
Difficulty Identifying All Jobs
Companies often struggle to identify an exhaustive set of jobs, missing key jobs customers want to get done. You must allocate sufficient time for broad exploration and discovery research across different customer segments to reveal less obvious but critical jobs.
You should also plan for additional jobs to emerge over time, so dev lifecycles should plan accordingly.
Disconnection Between Jobs and Features
A common downfall is a failure to effectively link specific features and product elements back to the jobs-to-be-done analysis. Make explicit connections between proposed solutions and target jobs to ensure offerings deliver desired outcomes.
Inadequate Ongoing Tracking
Businesses often conduct initial job research but don’t continually track importance and satisfaction over time. This causes missed opportunities as existing solutions become outdated or user needs evolve. Regularly measure importance and satisfaction ratings for each key job.
Overcoming JTBD Pitfalls
Here are best practices for avoiding common pitfalls:
- Frame jobs broadly around the end benefit or outcome. Ask “Why is that job important?” to expand.
- Uncover all dimensions of a job: functional, emotional, and social. Move beyond surface needs only.
- Use diverse qualitative research techniques to discover a wide array of jobs. Expect new jobs to emerge over time.
- Maintain a tight link between job insights and resulting products and features. Clearly map jobs to solutions.
- Institute ongoing measurement systems. Continually monitor importance and satisfaction job metrics through a built-in feedback loop.
Comparing Jobs to be Done to Other Prioritization Frameworks
With an approach to developing feature sets, there are pros and cons. Here’s how JTBD stacks up against other common approaches.
Framework | How It Works | Pros | Cons |
---|---|---|---|
Jobs to Be Done | Focuses on jobs customers want to get done and hires products/services to complete. Quantifies the importance of jobs and satisfaction with current solutions. | Customer-centric. Identifies unsatisfied customer needs. Scales across industries/product types. | Significant upfront research is required. Ongoing tracking needs investment. |
Kano Model | Categorizes product features based on how they impact customer satisfaction (delighters, satisfiers, must-haves, etc.). | A simple way to classify features. Indicates potential customer dissatisfiers. | Limited view of customer jobs. Can miss bigger opportunities. Static over time. |
ROI Analysis | Estimates potential return on investment of product/feature opportunities based on projections. | Data-driven. Factors in cost, revenue, and resources needed. | Subjective projections. Lagging and reactive. |
Buyer Personas | Develops representations of key customer segments to guide decisions. May include their goals and behaviors. | Helpful for targeting messages and offers. | Doesn’t provide insights on unsatisfied needs directly. |
Feature Prioritization | Allows ranking of features based on weighted criteria or other comparative approaches. | Provides a means to sequence feature development. | Typically lacks direct customer input. No guarantee that features map to needs. |
Should Your Company Utilize ODI?
Though Ulwick received his first of twelve patents for his process more than two decades ago, ODI’s framework continues to provide significant value to product owners worldwide. Though originally embraced by companies with physical products, its application is just as useful for those in software development.
Moreover, the framework scales up better than many tactical feature prioritization techniques and can be applied equally well to entirely new product offerings as well as considering simple, new features on existing solutions.
At the same time, practitioners would be wise to consider the larger datasets of feedback that Ulwick recommends, as obtaining hundreds of survey results can be a challenge in many organizations, as can structuring said surveys in a manner to obtain the most reliable data.
That said, the greatest value for software developers might be ODI’s focus on “jobs to be done,” as framing customer concerns in this manner is directly applicable to software development. When these jobs to be done are paired with the granular approach proposed in Ulwick’s opportunity algorithm, product owners will find themselves with an objective process that’s superior to less rigorous approaches.
Strategic Planning to Accelerate Your Vision
Accelerate your vision through strategic planning to improve your speed to market with the jobs to be done that will improve customer satisfaction. Schedule an introduction with Ascendle today to learn better ways for product ideation, user insights, and a proven Go-to-Market FrameworkSM.
Editor’s Note: This post was originally published on February 12, 2021 and has been updated for accuracy and comprehensiveness.