Brett van Zuiden

You need to make sure the whole thing makes sense

In taking a product from conception to release, a product manager is bombarded with feedback, ideas, and constraints all pulling in different directions. A unique and essential responsibility of the PM is to make sure that the product that comes out of this gauntlet still makes sense as a coherent whole.

New products and features typically (hopefully!) start from a clear, specific need – you know what you’re trying to build, who it’s for, and what it will allow them to accomplish. But as you start building, all sorts of new information starts to surface: it turns out that one of the features is really hard to build, so you settle for a partial version. You share an early version with some new customers who have some ideas as to how this could better fit their needs. The executive team is focusing the company on a new metric that, with a few small tweaks, your team could contribute to in a big way.

All these inputs individually are important and useful, but together can pull a nascent product towards either of two failure modes: making compromise after compromise until the product is no longer useful, or lumping in goal after goal until the product becomes overly complicated. When you’re in the weeds it can be difficult to tell when any particular decision crosses a tipping point, but it is the product manager’s responsibility to be able to take the necessary step back and make sure that what is being delivered still satisfies a clear, specific need.

We ran into both issues when developing Clever Goals, an analytics product for districts. The very earliest conversations with customers identified a clear need: administrators had key performance indicators for the different edtech products used in their schools, and they wanted to track these KPIs in a single place to make sure their students were on track. But as we started collecting this data, we ran into issue after issue, and ultimately decided that we would collect data for many - but not all - edtech products. In retrospect, this was the tipping point: there were many features we scrapped to create the MVP, but without support for all of a district’s products, the Goals product no longer satisfied the customer need.

Worse, as we were building Goals we started getting close to other interesting problems, like helping districts show ROI of edtech purchases and helping parents understand how their child was using edtech in school. Any of these directions might have been compelling on their own, but by trying to weave them into Goals we eventually ended up with a product that partially did a handful of things without really nailing the core use case. We eventually re-focused the product around really solving core analytics use case, but it was a painful and messy process.

There are a variety of frameworks for avoiding these mistakes and keeping a product on track; two of my favorites are the “work backwards” method of amazon-style press releases and Paul Buchheit’s If your product is Great, it doesn't need to be Good. But the simplest version is surprisingly effective: when you take a step back and look at the product as a whole, does it still make sense?