Q&A with Napster Product Manager Suzanne Scharlock

| April 7, 2017
Sign up to get weekly resources, and receive your FREE bonus eBook.
Thank you!

Get ready for some great content coming to your inbox from the team at UserTesting!

Ever wonder what it’s like to be a product manager at one of the world’s most iconic streaming music services? In a recent webinar, we caught up with Product Manager Suzanne Sharlock for a behind-the-scenes look at how a PM handles UX and design issues at Napster. Our webinar participants were so eager with questions we gave Suzanne the 20-questions treatment—literally! We’ve gathered our top 10 questions with Suzanne’s answers below, or you can watch the full webinar here. Enjoy!

 

 1. Are the brainstorm sessions conducted using a specific process?

The brainstorm sessions that we do are often based around hypotheses or assumptions that we have. We have a big spreadsheet called our testing log where every time somebody has an idea or an assumption, we’ll write it down and we’ll timestamp it with the date and whose idea it was. Then we’ll organize it into a different section so whether it has to do with off-boarding or monetization, etcetera, so we can start that spreadsheet really easily.

2. Do you ask the same questions when doing in-person versus remote testing?

We use a different set of questions. So with in-person user testing, we have a little intro speech that we give at the beginning. We explain who we are, what company we work for, and this is just a test. And we have this whole little script that we have to walk through with our in-person testers just to make sure that they feel really comfortable in the beginning.

With UserTesting, those people have already signed up to do a user test. They know what they’re getting into, so we don’t have to do that big intro spiel at the beginning. And also sometimes with remote testing, we’ll use a different type of prototype or the test will just be different. So when the test is different, you use a different set of questions than you do with in-person user testing.

3. What’s the timeline for carrying out the UX from hypothesis to testing and then to retesting?

So our product team operates on a really quick clip. Because we’re only three people, we can get things done really fast. We don’t have to go through a lot of layers of verification with folks to make sure that we’re not stepping on any toes. We just have a lot of autonomy so we can get things done pretty quick. So I would say that our process for a really meaty feature, something like onboarding, would take maybe like three weeks, but for something really simple like an icon swap, it would only take a week. So it’s in between those.  I would say the average is about two weeks though because we do that in-person user testing weekly, we’ll do one test one week and then we’ll retest it again the next week.

4. When reviewing the test results, who’s in the meeting? Is it just the PMs or also the designers? Are UX people there?

When we review the results, it’s the PMs, the designers, and the UX folks. So we want to make sure that everybody on the product team is in the in-depth meeting where we’re reviewing all the results and we’re going bullet point by bullet point through those big review blocks that we have.

And then we also make all of our research available to everyone else. We have a Slack channel where we post all of the research. We do an AT channel and we say, hey everybody here’s our latest research. So any of the developers, anybody in the company can go in and read through our research and kinda see what we’re up to. We think it’s really good for everybody to be on the same page to know that this process is happening and know that these features that are being built are things that we’ve pre-validated.

5. How many developers are there? What’s the ratio between product team and developers?

We definitely have more developers than we have on the product team. I’d say we have about six developers between the IOS team and the Android team and then we have a much larger platform, an API team, that’s probably around 12 people.

6. Who does the testing versus who does the visual design?

Myself as the Product Manager and our UX designer do the testing, and then our visual designer comes up with the visual designs for the product. But sometimes our UX designer will come up with some basics designs or she’ll have a really great inspirational idea and she can explain it to our visual designer and they’ll bounce things off between each other.

7. How often do you find you have to go back to the drawing board after the initial user testing?

It depends on a few different factors. Sometimes you’ll have some strong assumptions or some strong hypotheses based out of a survey and you’ll say, “I think that we’re gonna knock it outta the park with this first iteration,” and sometimes that happens.

But usually, we have to go back to the drawing board at least one more time. I’d say after the second time we usually have it nailed, but there’s always the occasion where we have to go back three or four times because we just are not getting the usability issues down.

But you’d be amazed by how much you can learn in your first usability session. You’ll uncover some really obvious patterns of behavior. People are getting hung up on the same steps and if you fix that then you’ll have it solved by the second pass usually.

8. When and where do you draw the line on when an item is ready for development?

We know that something’s ready for development when the product team feels comfortable that the usability issues have been worked out and that we have designs ready and I’m able to write an entire feature going through the entire user story process.

So literally you’ll have the design and it’ll say if you click this button, you go to this page. If you scroll, the screen is paginated. If you do this, X happens. So we have the feature requirements totally written, totally ready to go. The hi-fi designs are attached, the icons are added to the icon font. We’ve reviewed it with the dev leads. We’ve reviewed it with the client teams. Everybody has had their pass at it. They’ve been able to hammer on it. They’ve asked all these different logic issues. We’ve gone back and the product team has patched those different logic issues three different times from different sets of review. And at that point, it’s ready for development. It’s ready to go.

And that might sound like a really long, involved process, but that saves so much development time because all of those logic issues are already solved before that feature goes to development. So when they actually get the feature, they can just code, they don’t have to waste their time coming to me, you know, being grumpy that I forgot some logic issue in there and now it’s going to mess everything up. So we really spend the time upfront before the development actually happens to make sure that these features are totally ready to go.

9. Since your project isn’t out on the market yet, how do you set up tests? What safety measures do you take to ensure confidentiality?

That’s a great question. We have an NDA drawn up by our lawyer and when we go to in-person user testing we have everybody sign it before they do the test with us. That’s part of the requirement for participating in a user test with us. On UserTesting there also is an NDA baked in so we’re good to go that way.

When we go we bring our own test device so the participant is not downloading any apps onto their own phone. They’re interacting with the test device that we provide them so it’s pretty safe. I don’t think anything about our product has leaked so far and it’s worked out really well for us.

10. What’s your process if executive management tells you to make a specific feature in a short amount of time?

This is where the KPIs and the hypotheses come into play. So if an executive asks us to make a specific feature, it gets added to our testing log just like everything else, and we will work on it when we hit that different KPI phase. So if the feature is something that has to do with the core listening use phase, such as phase number one, then we’ll get started working on it earlier. But if it doesn’t fall into one of those five KPIs that I talked about which were core listening use cases, premium features, monetization, onboarding, and growth, then we’re not gonna work on it until each of those buckets have been worked through because there needs to be a method to the madness. There’s always gonna be a huge queue of features and we have to work through them in a logical order.