Cover image of Dollars to Donuts
(41)

Rank #103 in Management category

Business
Management
Entrepreneurship

Dollars to Donuts

Updated 2 months ago

Rank #103 in Management category

Business
Management
Entrepreneurship
Read more

The podcast where we talk with the people who lead user research in their organization.

Read more

The podcast where we talk with the people who lead user research in their organization.

iTunes Ratings

41 Ratings
Average Ratings
40
1
0
0
0

Incredible Resource

By shrothermel - May 01 2020
Read more
I’ve been listening to Dollars to Donuts since I started my career in UXR, and it’s been hands down the best resource for me. It’s provided inspiration and best practices and made me feel plugged into the broader research community. Highly highly recommend to any researcher at any point in their career. You’ll learn so much.

Great podcast

By Alwe9405 - Mar 27 2019
Read more
Excited to find a solely user research focused podcast!

iTunes Ratings

41 Ratings
Average Ratings
40
1
0
0
0

Incredible Resource

By shrothermel - May 01 2020
Read more
I’ve been listening to Dollars to Donuts since I started my career in UXR, and it’s been hands down the best resource for me. It’s provided inspiration and best practices and made me feel plugged into the broader research community. Highly highly recommend to any researcher at any point in their career. You’ll learn so much.

Great podcast

By Alwe9405 - Mar 27 2019
Read more
Excited to find a solely user research focused podcast!
Cover image of Dollars to Donuts

Dollars to Donuts

Latest release on May 11, 2020

Read more

The podcast where we talk with the people who lead user research in their organization.

Rank #1: 19. Leisa Reichelt of Atlassian (Part 1)

Podcast cover
Read more

This episode of Dollars to Donuts features part 1 of my two-part conversation with Leisa Reichelt of Atlassian. We talk about educating the organization to do user research better, the limitations of horizontal products, and the tension between “good” and “bad” research.

If you’re working on a product that has got some more foundational issues that need to be addressed, but the vast majority of the work is happening at that very detailed feature level, how are you going to ever going to stop kind of circling the drain? You get stuck in this kind of local maxima. How are you ever going to take that big substantial step to really move your product forward if it’s nobody’s job, nobody’s priority, to do that? – Leisa Reichelt

Show Links

Follow Dollars to Donuts on Twitter and help other people discover the podcast by leaving a review on iTunes.

Transcript

Steve Portigal: Howdy, and here we are with another episode of Dollars to Donuts, the podcast where I talk to the people who are leading user research in their organization.

I just taught a public workshop in New York City about user research, organized by Rosenfeld Media. But don’t despair – this workshop, Fundamentals of Interviewing Users is also happening September 13th in San Francisco. I’ll put the link in the show notes. Send your team! Recommend this workshop to your friends!

If you aren’t in San Francisco, or you can’t make it September 13th, you can hire me to come into your organization and lead a training workshop. Recently I’ve taught classes for companies in New York City, and coming up will be San Diego, as well as the Midwest, and Texas. I’d love to talk with you about coming into your company and teaching people about research.

As always, a reminder that supporting me in my business is a way to support this podcast and ensure that I can keep making new episodes. If you have feedback about the podcast, I’d love to hear from you at DONUTS AT PORTIGAL DOT COM or on Twitter at Dollars To Donuts, that’s d o l l R s T O D o n u t s.
I was pretty excited this week to receive a new book in the mail. It’s called The Art of Noticing, by Rob Walker, whose name you may recognize from his books, or New York Times columns, or his appearance in Gary Hustwit’s documentary “Objectified.” I’ve only just started the book but I am eager to read it, which is not something I say that often about a book of non-fiction. The book is structured around 131 different exercises to practice noticing. Each page has really great pull-quotes and the exercises seem to come from a bunch of interesting sources like people who focus on creativity or art or storytelling. Rob also publishes a great newsletter with lots of additional tips and examples around noticing, and I’ve even sent him a few references that he’s included. I’ll put all this info in the notes for this episode. This topic takes me back to a workshop I ran a few years ago at one of the first user research conferences I ever attended, called About, With and For. The workshop is about noticing and II wonder if it’s time to revisit that workshop, and I can look to Rob’s book as a new resource.

Well, let’s get to the episode. I had a fascinating and in-depth conversation with Leisa Reichelt. She is the Head of Research and Insights at Atlassian in Sydney Australia. Our interview went on for a long time and I’m going to break it up into two parts. So, let’s get to part one here. Thank you very much for being here.

Leisa Reichelt: Thank you for inviting me.

Steve: Let’s start with maybe some background introduction. Who are you? What do you do? Maybe a little bit of how we got here – by we, I mean you.

Leisa: I am the Head of Research and Insights at Atlassian. Probably best known for creating software such as Jira and Confluence. Basically, tools that people use to make software. And then we also have Trello in our stable as well. So, there are a bunch of tools that are used by people who don’t make software as well. A whole bunch of stuff.

Steve: It seems like Jira and Confluence, if you’re any kind of software developer, those are just words you’re using, and terms from within those tools. It’s just part of the vocabulary.

Leisa: Yeah.

Steve: But if you’re outside, you maybe have never heard those words before?

Leisa: Exactly. Atlassian is quite a famous company in Australia because it’s kind of big and successful. But I think if you did a poll across Australia to find out who knew what Atlassian actually does, the brand awareness is high. The knowledge of what the company does is pretty low, unless you’re a software developer or a sort of project manager of software teams in which case you probably have heard of or used or have an opinion about Jira and probably Confluence as well.

Steve: And then Trello is used by people that aren’t necessarily software makers.

Leisa: Correct. A bunch of people do use it for making software as well, but it’s also used for people running – like in businesses, running non-technical projects and then a huge number of people just use it kind of personally – planning holidays, or weddings. I plan my kids weekends of Trello sometimes and I know I’m not alone. So, yeah it’s a real – it’s a very what we call a horizontal product.

Steve: A horizontal product can go into a lot of different industries.

Leisa: Exactly, exactly. I’m very skeptical about that term, by the way.

Steve: Horizontal? Yeah.

Leisa: Or the fact that it won’t necessarily be a good thing, but that’s another topic probably.

Steve: So, I can’t follow-up on that?

Leisa: Well, yeah, you can. Well, the problem with horizontal products, I think, is that they only do a certain amount for everybody and then people reach a point where they really want to be able to do more. And if your product is too horizontal then they will graduate to other products. And that gives you interesting business model challenges, I think. So, you have to be continually kind of seeking new people who only want to use your product up to a certain point in order to maintain your marketplace really.

Steve: When I think about my own small business and just any research I’ve done in small and just under medium sized businesses where everything is in Excel, sort of historically, where Excel is the – there may be a product, Cloud-based or otherwise, to do a thing. That someone has built kind of a custom Excel tool to do it. So, is Excel a horizontal product that way?

Leisa: I think so, yeah. In fact, I was talking to someone about this yesterday. I think that for a lot of people the first protocol for everything is a spreadsheet. They try to do everything that they can possibly do in a spreadsheet. And then there are some people who are like the, “ooh, new shiny tool. Let’s always try to find an application for a new shiny tool.” I think actually the vast majority of people take the first tool that they knew that had some kind of flexibility in it. So, if you can’t do it in Word or Excel – people will compromise a lot to be able to do things in tools that they have great familiarity with.

Steve: Yeah. But from the maker point of view you’re saying that a risk in the horizontalness, the lack of specificity, creates kind of a marketplace for the maker of the tool?

Leisa: Can do. I think for it to be successful you just have to be at such a huge scale to be able to always meet the needs of enough people. But I think things like Excel and Word and Trello, for example, they’ll always do some things for some people. Like just ‘cuz you moved to a more kind of sophisticated tool doesn’t mean that you completely abandon the old tool. You probably still use it for a bunch of things.

Steve: So, your title is Head of Research and Insights?

Leisa: Correct.

Steve: So, what’s the difference between research and insights?

Leisa: Yeah, good question. I didn’t make up my own title. I kind of inherited it. If I remember correctly, the way that it came about was that when I came into my role it was a new combination of people in the team in that we were bringing together the people who had been doing design research in the organization and the voice of the customer team who were effectively running the MPS. And I think because we were putting the two of them together it sounded weird to say research and voice of the customer. So, they went with research and insights instead. And, honestly, I haven’t spent any time really thinking about whether that was a good title or not. I’ve got other problems on my mind that are probably more pressing, so I’ve just kind of left it. If you want to get into it, I think research is the act of going out and gathering the data and pulling it altogether to make sense and insights. So, hopefully the sense that you get from it and we do try to do both of those things in my team, so I think it’s a reasonably accurate description.

Steve: How long have you been in this role?

Leisa: It’s about 18 months now.

Steve: If you look back on 18 months what are the notable things? For the experience of people inside the organization what has changed ins?

Leisa: Quite a few things. The shape and make up of the team has changed quite a lot. We’re bigger and differently composed to what we were previously. When I first came in it was literally made up of researchers who were working in products. Prior to me coming in they were reporting to design managers and we, for various reasons, pulled them out of products pretty quickly once I started. And then we had the other part of the business who were running MPS that was being gathered in product and we don’t do that anymore either. So, that team is doing quite different things now. So, we’ve changed a lot of things. We’ve introduced a couple of big programs of work as well. One of them being a big piece of work around top tasks. So, taking Gerry McGovern’s approach to top tasks. Trying to build that kind of foundational knowledge in the organization. So, that’s kind of new. There have always been those people doing Jobs To Be Done type stuff, but very very close to the product level. So, we’ve tried to pull back a little bit to really build a bigger understanding of what are the larger problems that we’re trying to solve for people and how might we see the opportunities to address those more effectively?

Steve: So, trying to create shared knowledge around what those top tasks are – I’m guessing for different products, different user scenarios.

Leisa: One of the things that we really tried to do is to get away from product top tasks and get more into really understanding what problems-based is the product, or combinations of products, trying to address. So, we don’t do top tasks for Jira. We do top tasks for agile software teams. And then through that we can then sort of ladder down to what that means for Jira or Confluence or Bitbucket or Trello, or whichever individual or combination of products we have. But it means that we sort of pull away a little bit from that feature focus. I think it can be very seductive and also very limiting.

Steve: What’s the size of the team now?

Leisa: I have a general rule in life of never count how many researchers you have in the organization because it’s always too many, according to whoever is asking you – not you, but like senior people. I think we’re probably around the mid-20s now.

Steve: And around the world, or around different locations?

Leisa: Mostly we’re in Sydney and California. So, we’re across three different offices there and we have a couple of people working remotely as well. So, I have somebody remote in Europe and another person in California who’s working out of a remote office too.

Steve: Can we sort of talk about the make up of the team evolving – what kinds of – without sort of enumerating, what are sort of the background, skillsets? Any way that you want to segment researchers, what kinds of – what’s the mix that you’re putting together?

Leisa: So, I think at the highest level we’ve got people who do design research, predominantly qualitative and they do a mixture of discovery work and evaluative work. We’ve got a team of what we call quantitative researchers and those are generally people who have got a marketing research type background and so they bring in a lot of those data and statistical skills that the other crew don’t necessarily have quite so much of. And then we have a research ops team as well who are doing some different things. And then we have a handful of research educators.

Steve: And what are the research educators doing?

Leisa: Educating people about how to do research.

Steve: These are hard questions and hard answers!

Leisa: Well, if you dig into it too much it gets complicated pretty quickly. So, I think the reality of research at Atlassian is the people in the research team probably do – best case – 20% of the research that happens in the organization and that’s probably an optimistic estimate as well. A huge amount of research is done by product managers and by designers and by other people in the organization, most of whom haven’t had any kind of training and so take a very intuitive approach to how they might do research. So, the research education team are really trying to help shape the way that this work is done so that it can be more effective.

Steve: I’m thinking about a talk I saw you give at the Mind the Product Conference where you began the talk – I think you kind of bookended the talk with a question that you didn’t answer – I don’t think you answered it definitively which is, is bad research better than no research, or words close to that. It was a great provocation to raise the question and then sort of talk about what the different aspects of that – how we might answer that? What the tradeoffs are? When you talk about this design education effort I can’t help but think that that’s connected to that question. If 80% of the research is done by people acting intuitively then yeah, how do you level up the quality of that?

Leisa: Absolutely.

Steve: Which implies that – well, I don’t know if it answers – does that answer the question? I’m not trying to gotcha here, but if you are trying to level up the quality that suggests that at some point better research is – I don’t know, there’s some equation here about bad vs. better vs. no. I’m not sure what the math of it is.

Leisa: So, this has been the real thing that’s occupied my mind a lot in the last couple of years really. And I’ve seen different organizations have really different kind of appetites for participating in research in different ways. I think at the time that I did that talk, which was probably, what?

Steve: Almost a year ago.

Leisa: Yeah. I think I was still trying to come to terms with exactly where I’ve felt – what I’ve felt about all of this, because as somebody who – well, as somebody in my role, everybody in an organization is – a lot of people in the organization are going to be watching you to see you trying to be overly precious and to become a gatekeeper or a bottleneck or all of these kinds of things. So, I’ve always felt like I had to be very careful about what you enabled and what you stopped because everyone has work that they need to get done, right. And the fact that people want to involve their customers and their users in the design process is something that I want to be seen to be supporting. Like I don’t want to be – I don’t want to stop that and I certainly don’t want to message that we shouldn’t have a closeness to our users and customers when we’re doing the research.

But, I’ve also seen a lot of practices that are done with the best of intentions that just get us to some crazy outcomes. And that really worries me. It worries me on two levels. It worries me in terms of the fact that we do that and we don’t move our products forward in the way that we could and it worries me because I think it reflects really poorly on research as a profession. I think most of us have seen situations where people have said well I did the research and nothing got better, so I’m not going to do it anymore. Clearly it’s a waste of time, right. And almost always what’s happened there is that the way that people have executed it has not been in a way that has helped them to see the things that they need to see or make the decisions that they need to make. So, it’s this really hard like to walk to try to understand how to enable, but how to enable in a way that is actually enabling in a positive way and is not actually facilitating really problematic outcomes. So, that’s, yeah – that’s my conundrum is balancing that. One the one hand I feel really uncomfortable saying no research is better than bad research. But on the other hand, I’ve seen plenty of evidence that makes me feel like actually maybe it’s true.

Steve: So, that’s kind of a time horizon there that the bad research may lead to the wrong decisions that impact the product and then sort of harm the prospects of it. I’m just reflecting back. That harm the prospects of a research to kind of go forward. Right, every researcher has heard that, “well we already knew that” response which to me is part of what you’re talking about. It’s that when research doesn’t – isn’t conducted in a way and sort of isn’t facilitated so that people have those learning moments where they – I think you said something about sort of helping them see the thing that’s going to help them make the right decision. And that’s – right, that’s not a – that’s different than what methodology do you use, or are you asking leading questions? Maybe leading questions are part of it because you confirm what you knew and so you can’t see something new because you’re not kind of surfacing it.

Leisa: Loads of it comes around framing, right. Loads of it comes around where are you in the organization? What are you focused on right now? What’s your remit? What’s your scope? What are you allowed to or interested in asking questions about? In a lot of cases this high volume of research comes from very downstream, very feature focused areas, right. So, if you’re working on a product that has got some more foundational issues that need to be addressed, but the vast majority of the work is happening at that very detailed feature level, how are you going to ever going to stop kind of circling the drain. You get stuck in this kind of local maxima. How are you ever going to take that big substantial step to really move your product forward if it’s nobody’s job, nobody’s priority, to do that. So, a lot of this is kind of structural. That so many of our people who are conducting this research are so close to the machine of delivery, and shipping and shipping and shipping as quickly as possible, that they don’t have the opportunity to think – to look sideways and see what’s happening on either side of their feature. Even when there are teams that are working on really similar things. They’ve run so heads down and feature driven. So, doing the least possible that you can to validate and then ship to learn, which is a whole other area of bad practice, I think, in many cases. It’s really limiting and you can see organizations that spend a huge amount of time and effort doing this research, but it’s at such a micro level that they don’t see the problems that their customers are dying to tell them about and they never ask about the problems that the customers are dying to tell them about because the customers just answer the questions that they asked and that’s kind of what bothers me. And so, it’s not about – in a lot of cases, some of it is about practice. Like I think it’s amazing how few people can just resist the temptation to say hey I’ve got this idea for a feature, how much would you like it? They can get 7 out of who said they loved my feature. Like that feels so definitive and so reliable and that’s very desirable and hard to resist. So, yes that happens.

But the bigger thing for me I think is where research is situated and the fact that you don’t have both that kind of big picture view as well as that micro view. We just have all of these fragments of microness and a huge amount of effort expended to do it. And I don’t feel like it’s helping us take those big steps forward.

Steve: But Leisa, all you need is a data repository so that you can surface those micro findings for the rest of the organization, right?

Leisa: You’re a troll Steve.

Steve: I am trolling you.

Leisa: You’re a troll, but again, I mean in some – kind of yes, but again, like all of those micro things don’t necessarily collectively give you the full view. And a lot of those micro things, because of the way they’re being asked, actually give you information that’s useless. If all of your questioning is around validating a particular feature that you’ve already decided that you think you want to do and you go in asking about that, you never really ask for the why? Like why would you use – like who is actually using this? Why would they use this? What actual real problem are they solving with this, right? So, the problem becomes the feature and then all of that research then becomes very disposable because the feature shifts and then nobody uses it as much as everybody thought they would, or they’re not as satisfied with it as what they thought they would be. So, we just keep iterating and iterating and iterating on these tiny things. We move buttons around. We change buttons. We like add more stuff in – surely that will fix it. But it’s because we haven’t taken that step back to actually understand the why? If you’re talking about those big user need level questions, the big whys, then you know what, you can put those things in a repository and you can build more and more detail into those because those tend to be long lasting.

And then at the other end, just the basic interaction design type stuff that we learn through research. A lot of those things don’t change much either. But it’s that bit in the middle, that feature level stuff, is the most disposable and often the least useful and I feel like that’s where we spend a huge amount of our time.

Steve: Do you have any theories as to why in large software enterprise technology companies that there is a focus on leaning heavily on that kind of feature validation research? What causes that?

Leisa: I think that there’s probably at least two things that I can think about that contribute to that. One is around what’s driving people? What gets people their promotions? What makes people look good in organizations? Shipping stuff – shipping stuff gets you good feedback when you’re going for your promotion. They want a list of the stuff that you’ve done, the stuff that you’ve shipped. In a lot of organizations just the fact that you’ve shipped it is enough. Nobody is actually following up to see whether or not it actually made a substantive difference in customers’ lives or not. So, I think that drive to ship is incredibly important. And our org structures as well, like the way that we divide teams up now, especially in organizations where you’ve got this kind of microservice platform kind of environment. You can have teams who’ve got huge dependencies on each other and they really have to try to componentize their work quite a lot. So, you have all of these kind of little micro teams who their customer’s experience is all of their work combined, but they all have different bosses with different KPIs or different OKRs or whatever the case may be. And I think that’s a problem. And then the other thing is this like build, measure, build – what is it? I’ve forgotten how you say it.

Steve: I’m the wrong person.

Leisa: Build/measure/learn.

Steve: Yeah, okay.

Leisa: The lean thing, right, which is this kind of idea that you can just build it and ship it and then learn. And that’s – that’s – that means that like if it had been learn/build/measure/learn we would be in a different situation, right, because we would be doing some discovery up front and then we would have a lot of stuff that we already knew before we had to ship it out to the customers. But it’s not. It’s build – you have an idea, build/measure/learn. And then people are often not particularly fussy about thinking about the learn bit. So, we ship it, put some analytics on it and we’ll put a feedback collector on it and then we’ll learn.

What are you going to learn? Whatever. What does success look like? I don’t know. It’s very – it’s kind of lazy and it makes – we treat our customers like lab rats when we do that. And I feel like they can tell. There’s lots of stuff that gets shipped that shouldn’t get shipped, that we know shouldn’t get shipped. I’m not talking about Atlassian in specific. I’m talking about in general. We all see it in our day to day digital lives that people ship stuff that we really should know better, but this build/measure/learn thing means that unless you can see it in the product analytics, unless you can see thousands of customers howling with anger and distress, it doesn’t count.

Steve: It reminds me of a client I worked with a few years ago where we had some pretty deep understandings of certain points of the interactions that were happening and what the value proposition was and where there was meaning. Just a lot of sort of – some pretty rich stuff. And the client was great because they kept introducing us to different product teams to help apply what we had learned to really, really specific decisions that were going to be made. But what we started to see – this pattern was they were interested in setting up experiments, not improving the product. And in fact, this is a product that has an annual use cycle. So, the time horizon for actually making changes was really far and some of the stuff was not rocket science. Like it was clear for this decision – A versus B versus C – like here’s the research said very clearly like this is what people care about, or what’s going to have impact, or what’s going to make them feel secure in these decisions. They were like great, we can conduct an experiment. We have three weeks to set up the experiment and then we’ll get some data. And I was – I just hadn’t really encountered that mindset before. I don’t know if they were really build/measure/learn literally, if that was their philosophy, but I didn’t know how to sort of help – I wasn’t able to move them to the way that I wanted it done. I’d really just encountered it for the first time and it seemed like there was a missed opportunity there. Like we knew how to improve the product and I wasn’t against – like they are conducting experiments and measuring and learnings – awesome. That’s research. But acting on what you’ve learned seems like why you’re doing the research in the first place.

Leisa: It feels as though we have this great toolset that we could be using, right? We’ve got going out and doing your kind of ethnography, contextual type stuff. And then right at the other end we’ve got the product analytics and running experiments and a whole bunch of stuff in between. And it really feels to me as though organizations tend to really just get excited about one thing and go hard on that one thing. And growth – doing the experiments is a big thing right now. I see loads of situations where we look at data and we go, oh look the graph is going in this direction. Well, the graph is not going in that direction. Why? And I’ll kind of – like there’s so much guessing behind all of that, right? And if it doesn’t go quite right, well then let’s have another guess, let’s have another guess, let’s have another guess. And this – like you say, there’s so much stuff that we probably already know if we could connect two parts of the organization together to take together more effectively. Or, there are methods that we could use to find out the why pretty quickly without having to just put another experiment, another experiment onto our customers, our users. But the knowledge of this toolset and the ability to choose the right tool and apply it, and to apply these approaches in combination seems to be where the challenge is.

Steve: What’s the role of design in this? In terms of here’s the thing that we know and here’s the thing we want to make. We haven’t talked about designers. Ideally for me, I sort of hope designers are the instruments of making that translation. Without design then you can sort of test implementations, but you can’t synthesize and make anything new necessarily.

Leisa: Well, I mean yeah. Theoretically design has a really important role to play here because I think design hopefully understands the users in a design process better than anybody else. Understands the opportunities for iteration and levels of fidelity for exploring problems in a way that nobody else on the team does. And lots of designers do know that. But the time pressure is enormous, especially in these kind of larger tech companies where a lot of times designers – designers are also concerned about being bottlenecks. They have to feed the engineering machine. And it can be really difficult for them to have the conversations to talk about all of the work that we should be doing before we ship something. So, I feel as though they have a lot of pressure on them, a lot of time pressure on them. They’re being pressured to really contract the amount of effort that they put in before we ship something. And there is – yeah, there’s this huge desire amongst product teams and their bosses to just get something shipped, get something live and then we’ll learn that I think design really struggles with. And I don’t know – it can be really difficult in those kinds of environments to be the person who stands up and says we need to slow down and we need to do less and do it better. So, I have empathy for designers in their inability to shift the system – these problems – necessarily because of this big pressure that’s being put on them just to keep the engineers coding.

Steve: Right. If you say you want more time to work on something that’s – what are the risks to you for doing that?

Leisa: Exactly, exactly. On a personal level everyone wants to look good in front of their boss. Everyone would like a pay raise and a promotion and the easiest way to get those things is to do what you’re told and ship the stuff. Keep everyone busy, produce the shiny stuff, let it get coded up, let it go live, learn, carry on. That’s thinking short term. That’s how to have a happy life. Is that how you’re going to fundamentally improve the product or service that you’re working on? Probably not. But it takes a lot of bravery to try to change that, to try to stop this crazy train that’s out of control. Throw yourself in front of the bus. All that kind of stuff. Like you said, it’s hard, especially when there are loads of people all around you who are very happy to keep doing it the way that it’s being done right now. So, yeah. And I think that’s research’s role, that’s design’s role. I would like that to be PM’s role as well. And most engineers are – a lot of engineers are very, very, very interested in making sure that the work that they’re doing is actually solving real problems and delivering real value as well.

Steve: So, as you point to the individuals there’s a lot of shared objectives. But if you point the system, which is – it’s the system of – it’s the rewards system and the incentive system, but there’s also some – I think there’s just sort of what day to day looks like. Sort of the operations of the system of producing technology products.

Leisa: I think there’s also – there’s something about like what do people like to see? What gets people excited, right? And graphs pointing upwards in the right direction is really exciting. Like certain outcomes are really exciting. Ambiguous outcomes, really not exciting at all. Having things that go fast, very exciting. Things that go slow, not exciting. So, I think there are all of these things that this collection of humans, that form an organization have a strong bias towards, that we get really excited about, that don’t necessary help us in the long run. You know you see lots of people who get really excited about the graph. Very few people dig in behind the graph to the data. Where did this data come from? How much can I actually believe it? Like people are so willing to just accept experiment findings without actually digging in behind it. And a lot of the time, when you do dig in behind it, you go huh, that doesn’t look very reliable at all. So, there’s something about we love to tell ourselves that we’re data informed or data driven, but a huge number of the people who are excited about that don’t have very much data literacy to be able to check in and make sure that the stuff they’re getting excited about is actually reliable.

Steve: So, how could someone improve their data literacy?

Leisa: I think that there’s a lot of work that we need to do to make sure that we actually understand how experiments should be structured. And understand more about – being more curious about where is this data coming from and what are the different ways that we could get tricked by this, right? And there are tons of like books and papers and all kinds of things on the subject. But actually, when you go looking for and you’re coming at it with a critical mind, rather than with a mind that gets excited by graphs, you know a lot of it is pretty logical. When you think about like – like surveys, right? Who did this survey actually – who actually answered these questions? Instead of just going hey, there’s an answer that supports my argument, I’m just going to grab that. To dig behind it and go like are these – are the people who are being surveyed actually the same people that we’re talking about when we’re making this decision. Is there something about the nature of those group of people that is going to bias their response? It’s remarkable to me how few people actually kind of check the source to make sure that it’s worthy of relying on. Everyone – a lot of people are really keen to just grab whatever data they can get, but that supports their argument. I think this is another one of those kind of human things, those human inclinations that we have that lead us towards kind of bad behaviors.

Steve: I can imagine that with this research education effort that you’re doing that people that participate in that are going to learn how to make better choices in the research that they then run, some practical skills, some planning, some framing, as you talked about. But it seems like that literacy is a likely side effect of that, or maybe it’s not the side effect. Maybe it’s the effect. How to be better consumers of research. Once you understand how the sausage is made a little bit then you understand, oh yeah, that’s a biased question, or that’s bad sampling, and there’s a choice to make in all of these and we have to question those choices to understand the research that’s presented to us. I hadn’t really thought of training people to do research as a way to help then in their consumption or critical thinking around research.

Leisa: Something else that we’ve really come to realize recently as well is that because of all of the other pressures that the people who are making these decisions are having to deal with as well, we think we’ll do much better if we train the entire team together instead of just training the people who are tasked with doing the research. Because something that we’ve observed is that you can train people and they can go this is great – I really see what I was doing before was creating these kind of not great outcomes and what I need to do to do better. And then you’ll see them, not that long later, doing exactly the opposite to what we thought we had agreed we were going to do going forward. And we’re like well what’s happening? These are like smart, good people who have been given the information and are still doing crazy stuff. Like what’s going on with that? And you realize they go back into their context where everyone is just trying to drive to get stuff done faster, faster, faster, faster, and you have to plan to do good research. The easiest research to do really, really quickly is the crappy research. If you want to do good research you do have to do some planning. So, you need to get into a proactive mindset for it rather than a reactive mindset for it. And you need the entire team to be able to do that. So, one of the things that we’re looking to do moving forward is not just to train the designers and the PMs, but actually to train a ton of people all around them in the teams to help them understand that the way that you ask the questions and who you ask them of and where – all the different things that could impact the reliability of your research – requires planning and thinking about in advance. So, we hope that means that the whole team will understand the importance of taking the time to do it and it won’t just be like one or two people in a large team fighting to do the right thing and being pressured by everybody else. So, I think it is – the education aspect, I suspect, is super important and it goes way beyond just helping people who are doing research to do better research. It goes to helping the whole organization to understand the impact and risk that goes with doing a lot of research activity in the wrong way or the wrong place.

Steve: Just fascinating to me and I think a big challenge for all of us that lots of researchers are involved in education of one form or another – workshops – a big program like you’ve been building. But most of us are not trained educators. We don’t have a pedagogical, theoretical background about – it’s about communicating information. It’s about influence and changing minds. And it just seems like a lot of researchers, from an individual researcher on a product team to someone like you that’s looking at the whole organization, we’re sort of experimenting and trying to build tools and processes that help people learn. And learn is not just imparting the information, but reframe and empower, like these big words where – I wish I had – I wish you had a PhD in education. I wish I had that. Or that I had been a college professor for a number of years, or whatever it would take – however I would get that level of insight. Personally, I have none of that. So, to hear us – you know we all talk about these kinds of things. I think research gives you some skill and prototyping and iterating and measuring in order to make the kinds of changes in the implementation that your making. I don’t know about you. I feel like I am amateurish, I guess, in terms of my educational theory.

Leisa: Absolutely. And I think talking to the team who are doing this work, like it’s really, really, really hard – really hard work to come up with the best way to share this knowledge with people in your organizational context, in a way that is always time constrained. At Atlassian we have a long history of doing kind of internal training. We have this thing called bootcamps. We’ve got almost like a voluntary system where people come in and they run these bootcamps and you can go and learn about the finer details of advanced Jira administration or you can come in and learn about how to do a customer interview. But the timeframe around that was like – like a two-hour bootcamp was a long bootcamp. Most bootcamps are like an hour. And so, when we started thinking about this we were like we’re going to need at least a day, maybe two days. And everyone was like nobody will turn up. But yeah – fortunately we have – we do day long sessions and people have been turning up. So, that’s great. But yeah, it’s a huge effort to try to come up with something that works well. And every time we do these courses the trainers and educators go away and think about what worked and how can we do it better. So, we iterate every single time. So, yeah, it’s a huge amount of effort. I think in larger organizations too, there are other people in the organizations who are also tasked with learning type stuff. So, we have a couple of different teams at Atlassian who are kind of involved in helping educate within the organization. So, we’re building a lot of relationships with different parts of the org than we have before in order to try to get some support and even just infrastructure. Like there’s just a whole lot of like logistics behind setting this up if you want to do it in any kind of scale. It’s great to have that support internally and it’s really good to start to build these relationships across the organization in different ways. But yeah, I think that we certainly underestimated the challenge of just designing what education looks like and how to make it run well. It’s a massive, massive effort.

Steve: It’s not the material – you guys probably have at hand a pretty good set of here’s how to go do this. If you brought an intern onto your team you could probably get them up to speed with material that you have, but you’re trying to change a culture. And I think the advantage that I have, the context that I have as an external educator is that people are opting in and that I have to by the assumption that they want to be there and I don’t have access to what the outcomes are. I might through someone that’s my gatekeeper, but it’s kind of one them. I have the responsibility for the training, but not the responsibility for the outcomes, which is what you all are kind of working with. So, I envy you and I don’t envy you.

Leisa: Well, I think I – I like it because, you know, going back to the build/measure/learn thing, right. Again, we did a learn before we did our build and measure because we’re researchers and that’s what we do, but it is – it’s super interesting to see the behaviors of people who have come through the training and see whether they shift or not. That gives us – we get feedback from people after the course who tell us whether they thought it was useful or not useful and what they liked and didn’t like, but then we actually get to observe because through our ops team, they come through us to do their recruiting and get their incentives. So, we can keep a little bit more of an eye on what activity is coming out and see if there is any sort of shifting in that as well. And it’s not just courses as well. It’s thinking about like what’s the overall ecosystem? What are other things that we can be doing where we are sort of reaching out to help support people on this journey as well. Before we did educating, we had advisors who kind of made themselves available to go and sort of support teams who recognized that they might have a need for some help with getting their research right. So, that was kind of our first attempt. But we had to pivot into educating because of the time factor. We would go in and give advice and everybody would go it’s great advice. We’d totally love to do that, but I have to get this done by next Thursday. So, I’m going to ignore your advice for now and carry on with what I was going to do anyway. And that was pretty frustrating. So, we felt like we have to invest in trying to get ahead of the curve a little bit – try to get ahead. Try to not necessarily influence the stuff that has to happen by next Thursday, but try to encourage teams to start being proactive and planning to do research instead of doing this really reactive work instead. Or as well as the reactive work perhaps. I don’t know.

Steve: Have the right mix.

Leisa: Yeah.

Steve: I wonder if – and this is probably not going to turn out to be true, but I wonder about being proactive and planning versus time. The time pressure to me is about oh we only have so many hours to spend on this, or that calendar wise we need to be there next Thursday. But being proactive says, well if we started thinking about it three weeks ago we’d be ready for it Thursday to do it “the right way.” I’m wondering, can we tease apart the pressures. One is like no proactivity, the sort of very short-term kind of thinking, is different than hours required. Is that true?

Leisa: I think so. Because I think that even want to do the short-term stuff we still spend like quite a lot of time and effort on it. And the planning in advance, like the more proactive work doesn’t necessarily, I don’t think, entail more actual work. It just might be that you put in your recruitment request a couple of weeks beforehand so that we can try to make sure that the people that you meet are the right kinds of people, instead of if you have to have it done in 3 or 4 days time then your ability to be careful and selective in terms of who you recruit to participate in the research is very much limited. So, we see – when we have those kind of time constraints you see everybody just going to unmoderated usability testing and their panel and that introduces a whole lot of problems in terms of what you’re able to learn and how reliable that might be. Yeah. I was thinking for a second about you know when – theoretically when you do your unmoderated usability testing you should still be watching the videos, right. So, that should take as much time as watching a handful of carefully recruited people being – doing these sessions in a facilitated way. But the reality is, I think, that most people don’t watch the videos, which speaks to quality.

Steve: Here we are back again. It seems like there’s a, for the profession overall, maybe one way to start sort of framing the way around this time pressure thing is to decouple proactiveness versus sort of hours burned. That it’s going to be the similar number of hours burned, but if you start earlier the quality goes up. I had never really thought about those two things as being separate.

Leisa: Yeah. And I don’t think people do. I think when people think about – and this is – I mean I sort of said the word quality. I’m trying not to say quality anymore. I’m trying to talk about meaningfulness more now, I think, because whenever you talk about quality to pushback that you get is well it doesn’t need to be perfect, so it doesn’t need to be academic. I just need enough data to be able to make a decision and understand that. But then I see the data upon which they’re making the decision and that makes me worry, right? And I think that’s – we want to get out of this discussion of like quality levels, more into sort of reliability levels, like how reliable does it need to be? Because surely there has to be a bar of reliability that you have to meet before you feel like you’ve got that information that you need to make a decision. But I see loads of people making decisions off the back of really dreadful and misleading data and that’s what worries me. And they feel confident with that data – going back to the data literacy problem. Like they really haven’t dug into why they might be given really misleading answers as a result of the who and the how and the why, all those decisions that are made around how to do the research, most of which have been driven by time constraints.

Steve: Okay, that’s the end of part one of my interview with Leisa. What a cliffhanger! There’s more to come from Leisa in the next episode of Dollars to Donuts. Meanwhile, subscribe to the podcast at portigal dot com slash podcast, or go to your favorite podcasting tool like Apple Podcasts, Stitcher or Spotify, among others. The website has transcripts, show notes, and the complete set of episodes. Follow the podcast on Twitter, and buy my books Interviewing Users and Doorbells Danger and Dead Batteries from Rosenfeld Media or Amazon. Thanks to Bruce Todd for the Dollars to Donuts theme music.

May 22 2019

47mins

Play

Rank #2: 26. Jesse Zolna of ADP

Podcast cover
Read more

In this episode of Dollars to Donuts I talk to Jesse Zolna, who leads the User Experience Research Team at ADP’s Innovation Lab. We talk about driving change as an experiment, exposing the organization to how customers solve problems, and engineering psychology.

One of the challenges we face is getting “credit” for the work that we’ve done. A lot of what we do is help people understand the problem space better and understand these things that their users aren’t able to do, or want to do, or whatever. And oftentimes it’s not going to be brand new. Rarely do you come up with something that nobody’s ever thought of before. A lot of times we help solidify or better articulate those problems, which then you can attack much better. – Jesse Zolna

Show Links

Follow Dollars to Donuts on Twitter and help other people find the podcast by leaving a review on Apple Podcasts.

Transcript

Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization.

Northern Soul was a musical and cultural movement in the UK in the late sixties and early seventies. It was all about obscure soul music from America. The movement really was a scene, with clothing and dance styles, and clubs hosting dance parties, but let’s just focus on the music. For people in the UK in the 60s it wasn’t easy to get music from the US. In fact, this difficulty figures into the origin story of the Rolling Stones, where Mick Jagger and Keith Richards reconnect during a chance meeting on a train platform, and one notices the other has possession of some rare and desirable albums from the US. Anyway, Northern Soul started in that context initially, the difficulty of getting any of this music and then went on to specifically emphasize the rarities. Remember that there was no consumer music duplication technology – you had to have the 45. Many of the songs that became Northern Soul legends were commercial failures, failed artists, failed labels. And yet, these 45s, these songs found a second life, across a time, across an ocean, across cultures.

Decades later, we have the Internet, and we have globalization, and we embrace consumer enthusiasm. So there’s a cafe in Beijing modeled after Central Perk, from the TV show “Friends.” Mexican-American lowrider culture has been taken up in Brazil and New Zealand.

I find these examples fascinating, and given the frequency that these stories appear, I’m not the only one. Given the work that I do, my interest is specifically because these stories typically involve people going around the brand. The producer makes certain products and provides them to a certain audience in a certain marketplace. Sure, it’s a statement of identity to watch “Russian Doll” on Netflix, but your effort to both discover and consume is minimal. But to buy authentic parts for a product that isn’t made any more, from another country, for example, takes a lot more effort. Even with the Internet. This kind of lead user consumption is interesting.

So that’s the consumer side, going around the default path laid out by the company. And on the producer side, larger companies seem pretty intent on managing both globalization and localization. You can go to a McDonald’s in 101 different countries, but the menu will be different. Outside McDonald’s in Thailand, Ronald is posed giving the “wai”, the traditional Thai greeting of a slight bow with hands pressed together in front of the chest. Even though the McDonald’s brand spans cultures, the Thai experience is specific and self-contained within its own environment. McDonald’s redefines itself within the boundaries of the national border and even though we know Ronald can be found everywhere, this Ronald reminds a Thai customer that he is specifically in Thailand. Ronald is trying hard not to be the tourist who awkwardly adopts the local customs in order to seem “down” but actually someone who has moved in and become part of the scenery. Of course, brands, their symbols and indeed their products change meaning when they move from one culture to another, but we can consider the bare minimum meaning, just the fact of its existence in any particular culture.

Earlier this year, McDonald’s in the US introduced a limited-time “International Menu” featuring Stroopwafel McFlurry, the Grand McExtreme Bacon Burger, the Tomato Mozzarella Chicken Sandwich and Cheesy Bacon Fries from respectively Netherlands, Spain, Canada and Australia. Linda VanGosen, McDonald’s vice president of menu innovation said in a statement that “We know our US customers are curious about McDonald’s international menu items.”

It doesn’t matter if these are any good. And these aren’t intended to be authentic representations of the cuisine of these other countries; they are presented as authentic exemplars of McDonald’s in these other countries. The point here, the fascinating thing to me, is that McDonald’s is acknowledging, at least to its American market, that seams exist, that the way you experience the brand is limited by your geography, and another variation of McDonald’s – a non-American version – is out there. This is a very contained action by McDonald’s but the way it breaks the frame by pointing to something outside what they so carefully control and design is huge.

The homework for all of us is to keep our eyes open for how and when producers acknowledge in any way what lies outside the set of things they are providing to us. I believe this will continue to change. And for those of us who are in the business of producing things to be consumed, McDonald’s is signaling here that we have more choices than we might have previously thought possible.

This is an important aspect of the work that I do, helping companies to unpack these shifts in culture and consider the ways they might respond, in terms of what is appealing to the marketplace and what is authentic to the company itself. My clients have a lot of deep knowledge of what they have been doing, but they often need help getting outside that, and I help teams to build a new shared perspective and a plan to move forward. And so, the best way to support this podcast is to support my business. Hire me to help you bring a nuanced external perspective to how you understand your current and future markets. Get in touch and let’s discuss what we might do together.

I’d also love to hear how this podcast is helping you in your work. Email me at DONUTS AT PORTIGAL DOT COM or find me on Twitter at Dollars To Donuts, that’s d o l l R s T O D o n u t s.

Let’s get to the interview with Jesse Zolna. He leads the User Experience Research Team at ADP’s Innovation Lab. Jesse, welcome to the podcast.

Jesse Zolna: Thank you, Steve. Thanks for having me.

Steve: Let’s start by having you say a little bit about who you are and what you do and we’ll go from there.

Jesse: Okay. So, I lead the User Experience Research Team at ADP’s Innovation Lab, is one of the ways I put it. ADP has a sort of complex arrangement of business units and I fall within what we call shared products. It’s the stuff that runs behind all the different business units. I end up interacting with a lot of the different front end products that ADP makes. And so, just as a short term I say the innovation lab. I started here about 5 years ago when we opened the Innovation Lab. So, ADP made the decision to invest in UX and agile and design thinking, sort of all at the same time. I think I was hired #18 or 19 in the group. So, I’m leading the UX research team here in the lab.

Steve: What are some of the business units that – your lab works across these different – for those of us outside of the organization, what things could we learn about that ADP makes that you’re kind of working to inform the design of?

Jesse: At the simplest level, ADP has sort of three business units – small, medium and large businesses. And we make essentially payroll products for those businesses. That’s our core product and then on top of that we layer what we call HCM products, so anything that’s related to HR really. And then a lot of that will feed into payroll. We have what we call core HR which is all about people’s data, about who they are, where they live, and that stuff impacts the taxes that we take out of your paycheck. We have time products within each of these business units. You’ve got to record your time and obviously that calculates your pay. We have retirement services products, so like 401(k) stuff. Medical benefits, that kind of stuff. So, if you think about the paycheck as sort of the center of our business and then all the sort of – we have lots of different products that sort of feed into that center of business.

Steve: What does HCM stand for?

Jesse: Human Capital Management. It’s the newer term for HR.

Steve: I’m old enough to remember when it was called personnel and then that got outmoded by HR and now I’m going to be a dinosaur if I say HR.

Jesse: So, back then HR were bean counters and the just a cost center – the people were just a cost center. Now, today, people and talent are a company’s most valuable asset.

Steve: Okay. So, the term reflects kind of a different mindset.

Jesse: Yup.

Steve: Having heard it from you I’m sure I’ll now be recognizing the term when I come across, but I haven’t heard that before. So, I’m wondering, you described some of the different services that are provided – you’re providing services to companies who use these products to do all the human capital management and payroll for employees. How much of these different services do the employees themselves interact with? Do you know what I mean?

Jesse: Yeah. So, the initial project that I worked on when I came here is what we call the employee and manager experience which is essentially both a desk and web app that the employees use. Until ADP decided to focus on that, that was sort of an afterthought. We were building products for the HR and payroll managers that worked at our clients and an unnecessary evil was things where the employee needed to be able to go look up their information or enter their time or whatever. You know if employees wanted to get paid they’d figure out how to use our time products, so we didn’t invest a lot in that. But then as part of the decision to focus on that they realized that actually 99% of their users were employees and managers because each company has 10 HR and payroll people and 10,000 employees, or something like that. So – and it was also sort of a push into realizing that technology was changing and people were expecting better from everything that they use on a day to day basis and getting basically complaints from our clients that their employees found our products hard to use which would then reduce productivity, for both the employee who was trying to figure out how to enter their time, or figure out what medical benefits they want rather than working. And then the HR, the payroll person, had to fix errors, or help the employee navigate the system instead of do like more strategic work that they should be doing within like talent management.

Steve: So, the customers are identifying – your customers are identifying changes they want to see in the products you’re delivering to them, or the kinds of experiences you’re creating.

Jesse: Yup.

Steve: Which is not a signal to ignore.

Jesse: Yup.

Steve: So, this is the innovation lab. So, what does that mean? Those are both sort of very loaded words. For ADP what was sort of the – what was created at this kind of transition point?

Jesse: In the beginning it really was an experiment in choosing sort of the ratio of UX to development and at the same time moving to an agile development process. So, what they did was they hired all new people from different places. I came from Barnes & Noble. Worked on like their E-commerce site and the Nook device. A designer who came in at the same time he came from a media background, so had worked for NBC and MTV in the past, on their like web properties. So, they were trying to bring people in from different industries and think differently about what consumers want? And they hired about, I think, 100 people in the early days and 20 of them were UX people, whereas at the rest of the company there were probably 10,000 developers and less than 20 UX people. So, it was an experiment and if we really invested in this like will the outcomes be different. So, we had the employee and manager experience as one of our first projects and we sort of proved that you can move faster and get better results with that new ADP innovative way of thinking. So, the idea was to be an example to the rest of the company. So, now the rest of the company has seen what we’ve done at the lab and over the last 5 years has really adopted a lot of those practices and hired a lot more UX people in each of the different business units, each of the different groups and transitioned to agile development process. Both of those have been more successful in some places than others, but everybody is trying to move that way.

Steve: It’s interesting that you frame this as an experiment. What did that feel like kind of in the early days. You were, as you said, one of the early people to come into this. What did it feel like to be something that was framed as an experiment in a company this size, and I think with a big history, or long history?

Jesse: It was exciting because there was the opportunity to jump right in and have a large impact because there are so many clients and so many people using our products, yet we could start from scratch at the same time. And that’s how they sold it to us, like a startup within a big company, and we already have scale. And the other thing was we could do whatever we wanted kind of. We thought that was the right thing to do. We could do that. You know we didn’t have to go through all the red tape that the rest of the company had to do. So, we could just, you know, spend money on recruiting. We could have a very simplified NDA. Talking about research processes, that would just make it easier for us to get our work done and not have to get approval from legal and GSO and everything, which these days we have to deal with a lot more.

Steve: So, over these 5 years, as you said, the things that only were happening in the lab are now starting to, or have started to, happen in other parts of the organization. That seems like one measure of success for the experiment. It’s kind of creating models for practices for the organization to adopt. I wonder what does that then create, or open up space for the innovation lab itself to them move into different kinds of work, sort of after having birthed some processes and have them been taken up – what’s shifted for you all?

Jesse: I would say at the beginning we were really focused on our one product and proving that the user centered design process could work and you could get rapid feedback and it didn’t slow down development, and then you would see the effects on the market. And now other teams are doing it, and as you said one metric for success. Another one is that we’ve sort of changed the organization’s thinking around UX and what user centered design really is. So, the sort of simpler application of UX research is usability testing, or more evaluative, or maybe even like concept testing, like lean UX testing. When I came in, lean UX testing was like really all they thought research was. So, we’ve established also a program of ethnographic research where we do sort of foundational stuff that doesn’t necessarily have what changes can we make to this product to make it better, but broader, like what changes can we make to this quite of products and understanding how our clients are using our products within a greater context, to look for the “white spaces” and that has really opened up the organization’s eyes in terms of how research can help strategically too. You know strategy used to be the domain of only product management. They would come up with a strategy – you know based on what our competitor did, now here’s our new strategy. By the way, the competitor probably just copied us, so there’s a lot of like circularness to that. So, we’ve been able to bring in new thoughts and new ideas which I think has helped us rethink the way that we’re approaching providing tools to our clients to manage their human capital.

Steve: And is there a connection between the success you had with these initial efforts, in terms of changing the mindset of the organization? As you say, ideas only came from these kinds of places and now – obviously we know that ethnographic research can provide the kinds of new ideas that you’re talking about, but the change that’s happened here is that – I think you’re describing that people are receptive to that? They’re like oh yeah, this is another source for us to think about what to do. So, is there a connection between that mindset which is new and then the – was that driven by the work that the lab started off with?

Jesse: Yeah, well it’s related to what I said earlier about us being able to do what we think we should do because we think that’s the right thing to do. So, in the early days we were working on payroll and, aside from the employee/manager experience, I was helping with the payroll product which again is our core product. And I realized that like we knew a lot of people – we had people who knew a lot about how the time product works and how they gather that information and then it gets sent into payroll. We knew a lot about how payroll would process that. We knew a lot about how core HR data would influence that. So, we had people who understood those silos really well. But there was no understanding of how those things all kind of worked together. So, I proposed that we do some site visits where we just go watch people during their payroll process and just like say, “teach me how to do payroll from your company.” Because also every company kind of does it differently. They all have ADP payroll, most of our clients, but then they might have our time system. They might have somebody else’s time system. They might have somebody else’s compensation management system, or ours, or whatever. So, everybody kind of does it differently. So, we went to visit a bunch of different clients and like try to find the themes across and what we came back was information that was very surprising to the people internally and actually our first – we presented it several times and the first few, their reaction basically was give the name of that client, I’m going to go show them how to do it right. And I was like well, I can give you the name of all 12 clients we visited, but then what about the other 99% that we didn’t visit, are you going to go train them too? And they like didn’t quite get – they wanted to go solve it for that client that’s like very solution oriented, rather than understanding the problem and thinking about how can we work together. And they literally didn’t believe me that clients would double and triple check the time data before they put it into payroll and then after they put it into payroll, and spent hours doing this. Like literally going through Excel spreadsheets. One of the funny things the researcher on the project came up with, she used to call it the rulers of payroll. We literally had clients that would use a ruler, like on the screen, so they could look across these really long lines of data. And somebody else would be on the other screen and they’d say whatever, 1,400 – I don’t even know what the numbers were, but they would like call and respond to make sure that the like thousands of lines of data – and they were like why would they do that? It just imports automatically. But I mean the answer really was ‘cuz one time in the last year, or whatever, or maybe 20 times, I don’t know, there was an error and that caused like somebody to not get paid. And the payroll person thinks maybe they won’t be able to pay their mortgage because they didn’t get paid. So, they want to be super sure. So, anyway, it was like this idea of understanding things across these silos, coming up with these new insights that we never would have seen if we didn’t look across those silos. And so since then we’ve done probably 8 or 9 of those projects. And really we kind of started off with here’s a big domain that we kind of want to really understand deeply and we don’t necessarily have a specific piece of the product that we’re going to improve from that. And I think it gives everybody a better understanding and it gets the silos together. We actually – that research program we have has three stated goals. The first is to get people from different silos working together and to understand our users in the same way so that when they talk they have like an equal understanding and like the same vernacular and things like that when they do try to get together and work together. Number two, is just that – we call it the foundational understanding of what are our clients doing? What are the biggest pain points? How can we help them? And number three is to include non-researchers in the research process as well. So, when we go onsite we aim to have one researcher and two or three like developers, product managers, designers as like observers and note takers. So, they get to participate in the research process, understand it, see how it works. Maybe bring some of that – not only bring that empathy back home with them, but bring some of that research process and understanding too so they can think about when might be another time to do research as well.

Steve: Lots of really fascinating things. I want to pick one thing to go back to. So, when you started to present back these kinds of behaviors that you observed, like the triple checking and so on, it was surprising. “I’m going to go fix that.” What do you do after that? What do you do to sort of bring – to help these stakeholders understand what you learned and what it means and a better way to think about it than sort of fixing each of your thousands of clients?

Jesse: Actually that initial reaction was why we decided to include the third goal which is to get them out there because I figured if these people were there and they saw it they wouldn’t not believe me. I was lucky that we had some of the people who helped to establish the innovation lab and really believed in user centered design and research and are of the mindset that they want to learn. I think that makes a difference too with researchers working with people who want to learn versus people who don’t necessarily want to learn, or just want you to prove that they’re right or whatever. That’s a whole other topic of conversation. But luckily we had a few people there who from an executive level, or air cover level, were able to – I remember literally one woman said to me – you know, I told her don’t shoot the messenger, like listen to him. Like try and understand what’s going on and try to work with that and not – you know so that was super helpful to have like some believers that could help them kind of work through it really. And it took time too. Like you know just like the initial reaction was that’s wrong, like no way, let me call them. But over time I think these people were able to realize, oh wait, maybe I should like look into that. And maybe if they did look into it they could see some evidence that helped them believe and understand it better too. Because also having gone and done this ethnographic research, you know I get a super deep understanding and I can only present so much of that data and like try to tell that story and purposely try to tell it at a high level so they can understand it all. And it’s really getting those deep details and those, almost the anecdotal evidence that really brings color and life to it, that again the reason that we came up with that third goal to get those people out there.

Steve: So, if you think about someone who – it might be someone – I mean I think your caveat about they want to learn – someone who wants to learn but maybe is inclined a little bit to say well I better intervene and fix their erroneous behavior – what changes for them when they see the behavior that we’re talking about versus they hear about it from you? Like what is – I think there’s something fundamental that’s different for them. What is that?

Jesse: I think it just becomes more real. It’s no longer a rumor, or somebody else saying this. They like see it. It’s no longer a PowerPoint slide, or even – and this is why you use quotes and videos in your presentations, right, like that makes it real too. It’s no longer words on a page. It’s a person feeling pain in front of them and it’s just like more – it just becomes more real and tactical and you know I think it just fills in all the details that you need.

Steve: It reminds me of – I guess of a minor failure a few years ago where we were – our users we were looking at were people who were “unbanked” is the phrase, or “semi-banked.” And a tech company where the team themselves, that was not their lifestyle and we went out in the field with them and they were so affected by the experience, but then they asked us to – I think we didn’t do a good job sort of understanding the request, but they wanted an edited video of an extremely tight running time to like use in part of a meeting. And I think it emerged later on that the objective was for people to have the same emotional reaction from that that they did being in the field. And I get where they’re coming from. Like this really changed us and we want the rest of the organization to have this. Eventually I realized oh, you can’t get that from a video. Even though, as you say, videos sort of bring it alive more than words on a page might, but this experience of being out there – I think you’re right about sort of the anecdotal stuff. There’s the facts, sort of the details of the narrative, but you just saw things that I think those people were very moved by what happened to them in the field.

Jesse: Yeah. I think part of it too is they might not go to all of the visits, but they see one person and then when you present to them like the theme, the summary, then they can say, “oh yeah, I saw that in Joe.” So, they just understand it sort of one level deeper. And that’s what like as a researcher we often do that, like we have these themes and we understand it at a really deep level, but it’s hard to get that deep understanding. It’s hard to express that without showing all the data, but you’re trying to do – you know, just like these people in this meeting only had a certain amount of time, you know you only have – you can’t have everybody go participate in the entire research project the entire time. So, you have to summarize for them. It’s part of what we do is we go learn things and we summarize it for people so that they don’t have to go learn it themselves. But you just can’t get all of your learnings across. It’s hard. Which is part of the reason, on another subject, why, no offense, but I personally prefer being like an internal research person and like having worked on these projects for 5 years, like I feel like stuff comes out from 3 years ago that I can go back to. And I think there’s that sort of institutional memory which I realize now it’s ironic because 5 years ago when somebody would say, “oh well this is how we’ve always done payroll” I would like want to strangle them and be like that’s the worst thing for a researcher to hear is that we already know it. Like we don’t need any new information. Now, when people come to me with questions I’m like well I already know the answer to that because we did research 3 years ago. So, I have to actually consciously make sure that I’m always open for learning too, having been here for so long. So, maybe that’s the negative of being an internal person. As a consultant, you can always come in with fresh eyes and be willing to learn something new.

Steve: Right. I mean, I don’t know, the thing I like about consulting is – I mean, I’ll say the same thing you’re saying – the thing I like about consulting is being able to come into something that I haven’t seen before and try to figure it out. It’s often usually overwhelming, but I rely so heavily on people that are, that understand the problem space, so that I’m not trying to like reinvent the wheel. And I have admiration and a little bit of jealousy for people that stick with something for a long time because the time horizons for – I mean the kinds of change you’re talking about making, it’s very long. I don’t work that long term with somebody. My rationalization is we need both. You need people that live in the problem space and facilitate new ways – that hold onto the depth and sort of advocate for that depth of insight. And you need ways to keep getting fresh insight that you have with your changes over your tenure. I think we agree and you’ve got to find the role that you can sort of thrive best in, I guess.

Jesse: Yeah, totally. Yeah I mean even as an internal person I like to go into a research project as naïve – maybe not as naïve as possible, but relatively naïve so you could naturally ask the question when – well what do you think it should be – sort of in the research process. Or, how would you want it to go? Instead of like knowing the answer to that and possibly being somewhat biased or explanatory rather than questioning.

Steve: I shy away from getting into like technique too much in these conversations because I think we’re looking at the organization, but that being said, you said something that kind of intrigued me and I just wanted to ask a little more about it. When you described these ethnographies you said going into these organizations and saying, “teach me how to do payroll.” I don’t know what you literally said in the actual interviews, but any thoughts about the framing? Like there’s lots of ways to get to learn what people are doing and I wondered if you had a point of view about “teach me” as sort of the mode of inquiry?

Jesse: Yeah, I mean in that case we were purposely trying to, um, understand – uh, figure out what we don’t know. You know you don’t know what you don’t know. And we were trying to get the user’s point of view on it. So we didn’t – in that case we did not want to kind of structure the interview very much at all in a certain way because we wanted to see sort of what came out from them. We were possibly overguarding against imposing our own sort of biases in understanding and structure in it, but I mean that was essentially why we chose that sort of line of inquiry. And I do think there were literally like 4 or 5 questions we used there. And will say that – so my – we started to talk about my background, but my first job out of undergrad was for a marketing research company and their technique was literally one question and they would do one-on-one interviews and it would just be like tell me your thoughts and feelings about how “X” impacts your life? Or tell me about your – something like that. And it was literally one question. And then you were only “allowed” to repeat a word they said, or like say – we would either ladder up or ladder down. So, it’s either – so if they’d say like, “it makes me feel tired.” “Well, what happens when you’re tired.” Those would be like laddering down. Or like what does tired mean to you, or something – I think that was laddering up. It was a long time ago now, so I’m probably butchering it a little bit. But like literally one question and then like never say a word that they didn’t say yet. So, that’s kind of the technique. That has driven sort of my discovery type research, exploratory type research ever since, is like never impose your own words or structure or anything on it and just let the person you’re talking to teach you.

Steve: I’m so self-conscious of whatever word I use now, or whatever question I ask you. The reason I’m curious about the “teach me” question is I think you could ask them – since we’re talking about what are these kinds of questions we might ask, is there a difference between how do you do payroll and teach me how to do payroll?

Jesse: Um, yeah I think there is. Um, I mean I don’t know if this is necessarily true, but if you say how do you do payroll they might say this person is from ADP, I’m going to tell them how I use ADP products. Or, I’m going to tell them the right way. Whereas if I say teach me they’re going to show me the real way in a way. You know what I mean?

Steve: Yeah.

Jesse: Like it’s almost like you go to a university course on how to do something and then you go in the real world and you try to do it. It’s not exactly like the book says. Right? So, maybe it’s how do you do payroll – they would give me the book version. And teach me how to do payroll and they’d give me like the real life version.

Steve: That’s a really nice distinction between the two. So, we’ll switch gears a little bit. So, as part of this effort to bring more people out in the field and get them kind of exposure to the lives of customers and really participate in research – I think you talked about note taking and other kinds of roles that you’re giving to them. So, they are – at one level they are performing some of the tasks of research. I guess I’m wondering what’s – is there more? Is research happening that doesn’t involve you? Are they doing some of this on their own? I don’t know who “they” is?

Jesse: So, it’s related to the idea of the innovation lab sort of teaching the organization how to do these things. And so, I mean, because we’re getting different people from different groups together too – it’s not always a group that has like a robust UX function, right. So, they might go back and if they want to continue to be – to learn, they might have to do it themselves. And yeah, I think that’s happening. I do think there’s sort of the constant debate everywhere, and certainly internal within ADP. You know we have different answers in sort of different parts of ADP is like who should do the research, or whatever. And you know my team are all like dedicated researchers and we have our design partners that we work with and the design partners certainly help shape the research and we certainly help shape the design, but we’re sort of two different people and we think that splitting the sort of lead responsibility is more efficient and we can get a lot more done. Some places, even within ADP’s organization, they say that designers should do the research. And you know I’ve observed also, sometimes some people say designers should only do like evaluative research, like prototype testing type research because that’s sort of more straightforward. You know you get a task, you ask people to do a task. It’s a lot harder to like, I don’t know, lead people, I guess. But some people might argue that they might be biased and they might interpret – if they love their design too much. Again, it goes back to are they really willing to learn, or do they just want to like collect data to show that they’re right? It’s just like different ways that different people are. So, some designers probably are great at doing that and some designers might not be. And then some people might say designers actually should be part of the exploratory research so they can understand it more deeply and really – you know like we talked about earlier, having collected the data you understand the data more deeply and really understand the pains that they’re designing for better when they get to the design process. I guess you could add those together and say designers can do any kind of research. And I think that’s probably true. And then, not only designers, but product management, could also do that stuff too. We definitely encourage everybody to get out there and talk to clients and understand what they’re doing. Sometimes I worry that if you get out there and talk to one client you might overcompensate for their unique need, or pain, or whatever, which could ruin it for the other 99,000 clients or whatever. So, we do, in that program I was talking about, try to force people to participate in more than one data collection point.

Steve: I think there’s just lots of interesting issues around this. And you’re right, this is a very common topic, or question or debate, or just issue. I mean I just wonder long term – I mean right now there are people with the title researcher who have sort of expertise in all the things that you’re talking about. You listed a bunch of different aspects of what researchers are doing and I wonder sort of where does it take us, and I don’t know whether it’s 2 years or 50 years, where we’re holding on to some part of it and we have a belief that we bring value by doing that and we’re empowering other parts of it. And we believe that that adds value. Sort of wonder what will that ratio be? Or what should it be? Or will we have a role with this title in the future? Or is it a process that then gets handled a different way? I don’t know. I don’t know if you have a perspective on the misty futures?

Jesse: Yeah, I don’t know. It’s a tough question for sure. I guess while you were asking that question it reminded of – I heard a great quote recently from, I forget his name. You probably know his name, the guy who leads the design team at InVision, his quote about design thinking was “the worst part about design thinking is the word design” because everybody should be thinking this way. Obviously empathy is the number one step in design thinking. Everybody should be thinking empathetically. And we definitely have groups within ADP too where you – where you “UX” has matured enough that everybody is thinking empathetically and they think about UX as sort of everybody’s role – development, product. You know we have the triad formation – UX, Development and Product – and they work together on things. In some places all three of those people are really thinking empathetically and about end user needs. And also in most places the UX people are thinking about is this feasible or not, right? And trying to understand the tradeoffs in terms of like can we still get to the same sort of meet the user need in a different way that might be more feasible, or easier to do, or whatever too. I mean I think you’re really being successful at “design thinking” when everybody is doing all that stuff.

Steve: It reminds me of just the evolution from human resources to human capital management. There’s a fundamental shift in the belief of sort of the value of something that we used to optimize and now we want to enable and that you’re talking about sort of how products get developed and this idea of empathy being something that everyone can and should have, not sort of one group is helping another group, but that it’s a shared value or principal. So, I wonder if there’s just opportunities – this is not ADP specific, but just the profession in general to rethink how some of this stuff is structured, to get to that kind of organizational culture, or shared value and belief. Because there’s a skill aspect, right? I mean some of what you’re doing is teaching people how to do some of the mechanics of research which is – I don’t know, I’m talking more than I’m asking. Let me try for a question. Does teaching people sort of the processes and tasks of research, is that a way to help illustrated these other principles like empathy and human centeredness?

Jesse: Yeah, I mean I think the good processes and tactics for research come again from that – I don’t know if selflessness is the right way – the right word, but like the willingness to learn and the ability to admit that I might not be right, or I might not already know what needs to be done. And again, I think everybody needs to think that way. But even going back to the example of the way that I ask questions without saying somebody else’s word, like part of that is because I can’t assume that I know what you mean by happiness, or whatever, right? And so I think teaching that sort of instinct to not make assumptions is a big part of teaching the tactics for doing research. And it’s very much related to the reason Product and Development are asked to take notes during the site visits is because in my experience Product will do just what this woman did when I tried to present the ruler thing to her and say, “oh let me show you how to do that.” Like, in response to a question of how should I – I wish I could do something rather than say why would you do that? Or, how would you like to do that? They would say “here’s how to do it.” And that’s like not going to help you learn anything. It might help them learn for that one moment, but if you can learn what’s – you know ask the 5 whys? I love the 5 whys? Ask the 5 whys and bring it back and understand that and like build towards the 4th or 5th why. I think you’re in a lot better shape.

Steve: Can you explain the 5 whys?

Jesse: So, the 5 whys is a technique of interviewing where you just basically – somebody says something and you say why? They answer that and you say why, why? Basically, it’s getting from the surface level, like tactical, like I can’t do this, to why they can’t do it, or what they would want to do? Because then you can – maybe they don’t actually need to do that. Maybe there’s something an hour ago they should have done. You know what I mean? And you can solve their problem that way. It helps broaden the potential solution space, I think, by understanding the root problem, as opposed to the surface problem. But yeah, I mean I think the 5 whys is really about understanding the root of the problem and that’s what research is about is understanding the root of the problem, especially more like say discovery or exploratory type research, where generative type research is digging deeper into the root of the problem.

Steve: And there’s a lack of presumption that you understand, I think. Right, maybe a sort of question/answer thing is ask, “what are you doing?” “Oh, it’s this.” But I think you keep coming back to this principle of – it’s the willingness to learn of us, but also the not presuming that we do understand what the people we’re interested in are doing. So, the more you ask why the more you’re – it might even just be an interesting signal to yourself, the more you say it’s okay to keep asking why the more you give yourself permission to not know something and then not know something else and not know something else.

Jesse: And I actually – I set that expectation. Like if I’m doing an interview, part of my standard sort of interview introduction is I’m probably going to ask you some stupid sounding questions, I’m just making sure that I understand what you’re saying. You might feel like you just answered that question, but I’m going to ask it anyway. Please just like forgive me in advance, bear with me on that. And so it gives me permission to ask some really dumb sounding questions, but sometimes I get surprised and it’s like the best question I’ve asked all day.

Steve: You know back to our consultant and internal person thing, one thing that a consultant can do is not – I mean I guess you can set up this way – you are likely, I’m guessing, to be from ADP a lot of these times?

Jesse: Yeah, totally. I think that’s definitely like going in naïve is a great thing because you can genuinely ask a stupid question.

Steve: Do you have to overcome this sort of ADP relationship in setting up that kind of framing, to say I’m going to be asking dumb questions?

Jesse: Yeah, I mean I think – I mean I think for me it comes kind of naturally and maybe from my training, or just the natural way that I am, but I think – and like I said, I set it up at the beginning so that it doesn’t seem weird.

Steve: So you mentioned a little bit about – you mentioned your training. Do you want to say a little more about – you mentioned this market research you started off working in, but maybe you can go back before that?

Jesse: So, I went to Tufts University because they had engineering and a good psychology program and I didn’t know which one I wanted to do. Little did I know there was such a thing as engineering psychology. And Tufts had some courses in that. I ended up just doing psychology and then – so then – the story I like to tell is I went and interviewed for like psychology jobs, like – and realized I have enough problems of my own and I don’t want to deal with somebody else’s problems. So, I ended up finding this market research firm that was rooted in what they called Adlerian psychotherapy technique which this idea of just like not ever saying anything and just letting the people talk and talk and talk and talk. So, really, like enjoyed that and around that time like there were new technologies coming out. You know I got a Palm Pilot as part of my work and it was just really hard to use and I learned about this thing called human computer interaction and Tufts had just started like a certificate program in human computer interaction. So, I took some night classes. Thought it was really interesting and like the perfect combination of my psychology and engineering interests. And obviously that was in Boston. I grew up in Syracuse. Both very cold, snowy places. So, I decided to go to Atlanta and get out of the snow for a little while. I went to Georgia Tech and studied what they call engineering psychology, or human factors, and really learned a lot about like social science and statistics and like hypothesis testing and like really like hard core academic research which I don’t necessarily use today, but like I think having that basis helps me think about how to create a research project too. Even the exploratory stuff, I still kind of like have a hypothesis testing orientation to that. Even the qualitative research where I just had people talk and talk and talk. It’s not the same as collecting like hard data like I would have in school. And then, yeah, so from there sort of got into human computer interaction. You know they have a great Master’s program. I went to the engineering/psych PhD program because they would pay me rather than me paying them. That was another sort of thing that brought me along there. And then ultimately knew that like I was going to return to New York at some point so came back up here and got into the industry. So, I mean all that – so, it’s very psychology heavy. It’s very social science and like experimentation heavy, my background.

Steve: Yeah. And the team here – what’s the team look like?

Jesse: My team are 6 researchers and we have a mix of backgrounds. So, it’s not all psychology people. You know over the years I’ve taken people from biology programs, which is also scientific, but not – you know I’ve taken people from like design programs and they’ve all been good, you know great people. I’ve had people with human factors backgrounds. You know. So, we have six researchers and basically we’re like semi-embedded in like the products within shared products. So, like we – each of them sort of has a product that they focus on and they have their design and product management partners that they like do a lot of work with and collaborate with a ton and basically they are on that product team, but we all come together every week for our team meetings, but also in addition to let each other know what we’re doing and share ideas and help – you know bounce ideas off each other. And I think, you know, we kind of have the best of both worlds within our little product of being like embedded – you know part of the product team, but also centralized where we can help each other and get support from other researchers. I’ve been a researcher of one at places too and like that’s great. You get to do everything and like nobody questions you or whatever. Or they question you, but you get to design all the studies. But I’ve found that I wish there was somebody that I could like ask advice on this stuff, like research design or something. Like what do you think you would do? Or like help with analysis, or even just like take notes and like put two heads together at the end. Like what was the biggest theme that we found here? So, we get a little bit of both. We get to be a part of the product, but we also have this sort of small research community. And then at ADP there’s another, I don’t know, probably 30 or 40 researchers actually and some of them are two within a group of 10 UX people. Some of them are larger. So, it runs the gambit. But that’s how my team sort of is structured.

Steve: And what’s your role? What’s your sort of leadership role within that? What does that look like for research?

Jesse: So, I help with prioritization of the projects across – like I said, they’re semi-embedded, right. So, if one person is way overwhelmed I might say, “hey, Joe, go help Jane on this project,” because like they’ve got a lot to do and that’s important to my boss, right? Like I help them make sure that we’re focused on the things that are more visible and more strategic for the company and the group as well. That’s not necessarily the part I like the most. The part I like the most is sort of helping people think through their research plans and projects and understand like – so, this person came to me with this problem, but like what does that mean? What should I do? For me the most fun part about research is sort of translating like the request – “oh will you do a survey on this” – to like what we actually should be doing and like designing that project. I really like – to me it’s a nice challenge to do that. So, I institute some, also like processes or whatever, like define the research goals. But it’s mostly to help me help them, but also to make sure like they’re not going into a project unclear with their stakeholders of what they’re doing. Because I’ve also been at ADP long enough that I know most of the people that they’re going to be working with and I know their strengths and their shortcomings and how to like work with those. So, I think that’s another big part is like I’ve learned the ADP culture and like what works here really well. Like establishing the research goals is huge because many times I’ve been burnt with, “well I – well, did you find anything about this?” Well no because we didn’t go looking for that. You know what I mean? “Well why not?” Now I can be like because we agreed that this is what we want to do. It’s a little bit of a CYA, but it’s also like to make sure that maybe they had that in their mind at the beginning, they just didn’t say it to me. So, like working through that with the stakeholders to make sure that I know everything that I need to and now the people on my team know everything that they need to.

Steve: What’s the process for extracting those goals for a research project?

Jesse: Yeah, I mean, so, like I said they’re semi-embedded. So, for the most part they sort of understand it naturally just like in day to day conversation, or like working together and that kind of, you know. But, I mean the process we have like a little bit of a research plan forum and there’s like 5 or 6 elements that we have to fill out. You know the research goal, the guidance is 3 to 4 bullets that explicitly state what we’re trying to find out, but it isn’t so generic – like just 3 or 4 bullets, right? So, not too specific. Also not so generic that it could have been the last project you did. You know what I mean? And then like timing is you know an important one. Kind of what are we going to do with this? So, once we answer these questions like are we going to take an action on that because if there’s no answer to that we probably shouldn’t bother doing this project. Or we can deprioritize it if we have another one that has a big action. What else goes in there? Oh, like the different kinds of materials we need. So, like if we need a designer to build a prototype, like we make that explicit and get everybody’s agreement that like there’s somebody that’s going to help us do that. You know that has the time to do that. Off the top of my head I think those are the main things. The other big sort of process that I try to – try to get everybody on the team to do which I think has been really successful at ADP is in the presentations. So, we do like a classic presentation where we show a picture or like talk about the thing we were trying to understand if it’s more exploratory and there’s not like a prototype and all the things we observed or whatever. But then at the end we summarize what we call our insights table. So, it’s all the insights restated, so it’s our summary. But it’s pretty detailed. And we have our insights framework which is the observation, which is basically the data. Like nobody can argue with that. Then we have a recommendation which usually follows really logically from the observation. And sometimes it’s like so obvious it’s kind of like why’d you even say it. But again, as a researcher, that’s why I ask dumb questions and I say dumb things just to make sure that we all sort of agree on things. And then the action, and that’s we workshop with the team at the end of the presentation to establish the action. So, the recommendation, the guidance there is it should be descriptive, not prescriptive. So, it’s not like make the button a brighter color so people can see it. It’s make the button more visible. And then like you can make it larger. You can make a brighter color. You can put it in a different spot or whatever. Like the team figures out like what’s the right way to do it, that fits in with the rest of the product and is feasible and blah, blah, blah. Because we’ve had a lot of times where like we would make a very specific recommendation and there’s some thing in the background we don’t understand that makes that impossible. So, then everybody is “can’t do that” and they’re just like “I guess we can’t fix that problem.” But like, you know that’s not a good – we’ve got to work through the right answer to that problem. So, and then we have that framework and depending on the team we’re working with, you know some teams might want the prescriptive action filled out. Like they might want us to make a specific recommendation and then talk about it. Some teams, if you make a specific recommendation, they’re going to react negatively and not really want to hear anything else. So, we leave that blank. Sometimes we fill it in, or whatever, and it helps us be – also be very explicit at the end of the presentation. Like, what are we going to do now? And then we take all those and we add them – you know everybody takes their little table and they put it into – we have an Airtable that houses all of our insights from the year and I can then report to my boss, you know we had 750 insights and 350 of them like actually had impact on the products. And that’s the kind of stuff he cares about most.

Steve: I love this separation between sort of the qualities of the solution and the specifics of the solution. These sort of last two columns. And I think I’m probably misaligned in terms of my own language versus yours because I feel like there’s a column missing, which I’m sure isn’t. But if you’re saying sort of observation, like where is interpretation or synthesis?

Jesse: Yeah. So, the observation is like this thing doesn’t fit people’s mental model.

Steve: It’s in there.

Jesse: Yeah, it’s in the interpretation.

Steve: So, that thing that you’re not – I mean there’s a rawer form of data that’s like outside that window that’s like the left most column.

Jesse: Yeah. That would be on the previous slide that has like a picture and like an arrow and like…

Steve: Here’s what’s happening and then your first column observation. Okay. So, that’s our misalignment. I would give that a different label, but I see what you’re saying.

Jesse: Yeah. Each of these observations is like a thing that we saw a few times, or like a theme maybe or something like that I would say.

Steve: I think we get so messed up on these words. What’s an insight? What’s an observation? What’s a theme?

Jesse: Language is tricky. I mean that’s also like something that you see a lot, sort of everywhere, is miscommunications based on using the same work to mean different things.

Steve: So, yeah I think it obviously makes sense within a consistent practice that you run. I’m just like picking at you to translate it so I understand what it is. Thanks for doing that. I mean I really like the breaking those pieces apart and being able to help a team if they need specific action items, or empower them to make that decision themselves that there’s a way that acts on the research and that you’re being very responsive to the way that those teams work and that your process supports a variety of different teams and kind of energies.

Jesse: Yeah, I mean we also find if the team comes up with the action together they’re much more likely to actually implement that because like they can be like that was partly my idea so I’m going to make it happen. You know what I mean? Rather than somebody told me to do this and who’s this researcher? Like they’re a peer of mine, at best. You know what I mean? Like they’re definitely not telling me what to do, so I will – that’s another thing that I see a researcher’s role is supporting other people do their job better. Actually, I’m interested to hear your opinion in terms of like – so now I’m going to ask a question.

Steve: So, we’re at that point. Okay.

Jesse: Like one of the challenges that we face is sort of getting “credit” for the work that we’ve done. I think a lot of what we do is help people understand the problem space better and understand these things that their users aren’t able to do, or want to do, or whatever. And oftentimes it’s not like going to be brand new. Like rarely do you come up with something that nobody’s ever thought of before. A lot of times we help solidify or better articulate those problems, which then you can attack much better. But in the end, the product managers like look at this great idea I had and half the time I know that research definitely inspired that idea or helped to figure out how to do that, but like it’s hard to sort of take credit for it. So, that’s part of what this insights tracker is for so we can say that this action came from this research and get a little bit of credit for it. But I sort of view a researcher’s role as helping everybody else do their job better which leaves it again their job to do these things. What do you think about that?

Steve: I mean you’re just tapping into a bunch of things that I’ve been thinking and talking about and literally had some of this conversation in the 45 minutes before I’m speaking with you now. Where I was talking with someone about sort of the facilitation activity of things are stickier if people come up with them themselves, so helping them have an idea that they think is their own, and then she said what you’re saying, “well then I don’t get credit for it.” It just seems like another, beyond our language misalignment, we also are not maybe collectively aligned around what are we here to do and then how do we measure success? If we’re here to empower people then we have to measure that. And I took some umbrage recently about – there was some – I guess there was, I think, a talk happening and some people were tweeting about it and the quote that kept coming around was, “my research has no value if someone else doesn’t take action.” And, I mean it made me a little angry, but it mostly made me sad that – I mean I think researchers work very hard and are smart and passionate and are often in situations where they maybe don’t feel as valued or feel frustrated based on their expectations about what’s going to happen. My goodness, we don’t need to take that on ourselves and say what I’m doing is not valued unless somebody else takes an action because that to me is ceding all control of success to somebody else because you can’t control if someone else takes action. I mean I say this having worked on research projects for a very long time, in a variety of contexts, and having to sort of let go, or redefine for myself what success looks like. Again, as a consultant I don’t have sort of manager driven OKRs that affect my compensation. So, I might have the liberty to think about it differently. But yeah, if somebody else’s action is how you measure your own success, that just lets go of so much power and control and there are lots of reasons why other people don’t do things. And if we’re in the business of, you know – it’s back to your example of that person who was like well let me go tell them the right way to do it. I mean if you’re – and obviously you want to have a better outcome than that, but if you’re sense of worth and the quality of your work is based on that person’s choice to interpret that information that way, that’s a lot to carry and I feel like we need to redefine some of these things about what success looks like for research and like what are we here to do, right? If we’re sort of – I mean being a supporter is not the same as being an educator, is not the same as being a facilitator. Those are all adjacent words, but they are a different sense of what we contribute. I don’t know. That’s kind of a little rant from me, or a long rant for me because it’s definitely struck a nerve. So, I hear you.

Jesse: I think you said a lot of things that resonated with me too in that rant. It’s good to hear other people are sort of thinking about that too. But I really love the idea of maybe redefining the metric. So, like I said, I report up to my boss the number of insights that we’ve had, or observations that led to an action, right? Like, there’s a million reasons they didn’t happen. Or they might have done a different action that I don’t want to record because I don’t like – so, it’s not a great measure. But yeah, I mean part of it too is like can we take shared responsibility for that success of the product, right? Rather than the product manager being like look at this great product I got, or I have, and like taking all the credit for the – some product managers won’t take all the credit. Some of them will. It’s the way some people are too, I think.

Steve: Well you were pointing loosely towards a future, earlier in the conversation, where this is a mindset, and a set of processes that are sort of equally distributed across a lot of functions. So, if that were to be the case then credit also is the case and that sort of these different skills and processes are pulling together to change a product, or change a product – aspects of it that get shipped. So, then that – you know it might be more of a nirvana state that we’re sort of playing with, but that – that sort of addresses this question about where does credit go? The credit doesn’t need to be sort of parceled out because it’s about what is achieved collectively. Sorry, if that’s like a too socialist idea.

Jesse: Well, honestly, like where – like I said earlier, I think where UX is functioning the best, the triad is taking responsibility and credit collaboratively. And I think that’s great. I think that’s sort of what you’re talking about too.

Steve: Well it’s a good future vision I think that we can kind of hold on to. Is there anything else you want to talk about today?

Jesse: No. I don’t think so. Thanks for having me. It was a fun conversation.

Steve: Thank you so much.

Jesse: Thank you.

Steve: Thanks for listening! Tell a friend about Dollars to Donuts, and give us a review on Apple Podcasts. You can find Dollars to Donuts on Apple Podcasts and Google Play and Spotify and wherever fine podcasts are served. Head on over to portigal.com/podcast to get all the episodes with show notes and transcripts. Special thanks to DJ Anne Frankenstein. Our theme music is by Bruce Todd.

Sep 05 2019

1hr 2mins

Play

Rank #3: 24. Ashley Graham of IBM

Podcast cover
Read more

This episode of Dollars to Donuts features my conversation with Ashley Graham, a design research leader at IBM. We discuss synthesis as a collaborative, co-located activity, being mission-driven, and building a process that addresses complexity.

When I look at the wonderful research community, I don’t see a ton of people that look like me and so even by talking to you today I have a hope that we’re growing and that we’ll continue to see more diverse faces, diverse ways of thinking and diverse backgrounds represented in the field. – Ashley Graham

Show Links

Follow Dollars to Donuts on Twitter and help other people find the podcast by leaving a review on Apple Podcasts.

Transcript

Steve Portigal: Hi, and here we are with another episode of Dollars to Donuts, the podcast where I talk to the people who are leading user research in their organization. In 2018, Kevin Mims wrote in the New York Times about the Japanese word tsundoku – a stack of unread books. In a New Yorker article entitled My Father’s Stack of Books, Kathryn Schulz reflects on what her family referred to as The Stack, the books that accumulated in her parent’s bedroom, especially on her father’s side of the bed. These weren’t just books to be read, but also books that were recently read that should be kept near at hand. She estimated that The Stack contained 3-400 books.

For me, I switched a few years ago to getting rid of most books, passing them onto someone who might like them, or giving them away in the community via Nextdoor or Freecycle. I felt like hanging onto every book was becoming increasingly unmanageable, and in some ways was creating a barrier to acquiring – and thus reading new books. My tsundoku serves as a last-in-first-out queue, but for me unread books go in the bedroom and books that you want to keep should be displayed on bookshelves.

I was a voracious reader of books as a kid, and at this point in my life, it’s something I need to make a deliberate effort towards. I read on the Internet all day; I read several magazines regularly. I read a print newspaper every dat. Plus I’m trying to watch a ridiculous number of television shows and movies on all the platforms. And oh yeah, podcasts, right?

As a kid, I really got into science fiction and especially sci-fi short stories. At a certain point I set that genre aside but maybe 15 years ago I came across a phone-book sized annual collection of sci fi short stories. And then the annual best American short stories series. These books have got built-in portion control – read at least one story in bed, before going to sleep. But then I’d find myself in a bookstore staring at the shelves without any clue about which ones in the series I’d read. And even if I was home, if I’d given the books away after finishing them, I couldn’t just go check my shelves to avoid repurchasing something I’d read. That’s when I found Goodreads – a website and an app where I can organize the books I own but haven’t read, the books I have read, and the books I don’t own but want to read.

There’s a whole set of other things you can do on Goodreads. You can connect with other people and get their updates in your feed. You can post progress updates as you go through a book, and you can write and read book reviews. But for me, my primary motivation was to have a single location, available to me anywhere, to see what I had already read.

My Goodreads usage stayed casual and intermittent for a couple of years. But when my local library opened a brand new building, I went in and renewed my library card and that triggered a deliberate and focused shift to reading more books. I think the stack beside the bed has not decreased much in size, but I’m fine with that. I’m making use of the library website to put books on hold, maybe it’s a book I learn about from Twitter or an article in the newspaper. I’ve also been saving books on the library website that I’ll want to put on hold later. I’ve been getting books from other libraries in California. I’ve been reading graphic novels and regular novels, both in print and on my iPad. One of the cool things about the library is that borrowing a book means that I have a deadline. It’s not just that the book is expected back, it’s that the deadline makes reading the book into a tangible accomplishment. It’s due back on a certain date, and I finished it, and I returned it. I mean, I’m also free not to finish the book, but even so, that’s something I can metaphorically cross of a list. And then this is where Goodreads comes back in, because even though I haven’t connected with many people and I don’t care too much who sees my updates, going to the site and marking the book as “read” with today’s date is another marker of closure. I am definitely doing the reading for the reading, but there are these additional rewards, other bits of satisfaction, I guess we call this gamification if that’s even still a thing. I might write a tiny review, occasionally I’ve asked a question when I didn’t understand something, but mostly it’s about making that mark when it’s done and then feeling a sense of pride or accomplishment from the accruing list of books that I’ve read. This public-ish list announces something about me and what I value, and I enjoy building that even if it’s just for me to look at and reflect on. Taking a book off the stack has a tangible satisfaction, but I find it a bit more diffuse, to move an actual book from a stack to a shelf, than to do the analogous operation on an abstracted book in a digital interface. Go figure.

But this is just how I use Goodreads. I’m sure it’s hardly unique, but it’s also one of many different ways that people could and probably do use the site. I mean, I don’t have any idea what other usage models are. But of course, we have processes and tools for finding out! And this isn’t just about Goodreads but for any product or service that has many features, different sections, different experiences, it’s essential to try to understand how people are using your product. The risk is in only considering the product as a set of separate features. Who is using this particular feature, and how? And for another feature, how is that being used, and by whom? If you are working on a product that has this many facets, you are better served in learning about the bigger picture that may weave its way through different features, as well as other tools or products that support the underlying goal. If you want to improve, or optimize, or extend the capability you are offering, you’ll want to do so based on an understanding of those goals, not just feature by feature. That means you need to think about how your teams are organized, and sometimes the way you’d organize a software team to build different features isn’t the same as how you should organize designers to design those features, and almost certainly not how you should organize researchers to inform the different decisions you’re making across features.

Of course, this is the kind of thing I can help with. Often I work with clients to help them get a handle on what their customers want to accomplish, and look at how the team can focus product and design decisions to best support the people that use their product. The teams I work with come from a variety of industries and have varying levels of experience in learning about their customers and acting on those insights. I see this podcast as an extension of that work, something that I’m able to share with you. And so, the best way to support this podcast is to support my business. Hire me to lead user research projects or to coach your own team as you talk to users. Also I run in-house training workshops to teach people how to get better at fieldwork and analysis skills. Get in touch and let’s discuss what we might do together.

I’d also love to know more about how this podcast is helping you in your work. Email me at DONUTS AT PORTIGAL DOT COM or on Twitter at Dollars To Donuts, that’s d o l l R s T O D o n u t s.

Let’s get to my conversation with Ashley Graham. She is a design research leader focusing on digital, at IBM in New York City. Thanks for being on the podcast.

Ashley Graham: Thank you for having me.

Steve: I have a very, at this point, traditional way of starting which is to ask you to introduce yourself. Who are you? What do you do?

Ashley: So, I’m Ashley Graham. I lead design research for the digital part of IBM. That means we focus on the customer journeys that our clients, or potential clients, are taking. And we bring a mixed methods approach to both qualitative research, quantitative research and generating insights that can help drive our business and drive innovation for our users.

Steve: Can you give a little context about the digital part of IBM? IBM is this huge company, that for those of us outside – like I personally don’t have a good mental model like what this company does even in 2019 and how it’s divided up and sort of where you are and what your efforts are focused on.

Ashley: Yeah. So, as you can imagine, IBM is huge. We have a large breadth and depth of a portfolio. Digital really sits across all of that, right. So, we have more legacy parts of our business. We have newer, innovative parts of our portfolio and digital really seeks to understand across, right, what are the user needs, our client needs, across the portfolio? How can we bring all the parts of IBM together, all our capabilities and expertise and bring that through a digital experience? And so that’s not something that IBM has traditionally done. It’s kind of something new and so our team, the digital part of IBM, really drives best practices around digital experience, both from a design and also a business perspective.

Steve: Can you unpack digital a little bit? Again, I’m just pulling in very old mental models of IBM as kind of back in the legacy days, you know like a hardware company.

Ashley: Yup.

Steve: So, what does digital mean now?

Ashley: That’s a good question and that’s the question we’re answering. IBM has this over 100 year history, right. We’re going into our 108th year this year. And traditionally we’ve been face to face, right. The client comes to the client representative. They tell them about their problems. They tell them about their architecture and any infrastructure needs. And then the client rep goes back to IBM and says okay, what do we have? How can we serve the client and move them forward in a way that serves the people within that business and then the business needs, right? But, if we want to scale, if we want to grow, we want to transform ourselves to really thrive in the 21st century, we have to shift that model, right. We have to think about digital ways of finding new customers, our customers finding us. We have to think about digital ways to self-serve capabilities. And then we have to think about excellent user experience, right, because that’s a critical part of a digital experience is having intuitive ways to find things, get my tasks done and move on to the rest of my day.

Steve: So, how does the idea of journeys, which you kind of mentioned right off the top, what does that mean in that context?

Ashley: Yeah, so that’s a good question. I think – if I think of a B2C company, right, or a consumer company, right, a journey might be me taking a ride, right, to California. How am I going to get there? We could break that down to me discovering you know the best way. Am I going to take a plane, or a train, or a car? What are the different things I need along my journey? How do I get there? At IBM we kind of have to zoom out because the journeys are much more complex than taking a ride as an individual person. We’re talking about leading and helping organizations achieve their key routes to value. So, we like to actually use this metaphor of the United States and the federal highway system. Right. Before the federal highway system was implemented in the U.S. if you were trying to journey across the United States it was quite a journey, right. Each state has their own way of moving people through roads. They have their own signage, own signals and best practices. So, we are really looking to create those interstates – the fewest, fastest interactions that it takes to solve a large enterprise’s needs. And so that means we have to think about the whole spectrum of technologies, people, processes. They’re our client organizations own transformation and we really have to understand a lot of different things in order to design for that. So, a journey really is like an organizational transformation for our clients. It might be helping them transform the way that the people work within the organization. It might be taking them from an on-premise hardware infrastructure to a Cloud. And so it’s a big journey that we’re talking about, at the level of IBM.

Steve: Given as complicated as it is to figure out how you’re going to go from here to California, for example, the complexity of the things that I think you’re talking about at an organizational level, it’s a little mind boggling I think to – there’s just complexity. There’s a lot of complexity.

Ashley: And I think too, when I’m traveling from New York to California I probably have less constraints than an organization that’s invested 40 years in their infrastructure and is looking to like how can they maintain and build on the investments they’ve made over time. So there are a lot of constraints we have to consider, right, within our client organizations. There are a lot of long term needs that we have to consider. And so, from a research perspective, it’s really about putting together the full picture of our users needs and being able to consider and design for that, in sort of a flexible way. And also being able to reflect that we understand those deep needs and then being able to show that we can guide, right, through complex decision making. Through long term projects. And ultimately to help our client organizations arrive at a future state.

Steve: Okay. So, I’m a big enterprise that has some technology infrastructure that’s been around for 40 years and I come to you and the rest of the IBM and say I’m interested in making certain kinds of changes and others that I don’t even know what they are. What kinds of activities do you and the rest of your organization – well, let’s talk about you and the team you work with. What kinds of activities do you undertake to start responding to that?

Ashley: Yeah. So, that’s – our team is really focused on organizing what that experience is like. And so, say I’m going through a Cloud transformation, right. Or the client is journeying to Cloud. That requires a lot of different decisions, right. So, what we do is we – my team looks across IBM and says, okay, what are all the capabilities that we have from a services perspective, right, like consulting. A technology perspective. Maybe IBM comes in and helps guide those decisions over time, or even embeds and helps in some of the architecture of development. But it really – in order for us to serve, right, those transformation needs, we have to bring together a really diverse set of capabilities that IBM really excels in.

Steve: So there’s an internal examination. You’re talking about looking inward and saying well what are the pieces we can put together here? But are you – is there – how are you assessing what their needs are to figure out what pieces you would bring to bear?

Ashley: That’s a great question. So, I think there’s that inside out, there’s the inside out view. What do we have? What capabilities does IBM have from a technology and expertise perspective? But also we have to look outside in. Right. We can’t just rely on our historical knowledge or our internal strategy. So, we do a lot of actual ethnography and talking from – talking to people that maybe haven’t been with us for 40 years. Talking to people that have been with us for 40 years and understanding the patterns of needs. And we kind of talk about it in terms of key needs, or key journeys. Like what are the most important things that are happening in the market, from an IT perspective, or Cloud and cognitive perspective. We bring those in and we kind of match ‘em up – them up to our inside strategy and see okay for 2019 what are the key journeys that we need to focus on and how can we line up our capabilities to serve those? So, it’s kind of this dual lens, right. We’re looking inside out. We’re looking outside in and the synthesis work really is understanding where do we put focus? How do we organize ourselves to be able to deliver on those keys things?

Steve: Right. There’s something I’m grappling with here to find the right kinds of questions because I think you’re talking about like doing research to plan. You’re not talking about doing ethnography to figure out what the solution looks like? You’re doing ethnography to find out what sort of this larger context, to figure out how to take on a much more complex process.

Ashley: Exactly. And really it kind of reminds me of like a systems thinking perspective, right. We have to map the full spectrum of needs – understand right – if you think of a service design model, what’s happening off stage, what’s happening front stage, as users interact with us, and then what’s happening backstage. So, we map that full spectrum and then we decide okay, what’s the best way to deliver on the needs that we see across.

Steve: Is there – so again, my model may be kind of broken here, but if it’s kind of planning to plan. Again, I hate to put my labels on top of yours, but I’m working to make sense here. You’re not – you’re setting the stage for sort of exploring and building a solution for IBM to do that. So, even in doing this, building this large map, this initial map, you are looking at needs and looking at processes and journeys.

Ashley: Um-hmm.

Steve: Does the building and deploying of these kinds of solutions that support transformation, is there activities in that process that look like what we might consider user research?

Ashley: Absolutely. So, I should say my team is really focused on journeys, right. The customer journey. There are research teams that focus more on solutions, right. And so we have to have this feedback loop between the teams that are working on products or services and that are more embedded in the day to day user research, right, generating requirements needs, evaluating the quality of the user experience. Right. Working with engineers, working with product managers, right. There are teams that are very dedicated to that and they have incredible expertise in the technologies in that context. But we also have to keep in mind, if we’re talking about these key journeys, how do those products and services add up? Right. How do they add up to really solve the big needs that we see from our clients? And so someone has to kind of be that layer on top. And the mechanism to pull together the teams, pull together the expertise and capabilities that we have within IBM, and so that’s our team. And so in some ways we’re researchers and we do user research, right. We bring that outside/in perspective. But we’re also service designers in that we’re really designing how the full system in an organization works.

Steve: I feel like there’s another facet to what you’re doing though, and maybe that just falls under the umbrella of service design, but creating this deep understanding which informs this plan – plan is kind of the word I keep throwing into this. But also, you know, serving as – being in dialogue with the sort of teams that are doing a little more – the deep requirements, it sounds like that’s a little more day to day. That’s the detailed view versus the overarching view which you’re capturing. I guess this speaks to the complexity and the scope of the kinds of projects you undertake. It’s interesting for me to think about this overarching layer of understanding that’s kind of crafted that is not about what are sort of the finishes going to be on the details of interactions, but it’s really just this big view.

Ashley: Yup.

Steve: That’s really interesting. And so – I’m curious about what I think goes with sort of the scope that you’re talking about is some time horizons that must be interesting. If there’s a certain effort to make a plan and then a certain effort to build and support and roll out things – I don’t know, is there a typical time that you’re thinking about when you work on these programs?

Ashley: That’s a great question. So, this is a fairly new effort. You know we’ve been working with this customer journey and user need centric way of thinking about the future of IBM for about let’s say 2 years, or less. And so there was a phase where we were really framing like what does this look like? How do we talk about it? What are the words? What’s the right language to use? What is the relationship between a user, a team, a journey, a product, an IBMer? We were establishing that frame, building on the practice of service design. And then there’s a phase of piloting that’s kind of the phase that we’re in now. So, you might have a great frame, but how does it get really built into the DNA of the way that the company works. We found that it’s really important to pilot work with particular business units, understand are our methods working, does the frame hold up and do we know about – enough about the way that the business works yet in order to say yes we can deliver this way of working. So, we’ve been on our own journey to do the research, the practicing to finish the frame. And then there’s a future effort to scale. Right. How do we get all of IBM working like this? And so that’s really – that’s kind of the plan. Right. And it requires a lot of influence – stakeholder engagement, a lot of consensus building and alignment. And so that is also the work that our team does.

Steve: As you talk it’s clearer to me that there’s a significant focus on the organization of IBM and how IBM does things. It sounds like that’s where a lot of the effort is right now.

Ashley: Um-hmm.

Steve: Even though ultimately this is outside focused on customers that IBM is supporting, you’re trying to – you are building new practices in terms of how IBM does that.

Ashley: Exactly.

Steve: So your emphasis right now is on, at least a significant part is, on how IBM does the work that it’s been doing in terms of building things for customers.

Ashley: Exactly and I think that’s critical – that’s a critical place to focus, right, because we could generate all the understanding of user needs that we’re able, but how do we build that into the way that everyone thinks and how do we go to market in a way that actually is accurate to our understanding? And so that really requires organizational change in some ways and this work, right, this journey work is building on the design program that Phil Gilbert has been leading over the past 5 to 6 years. So, he came to IBM and said hey, Ginni (our CEO), design is critical to the future of IBM. We need to focus on user needs. And so as his program has scaled we’ve started to see a focus, not only from designers, but from other IBMers, right – engineers, product managers, executives – in adopting design thinking as a way to a) focus on user needs, but also work better cross-functionally across our business. And so this work that we’re doing to really drive customers journeys is building on that. And so we leverage the 2,000+ designers across IBM and we’re really disseminating these methods to them, building on the design practice that we already have.

Steve: So, is that an example? Because you mentioned influenced being a major part of what this team is looking at right now. What are some practice for influence that you’ve found successful here, or that you’re exploring here?

Ashley: Yeah, I think being in a company that loves numbers, I think that that’s a reality. I think what we’ve seen is that qualitative research is really important, but it’s actually not always enough. And so I’ve been really thinking about what does a mixed methods practice look like and how do we bring a lens of both qualitative and quantitative that at an executive stakeholder level is like – stands up to snuff. Can we scale the number of data points that we have to help show the size? or impact of a problem that we see? And can we then talk about its qualitative nature to help us look for solutions to solve those issues?

Steve: So is that – I feel like I get into versions of this discussion with people all the time around influence and the kinds of things that you’re talking about. And where one thing to do is speak their language. Another thing to do is to – I hate the verb educate because I think it’s so patronizing – I don’t know, to empower people with the mindset of your own language. I mean you see really bad practices where people may do a very small ethnographic study, but they’ll – like very, very small, but they’ll quote percentages. Or they’ll say 3 out of 8 or something like that as a way to make quantitative research a peer quantitative, which I get the impulse to do that…

Ashley: Yes.

Steve: …because it’s sort of speaking somebody’s language, but is also doesn’t help that person because they’re going to misinterpret – you’re misrepresenting and it invites misinterpretation.

Ashley: Right. I think it’s much more about showing the scale of the need that we’re maybe not fulfilling yet as a company, or the scale of a problem that exists where two parts of our business are not aligned, or you know, there’s a gap in product/market fit, right. So, I think in the mixed methods practice that we’re thinking about it’s much more about like understanding from a data perspective what’s happening across the business? What patterns do we see and behavior? And then laying on top of that, or infusing that with some of the qualitative insights that we get by talking to customers, right. Talking to people that are in the space that we’re working, Cloud and cognitive capabilities, and bringing those perspectives together. So, it’s less about like okay we have a number and it’s more about we have looked across, we’ve understood from what thousands of people are doing, not just 3 to 5.

Steve: So, is that the starting point and the qualitative is the supplementary? Is that kind of the model?

Ashley: They run in parallel. So, on my team I have what we call design researchers that are really practicing user research and service design. We also have data scientists on our team. And so we are kind of running different workstreams, right. Let’s say we’re working on a journey of a client, modernizing their infrastructure to a containerized environment, right. So, we’ve taken that frame as the journey. The data scientists then go to the systems of record and they use that frame to understand all that they can. And they’re really creative. I really want to emphasize that. Right. We have a lot of historical data as a company. How can we best leverage that to inform our future? So, they’re like getting access to support data, to financial data, and really putting together a picture of how our business has run and what behaviors they see. The qualitative researchers are then talking to customers, talking to people on UserTesting.com or using Respondent to conduct moderated interviews and understand what are the needs – what are the business needs that organizations have and then we bring those together. So, that’s kind of an exciting moment to start to see like what patterns are happening across and where can we derive greater insight by bringing those two lenses together.

Steve: How do you create the conditions where these stereotypically different mindsets, different methods – you know how do you create conditions where the gestalt of what both those uncover can be kind of extracted, I guess?

Ashley: Yeah, that’s a great question. I think what we’re finding is that it’s all about making it physical or visual, right. So, we have a collaboration space where we just put everything on the walls – the war room, right. Take what the data scientists have found. Let’s talk through it. Let’s try to understand it even if we don’t have the expertise of the statistical figures. We then put the qualitative insights on the wall, right. And so we’re mapping. We’re doing a lot of mapping – systems mapping. We’re doing a lot of like pain point gathering and making it all visual, making it almost tactile. It helps us to put that picture together, the fuller picture together.

Steve: And this is both groups are doing this together?

Ashley: Right. Well, I should say that is the practice that we’re growing. I think it’s still forming, but yeah, that’s my vision is that we bring all of the insights that we have. We really work through it together and we’re stronger and better in the way that we can tell the business where to go next based on those two influences.

Steve: So there’s a lot of emergence, I guess. You are defining practices that are new to this part of IBM, or new to IBM maybe, or new to sort of the field overall of all the fields that you’re kind of pulling together?

Ashley: Um-hmm.

Steve: What’s your approach to trying to innovate and process? You’re making new things in the way that you’re working. Where’s that coming from?

Ashley: It’s all in pursuit of a customer journey that is excellent, right. We really try to keep that at the center, right. What problem are we solving? We’re solving for our client’s business needs, or our future client’s business needs. So, we keep that at the center, but we are constantly making. We are constantly trying methods, iterating on them, seeing can we get to a picture that is compelling for the people that are making really important decisions for our business’s strategy. And so it’s a constant evolution and I think what’s exciting is when we get momentum or we get resonance – we reach a point of resonance. And whenever we feel that we run with it. And so I think there’s a certain intuition to the work, right, to say okay this is compelling. Like how can we amplify an insight that we’ve had or an artifact that we’ve made and how can we start to bring that into a standard practice over time?

Steve: You’re obviously being reflective about the practice. You’re doing the work, but then you’re also looking at let’s think about the best way to do this work and let’s change that.

Ashley: Absolutely, yeah. I think for us, what we’ve done is we’ve had to make time and make space for that reflection. So, now every week we have like a huge block of time on everyone’s calendar and we all get together and talk through, work through the data that we’ve gathered, synthesize together – that’s critical, right. Synthesizing just the qualitative research in Excel on your laptop is not the best way. It’s about us all thinking together through a synthesis process and that’s how we get to those moments of resonance.

Steve: There’s a couple of things. I think you’re making time work together on the work, but you’re also making time to reflect on how you’re working.

Ashley: Yes, yes. And so – we actually work in agile sprints which I know can be challenging for research, but it helps us to have a cadence of working, doing the work, reflecting on the work and pivoting. And so I think our leader, Sarah Brooks, she leads journeys at IBM. She’s a distinguished designer that’s really leading this mission and she’s been really intentional in making sure that we know the impact of the work that we’re doing, that we’re properly socializing and influencing the rest of the organization and that also we’re really tight on the methods and practices that we employ.

Steve: Can we go back to what you were saying about doing synthesis together?

Ashley: Um-hmm.

Steve: And kind of creating those moments of resonance, I guess. You said that Excel on your laptop by yourself was not really the way to go. I mean I agree with you and I’d just love to hear you like make the case for why – for what happens when people are together? I’d love to hear more about what that feels like and how it’s valuable?

Ashley: Yeah. I think – I’ve worked on a – embedded in a product team as a research lead, but leading by myself and I think, you know, you’re going to – one researcher, one person is going to have a certain interpretation of the data that they find. But when you bring in another person you’re always going to get a slightly different take. You’re going to get a different set of assumptions. You’re just going to have a more diverse, right – even bringing a second researcher in, you’re always going to have a more diverse take on the data that you found and so I think doing synthesis together – there’s two aspects to getting out of Excel. One is that diverse set of perspectives, right. The second is that when you’re in your computer you’re not thinking in the same way that you are on a whiteboard, right. And this is the premise of design thinking, right, externalizing the data that you’ve found. Externalizing your thought process, just helps you move faster and clearer. And so I think getting out of Excel is really about those two things and that’s something that we’ve been really working on in our team.

Steve: It just reminds me of this thing – it’s a thing I remember from being a little kid. I don’t know if this is unique to me, or just general for everybody, but, like being stuck on something in the classroom and like going to stand next to the teacher’s desk to ask a question and by the time I could formulate the question I’d figured it out. Or even if I’d verbalize it I was like oh now I know what it is. It seems like the externalizing process is a sense-making activity, I think.

Ashley: Um-hmm. Even to be able to communicate, right, a data point that you found, or tell a story, right. That’s a synthesis process. That’s a cognitive process. And so I think bringing that out of a person just helps us each grow in the way that we communicate and understand what we found.

Steve: When you talk about just even adding another researcher that’s going to have a different perspective on something – I mean I agree, but I also hear the things that make people anxious about qualitative research. Like you’re sort of celebrating the – it’s an aspect of uncertainty or – right, we struggle with I think being perceived as rigorous.

Ashley: Yeah.

Steve: And when you say well if someone else is going to come in the room they’re going to see something differently than I am, but that’s a strength. So, someone might hear that and say that’s a weakness because how do we know what’s real? Or how do we know what’s true if you can’t even see the same thing as somebody else.

Ashley: Um-hmm. I think too it’s less about facts. It’s more about like what’s important in the data. I think the synthesis process is a lot about prioritization and coding or articulation of what’s important. And so I think what’s important to one researcher might be different than what’s important to another. So, it’s less about like my truth/your truth. It’s more about what’s important to our team user needs and then what’s important to the business, right. Ultimately that’s customer lifetime value and meeting our business goals. So, in getting out of Excel, working together, we can all talk more about like what’s really important and that’s only going to make the work better.

Steve: I feel like this ties a little bit to the influence part of our conversation as well ‘cuz you are running things in a way that I think – there’s definitely struggle in some teams to do this. To have more than one person. To take the time to reflect on their process. To devote the time to synthesis that it requires, as opposed to kind of sitting down at your laptop and kind of pushing out the next set of recommendations. Are there elements or components of the practice that you have been building that we’ve been talking about that you think could be even adopted in small parts by somebody in a different kind organization?

Ashley: Hmm. That’s interesting. Um, I should also say this is the practice that we aspire to execute on, right. It doesn’t always happen every study, every day. That’s where we aspire to go, but I think one of the special things about the work that we do and the way that we understand customer journeys is that we really are taking a holistic view and when you start to map the whole of what users need, how they interact with us, our relationships with them, and then also what IBMers need, right, it’s kind of hard to unsee once you see it. Once you see how two IBMers are disconnected or their incentives aren’t aligned and you see how that surfaces in the user experience you kind of can’t unsee it. And so I think although I lead the design research practice, I think there’s an element of expansion of design, to this like service design frame, or even like a business model frame, that I think we’ve gotten a lot of value out of, right. So, user research is really important, but it is a foundation to a larger understanding. And so I think other organizations could benefit from that, right, in adopting a more full picture of how user research and how user experience plays within a larger context.

Steve: And you brought it back too to this, in some ways your first customer is the organization of IBM. Even though IBM is focused on its customers as kind of core to its business, you’re serving an internal – I mean everything. You’re serving everything about IBM.

Ashley: Um-hmm.

Steve: And you know that’s different. And then you mentioned sort of the figuring out how to go from New York to California. In those kinds of organizations they talk about being customer obsessed and the researchers and designers are thinking about improvements for the customer, which is also a great thing for a business to be thinking about.

Ashley: Right.

Steve: But it’s different than can we make what we do here as a company better for us as a company versus can we make a better product for our customers. Success is going to be measured in different ways in those organizations.

Ashley: Right and so we are designing – we call it orchestrating and also measuring the whole customer journey and one question we have to ask ourselves is like where do we sit in that customer journey because we’re part of it, right? Backstage. We’re somewhere and I think we’re actually pretty far backstage, right, because our first line of user is the IBMer. It’s then the client organization, but then our client organizations also have users. So, there’s this spectrum of the definition of user and we have to figure out like who’s the priority among that set of users? And where is it best to focus and that might shift over time. Maybe initially we’re designing and establishing practices for IBMers, but once those are set how can we push further towards more of a human centered perspective? And ultimately that’s getting to deliver an experience that helps that end, end user and their needs.

Steve: I’m sure there are some visuals that go with this.

Ashley: There are some big maps. I think it’s also best to give an example of this, right. So, let’s say we’re designing a journey engine modernizing to a containerized Cloud environment, right. Why – we have to understand like why are people – why are organizations doing that? Why are they modernizing? What we found is a lot of it is about driving better – their own better customer experience, their own better user experience. A faster app, right – iPhone app. A better, right, way to search – for their users to search within their business. And so it’s funny that we’re going through our transformation, our client organizations are going through their transformation and ultimately that comes down to a person using a consumer thing. So, when I pick up my banking app and I check my savings account, right, that’s ultimately coming back to something that IBM is probably involved in. And so how do we keep that experience in mind, that very human, very personal experience of knowing how much I have saved for my future and how do we then comb through the technologies and capabilities and relationships and politics that affect that? There is a great breadth and depth to the work that we do, I guess is what I’m saying.

Steve: So, if this person with the bank account is going to have a better experience the company that – that bank needs to have processes and tools and technology to enable them to be more contemporary in how they make products and services which means someone like IBM can help them understand how to go about changing those processes, changing their tools, changing the infrastructure, so that they can then go make that app. Which means that you and your team have to, you know, build processes and practices that can help IBM help that bank build better kinds of apps to help people have better experiences.

Ashley: Right.

Steve: I just said what you said, but I started going in the other direction.

Ashley: I like that you mapped it back to the other direction and I think as we can better serve our clients, right, we can help them get to a level where they can innovate. So, ultimately, right, the work that we do is about moving towards the future, right, with shepherding new technologies into the world, shepherding new practices, advising, guiding, and ultimately in pursuit of something that’s ideally life changing for someone.

Steve: That’s great. Can I go back to something else that you said before and maybe pivot from that?

Ashley: Yeah.

Steve: You’re talking about that moment of a researcher brings another researcher in to talk about some data points, or some qualitative research. And that this diversity of thought is a – starts to add value because people look at things a different way. You know – the word diversity is being used to represent a lot of different kinds of things now and I guess I’m wondering if you can talk a little about diversity overall in the design research field? What are you seeing or thinking about right now?

Ashley: Yeah. I think that’s definitely something I’m thinking about, my boss is thinking about and IBM is thinking about. From a research perspective, I think there’s a lot to be desired in terms of having a diverse representation in who is practicing it. And I think that’s really important if you’re talking about a new technology, Cloud or cognitive capability, AI, right – we need to have people that have different perspectives in the room, right. If we’re creating a data model and it’s going to affect the way that someone’s bank account, right, or their financial service – it’s really important to have people from different economic backgrounds, or from different contexts, that can understand what are the pressures, what are the needs, what are the pain points of different types of economic situations. Right. So, that’s just one example of where I think we could grow. I think when I look at researchers, right, the wonderful research community, I don’t see a ton of people that look like me and so even by talking to you today I have a hope that we’re growing and that we’ll continue to see more diverse faces, diverse ways of thinking and diverse backgrounds represented in the field.

I think too, a second point, research has traditionally been like somewhat of an academic thing, right. It was something that PhDs did, right, and maybe other folks that didn’t have a master’s degree even were kind of shut out. But I think we’re seeing a shift there. I think we’re seeing people go through things like General Assembly, or self-taught, really thriving in the research community. I myself have a background in architecture and this crazy MFA program called transdisciplinary design. And so although I picked up research practices throughout both of those educational experiences, I think I have somewhat of an untraditional background in the fact that I’m leading, right, a team of researchers and able to really maybe bring more of a generalist perspective to the practice and bring the things that I’ve learned in architecture, from systems thinking and mapping and other practices. It just only helps enrich the field and enrich the work that we do.

Steve: I do want to ask more about your background, but let’s keep talking about what you see when you look at the community of research, or what you don’t see, I guess, more importantly. Right. Let’s just imagine someone is listening to this podcast who’s kind of on the outside of the practice, but is interested in it. Do you have suggestions for them if they also are not from the groups that are currently sort of dominating the population of research?

Ashley: Yeah. Um-hmm.

Steve: What do you tell them? Or what do I tell them? What do we tell them?

Ashley: I think one of the key things to any practice, right, or field is language. So, I think a lot of what we do is about having the right language. It’s having a mental model of people, behavior, psychology, right, cognitive processing, kind of understanding that. So, I think if you can kind of just pick up a foundation of understanding of those things. Like how do we talk about user problems? How do we talk about user needs? How do we talk about motivations? If you can kind of pick up that sphere of things, I think it’s easier to move into the space, right. So, for me methods are important, but methods can be learned. And so I think focusing on the mindset and the perspective that we bring when we sit at a table of, you know, various stakeholders, right, we have to bring that user lens and that’s the most important thing that we do. It’s not about are you the best interviewer? Are you the best person to like write the survey? Because someone can always help you with that. But it’s about do you have the right mindset and do you speak up and bring the voice of the user to whatever context that you’re working in? And if you can communicate that and start to build off of that, I think you can have a viable career.

Steve: Good.

Ashley: Would you agree with that? I’m curious.

Steve: I mean I think your point about mindset vs. methods is really – if I think about just my earliest days in the field and trying to hire people – I mean there weren’t people – it just was sort of underpopulated practice. Whereas now I think there are sort of new people coming in all the time. And I remember struggling with how do we articulate someone who would be good to join our team? I worked at this agency and we used to just talk about people who got it. I mean that might – in 2019 that might be a trigger phrase for sort of exclusionary thinking.

Ashley: Yeah.

Steve: I think what we were trying to say then is the same as mindset. I think we were thinking about that there’s a frame of reference or just a common language around how you think about what we’re here to do. I was at a product design consultancy and, you know, you’re there to sort of serve the client, but also serve the – we were doing research and trying to help the company serve their customers even though our customers were the company making the things. So, there was some advocacy and evangelism and facilitation and sort of understanding hey, here’s where people are at, so what are the opportunities to sort of help them, or empower them, or do something for them, that I don’t think we had a lot of clear language around. It was sort of new to the practice or new for us. It was a long time ago and we were relatively isolated at that point. Yeah, I think – I agree. We’ve had the conversation many, many times about you can learn methods, but there’s this other thing that’s sort of harder to teach. And so I think your advice is kind of hey, get that thing.

Ashley: Yeah.

Steve: That’s the kind of thing that – so, how does someone demonstrate that they have this thing, that they have the mindset?

Ashley: Ooh, I think write about it. Those have been some of the best moments in my career so far where I worked through a project and then I went and wrote about it. Because that’s a beautiful reflection process – what was important, how should I talk about this, right? You might have had one conversation in your head, or with your team when you were doing the work, but when you reflect back on it, like what stood out? What were the major like learnings? What were the moments that really mattered? And I think writing about maybe a project that you’ve done, or how you would like to do the work, could be a really great way to yeah, just get a stronger hold on your own practice. So, portfolio building has been – I think is a great practice. Maybe not to get a new job, right.

Steve: Right.

Ashley: Not just to get a new job, or get a promotion, but just as a reflection practice. And that’s something that we encourage here at IBM. We talk about this thing, the portfolio of experiences. So, it’s not necessarily like oh here is the beautiful app that I designed, but the portfolio of experiences talks about here are the range of problems that I worked on. Here’s what I learned along the way. Here’s how I grew to be a better researcher, designer, visual designer, facilitator. Right. Here’s how I learned to talk to executive stakeholders. Here’s how I influenced them. And that’s so important. That’s so important. And so I know it’s hard. When you’re just starting out you might not have real projects. I hear that a lot. I get that in my LinkedIn – how do I get started if I haven’t worked in the field yet? Write, talk about what maybe you would do. Make up a project. Do a General Assembly course and get some experience and reflect on it.

Steve: I like portfolio, almost as a verb the way you’re talking about it and less about the noun. So, that’s developing the mindset by – so it’s less about this artifact that you’ve created and more a way of kind of thinking and communicating. And then you have stories to share and stories to tell with people that you meet.

So, we started off this thread of the conversation by talking about – you didn’t use the phrase underrepresented, but I think that’s an element of what you’re talking about. But, I guess, as often happens when we talk about increasing accessibility of anything, like this is good advice for everybody, right – regardless of your privilege or your representativeness. I don’t know, is there anything else there for people that are not represented, or are not in the sort of majority of what the field looks like. You said speak up. That was one sort of thing that…

Ashley: Yeah, speak up, have a point of view. But I think ultimately it’s on the leadership of, you know – whoever is leading a team. The responsibility is on us. It’s on me to, when I’m hiring, saying alright I posted this in this platform, I only got these types of people applying, I need to look somewhere else. And I need to consider the full range of people, but I need to make sure that I’m looking for people that aren’t currently represented on the team, whether that’s – you know we call it URM (unrepresented minority) in the HR system. So, it’s like women, people that are black, African American, Latino, Latinx, people that have accessibility needs. That is like – that’s really critical. I mean the work that we do, accessibility is so important in, in, in this space. And so having people that have that experience, are like why wouldn’t we? Why wouldn’t we have them? And we’re working on that. And then also people that identify as LGBT or even people that are veterans, right. Those are some of the groups that you want to really look for and prioritize and there are organizations out there that cater to those groups, reach out to them. Go to those communities, right, and that’s how you’re going to find those people. There’s this like concept of a pipeline issue and sure, but right – let’s say you wanted to hire more black designers, that’s something that is really important to me. Go to historically black colleges and universities, right. Partner with the leaders in that organization. Make sure that the students in that school are getting the education that will result in skills that apply to the work that we do, right. There’s a lot of work that we can do that’s not at the hiring stage, but just kind of working a little – just a few years back, right. And we’ll radically, radically change the quality of the skills that the students have coming out of those institutions and radically change the quality of our organization. That’s just one thing you can do, right, that will be really impactful. And so I think – I see that growing at IBM and as a leader I want to be part of that. I want to do that. And it’s not that hard, you know. So, let’s do it.

Steve: So, there’s – I think about sort of these disciplines, maybe nested within each other, that you have – the tech industry and its challenges with – you’ve given very good practical advice here, but, it’s challenges with hiring underrepresented minorities. And then you have kind of UX or design within the tech industry. And then you have design research within that. I think sometimes, and maybe it’s the same thing across all, but I think sometimes we – like the research practice, when we have these messier things come up – like oh, we should be talking about ethics, that I fear sometimes that design research just says well here – defaults to the UX conversation or defaults to the tech conversation. And I’m thinking about, again, diversity and inclusion in hiring, what our community looks like, do we have specific challenges or specific opportunities, or either way, within design research that we should be thinking about? Is there something unique for us as a field that we should be thinking about?

Ashley: For sure. I think maybe you’re gesturing to the practice, like within doing the actual work.

Steve: I wasn’t, but if that’s where it goes that’s fine.

Ashley: Even in we’re going to do 30 interviews – who are we interviewing and what perspective are they bringing? If they’re talking about the challenges that a user experiences in their day to day is that different for men and women? Is that different for somebody who has full vision vs. like a low vision need? So, I think even just starting there. That’s hard. That’s challenging. It’s hard enough to get to – in IBM as researchers to get to users, our customers, right. That’s an initial challenge, but then to make sure that the people that we talk to are a diverse – represent a diverse set of experiences and needs, you know that’s something we could do better. That’s something we can prioritize and talk about and cite. And so I think that’s one way. But, maybe rearticulate your question so I could answer it more directly.

Steve: There’s a tech challenge, that tech isn’t diverse enough. Tech says we have a pipeline problem and they’re not going to historically black colleges and universities to recruit – sort of the thing that you described. If you look at what tech is doing as an industry. If you look at what design is doing – I mean because those are sort of bigger entities with a lot more voice around here’s the challenges, here’s how we’re failing. I mean design research is a smaller practice. Maybe we have – like it’s a practice that my anecdotal data says it’s more women than men. So, sometimes – I don’t mean to imply that’s sufficient diversity. I mean it just – it inverts the sort of male dominance to a lot of tech. So, maybe that feels good. As a man in the field that’s kind of like oh cool, I’m in a non-traditional kind of thing. It doesn’t’ mean that I think we’ve achieved the kind of diversity that you’re talking about. But because people in design research come from different places than say software engineers do, because the field offers certain kinds of experiences and pulls on certain kinds of creative mindsets that are different – because we do different work. Like there’s something about research that – like the work we do is about telling the stories of other people who are different than us which maybe you could use to justify not being diverse because our practice is to sort of – or maybe you would use that to say well then we especially better be diverse because of it.

Ashley: Right. I think there is something privileged to design research as a practice. I think especially for myself coming out of grad school, or going through grad school, going to get an MFA was like kind of a privileged thing, right. Become an engineer or a lawyer or a doctor, it’s kind of more certain, right. If you’re investing however many hundreds of thousands of dollars in your education, I think for certain groups, maybe socioeconomically, from certain contexts, right, you kind of want to go with something that’s certain, right. So, I think maybe there’s something to the way that design researchers are traditionally educated that’s a little inaccessible. And I see that changing, but maybe there’s something to that and we can pick up on it and amplify less traditional or expensive ways of becoming ethnographers, design researchers, UX researchers. I think there’s something there that I’ve definitely had conversations about with people. Um, and again, even coming out of grad school, my graduate program, transdisciplinary design at Parsons was 5 years old when I graduated and that was the first year that like people really got jobs. So, you know, working in messy spaces, talking about soft things like feelings, behaviors, is a less certain space to be in. The outcomes are less certain and less rigid and so I think maybe that has something to do with it. And if it is, if we find that, maybe that’s an area that we can like probe in and find a solution.

Steve: Good. These are some good signals I think to toy with. Can you talk a little bit about this program? Like how did you find it or what kind of drew you to it?

Ashley: Yeah, okay. So, I went to Howard University – the illustrious Howard University, as we call it, a historically black university – and studied architecture. I got a professional Bachelor of Architecture. It’s a 5 year degree and by the 4th year I was like absolutely not. Very cool. I loved concepting. I loved research. I loved setting contexts, like okay what is this neighborhood like? How does that influence what this building should be? I love that part of it. I love the concepting. And then the visualization, like the diagramming. And, um, but did I want to go build buildings? And like architects live really difficult lives? They work really hard, right. Um, um, so I said I love some of the things here, how can I pivot or like open up the lens of problems that I want to solve? And so I went and got this graduate degree from Parsons and I only found it because I went to go look at a fashion program and I looked down the list of programs and it was at the bottom. I was like transdisciplinary design, what is that? That sounds interesting. So I went in and met with the director of the program and I was like, oh, this is a thing. Like solving complex problems, talking to people, understanding systems and behaviors in spaces. And I was like oh yeah, I think I’m going to do this. So, that program is really built on the premise that our world is so complex, we have these wicked problems that one discipline can’t solve. And so the practice of transdisciplinary design, if it is a thing, is about bringing diverse sets of mindsets, but also methods, together, and using design thinking kind of as the framing for bringing that cross collaboration and synthesis of those mindsets. So, that’s what I went to do at IBM and it turned out to be a really useful education.

Steve: Did you go to IBM from that program?

Ashley: Yes. So, I’ve been at IBM for about 4 years now.

Steve: The way you described the program – I guess it’s no coincidence the way you describe what you’re doing at IBM and the way you describe what that program’s aims are seem to have quite a bit of coherence.

Ashley: Right, right. And finishing up that program I had no idea that I would work at IBM. I didn’t think about that. But when I went and I talked to the folks that were hiring at the time, I was like oh wow, there is a need for this and it wasn’t exactly the context that I thought, but it’s been a really great place to learn and practice and then begin to lead.

Steve: I guess hindsight has always idealized things in retrospect, but just kind of the way you described it, it seems like the experience at Howard told you some of what you didn’t want to do, but also opened you up to see this transdisciplinary design program. Like you were at a point of sort of creative self-discovery that this filled – this was a thing that you didn’t know existed that then was right for you right at that time.

Ashley: Exactly. And even while I was at Howard, I mean you could imagine it’s a place of social innovation and like you know the student body there is very much engaged in politics and systems and so that was already there, right. That was already something I was thinking about. But then to move into a program where like that was the basis, the foundation, that was the work that I would do, it was like oh, this is obvious. And so I think as you – as I have gone from place to place and learning and growing, I like to follow those threads, right. Okay, systems thinking, mapping, social innovation. Right now it’s about this customer journey, service design, right. You can kind of follow those threads from place to place and build on your interests while building on your expertise and I think that’s something else that I would tell folks that are looking to get into the field, like build on something you already have. If you were a teacher, maybe become a design researcher in the education space and that’s another way to break in and to, yeah, just build on what you already have.

Steve: Do you ever imagine a future for yourself? Think about whether it’s 5 years or 10 years – I hate it to be where do you see yourself in 10 years. I don’t mean it that way.

Ashley: Oh, terrifying.

Steve: You know on these kind of – you’ve made these interesting, I don’t know if they felt like leaps – you don’t present them as leaps, but these zigs and zags I think that sort of came at the right time. If you think about a future for yourself, what kinds of work might you be doing? Or what kinds of roles or organizations do you think are there for you?

Ashley: I think it’s all about mission for me. And that’s why I was so excited to join Sarah Brooks and the journeys team and lead research. But I think it’s all about missions, like I like the big messy, unsolvable problems and maybe that’s like nature, maybe it’s nurture. But I think it’s all about that. So, maybe it’s the criminal justice system. Maybe it’s food justice. Maybe it’s design research diversity. I think anything that’s like hard and interesting and you don’t quite know the answer is something that I want to do. And so I think, if I look 5 or 10 years out, maybe I’m at IBM still doing that. Or maybe I’ve learned enough that I am going out and I want to like focus on a problem. So, I don’t know what it looks like, but I know it’s a mission. It’s always a mission.

Steve: Anything you want to plug?

Ashley: Yeah. So, if you’re interested in IBM and practicing design, or even just learning about how we talk about design and the work that we do, you can go to IBM.com/design and that will kind of launch you into our world. You can also reach out to me. I’m @ashleyograham on Twitter. And thank you.

Steve: Well thanks for a really great conversation. We explored a lot of different nooks and crannies and I think it’s fascinating. So, thanks a lot for taking the time.

Ashley: Thanks, Steve.

Steve: Thanks for being here for this episode. You can find Dollars to Donuts on Apple Podcasts, and also Pocket Casts, Castbox, and Overcast. If it has “cast” in the name, we’re probably there. Go to the website – Portigal dot com slash podcast – to get all the episodes with show notes and transcripts. Our theme music is by Bruce Todd.

Jul 25 2019

1hr 5mins

Play

Rank #4: 25. Juliette Melton of The New York Times

Podcast cover
Read more

In this episode of Dollars to Donuts I speak with Juliette Melton, Director of User Insight and Strategy at The New York Times. We talk about updating the old “design research” label, user research in a journalism culture, and the role of coaching.

I think that researchers can bring a kind of brightness into a space and a kind of optimism for a team and a sense that we can learn these things together. It’s a bit intangible as a quality, but when we bring on new researchers that’s really something we look for. Like is this person someone who is excited about making connections across an organization? Excited to share what we’re doing? There’s something about bringing energy into research which I think is really important. – Juliette Melton

Show Links

Follow Dollars to Donuts on Twitter and help other people find the podcast by leaving a review on Apple Podcasts.

Transcript

Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization.

There used to be a restaurant in San Francisco called Heaven’s Dog, and one of the items on their cocktail menu was called “freedom from choice.” Rather than choose a specific cocktail, you’d tell the server your preferred spirit, and a bit of direction on the flavor, and they’d make you something to fit those criteria. While once freedom from choice would only have been seen negatively, here outsourcing your choice to an expert became a positive. The menu still had all the cocktail options for people that knew what they wanted, but for patrons who didn’t know what they wanted, there was an option for them as well.

A number of years ago we started using a meal kit delivery service, first Blue Apron and then later Sun Basket. In case you aren’t familiar with services like these, they deliver everything you need, once a week, to make some number of meals, in our case three. When Blue Apron started, they offered very little choice, we could opt for vegetarian or not, but otherwise, they just sent us three meals. Eventually we could opt out of pork, or fish, for example, and then they began offering a larger number of meals we could select from about a week before.

We immediately loved Blue Apron. It eliminated the planning and deciding, something that we just weren’t good at. Or that we avoided, because we didn’t enjoy it, and so we’d just make the same set of meals based on what we had done before. Using Blue Apron reduced the shopping, but not entirely as we still needed fruit, milk, bread, juice, and so on, for the rest of our meals. More interestingly and most meaningfully, it totally changed our meal prep activity. Where previously one person would have decided and cooked, and the other have cleaned, now the whole process was something we could share. Rather than anyone “owning” or controlling the preparation process, we had this well-designed, visual recipe on it’s own peice of paper. With the Blue Apron meals, we cooked together, easily weaving between chopping and heating, each person taking responsibility for a each task but no assigning, no delegating. We would discuss ways to improve or alter the recipe to suit our tastes (less lemon juice, more red pepper flakes).

Sometimes we’d take the meal kit over to someone else’s house, friends who were also Blue Apron customers, having planned to save a common meal to prepare together, playing out the group cooking activity with a larger group.

I was mixed about the results, some meals just weren’t that tasty, some meals required a lot of work for little result, and I felt like my own cooking wasn’t improving, as following the instructions didn’t give me much insight. We looked on social media to see what Blue Apron customers were saying about a particular meal, in case that gave us insight into choices we could make. Eventually, we switched to Sun Basket for more choice and better quality meals. But most of my satisfaction came from the preparation activity, rather than the actual meal itself. That wasn’t necessarily a failure; it’s worth taking satisfaction and reward where you can find it.

As with any product or service that I use, I often wonder if the producers understand the ways it’s being experienced, and how that inform the choices they make in configuring the service. I don’t know what Sun Basket or Blue Apron do or don’t know about their customers.

Anyway, after several years, I happened to be visiting someone out of town and over several nights we had delicious turkey burgers, and then really great chicken thighs. And I was overcome with a sense of regret, that I had been settling, that I was missing out on eating like this every night. The freedom from choice had been great, but now I was missing out on freedom of choice.

So when I got back home, we changed our approach again. To address the “what to make for dinner” – the problem that these services had solved for us – we made a spreadsheet. Entries included some basic examples like “tacos” but also meals from Blue Apron that we liked (with a link to the recipe), and most of the recipes torn from magazines or printed out that were sitting in a folder. Now we had our repertoire in one place, and we could refer to that rather than thinking oh wow what to make?

We began planning out a week (or part of a week, at least), discussing what to make in front of a computer (or two), opening up the recipes and updating our online grocery list based on what the recipe called for and what we didn’t have on hand. And then putting the URL for the recipe into our shared calendar for the night we planned to prepare it.

Our shared approach to cooking the meal itself of course was no longer tied to Blue Apron or Sun Basket, and we still do that. And we update the spreadsheet with the date we made a particular recipe, and any changes we would make or if we’d even want to have it again.

I subscribed to a bunch of recipe newsletters from the New York Times and elsewhere and have regularly been adding new things to try.

So we took back a significant part of the work – the planning, and the list making, and the shopping, but the results – the meals themselves – are so much better, and the experience is just as enjoyable. Indeed, the work of planning is not a chore, it’s now tied to feelings of excitement, anticipation, challenge, and discovery.

How we eat and how we feed ourselves is enormously personal and a rich vein to mine; it’s why I often ask workshop participants to practice interviewing on this particular topic. My point here is not to offer a recommendation about how you should be managing your meals, it’s to illustrate that the way we experience products and services can be different from what is intended, or what is assumed. That meaning can be a bigger driver of behavior than say “value” or “time saving.” And that perceived benefits or limitations shift and become reframed on a regular basis.

There can be a lot of nuance to uncovering this with your customers. If this sounds like something you’d like to explore, get in touch. Supporting my business is the best way you can support this podcast. And if you have any feedback in general, you can email me at DONUTS AT PORTIGAL DOT COM or find me on Twitter at Dollars To Donuts, that’s d o l l R s T O D o n u t s.

Let’s get to the conversation with Juliette Melton. She is the Director of User Insight and Strategy at the New York Times. Thanks for being on the podcast. I’m glad to get the chance to chat with you.

Juliette Melton: Thanks. Yeah, it’s nice to see you again. It’s been a while.

Steve: It has. Why don’t we start with just an introduction?

Juliette: Great. Hi. My name is Julie Melton. I’m Juliette Melton on the Internet and yeah, I’m a design researcher here at the New York Times. My title is officially the Director of User Insight and Strategy. We have in some ways moved away from the design research label, which I think is also something to maybe talk about today. But, yeah, we’re here at the New York Times. We’re sitting in a conference room just off of the newsroom where the news gets made.

Steve: Does news get made?

Juliette: Well, the news gets typed.

Steve: Can you unpack your title a little bit? Maybe that will feed into some of the themes you’re already teasing.

Juliette: Sure. So, we purposely changed the name of our group a while back and to talk about user insight and strategy. It becomes a little more broad and a little bit more impactful than I think just calling it design research because I think – in the IDEO context, design becomes a very large word. In many other places design is a little bit smaller and speaks really towards a specific function within product. And I think it’s important that research can transcend the product team because we can offer much more in an organization and we can do much more in an organization. So, by calling it user insight and strategy it puts the focus on the people who we’re actually understanding and not the end result and I think that that’s really powerful as a reminder to us and to the organization about what we’re here for.

Steve: I’m going to just back up on part of that. What does product mean in a media organization?

Juliette: That’s a good question. So, here, it refers to essentially the websites, the apps, the newsletters, all of these digital expressions of what we’re doing here. And there is some church and state separation here between what happens on the news side and on the product side. So, within the newsroom there are people who are interactive news producers, and they have a very different job than the people who are building the apps and the websites and the newsletters. It’s essentially a separate kind of function. We think about engagement with the news and how do we package, present, share everything that we’re going. And we’ll talk a bit more about it, but my current role is not really news focused. I worked on other stuff here. So, my group is called New Products and Ventures. I’m the Director of Research for these other businesses that the Times has been building. So, that includes NYT Cooking, Crosswords and we’re very proud to talk about Parenting which is a new digital product which was just released a couple of weeks ago. So, these are separate businesses. These are places where the Times is hoping to expand its audience, expand its reach, do some new things, stretch some new muscles. So, it’s really fun from a research perspective.

Steve: Are there other – without asking you draw the complete sort of corporate org chart, but those are three separate businesses. Are there others? I’m just wondering how big is the set of businesses that comprise kind of the Times universe?

Juliette: Well, that’s kind of hard to answer. There are a lot of different brands here. You know there’s The Daily, The Weekly, which is the new TV series, and then there’s other brands that sort of become part of what we do like Modern Love, for example. And these are all things for the – the organization can spin off into these different entities.

Steve: It’s interesting from the outside. I guess it’s the same with so many different kinds of multiple touchpoints, as we like to talk about businesses, where if you’re consuming a bank, for example, you – what do they say, don’t ship your org chart? So, I guess as a consumer of The Times media I might be aware of a lot of different brands, but that doesn’t necessarily map to – I think part of what you’re saying is they could be structured as businesses, or as – or not, depending on what makes sense internally. It’s invisible to me as a reader.

Juliette: Exactly. And so the things that I’m working on, I think, would be visible as separate businesses because you could subscribe to them separately. And so literally you could just subscribe to NYT Cooking, or you could just subscribe to Crosswords. Parenting doesn’t have a paywall at the moment, but down the road that could be something that you would pay for separately as well. So, yeah, there is some experience differentiation there. As opposed to you’re a reader of Modern Love – that is part of the nytimes.com world. So, that is just part of what you’d get from a Times subscription.

Steve: I’m sorry, can you say the name of the area or the division…

Juliette: That I’m working on?

Steve: Yeah.

Juliette: It’s called New Products and Ventures (NPV).

Steve: Alright. So, some could be products and some could be ventures.

Juliette: It’s sort of a catchall, yeah. I think the interesting thing there is it’s a spot for us to be innovative in a different way. To be innovative around business models as well as what we’re doing from the content side. And I think that that’s been really good for the organization and pretty interesting as a researcher.

Steve: So within sort of nytimes.com you can provide different kinds of content, but it’s always going to be under that business model because that’s the Times subscription model.

Juliette: Right, exactly.

Steve: But if you have a separate piece – like you said, you can subscribe to the Cooking app separately, so you can have that as a separate part of the business. And so that’s a business model shift.

Juliette: Yes. Exactly.

Steve: Okay.

Juliette: I don’t know how interesting this is for people. You know one shout out here is I’m a big fan of subscription models as a way to exchange value for the web. You know I’ve worked in the past with a couple of other subscription-based services and I think there’s something really honest about it. You know like someone pays you because they get value. Right. It’s just a very direct value exchange which is different from working with an organization that makes money in somewhat – in ways that are somewhat more hidden to the consumer, whether that’s through advertising, or some other kind of data brokerage. Like, yes we do have advertising at the Times, but the subscription is really like the core of the value exchange. And yes, I just like it from like an ethical perspective. I think that it’s a really like nice way to be doing business. But then also it’s fun to be working on these other smaller businesses because you can really measure success. You can measure outcomes because people are paying you. And so as a researcher it’s just really satisfying to know, okay, we did this thing and then this resulted in this much revenue. And just that direction connection I think is really – it’s good feedback.

Steve: So, as a researcher how are you using research to support the organization?

Juliette: Well it comes at a lot of different moments in the lifecycle of one of these products. For example, I did work last year with our strategy department to understand a new vertical and whether or not there was opportunity for us in that space. Looking at that from the market size perspective, from user needs, from how much brand flexibility we had, we decided eventually not to do it, but that’s a case where research can come in on a more very, very early stage, very strategic level. As you move through the phases of the design process we can be involved in very early stages of discovery. I’d mentioned briefly about the work that I’d done in Canada. I spent several years here working with our global expansion team, understanding, around the world, what might we do to expand our reader base? And that includes everything from payment models to partnerships to hiring additional people in different countries and building our new bureaus. And so it had a lot of different kinds of ramifications around what we’re doing as an organization to meet these needs. So, that all kind of starts to feel like the beginning of a process. And that’s where a lot of the work I did at IDEO, for example, comes into play – different thinking, understanding people’s needs from more of an opportunity space perspective. What might we do in this space answering some of these bigger questions? But then, you know, we’re building and we’re shipping and so the questions continue to get a little bit more specific as you go. So, it’s, you know – how exactly do we meet this need? Right. What do we do for this audience? So, for Parenting the team learned that there was this gap in – actually I don’t want to talk about that – scratch that. To speak in broader terms, because I don’t want to get too specific for the purposes of going public here, it’s about locating an opportunity and then designing for that opportunity and then along the way checking to make sure that you are designing that properly. And so that’s basically the cycle of generative to evaluative research and our team really does all of that.

Steve: In some organizations the narrative of research is about – you hear phrases like buy-in or advocacy or evangelism where it’s a newer process that is trying to support or integrate in things that the company is doing. So, now I’ve set up a very leading question. For you, what’s sort of the appetite or understanding or kind of engagement with these different areas of the organization that you have found here?

Juliette: I think that advocacy is always there. I mean I think that that doesn’t go away. I was on a panel a few weeks ago and someone came up to me afterwards – you know because we’d been talking about advocacy on the panel – and she said when does that stop? When can you not have to do that anymore? And I said like I’ve been doing this for 20 years and – and he’s like 18 – it’s like that doesn’t stop. You’re always explaining what you’re doing and why and what the impact is, what the outcome is. But I think overall this is a place that really gets it. We’ve been doing a really good job for a long time now and I can say that as someone who came in just 5 years ago. It’s like predating me I think the research team was really strong and so we just – we have a really good track record here and this is an organization that, you know, we seek to find the truth, right. And I think that that permeates the culture here. People are very curious. They really want to know what’s going on in the world. And you know when I’ve worked with journalists it’s been amazing. You know, our methods are sometimes different and the outcomes have different purposes. But so much of it is so similar to journalism that they get it right away and love it. So, it’s been very, very easy to build advocates in the newsroom because of that. And then on the product side, once you start working with a team and they get it, they just want more and more research. And I think that that ends up being more of the challenge here than people not wanting research. It’s just like bandwidth constraints. Like there’s only so much that we can do as a team. And so for us the challenge is more working with the different teams to figure out where our work can be the most impactful and where we’re going to be the most useful.

Steve: What kind of things might someone look at to assess where they’re going to have the most impact? Or to make priorities?

Juliette: One thing I always ask is – like let’s just play this out. What if we heard “X”? What if we heard “Y”? What do you think we would do about that? What could we do about that? Because sometimes what I’ve found is that people want to do research because they’re curious, but it doesn’t necessarily mean that there’s flexibility in terms of what they will do. And if they say look, we’re going to do this regardless, then I’ll say, that’s great, I don’t think this is a place where research is going to be that helpful. So, making sure that there’s bandwidth for that team to do something about it and the will to change course based on learnings. And if those aren’t in place then it’s probably not the best use of research resources.

Steve: If bandwidth was unlimited would you have a different perspective on people that start off with sort of an intractable request?

Juliette: You know, honestly, probably not because I want for research to be seen as a strategic function that leads to better outcomes, that leads to teams that feel more secure and more enlightened. And if it were something that you could just sort of sprinkle everywhere, I think the power of it might become a little more diffuse and I want for it to be like a powerful, useful tool and not just like an oh I wonder about this or that, you know.

Steve: Yeah. So, there’s a scarcity in value – that’s not really what you’re saying. I think I’m changing what you’re saying and reflecting it back to you.

Juliette: That said, I would love if we hired 10 more researchers, yes. I think that would be great.

Steve: But that still wouldn’t give you the bandwidth though…

Juliette: To answer everything for everyone? No, it would not.

Steve: What’s the size of the team right now?

Juliette: Well, so the overall insights team – I think it’s interesting here because we really encompass what would be considered market research, what would be considered product or UX research? We all sit under the same group which I think is a very positive, good thing. I’ve worked with a bunch of organizations that have separate market research and UX research functions and that just gets really messy, right, because then it’s like who owns the truth? Who owns the insights? And you don’t want to be at odds with each other. I think that when you fly in formation it just like makes so much more sense and there are very powerful research methods that have come from both the market research side and from design research/UX research. So, being able to play in all of those sandboxes, however you need to to address your clients’ needs is pretty fantastic. So, that said, yes, our research – our overall research team encompasses all those different functions. The group that I sit in we do more on the qualitative side and probably more on the product side and there’s 5 of us. And then I’m going to kind of spitball here, I think there’s probably about 15-20 altogether and, like I said, that encompasses all these different insights functions.

Steve: So, when you say 10 more – it would be great if you got 10 more people, I just wonder sort of the context of what does that mean? It’s about a 50% overall increase.

Juliette: Yeah. I think that would be lovely, but here it’s the booming newspaper industry.

Steve: So I hear.

Juliette: It’s a good time to write things on paper and sell them.

Steve: So, you’re talking a little, maybe directly or indirectly, about where – you said you work a lot with product teams, but I think you sort of said in one way or another that there’s a potential here, or just an opportunity to work across the organization in a broader way. And I don’t know if I’m putting words in your mouth or channeling something that you said, but…

Juliette: You’re channeling. You’re doing a great job, Steve.

Steve: Good feedback, thank you. You’re doing a great job as well. Good.

Juliette: Um, yeah, no I think that’s right and the work – in the same way that we can play in the different sandboxes of market research and product research, you know I work, particularly with new products and ventures, I work with the marketing teams and the designers and the product managers. So, I work with everyone, right, because from a user perspective it’s one experience. It’s like how do you relate to this organization, whether through that they tell you, what they show you, all of that is a singular experience. And so being able to be more holistic about that I think is really powerful.

Steve: I wonder sort of what parts, and I don’t mean your organization, but just organizations in general that have research kind of lurking somewhere, maybe tied to product, but I wonder what sort of – what’s the next big thing that research will be doing beyond sort of where we’re seeing right now? I don’t know. And again, I’m not asking you to speculate about your own organization, but you’ve seen a lot of different things. Do you have a – what should we be doing?

Juliette: Here’s my hunch. And this is also an equivalent about me, I’ve been doing coaching training and have been doing some personal life coaching, a few clients, to build those skills because I think that working inside of organizations, working with people here, is something that is very natural for researchers to do and I think for us to build these coaching capabilities is really helpful. You know, understanding the people who are around us, what are your unmet needs? What is this really about? What are we doing here? Asking and answering some of these deeper questions. It doesn’t have to be just about the consumers of our products. It can also be about what we’re doing inside an organization. So yeah, I’m really inspired by that and that’s why I’ve been building my own coaching muscles and that’s been super fun. And I would love to see down the road how coaching and research start to come together as – you know does that become something that looks more like org design? I don’t know. So, that’s a place that I could really see some benefit down the road and it’s been interesting to see other researchers or product people move more into coaching and therapy roles. You know, things like that, grounding the human experience in a somewhat different way.

Steve: Yeah. And I’ve definitely seen. I mean I’ll just confirm my N of one. Like I’ve seen this as well, just with people that I know. I want to ask a couple of clarifications here. Is this something that you’re doing on your own versus as part of the role you have here?

Juliette: It’s both honestly. So, it’s partly like my own passion on my own time. The non-working time. But it’s partly something that I do bring into work as well, working with different people on my team and working with people on other teams, doing kind of informal internal coaching here. And so that’s been super rewarding, but I would love to make that more official. And I do feel like as researchers that’s something that becomes natural for us and could be more codified and more supported.

Steve: I just in the last couple of weeks have had a couple of informal conversations with people about is that something that I do. But then, just in a self-serving manner I want to ask if you can put a little definition around coaching. It’s like a term that we all think we know what it means and I’m honestly not sure I know what that means?

Juliette: Well, it can mean a lot of different things. And when people ask me, well what kind of coach – I mean it’s kind of a life coach, but I hate that term, right. So, I think that we need better terminology around it for one thing. But how I see it is it’s about people understanding where they are now, where they want to go and how they might get there. And it overlaps quite a bit with more kind of standard therapeutic approaches which tends to be more about where I came from and what does that mean about where I am. I think everyone probably needs both a coach and a therapist. They’re very complimentary. But I think particularly in organizations, to understand where I am now, where do we want to go and how we might get there? I think researchers are set up really well to be able to ask some smart questions to help people develop this understanding of their own and help support them through that process.

Steve: So in research there’s an interesting tension and I don’t – we’ll see if it’s true for coaching as well, about – put it this way, are we in the recommendation business, or not? Do you know what I mean?

Juliette: I know exactly what you mean and no, we’re not. I strongly believe that? I think that our role is to unpack relevant truths so that teams can collectively make better decisions and yeah, I think informally, sure. Like I can be part of those discussions and kind of help guide them in some way. But I don’t think that research is about recommendations. And the same way, coaching is not. I think I have a bad habit of giving advice when it’s not asked for and I do hope that by going through the coaching training that I have I do less of that, right. I want to be able to know when to ask the right questions so that people develop their own insights, their own understanding. Because that’s what sticks. That’s what works. And I don’t know – if I were your coach like I wouldn’t – I don’t know, what could I tell you about what to do with your life? Like you know your life. I don’t know your life. But I could ask you questions that could help you unpack for yourself what’s relevant for yourself, in a way that it’s kind of hard to do without someone else there with you.

Steve: So, I think this is true for researchers and I think for coaches as well, I think that the choice of our language can change the frame. I think right, you’re probably not going to say to me, Steve, you should do this. But you might – and I don’t know how this applies to coaching, but I think in research we might say other people in analogous situations have done this, right. Here’s a benchmark or here’s a different case. Or we might say when I was in a similar situation I made these choices. Or, if I was in this situation myself I would make these choices. I don’t know…

Juliette: As a coach I wouldn’t say any of that. No. If someone flat out asked me, like have you been in this situation before, then I would sort of gauge whether or not that would be appropriate at the moment. But no, I wouldn’t think I would do that. And even with analogous research, I think it would be presented more as like a hypothesis. Like it seems that this other way of solving the thing could be useful for us. Like what do you unpack from this? What do you see here? And I could offer this is what I see here, but I wouldn’t say this is what this means and this is what this means for you.

Steve: Right. I think – and I hope you are – we’re just kind of bouncing between these two things which we’ve set up as analogous here, between the researcher and their client and the coach and their client. I hear from researchers that if they don’t sort of provide more recommendation or advice then they lose out on attribution which is not about ego, it’s about – it’s about advocacy. It goes back to that advocacy thing. If you’re not seen in a tangible way as the work that you’ve done has led to these changes, right – I mean there’s something about helping people come – maybe you come up with the idea, but you help them feel like they came up with it. And then it’s going to go. It’s not going to go if you tell them – I think that’s also the limitation of advice. But then it introduces this attribution and sort of impacts advocacy and so on.

Juliette: I think that’s a perennial question for us, you know. And I want for my coaching clients, and the teams I work with, to feel like – like they own their insights. I want for them to feel like they’ve been part of a process that they have directly contributed to that has resulted in insights coming forth that help them make better choices. And I might have had in my head that that’s what they should do anyway, but like – yeah, if I told them that, like it’s different, right. You want for people to come to their own understanding. And yeah, you lose credit. And it’s something that on the research team we talk about. You know like what is – like you have to be okay with some ego loss there, right. Like research is not the thing to go into if you want like the billboard with your faced on it, right? Like ideally the things that we learn become so embedded in the DNA of how an organization works that it’s not really attributable to a person, even if you were the one who made that happen. But yeah, that becomes tricky when you’re wanting to advocate for what you’ve done, right, and ideally you just have really good working relationships with the people around you and they’re kind of aware of your magic powers. Like, when this person is in the room smart, good things happen. And I feel like I make better choices when this person’s around, right. And ideally that’s what happens.

Steve: So, that’s attribution about the relationship or about the collaboration.

Juliette: Right.

Steve: But not attribution for we took action “X” because of input “Q”.

Juliette: Sometimes it is that direct, right. I mean sometimes coming out of research, like we’ll have a very clear understanding of what to do next and then we do it and it succeeds and you can like trace it very cleanly and that’s always very satisfying when that happens. Other times it’s more of a slow burn. You know it’s like you hear something in research and people start to internalize it and then they kind of start to believe it and then they start to act on it and then it becomes like a new thing and I could point to several different mindset shifts that have happened here that originated in research, that I think that – that I know that’s what happened, but I think the folklore in the organization would just sort of describe it as oh now we think this, now we believe this.

Steve: Right. So, there’s something about taking your ego out of it.

Juliette: Yeah.

Steve: I’ve complained about this a couple of times, a series of tweets I think that were from a talk a year or so ago, that I think is not an unusual sentiment, where they were saying my research doesn’t have value unless somebody takes action. And yeah, that got me upset because I think it’s a really high bar and it sets us up to – it’s a statement of our value, or our lack of value.

Juliette: That sets us up to do very small projects, right.

Steve: So that you’re always pointing towards.

Juliette: Yeah – I mean, yes, if you strictly do product based evaluative work then you could say that yeah my research always results in something because that something is like moving this button or moving – I mean like that’s something, right, but when you’re talking about change which is more systemic, which is on a deeper level, that takes time, you know, and that kind of research – those research outcomes, like I said, like it takes a while to be embedded in what you see from an organization. So, yeah, I mean there definitely have been some significant projects here which would I say two months later, like yes, someone you know moved this thing on a roadmap because of it? No. But I can say three years later we’re a different organization because of it. And so I think that having some patience and some – I don’t want to call it maturity, but it’s just kind of like being able to see a bigger picture from both a timeline and a strategy perspective, you can really take on bigger challenges and see the impact of those within an organization.

Steve: I wonder if people that – I mean I’m self-employed so I’m – these things manifest differently for me, but for people that are – where their promotion or their title, or just their sort of general success is tied to some of these things. I mean – your point about patience – so, I think there’s a couple of levels. One is sort of how do I feel about what I’m doing? Right, am I alright? Am I confident, or am I learning or am I growing? And am I successful in kind of a corporate structure which can only measure, reward and acknowledge certain kinds of things? And so yeah, I think I could encourage the former and I have had to learn that very much myself. Like things you think are going to be the rewards for doing this work are often other ones which maybe you have to go hunting for to see that they’ve even happened.

Juliette: Yeah. And I think that’s part of why, as researchers, again, like we don’t want to just stay embedded just in product teams. Right, like we need to know executives. We need to know people across the organization, people leading an organization so that we can understand the big problems facing an organization and they can understand what we’re doing about it, right? And I think that sometimes that just comes from relationships and being able to kind of get a read on what’s going on. But yeah, it’s hard. It’s hard. You know when we can put a dollar value on it that’s amazing, but it’s hard to do that.

Steve: You know you speak in a fairly optimistic way, I think, about not even just the potential for research – potential, so that says it’s kind of in the future. I mean it’s the value that research is bringing to – in the organizational mindset, the belief structure, as well as the products and services. You know, when you look outside your own organization, you know, between optimistic and pessimistic, what’s your assessments? Maybe it’s not that axis, but what’s your assessment of kind of what’s going on?

Juliette: You know I – the design research Google group that a lot of us are on, it’s grown from two people who started it. Now it’s like 1,000 something people. That’s a lot. You know, like our funny little profession, like we keep growing and people keep finding value in it and it’s something which is in many ways new to the world. We’re building it as we go and that creates its own challenges because it’s hard to know what does a 40 year career look like in design research. Like we haven’t really done that yet. Right. So, it’s not like being a doctor or a lawyer where you know like okay, at this point in my career I can expect to be partner. You know, so it’s less secure than that, in that way, but it’s also what’s fun about it, right. I mean like we are co-creating this profession and so like why not be optimistic? Right, I mean like our job is to make things better and I think that we do that even within our communities. And part of why I’ve stuck with research for so long is that it’s such a nice community. Researchers are nice people and they’re people who I like spending time with. So, yeah, I think that bodes well for the future of the community and future of our impacts.

Steve: So, what does an 18 year career look like?

Juliette: Well, I think I’m making the path as I go, you know. And so far I’ve – you know early on research was one slice of what I did. I was a front-end developer. And then later I did more product and then I focused on research and started on the more evaluative side and then moved to the more generative end. And so I’ve really done all of it as it currently exists. So, you know, I’ve been involved in the web since it looked very, very different than it does now. And so far so good, right. I mean we’re still here. But yeah, as I think about how to move forward, I think we make the path by walking it and I have a hunch that there’s something here about researchers looking inwards at organizations – and again, I don’t know a ton about org design. It may be already kind of covered by a lot of these other practices, but at least for my own life I feel like there’s really something about using the skills and techniques of being a researcher, combined with what you can do as a coach and specific training that you get as a coach, to do something different within organizations. I don’t know what that is yet, but I think there’s something there and so that’s the path that I’m kind of figuring out as we speak.

Steve: That’s exciting. I want to go back a little bit. Because you were just giving a nice overview, I think, of where some of us have been and where the community has been, that if you’re at 18 years and it’s all about walking the path to kind of find the path. There are lots of people sort of coming into the profession, let’s just say in the last three years, that – I mean – I feel like when I started, and maybe the same thing for you when you started, there wasn’t a lot of path.

Juliette: Right.

Steve: And I’m like you. I don’t sort of know where I’m going. I mean following the path is a much more confident way of – I might say stumbling – but, it’s just interesting that the field has grown and you just look at – there’s just a number of people and the time. And so somebody that sort of chose research, or was chosen by research in 2017 is – they’re part of a different community than you were 18 years ago. So, what it means for them to sort of learn. There’s, there’s – the terrain is different because it’s populated by people like you or there’s a path that’s woven by people like you. I’m going to take your lovely metaphor and just…

Juliette: Go for it.

Steve: Mow over it. I find that interesting. Just back to sort of my question about advising, the people who are joining the field are joining a different field than the one that I joined or that you joined.

Juliette: I mean it wasn’t a field at that point, right. I mean like there were a few people who were doing some great work, but it wasn’t really a community, at least as far as I was aware of. So, I think people joining now – I think it’s very cool that there is a field for them to join and they could hop on this gigantic Google group and ask a question and have 1,000 useful people helping them out. You know for people starting this now, I think they’re in a great position. You know you can actually learn this in college. Like, you couldn’t learn this in college when I was in – at least where I was in college this is not a thing. You know I studied comparative literature in undergrad and I remember telling my comparative mystical literature professor that I was wanting to go into IT, as it was all called back then, and he was like why? Why would you do that? Like just totally confused. Like that doesn’t seem like a realistic or interesting thing to do. It wasn’t really a path. So, yeah, I think it’s great that people are joining now and I think there’s more support. You know in general for all this technical, digital roles there’s been a lot more definition in terms of the different job functions and I think that that makes it a lot easier. I mean back in the day it was about like webmaster and they did everything. There aren’t webmasters anymore, right. I mean like the jobs have become much more specific. And I think that makes it easier to train, to decide like what you want to be doing to become really excellent at what you’ve done. You know, it’s a bit of a tangent, but I think that research as a standalone profession is I think something that’s also becoming more possible to do for – I’m curious your take on this as well, but I feel like the sort of broader field of UX, you know someone is like a UX person who does design and they do research and it’s all kind of rolled up into one thing – I didn’t want to do that. I wanted to be a researcher and so when I made that choice 10/11 years ago to focus on research, it really limited my options. It meant that I could either work for an agency or for a really big company. But for smaller companies they really wanted more of a full service “UX” person. I’m curious if you have seen more growth in smaller organizations hiring specific – like researchers who specifically do that?

Steve: Who are researchers?

Juliette: Just researchers, yeah.

Steve: I mean this is anecdotal, of course, but… I think there are – there seems to be sort of a pattern of maybe the first researcher in an organization is – or maybe person zero is someone who is doing multiple – we have an intern or we’re going to take some of somebody’s time, they’re interested and they’re going to go do this. A product manager maybe who has kind of got the bug, or read something or saw something. But you know when companies start hiring someone with a research title I think sometimes that first person is very junior and then it has to build up to a point where there’s a realization, oh we actually need leadership and mentorship and being able to develop a function and a practice and a team. And I have a lot of concern about that person who is sort of brought in with no mentorship. I think about the person I had coffee with about a year ago who was – I think he was not too long out of graduate school. So, professionally junior, but I think educated a certain amount, and they were the only research person in this organization and they basically were sent to do something where they were told how to go about it and what to produce. And of course, if you don’t know what you’re doing there could be a mismatch. But they took the hit for this and they, I think, ultimately were dismissed because they were unable to produce the information that was needed from the method they were required to use. And we had this long conversation and this person was convinced it was their fault and that they were deficient as a researcher.

Juliette: Oh man.

Steve: Yeah. And to me that was a lack of leadership.

Juliette: Yeah. This company needs coaches and researchers it sounds like.

Steve: Yeah. Some executive coaches or management coaches who say here’s how to manage your staff to success. And that’s I think – because there’s sort of this generic perception of research, or that it’s just a thing and like you’re just talking to people, or if it’s a survey, just go set up a Survey Monkey, then it seems easy to have people that don’t appreciate the craft and the mastery of someone like you that’s been doing this, that chose this, can bring. I mean that’s a really sort of disappointing example, I think, but there’s the researcher who has the title and is not in a situation where they’re going to be able to thrive, but I mean – if your question is like are there researcher title jobs? I think a huge amount.

Juliette: I guess it’s more like is that increasing as a phenomenon because yeah, I don’t know. I’m just curious if as these other roles have become more specific if there’s been a greater appetite for researcher as a standalone function. But yeah, I mean I’ve been that person. I’ve been that junior single researcher and it’s really hard. And that’s why I take heart in some of the larger online communities that have formed because I hope that what we can do for each other is to help advocate for methods and for approaches and to kind of stand in as mentors to people who need that support and don’t have that person there with them. Like I would love for us to keep doing that for each other and I think we do that to an extent and that’s part of why I really advocate for researchers to know other people who do this. And it’s also part of why I like working here. You know I work with a bunch of really, really smart researchers and if I need to pull in someone who is an expert in running a conjoint analysis, I can do that. And it’s nice to have this deep bench of expertise just available here and that’s part of why I really enjoyed working at – not a huge company, but like a big enough company that has a deep bench when it comes to research vs. like me off on my own doing this startup. I feel like I can be a better researcher when there are people around me who I can lean on for their expertise as well.

Steve: So, you’re okay not being able to do a conjoint analysis yourself?

Juliette: Totally. We’re always learning, right, all of us are learning, but we work with people who have PhDs in statistics and like I think that’s great. That’s probably not something that I will do. I think it’s wonderful that we can specialize in different areas and I think that makes us all stronger.

Steve: You know it’s that depth vs. breadth tradeoff. You know you’re sort of asking about research as a specialty, but even that word, to your example, it comes with so many things.

Juliette: Yes.

Steve: I started off trying to position myself as a generalist a long time ago and the more that times go by the more things I realize I know nothing about and personally it’s terrifying. Maybe that’s too dramatic a word, but I sort of have to own it and then also deny it, that there’s this thing that I don’t know how to do and so I’m aware – what does expertise look like? And what does expertise look like for you as you proceed and there are things that you’re going to pull in that are not going to – you may never be a master at conjoint?

Juliette: That’s a great questions because I feel like there are things that I’ve definitely decided to not specialize in, you know, and one of those is prototyping, for example. At one point I did do much more design and now I am totally happy not doing that. I think it’s – knowing what you are okay letting go of I think is a very important professional skill. And I think it’s part of growing, is knowing like this is something that I’m going to bring someone else in for. And, yeah when it comes to live very, very complex data analysis, there was one point in my career where I was really focusing on that and was building a pretty big skillset there. I think actually last time I saw you at IDEO I was really focused on building a very deep quantitative understanding for myself and eventually I decided, no. Like I would rather lean on the person who has the PhD in statistics and I want to refocus on the things that I feel like I can more uniquely bring to the table. And for me that’s more about qualitative research, that’s more about design thinking. Like those are things that I can uniquely do that I think someone else couldn’t necessarily uniquely do and so yeah, I think just understanding what you want to do and what you don’t want to do I think is a really nice part about getting older in this world.

Steve: So, how should someone approach that? That is maybe in the first few years of a professional career.

Juliette: You know, I think the first few years like don’t go around saying no to stuff. You know, say yes. And then follow your intuition there. But yeah, I’d say probably don’t get too – I think in general in life, like be cautious about saying no to things that you don’t really understand. Like maybe kind of dig into it a little bit because there might be something in there for you that you hadn’t expected.

Steve: Do you, like let’s just imagine the magic wand thing. You were sort of saying that if you could hire 10 researchers you wouldn’t be opposed to that.

Juliette: Right.

Steve: So, if we were to wave the magic wand and make that happen, as you would start talking to people, what kinds of things would you – I realize that we don’t have a job description – whatever – I mean what do you start to look for when you talk to people about working with them?

Juliette: Um, I like working with people who – the first thing I would say is that it’s important to hire people who bring a good energy into a room and I know this sounds like super – I did live in California for a long time, right – but I think that researchers can bring a kind of brightness into a space and a kind of optimism for a team and a sense that we can learn these things together. It’s a bit intangible as a quality, but when we bring on new researchers that’s really something we look for. Like is this person someone who is excited about making connections across an organization? Excited to share what we’re doing? You know there’s something about bringing energy into research which I think is really important, but beyond that, yeah, I mean being able to really, really listen. Being a thoughtful, humble listener to what people are saying and I can sometimes know in the first conversation with someone whether or not I would encourage them to move ahead as a researcher based on their qualities as a listener. And then being able to think thoughtfully, strategically, big picture and being able to execute. It’s hard to – it’s so hard to hire, right, because these are all kind of a little bit intangible as qualities, but yeah, looking for someone with a curious mind who truly likes people is really important. I think that’s key.

Steve: When you say brings energy into the room and is a humble listener, part of me – a part of me feels like well is that contradictory? I’m really just thinking about myself. The one is sort of an extroverted energy and one is an introverted energy, but I don’t know.

Juliette: I’ve found that most of the researchers I’ve ever worked with are true introverts and I would put myself in that same category, but we are – we are introverts who like people, you know, and I think that almost all of the researchers I’ve known fall into that category.

Steve: Maybe this is a personal issue, but why not. The bringing the energy into the room thing is interesting to me because I’m probably more of a listen first person. I mean I pride myself on being a good listener, but that also means I sort of – I watch a lot and, you know, depending on how people come across me they might see me sort of in on stage mode where I’m saying a lot and I am telling you stuff and sort of trying to bring something. But I spend most of my time sort of holding back and just waiting for those moments and I think it’s – like I feel intimidated by that description to sort of have to make an impression.

Juliette: But I think what you’re talking about is its own kind of energy, right. So, the energy that you’re bringing into a room is I support you, right. I’m here to listen to what you’re saying and it matters to me, right. And so like that’s a very particular kind of good energy.

Steve: Okay.

Juliette: Right. It doesn’t mean that you’re coming in like you’re running for Mayor, right. But it means that you’re coming into contexts and like in a positive way. You know like you’re not glowering. You’re, you’re – you’re engaging, even if you’re not saying something. And so I think that that’s – does that make sense?

Steve: Yeah. And I think just clarifying words is so important, right, because when you said energy that made me think of extroversion and speaking and being loud and “hey,” and smiling. And you’re right, there’s a lot of different ways to manifest that energy.

Juliette: So, I’m coming in as like a Reiki person and so for me it’s about like what is the thing that you’re emanating? And what is the thing that people are picking up on about you? And what are your intentions when you’re coming into a space?

Steve: Okay. So, you listed a bunch of things that you said were at varying levels of intangibility. But what are some successful ways that people could present those qualities? Demonstrate them? Whether it’s in a sort of interviewing context, or like a job interviewing context or any other?

Juliette: Yeah, I think being a thoughtful communicator is really important. And so when you’re communicating with a potential employer just paying attention to what they’re asking you for and really showing up for those conversations. I think that that’s a big part of it, right. And being able to share examples of work that you’ve done, that you’re proud of, and being able to speak to experiences that you’ve had in a really cogent way. One of the – we haven’t really talked about this, but I think one of the biggest skills that a researcher brings is a synthesis. I mean being able to look at a set of disparate facts or impressions and pulling something meaningful out of that. And so I think if you’re – if you were in an interview to be a researcher somewhere I think that you’d want to be able to speak to work you’ve done in a way that makes sense, that’s to the point, and that communicates what you want to communicate.

Steve: So, can we maybe switch sort of the who’s communicating to who scenario. So, we’re talking about people at a certain stage that are kind of approaching organizations. And then earlier on we talked about advocacy. You eluded to being on a panel about advocacy. What are things that researchers can manifest, display, whatever the verb is, for the people that are – that we want to get excited about research, that we want to sort of demonstrate the value. What should we be doing or saying or showing them?

Juliette: This gets back to our conversation on impact. Um, which again as we talked about is tricky with research because sometimes that impact is direct and sometimes it’s more indirect. But I would love for someone to be able to speak to that. You know and they could say like look we did this project and there wasn’t a direct outcome from it because there wasn’t space on the roadmap and there wasn’t budget and there wasn’t executive buy-in, but I was able to keep bringing this into the organization in ways that eventually it became part of what we did and here’s an example of it. So, you know like that’s a story that I would love to hear from someone who wanted to be a researcher because it shows that you have an awareness of the nature of reality, but that you’re able to effectively operate within complex interpersonal spaces.

Steve: So, let me change the players in this dialogue that we’re playing around with as well. So, that’s kind of a researcher to researcher thing, but now maybe it’s you, or it’s future you, and you’re talking to someone that is interested in, or has some curiosity about, or maybe some skepticism about how what you do can bring value to their part of the organization. How do you advocate for research for them? What do you highlight?

Juliette: You know it’s interesting because I feel like I haven’t had to do that for a while. You know we haven’t had to do that here because people know about success and they want some of that. So, if you can do a great job with one team at an organization other people will know about that. And so, I think being able to effectively do one good thing within an organization will help you work across the organization. Does that answer your question?

Steve: Yeah. I mean the best way to have that conversation is to preempt the necessity for that conversation by doing work that tells the story itself. I’m just saying what you said, but I said it faster.

Juliette: Yes. Thank you for synthesizing that.

Steve: That’s podcast…just repeat it back faster.

Juliette: You’re doing a good job, Steve.

Steve: Thank you. You’re doing a great job as well. So, we’re jumping around in chronology here a little bit. I want to go back to one thing that you said. Maybe two things. What did you say to your professor when they were like huh, IT, why would you want to do that?

Juliette: That’s a good question. I think this was in like 1998. It was a long time ago.

Steve: And you don’t remember exactly what it was?

Juliette: No, I don’t. But I remember – I don’t know, I wasn’t defensive. I was just kind of like. I don’t know. I think honestly I felt kind of proud that I was doing something strange and it was like I’m an iconoclast. I’m doing my own weird thing here. So yeah, I think I was probably like – I was like a nice kid. I was probably like oh yeah, I don’t know, that’s interesting. But I think on the inside I was like yeah, I’m a rebel.

Steve: Yeah. Right. Sometimes that external perspective – you know if you’re in that sort of bubble, or that identity, that perspective that you’re breaking out of is like – you may not have felt that way until someone told you that you were rebelling.

Juliette: Yeah. I mean especially at the time. It was like the Internet was just like new and weird and juicy and I was like there is something going on over there, like I want to get into that, whatever that looks like. That is strange and that is good.

Steve: And another moment, I don’t know if you can recall any details, but you talked about sort of starting off in front end development, that eventually you got into research. And I’m wondering was there a moment or series of moments where you saw that it was a thing, or that it kind of sparked for you that – do you remember what you were doing or what you were thinking when you started to feel pulled towards it?

Juliette: Yeah. You know when I was doing front-end development, and this was back in the day when like, you know web standards were just really hitting and we basically tore apart the web and rebuilt it in a different way and it was a really like heady time, you know. Starting to use CSS in ways that hadn’t been used before. I mean the underpinnings of the web just within a couple of years like started looking very, very different. And so that was just kind of an exciting time to be doing that particular work. But then eventually I just got good at it and it was like someone could hand me a mockup of something and I could build it. Like, okay. But – I do remember a couple of times where I was building something and I thought that’s a bad idea, why are we doing that? And I realized that I just wanted a different seat at the table than that. And so that’s when I moved more into the product role which at the time I think was a little less exciting than product now. I was working for an organization that had a very long term perspective on things and we were doing things like requirements documents that would take several months to do and then a development cycle that would take several months. And it would like be a year all in, right. And so I would just be writing these like tomes and again eventually it was like, yeah, I want a different kind of seat. And I remember a moment – this was also when I was considering a move to California – was if I wanted to go more into interaction design or wanted to go more into research and could sort of see different pathways. And eventually did more on like the project/product end and then while I was at one company really pulled the research threads out of that and created a new role for myself around research. And that’s when I was like I am a researcher. So, I really built that role for myself out of more of like a project/product manager role.

Steve: Maybe it’s different – it’s definitely different now than then, but any thoughts about – like you just declared research as an identity in that part of the story and I think that’s an interesting one. We don’t talk about the identity part of it enough. We talk about sort of skills, or are you a designer, or maybe you’ve heard the phrase people who do research? I’m not even sure what my question is here. Do you have a perspective on – if I throw out the idea of research as an identity does that spark anything for you?

Juliette: If you’re familiar with Richard Scarry books, Busytown, there is like the spoof of Busytown with like the little internet creatures and there was the like innovation strategist and she was building a 2×2 framework covered in Post-its and she was wearing like a printed anthropology dress that looks basically exactly like what I’m wearing now. And I was like oh, like that’s uncanny. Maybe we do have an identity. Or like maybe there is something to this that we’ve kind of like co-created as a thing that we’re doing. So, yeah, that was the first moment where I thought huh, maybe there’s a researcher identity that’s happening. But yeah. I think we’re not as cohesive, partly because I think we’re not – we’re not that stylish. Right? I feel like I can walk into a group of people and they’re like okay those are the designers, those are the exactos. I don’t know, maybe the researchers are the ones who are like very earnestly listening and kind of like a little disheveled. Maybe those are the researchers.

Steve: Right. And you’re framing identity as an external thing which is interesting. I thought more as an internal thing, but you’re kind of saying that there’s an aesthetic maybe of certain practices and that if researcher identity is observed more by behavior than that puts it into a different space than like a designer identity which might be more about…

Juliette: Steve, you’re really good at restating things in a more coherent way. Thank you for doing that.

Steve: Thanks. You brought up Richard Scarry which I think is a very good thing as well. So, thank you for doing that.

Juliette: I’m the parent of a toddler, so, you know here we are.

Steve: Yes. Write what you know. Okay. I feel like if we’ve hit Richard Scarry we might have hit – in fact Twitter parodies of Richard Scarry might be sort of peak episode. Anything else that you think we should talk about today that I didn’t ask about or you want to get to?

Juliette: Just that I’m excited to see where we go with this.

Steve: This episode?

Juliette: No, no, with research, as researchers.

Steve: Where the community goes? Where the practice goes?

Juliette: Yeah. Like, ‘cuz no one knows. We’re co-creating this thing and I guess I would say to myself, and everyone else, like take heart. Like we will keep building this thing together and it’s going to be good and just because there’s not like a traditional career path doesn’t mean you’re not going to have like awesome things you’ll keep working on and we’ll figure it out as we go.

Steve: Do you have hopes or dreams? We don’t know, you don’t know – that’s fine – but is there a place you can fantasize about it going?

Juliette: I think what I was saying about research having like a more both strategic and emotional impact within organizations that transcend making choices about how we serve customers. I think that’s a thing that we can do.

Steve: And maybe I’ll just add to that. Maybe there’s an element where that also is explicit instead of tacit and then that wraps up some of the challenges we were talking about.

Juliette: Yes. And in fact, a plug for NYT Parenting, we’ll soon be doing a series on emotional labor. So, if you think as researchers, having a lot of the work we do become acknowledged and codified I think will be very helpful. And it’s not just the tacit, emotional labor that we currently do.

Steve: Excellent. Well, this has been a fabulous conversation. I really appreciate the chance to speak with you.

Juliette: Thank you. Thank you for coming to New York.

Steve: Okay that’s a wrap! You can find Dollars to Donuts on Spotify and Google Play and of course Apple Podcasts. On the web at Portigal dot com slash podcast you will find all the episodes with show notes and transcripts. Our theme music is by Bruce Todd.

Aug 22 2019

1hr

Play

Rank #5: 23. Michele Marut of CBRE Build

Podcast cover
Read more

In this episode of Dollars to Donuts I speak with Michele Marut who leads user experience research at CBRE Build. We discuss the curation of research repositories, using research to go beyond fixing things, and building processes and tools that can be used by researchers and people who do research.

The philosophy is that the trained researchers should be taking on the most critical, the most risky projects. That’s where they can add the most value. Are they going to lose a lot of money? Is this a totally new workflow? Is this really new to the world? So, really focus trained researchers in that area. – Michele Marut

Show Links

Follow Dollars to Donuts on Twitter and help other people find the podcast by leaving a review on Apple Podcasts.

Transcript

Steve Portigal: Thanks for joining me on Dollars to Donuts, the podcast where I talk with people who lead user research in their organization.

As a consultant, I find myself collaborating with very different types of organizations in terms of the amount of experience they have in doing user research, or learning from user research, or acting on what we learn from doing user research. There might be strong leadership in user experience design, or product design, or service design…or that might be a completely foreign concept. That’s a significant challenge in my work as a consultant, ensuring that I’m in a position to assess and respond to that diversity. It’s also something I really enjoy, and I see working on this podcast as part of that journey, something that I’m able to share with you. And so, the best way to support this podcast is to support my own consulting business. You can hire me to lead user research projects or to coach your own team as you talk to users. I help organizations put together research roadmaps so they can prioritize their limited resources. And I run training workshops to level up fieldwork and analysis skills. Please get in touch to see what we might do together.

Otherwise, if you have feedback about this podcast, email me at DONUTS AT PORTIGAL DOT COM or write me on Twitter at Dollars To Donuts, that’s d o l l R s T O D o n u t s.

I was intrigued by a story on NPR last month about program created at a VA hospital in Madison, Wisconsin, called “My Life, My Story.” In this program, staff and trained volunteers conduct open-ended interviews with veterans about their lives, letting the patient decide what they want to share about themselves. The interview is turned into a thousand-word biography that the patient reviews and revises, and then is included in their medical record. Patients also can share these stories with friends and family. This particular VA hospital has been training other hospitals in the VA system over the past few years. The article describes the benefit to the patient, and the different caregivers, both in the act of telling the story, and in having this kind of softer information available for review in their medical record. The article also explains the evolution of the “My Life, My Story” program, and how they “tried having patients fill out surveys, which were useful but still left the team wanting more. Next, they tried getting patients to write down their life stories themselves, but not many people really wanted to. Finally, an epiphany: Hire a writer to interview the patients, and put what they learned on paper.”

This is a primordial form of user research; there’s no sense-making of patterns across groups of users; the data gathered from a user of the system only has value when applied directly to that user’s specific experience. I mean, maybe there’s a cultural shift that comes from making this kind of information available in the medical record itself, maybe it invites providers to be hungry for this kind of data, maybe it changes the conversations internally around patients as people with rich and messy lives beyond their medical conditions. And I want to dismiss it as unscaleable – this idea of interviewing every single user, rather than a sample, but – look at the implementation here – it’s absolutely about operationalizing it so that this service, asking open-ended questions about a patient’s life, is a component of the experience for each patient.

In some ways, it’s a bit threatening to user researchers, that the simple act of just asking people about themselves can benefit the delivery of services so significantly, but on the other hand it also suggests that the potential value researchers can bring is well beyond what we usually say it is. If you’re looking towards the future of user research, this may be an early signal for possible directions.

Let’s get to the interview, with Michele Marut, who is leading user experience research for CBRE Build in New York city. Thank you for being on the podcast.

Michele Marut: Happy to be here, Steve.

Steve: Let’s start with an introduction. Who are you? What do you do?

Michele: My name is Michele Marut. I am currently a lead UX researcher at CBRE and I’m working to build up a research center of excellence in the sales office.

Steve: What kind of business is CBRE?

Michele: CBRE does commercial real estate. You’ve probably seen their names on buildings, especially if you’re in bigger cities. But if you’re doing retail or industrial and logistics types of work you might see them. And if you’re searching for office space you might have encountered them from a broker and advisory perspective.

Steve: What part of CBRE are you working in?

Michele: I’m working with a group called CBRE Build and it sprung from an acquisition that happened approximately 2 years ago when they acquired a company named Floored that specialized in 2D and 3D visualization tools that were really helping brokers to help sell their buildings and spaces. And so those tools are still in existence, but since then my manager has been tasked with creating more of an innovation group and they’ve created other tools that complement the original 2D and 3D tools, as well as tools that help people calculate and anticipate space that they might need. Tools that help work with test fits and that can be used by people beyond brokers and landlords and more for helping you really figure out the right space for your needs. It’s still focused on commercial real estate.

Steve: So, is this kind of a product and tools part of a larger real estate organization?

Michele: It’s a digital and technology section of a larger company that’s focused on commercial real estate. A fun word to talk about is proptech, or property tech. We are very much in the property and proptech space if you see that. Some people might think it’s a buzzword, but it’s what’s happening with other parts of the industry as you’ve seen other groups – financial went through this, health care is going through this, even consumer – commercial real estate, as well as other parts, even in your home and consumer real estate is finally getting more interactive. It’s more tech savvy. And so, some of these things are finally coming to commercial real estate.

Steve: I think the New York Times keeps writing about how companies like Zillow will now buy your house as well as tell you what it’s worth and there’s some technology that makes that happen.

Michele: Yes, yes.

Steve: So, is that proptech?

Michele: That’s proptech. Proptech can be consumer, if you’re looking for a house or an apartment. And it’s also in commercial real estate. So, it’s all different things. Anything that helps take data, technology and information and make it more innovative and probably disrupting the status quo of what you’ve probably been buying and selling real estate for hundreds of years the same way. We’re kind of at that inflection point for tech now, for real estate.

Steve: Okay. That’s great context. And then you mentioned center of excellence? Is that right?

Michele: Yeah.

Steve: What does that mean?

Michele: So, some of the centers of excellence are business terms that have been around. Here what we’re trying to do is one, build a group so there’s more researchers. There was only, I think, a researcher who had come from support. Before I joined there was one or two people in the other parts. We’re certainly adding more trained and skilled researchers as part 1, literally building the team. Part 2 we are creating templates and processes so that anybody, whether they’re a trained researcher or a person who does research (PwDR) can actually leverage some of our tools. They can get started very quickly. They can find the right participants. They can use our templates. They can use interview guides. And they can probably execute a basic research plan on their own. And then the third component is understanding who else in the company is doing research, making sure that we’re connecting them and building this community so that we can continue to grow our research center of practice, but also to share like if we’re actually talking to brokers and the same people across the company. We can share what we’ve learned and we can start to share journey maps. We can start to share new insights and then try to really understand does it differ by geography? Does it differ by region, or something else? So, that we start building on a shared platform of insights and not redoing work that somebody else already did.

Steve: What’s the process? When you come into an organization you want to find – I don’t know, these sort of pockets or different areas – how do you go about doing that?

Michele: So, somehow I was able to connect with the person in Seattle who introduced me to somebody in Dallas and we initially just said we’re going to have essentially a meeting. That meeting that I thought was only going to be 3 people is now been a monthly meeting that’s grown to 5 – probably 5 to 8 with a few people – we’ve now had designers who are basically doing research. We have a product manager who is actually doing his own research and is a huge advocate for research, who is attending that. And so, based on those meetings and others they’ve invited us – in one case we spoke at a lunch and learn. In another case we were a guest – we were kind of a guest speaker, or guess attendee, at other meetings and people have started to find us and either say, “oh we have 73 personas on this topic.” Or, “we don’t know. We think we have 73 personas. We can’t find them, but you should talk to this person.” It’s been, I would say, very grassroots, but we’ve established kind of this monthly meetup and we at least know a handful of people that are here and who to reach out to and we’re getting the word out. People sometimes reach out to us.

Steve: So, does that step outside of – originally you were sort of describing where CBRE Build is, sort of what its mission is. Are you now making these connections outside of that?

Michele: Yeah. So, I’m making the connections – outside of CBRE Build my mission is definitely to establish the New York City office as a center of excellence for user research, but also in so doing, because we’re under the same umbrella, the digital and tech team umbrella, that part of that is just elevating everybody else. And also, to do our job, so it makes sense for us to talk to people in other locations and other parts of the business because of the overlaps.

Steve: What was going on at the organization where there was – in the time when you were talking about being part of CBRE Build there’s obviously some desire, or hunger or some sense that something’s – there’s more that we could be doing. I know that’s sort of before you were here, but like I’m wondering what are the conditions that set up for a willingness to drive the kind of work that you’re doing for them now?

Michele: I think that the company has always embraced user centered design and understanding the users. All the product managers that I interviewed with and spoke to, they were doing the best that they could with user research. Sometimes they certainly hired outside contractors to execute research. So, there was already that interest in user research. People were doing their own jobs to be done. They were ready. Again, doing their own interviews. And some people have taken UX classes, or had some background in user research and similarly there were designers here that were also doing what we would call user research. And so, there was already I think a good platform having product managers and designers who were trying to make their products better for the user, not just a technology play. And so, I think that that was really exciting. It was also – what attracted me is that it’s not – it wasn’t simply the end side of a project that they wanted. They also wanted help starting to develop MVPs and being more experimental. Like is the – we have some insights, can we make something? Is this really the right fit? Okay, great. You know it is, or it’s not quite, we need to go back to the drawing board and tweak that a little bit and then get into building it right. So, the fact that they were talking a lot about that and doing more innovative processes and were open to experimentation, that made me want to come here.

Steve: That makes me think about like what’s – you know when researchers are looking at organizations to see if they want to join them, some of the things that you mentioned were about – were not research specific, but how are products being made.

Michele: Right, right.

Steve: So, you’re looking for sort of the context that’s going to make the research that you want to do…

Michele: Yeah, yeah. So, were there designers here and actually at the same time they are since hiring – they are going to be hiring a new design director, but there was already the foundation of the designers in this office. And then there was – I had a sense of there might be other designs in other pockets of the organization. I didn’t know to what extent, but the fact that they – I’m not sure when I heard about it, but I definitely understood they were trying to improve product development on all levels research, from formal design as well as product management. So, they had those tools.

Steve: You mentioned one of sort of the three different things you’re doing was around building tools for people who do research (PwDR). Do you have a perspective on – I think this question of who does research and what do we do as researchers to support that – is there a philosophy or a goal kind of beneath that for you that you think about?

Michele: The philosophy I think is that the trained researchers should be taking on the most critical, you know places, the most risky projects. And that’s where they can really also add the most value. Are they going to lose a lot of money? Is this a totally new workflow? Is this really new to the world? So, really focus trained researchers in that area. I believe that once people are more familiar with some of the usability basics and they under – they really – if they really embrace – like we really want to understand which concepts are resonating better, which aspects of the concepts, and they do it with an open mindset that they can start to get feedback on a concept. One of my favorite stories involves a designer where originally I think I’d led several of the sessions, so he kind of understood my script for one point and then he was able – we set him up. So, he was going to lead the sessions with these other participants who were recruited. They already met the criteria. And I remember him coming back and he said he just showing them the concepts. And I think it was probably 3 or 4 and then he said they just weren’t getting it. And then he kind of – he commuted on the bus so they went home on the bus and he like redid something based on all those things and he redid the concept based on the feedback, so I think – and then eventually it worked. It tested better. But the fact that he, even on his own, exposed 3 or 4 people without me saying like can you do you this, can you do this? And like 3 or 4 people couldn’t and he finally understood there was some issue with the concept and it wasn’t about him and it wasn’t about them, but that he needed to take out information and redesign it. Like that’s my favorite story because we set it up, he had my original script, he didn’t have to find the people, but he got that insight and redesigned it. So, that’s my favorite story is someone maybe who isn’t trained, but being in the right circumstances and then being able to act on it. Was able to do that.

Steve: That’s great. I mean that is a good example of sort of setting people up to be successful. And you didn’t need to be there for that, so you could do something else.

Michele: Right. But he was also very aware and the fact that he was then motivated to be like oh wow, it really, really isn’t working. You know, let me see how I can rework this? And that he was humble enough to say – not have the ego or anything, but to really say okay, something doesn’t make sense. You know.

Steve: Right. Because you can experience those research sessions in different ways. Like even that phrase, they aren’t getting it starts to point outward.

Michele: Right. But I think it was that he had to do – the fact that he also persisted and did all the sessions and then realized okay there’s some – and maybe – and I don’t know. Again, this is several years ago, but I remember the essence of that and so I think that if you can help set that up, I believe in helping find participants. I’ve done this in other cases where we’re going to say it is easy to find participants. Sometimes you might have a dedicated recruiter. Sometimes you have a research ops person. Or where I do believe there’s still skill involved in getting the right people, but that having some of that infrastructure around can help say, okay, everybody doesn’t have to worry about that, but I don’t think that other people should be necessarily taking on – there’s a lot of skill that is involved in recruiting, so I think research or research ops can say we’re going to help find the people, we are also going to help you write your script and if do this – and essentially this is what you’re looking for. And also having enough people in the room. I also believe in watching, having people that are going to do it – if they’ve watched professional researchers do this, if they’ve been in enough sessions, they understand essentially, hopefully what they’re looking for and what they’re not. I wouldn’t throw somebody in without any training. They also can reuse scripts. Like in some cases I’ve done a lot of automated usability testing. And so, again, usually designers and product managers love that because they can watch the people do things on a real screen. They’re not even having to worry about writing their prompts. They’re maybe reusing or rewriting a prompt that I wrote or another researcher wrote and then they can go back and watch that. And usually those are also really insightful for the designers to watch that and also to watch it – what I feel – this is making me think is another thing where it’s very focused. I think that other people doing research, some automated research is really good, or even their own, if it’s super focused. It’s very ABC. Could they get from A to B to C? No, they got lost. Okay. That is easy to identify. Where it gets really complex, where I think there’s a lot of interactions, that’s where you probably need a team. You need more dedicated researchers. But I definitely think that things like that can be – people can be trained to do that.

Steve: So, what are some of the most impactful tools or processes, or things that you’ve set up to help these folks?

Michele: So, we’re still in the process of setting up – we’re calling it kind of like a research Wiki. Off that research Wiki we start – we’re having templates. We have past surveys and details that you can reuse – or surveys with the details of like sample questions. If this is an automated study, if it’s a survey, here’s what those questions look like so that it’s starting to be a little bit more self-serviced. We’re starting to link any journey maps, any personas that were done for projects that actually this might relate to your project. So, we’re trying – we’re making sure – we’re not calling it by the project name or a code name, but calling it here’s this type of person, making sure that anybody else can find that. So, we’re setting – basically hacking Google Docs to make a series of mini-Wikis, but it seems to be working. And then on one project we actually have gone so far as, we’re using Airtable and making Airtable to kind of capture some of the feedback from key elements. That’s something that’s still, I guess, under development, but the product manager actually loved Airtable and she had set something before and because Airtable is really like a giant spreadsheet I was able to say well here’s where I would put in a few recordings, here’s some clips, here’s some evidence that I would also add as I find it and we were doing something that lasted over several months and we got into the habit of at least trying to add one or two lines with maybe keywords that we could do – that we could put into Airtable. And what a really nice outcome of that is we’ve now been able to link those Airtable nuggets and pieces of information to Clubhouse tickets which is where there are design and engineering tickets and so we’ve been able to re-look at what’s in Airtable and say oh these things all relate to a certain topic, or a theme, we can create a link and then when an engineer and designer is working on that we can send them that link that it literally has all the evidence, at least at a high level, and if they ever wanted to know more they can go back to here’s the entire survey, here’s this entire research report or something like that, but it’s all off that link. And so, I think that that was exciting for me because of the impact of finally having a process, at least on one project, where things – sometimes in the past they’ve gone into repositories, but they weren’t always linked at the time to a design or an engineering ticket. Sometimes there was a lot of rewriting and so this is something that’s in progress. We hope we do a blog post and share it out, but it’s working for this one project. If it’s scalable we’d love to share some best practices and learn from that.

Steve: You mentioned repository and that seems to be sort of a trigger word in the user research field right now.

Michele: Yeah. So, I think a research repository is definitely a trigger word. I had the opportunity to work with Tomer Sharon who started Polaris at WeWork and I was one of the people that did not create Polaris as much as tried using it as a dedicated researcher and providing tweaks for some of the early information that was in there. Some of you may or may not know is that Polaris started as an Airtable base before we created – at WeWork that we created a homemade tool. And so, I believe that some of the lessons that I learned from working on Airtable there and learning from working on the homemade tool Polaris, I was able to apply here. What I took from that was still making sure there are single line items that can stand on their own as kind of a summary or key word. Like, you know, the participant did not understand this word. They used this word instead of that word. Things like that. Things that are very self-explanatory to anybody reading. And then if there is evidence, is it a screenshot? Is it a record – is it a link to a recording, even with a timestamp. Sometimes we can put in a little bit of a clip of like hey, the user really had trouble doing that. But that went into Airtable. And then at least being able to search by keywords. Sometimes I was prioritizing it by impact to user. Was it critical? It’s really going to be a showstopper? Is it moderate? They’re going to kind of muddle through, you know, but it’s not a show stopper. And then minor, it’s going to impact them and irritate them. And so, putting in, at the very least. And then also what the designers and the product manager loved is making sure there was an area for notes or ideas because sometimes people have ideas and I think that you should be able to capture that at the time. Maybe it’s – you know we really, we don’t have this in help, and not that you want to rely on help, but in some cases it could be like we want to make sure that they’re looking for this keyword in help because they all use this keyword. So, make sure it’s there. Or, we really – they use this other competitor tool, here’s the ideas. We should really – or they used a consumer tool and the consumer tool reminded them of this tool, we should look into that. So, making sure we can capture all that. I think it’s not – because of how it is it hasn’t been, for me it’s been a challenge. We have some verbatims from surveys that was great to add into Airtable, but I also had to add high level summaries of surveys or items that were maybe not as conducive to Airtable, so essentially hacking Airtable to make this repository and for that reason that’s why I think it’s not complete. I still kind of have a research Wiki which is really like oh you’re looking for the personas for the project. You’re looking for – oh, here were some key things that we did. I did a biofuture type of workshop once with some participants to really help us prioritize other features. That didn’t lend itself well to Airtable, but we have the people recorded saying hey this is valuable or it isn’t. So, I feel like the research repositories bog down because there isn’t a good – the way – in order for them to be useful, 1) you have to fill it and 2) you’re not always getting the same types of input. Sometimes it really is observing people and even if you’ve got the transcript you still need the context of what feature were they trying to buy, to add like really, really meaningful things, if I was to do that into Airtable, and for it to make sense at some other point. So, I think like there’s room for growth in terms of research repositories, but at least I have a link to say hey we did this. You know if anyone is considering these 10 features go back and at least look at some of these notes. We have people prioritizing that. So, I think that there’s so many different types of research and the repositories, to make it perfect, I don’t know if it’s possible.

Steve: Are there processes that you need to put in place, either on the – I don’t know, is it fair to call it production and consumption – the sort of different users of these repositories.

Michele: So, I think, yeah, a lot of it was tasked with, again, definitely the product manager who already left Airtable and myself trying to be – what I wound up doing is making sure if I had verbatims that would add value that at least those verbatims got into that. I know that it’s challenging. I think what will help is making a form for if people do have other things. We do have kind of our – some of our engineering and production teams who are catching bugs and catching other feedback, or even something that doesn’t seem right. So, there’s a way to create a form to add that and that’s been really good too. I think also – I know that other product managers are using Airtable, but I’m not sure how they’re doing that and I think that also – I know it’s time consuming, even if you – you know, like I said, even if you get the transcript you still have to translate it into something that’s meaningful. So, I think, you know making sure that it’s useful. It’s not going to weigh you down as you do your projects. So, having every product manager enter it, I don’t know if that’s reasonable. As a researcher I certainly can’t enter everything either and that’s what I’m saying. Like I’m linking, hey we did this awesome biofuture priority exercise but I did not – we have like spreadsheets, we have videos. I didn’t have time certainly to do that. And also, we’d moved on. I think that’s the nature of some of the repositories. Like, oh great, we did that week, we prioritized, we learned from it, there’s some high-level takeaways. But to go back and put in all these nuggets, it’s frustrating to me because I’m sure there’s nuggets that could be used in like 5 months, but it’s also not time pressing for me to do. I need to help like the project teams move on. So, I think that that’s been a – that’s a challenge. So, I’m pretty sure time is a challenge. And then also just making sure – I think that everyone, like you said, can access it. We – like other teams we have Google Docs. We have Outlook. We have Airtable. We have tons and tons of tools and so staying on top of that. The reason I wanted to put the feedback into Airtable is because the product manager was already using Airtable and I linked the – I add the links that have my research evidence in Airtable to Clubhouse because the designers and the engineers are using Clubhouse. And so, what I’ve found in the past is you have to link it to wherever the teams are. And so that’s – in some cases I’ve used Wiki – Wikis and Confluence and other tools to do that, but I think that that was a challenge that I had. We definitely had that sometimes with Polaris. Like people could find information when it was either kind of given to them, but it wasn’t directly linked potentially to where – when they started those tickets, or they were aware of it. So, I think this is kind of step forward that as a researcher I’m actively helping to put that in. I know other colleagues do that. They put things into Jira. They put things into the Wikis, wherever designers and engineers are going to need to be acting on it.

Steve: Is there anything that – what do you have to do to create conditions where people are going to say, oh I have this question, let me go see if there’s any research that speaks to it?

Michele: So, I think it’s two – I think the first stop would be probably talking to a researcher and saying are you even aware of that and a researcher might say, yeah, I know we did some projects like that. Or talk to so and so, I think that they did that. I think creating – what I’m trying to do is create the research Wiki so that hopefully it could at least be by – if those topics emerge or the personas that – again, because we’re in commercial real estate there’s a lot that deal with brokers, but helping you find that right one, at least maybe – at least there’s going to be here’s the place to start and then here’s people you can talk to. I don’t think sending someone to an Airtable database is the right way. I think that there’s some type of – we’re working – we also have something in the works that we’re going to be creating essentially a curated repository and this is something that the other researchers and other people wanted where it is – here’s journey maps, here’s personas, here’s information that we’ve all agreed is okay to standalone. So, someone who wasn’t on the project can use and find and then that we’re going to put that on more of a public – not a public, but an internet site, essentially linked to user research. And so – but it will definitely be curated and I think they might start there. So, that’s one more place.

Steve: So, even people being willing to say oh I have a question, I should approach either researchers or internal researchers, as opposed to, I don’t know, doing nothing.

Michele: Yeah. I think the first thing was is there a researcher or someone on your team, or product team, that’s probably been doing that? They probably know, or they can probably be a resource to get in touch with other people because again, we do monthly meetups. We’re in touch – even designers might know if a project has been done, or if you focus again on what was the user type and not the project, and the code names, then you might actually say well 3 years ago we did this type of work and you might find value in that. And that’s where eventually we want to do with almost creating these – as I said, talking to the other designers and research people that there would be an area that’s curated is really important to us because I think that there’s a thought that you could find all these random usability studies, you can find draft reports, but 1) they may not have value to other people – again, 5 months ago, even the usability test I ran last week, the menu that we were testing has already changed in a week. It’s almost not – you know the insights aren’t relevant and that’s fine because we actually changed the menu and that’s great, but I don’t know that anybody needs to be able to access that forever. The project – it’s lagged on the project team, but putting that into a research repository or something, accessible by anybody, I don’t think is going to add value at this time.

Steve: I like the verb curation kind of attached to research repository. It starts to make sense given the complexity of the context you’re talking about.

Michele: Yeah.

Steve: Can you talk a little bit about how the team has evolved in your time here?

Michele: Sure. So, in terms of the broader team, so when I started in November of 2018 there was someone who was working part-time. She had been – as a researcher – she’d originally been in support. She’d been here for 2 ½ years and she’d kind of learned user research and she was awesome, but it was also very self-taught, but it was great that she had a user mindset. I asked her to give me kind of like just a background on what was happening and what tools we had and I asked her – and we started with well what tools is everybody using? Even like – even identify oh there’s Airtable. Oh, one group was using Optimal Sort. We have licenses and information. So, okay, again, this kind of confirmed people are doing things for user research, but maybe it’s not organized or something. So, we started to 1) create that list so we understood what tools do we even have access to? Do I need to buy tools? And then at the same time they started recruiting – or they’d always been recruiting for that design director, as I said, who will start later in the summer. So, that was new that there would actually be again a design director. Designers had been reporting to a lead or a group product manager, but that was a change. So, um and then also working more closely, I think, with Seattle, instead of me just being focused on building out New York as a center of excellence, really trying to work – there was a lead researcher who had been in Seattle who has since gone on to become a UX manager. He hired another research there so there’s – for, um – you know for a few days we had, I think, three or four people that were actually dedicated researchers, people who had titles that were different changing and I’ve been – just last week we had another researcher start here in New York City. So, depending on how you’re counting it there’s probably four or five people. Some people are part time in different levels, but there’s – within this Build group we kind of have finally the start of an actual user research team. And I say team loosely because we’re not necessarily all reporting into the same people. We’re reporting up to the same SVP, but that’s why I also see us more of a center of excellence. We’re working to – we have now some Wikis. We have shared tools. We have a Slack group. When someone actually got a demo last week about a new tool we were considering, she was like oh I’m going to put this right into this tools folder and I was so excited because we had that and we didn’t have anything like that six months ago. And the other thing is it’s been really exciting to see the product managers take interest, that I’ve worked with like some of them have been doing jobs to be done. They’ve been doing different types of projects. But to see what they’ve been doing with their research – and in one case we were able to also help create a journey map that one product manager saw and was like I can take this to my clients. So, I think the impact on the team is like starting to actually say here’s what research can do and in ways that you didn’t anticipate that maybe we could do that. So that was – so, we’re starting to change just the mindset potentially.

Steve: Whether it’s here or in other roles that you had, I’m wondering about when you look at people that are potential to join your team, what kind of things strike you about perspective – about researchers that might be perspective new colleagues?

Michele: Um, I think you have to be super inquisitive and analytical. So, are you going to just take – hey these are the top three problems and is that a given? I think that researchers and UX people tend to say well no that’s not the – you know we’re going to poke holes in that. So, I think, that’s number one, being inquisitive and analytical. I also want to understand if they have – you know do they have that user mindset? Are they able to understand different perspectives? Have they done different types of projects where it’s not just following somebody home, but it’s like literally spending the time working with them. I’ve had the fortune to work across different types of product industries. In one case, working for a medical company, I was able to – I spent all day driving along with an oxygen delivery person and just knowing that you’ve had that chance to do field studies and that you haven’t just been in a lab is something that’s really important to me that you have – ideally you have wider experience than sitting behind a desk.

Steve: How do you – like what are sort of cues of like the user mindset? How does that exhibit.

Michele: Well I usually – I definitely like to understand, if they’re presenting their portfolio or a case study, what was – you know what were you really – what were you asked to do? What was the actual request and then what did you decide to do? How did you interpret the problem? And then what was – who did you decide – like how did you decide to approach this? But then who – like what were really those key insights that really – what was the impact to that user beyond moving – again, not from a usability perspective, but from probably a higher level of like here’s really where they were having these issues. They couldn’t figure something out; they were struggling to do that. Like did they really then advocate as that key change kind of thing. Something’s not right. We saw that pain point and then that became ideally one of the key insights that they hopefully informed a project about. And so maybe they had that big impact. So, that’s what I’m trying to say. It’s like could you see – can you see some of the challenges? Or even if it was like something that a user loved that you were able to take it from another place and say we’re going to impact this project or projects with that because that’s so valuable. They get really excited about that. We don’t want to ruin it. We want to make sure they can keep doing that or do that better. And did they have that user mindset, that user insight, that they were then able to be that champion to take it through the project.

Steve: You know that’s such a good point. I think we all, and myself, certainly get caught up in we’re about finding problems and fixing them. But your point, we find things that people love, those are huge opportunities to either protect, or celebrate or upstand.

Michele: Right, right, right.

Steve: Yeah. It’s easy to get lost in the fixing – we’re here to fix stuff.

Michele: I think so and I think like – and then also having that freedom is suddenly sometimes different. It’s like oh, we don’t have to only just fix stuff. We can make this – they love it, let’s give them more. You know that might not be a bad idea. Like can we make it so they can spend more time – I think in this case, like for a lot of the knowledge workers, let them do more time analyzing. Like if they’re really good at coming up with these insights, they want to find the best office for their clients, well like they don’t want to be just entering the data. They want to be reviewing the images that we’re providing, the analysis, and let them do that. And that might be something that they love, that type of things.

Steve: Right. It reminds me of projects where the assumption going in was we were going to kind of remove steps and save time and remove effort and that you find that there’s parts of problem solving, or all sorts of weird parts of our lives, that people find satisfaction or even like joy and delight in.

Michele: Yeah, right.

Steve: And that our job isn’t to just eliminate stuff. Like you said, there’s parts of the work that are…

Michele: Right, right. I remember interviewing someone when I was working on like a – I think it was a small business. And she was like well I love paying the employees. Like it took a lot of time, but at the end of the week like she felt successful. She was happy to pay her employees. So even doing – like understanding that step, like when she’s using our product, but she’s happy. It’s not a chore. It’s – you know it’s not a chore. Like she’s happy to pay her employees and she wants to reward them. So, like I think having some of those insights and saying like oh, this is a potential delighter, how do we build on that, is potentially something to keep in mind.

Steve: Right. That seems like a potentially big reframe from small businesses don’t want to spend money. Every loss is a loss. Every outlay of money is a loss. And this person is finding meaning in that.

Michele: Yeah.

Steve: Because of what it symbolizes to them and what accomplishment there is.

Michele: Yeah.

Steve: That’s great. I love those things in research.

Michele: Right.

Steve: Because you can understand what you’re really providing.

Michele: Yeah.

Steve: So, you’ve mentioned a couple of different contexts like delivering oxygen and so on. Can you talk about some of the other roles, or maybe even just how you got into this work? Maybe we’ll start back there.

Michele: My degree, I got a Masters of Science from Cornell. It’s like in human environment relations with a concentration in human factors and ergonomics. And so, when I started the Internet was just getting started. I think I did my thesis which related to actually some household tasks and products. And it was more about physical design, but I remember like I hand coded my survey because HTML was just kind of starting out and I was literally – they didn’t have Survey Monkey or anything. So, I was doing things like that. My first job after grad school was working for Kohler Company. They do kitchens and bathtubs, but I had the chance to work with a lot of really talented industrial designers. Eventually my career included working for the Consumer Products Safety Commission and that’s where I learned a lot about safety and some hazardous issues that could have been prevented because if usability and design had been better – and the work there was more focused on standards writing and reviewing and it was really informative for me, but I wanted to go back to product development.

Steve: What is the Consumer Product Safety Commission?

Michele: This is a federal agency. You might know them because they do all of the recalls. They also are the people that tend to put those warning labels on different things that come out. There are a lot of voluntary standards that you might – they happen to appear a lot on consumer products. Maybe baby products in particular, but they affect a lot of different products. I remember looking at a gas grill that I think you bought on an infomercial and it kept kind of tipping over and the human factors and the usability perspective of that was that it looked like you could light it in one place, but that actually where you could light it. Again, this is probably a really cheap grill, but the reality is, from a human factors and design perspective, why wouldn’t you light it in this one place? And when you lit it there it actually caused accidents. And so that had to be recalled. But that was a really different, I think, experience than a lot of my colleagues have had. And so that – I was grateful for that experience, but I wanted to get back into product design. I eventually made my way to a medical products company which was exciting because it was starting to blend the physical, like products like sleep apnea and oxygen machines which also had some digital screens and different – you know just different types of products that I think today we take for granted, but back in the day it seemed like things were either all digital and very, maybe archaic, or they were almost all physical with just some lights or something. And so that was where I think it preceded the mobile phones, but I had a sense of designing with kind of physical and digital at the same time, what we would call hardware and software today, but it wasn’t always called like that. I was working with industrial design and product designers who were physical product designers at the time.

Steve: How did you find your program – your school program? How did you find your way to that?

Michele: That I think was a fun story. I did my undergrad – I wanted originally to study architecture. I spent a little bit of time with kind of the intro to architecture classes. I decided that actually I didn’t want to be an architect. I wanted to study the psychology and I think I somehow understood more about designing for the people, but I didn’t know anything about what it was called. And so, I finished my undergraduate degree in psychology and eventually I started just talking to people. I think the OXO Good Grips that come out – I think I did an interview – I think I did like an informational interview to say what is this? This seems really interesting. I think I called somebody and I think I got some leads and eventually I remember just talking to everybody until somebody finally said we think this is your field and I think that there was a program – one was at Cornell and one was in Wisconsin. And eventually I think I just found these programs that said this sounds like what you’re doing and I applied and I was lucky enough to get in and then I found that program and that’s how I did that. It was definitely a researcher doing research to do that, but also really using your network and just saying I think there’s a way to make all these things easier. It seems like people are doing this. I know there’s something related to psychology and design. I don’t know what it’s called, but I think there’s something here. People seem to be doing this type of thing.

Steve: If you – hindsight is, of course, 20/20, but would you – is there a different path – given where you are now, is there a different path you might have taken?

Michele: I don’t know. I mean – yeah, from psychology, I mean in terms of – I think, like I said, I wanted to apply it. Definitely I could have gone to law school. Like could have done really, really different things. But I don’t think – I think it would have been great if the UX program had been more – if I’d been more aware of the UX programs if things today had existed. But I think I kind of had to make my way through the career and I’m happy that I got a chance to, again, like work on physical and digital and mobile products and things that I think it’s – because it was sort of me trying to figure it out and maybe that’s not as different from other UX people that oh this seems interesting, maybe I can go work in this area, or there’s ways to apply this. I think it worked out okay.

Steve: I’m sure there’s some analysis we could do of kind of when you were in school or when you were sort of looking for your jobs what you were aware of in terms of what kind of work there was, and what kind of educational options there were. Because, right, maybe if you’re starting now you have access to certain choices. If you started when you started, or when I started, you and I were doing different versions of finding our way with some serendipity and some sort of opportunistic research, but the context was so different.

Michele: Right. I think I would also add, I think the assumption was I had this undergrad degree in psychology. I think I just got a job – like I wasn’t doing anything in the field. I was working in a publishing agency, which was great, but it wasn’t – I was really doing that to pay the bills essentially. I didn’t – had I been able to be an apprentice, had I found something else, maybe I wouldn’t have gone to grad school. I think the assumption at the time was there’s something out there. I think I probably assumed you had to get training and go to grad school. And so that’s the path I took. But potentially, in this day and age, you might do more of an apprentice, or you might join a bigger company and see if you can get training on the job in that sense.

Steve: Just hearing you talk reminds me that there’s a history of this profession. And relatively speaking we’re still talking about recent times, but it seems like the field goes back longer than we might – maybe be collectively forget the history.

Michele: Yeah. Definitely, I mean, I think that’s an unknown right, that it goes back to people working on the planes and the pilots – you know that the controls were in the wrong places and then they had to standardize, but that was really kind of the field originally or human factors. I think a lot of the work they were also doing, as I understand it, probably with the early computers, even with NASA and things in the 60s, that was definitely pioneering the work of human computer interaction which sounds today so goofy to say human computer interaction, or human systems interaction. But it definitely goes back decades, way before us, and generations.

Steve: Right. You have a human factors degree. I have an HCI degree, you know human computer interaction. And those seem like rare terms. I don’t find people with that as their background as much. We didn’t have the phrase user experience, or UX, or user research, or any of those things.

Michele: No, no.

Steve: So, we were finding our way.

Michele: I don’t think I had the user research title until about 10 years ago. I think that’s also when some of the industry changed a little bit, but that people were doing task analysis. They were trying to understand the system. They were interviewing people. They were testing, whether they used the work usability testing or prototype testing or concept testing, they were essentially going through all of the things that we do today. They just might have called it something different.

Steve: I mean and some of those sort of classic examples of human factors research and some of the things that these examples, there’s just a lot of detail to sort of the decisions that are being made.

Michele: Right.

Steve: A few minutes ago, we were talking about reframing our understanding of what the problem is. So, there’s a – even that’s the – there’s other axis we consider, but just among that sort of detail vs. big picture it seems like there’s a big range of what researchers do.

Michele: Yes. I think researchers have a really key role to play at both parts of the double diamond process, if you think of the traditional, you’re trying to design the right thing and can you inform that? What are the key insights? Are we going to build the right thing? Maybe researchers are helping to take those insights into a high level, MVP, is this even the right fit? But once you’ve decided to build the right thing are you building it right? Can you test multiple concepts? Are there three or four ways that you can actually figure that out? If it’s totally new to the world, even if it’s for the future, maybe there’s two or three different ways you can do a new feature and then can you test that and iterate that? And so sometimes it gets into really detailed design, all the way down to the icons, the placement on the page, the labeling and all those other aspects that really go into refining a product and making a great product.

Steve: And you like that part of it?

Michele: I like both. You know, I think – I love the idea of really working at the upfront doing really generative and informative research and going out and saying we’ve interviewed all these people, what’s really working, what’s not. What’s their workflow? What are like – what are things that nobody’s found before and how could we maybe take this into something new? I love doing like rapid experiments to say maybe there’s something we could go based on that and taking that into what you might call an early MVP, but of course yeah, once we do it I want to make sure things make sense and that it’s really easy to use and so that’s all dear to my heart at the end of the day.

Steve: When you think about the user research profession, or the practice, whatever we kind of are collectively, do you have any hopes or goals in the sort of medium term for this field?

Michele: I hope that we work more closely with all the different, I guess you would say, factions of user experience and design. On one hand I love that there’s a zillion different meetups and I can go on Eventbrite and see anything related to UX and research. On the other hand, I get bummed out that I’m like oh actually they should have just talked to that other group. Like there’s a lot of groups – there’s UXPA. There was IxDA, I don’t know if they’re still active? But there are so many groups that have either been active or could help some of these, what I’m seeing as random meetups. And I would love if some things got a little more organized, or if there was just way to say you don’t need a new Slack channel. You can talk to all these people. There are communities. And letting people know. And maybe that’s, at a very high level, like just formalizing it. Yeah, I’ve been in the field for a while. I don’t like that we still go in circles about what the title is and so I’m okay not having the same titles and things, but I would love people to know like there is definitely a very long history of people doing this work and making sure that people have access to that and the resources.

Steve: Yeah, it seems like we – as people and certainly the technology profession, we have a short memory. We’re attracted to kind of the shiny. I mean this was already a long time ago, but I remember someone telling me a story about – I think that they were at an agency and they went in to talk to somebody about doing research and the person said to them, alright, what’s new, what do you got? And like even that idea that a thing that you’re going to be doing with research should be new. I mean obviously contexts change.

Michele: Right.

Steve: And we adapt to them, but just the idea that new itself was sort of – was a value, seems to sort of fly in the face of what you’re saying, there’s a whole history here that we’re leveraging. We don’t need to reinvent it.

Michele: Right, yeah. I’d probably be happy just to let – if there was like a blog post or a tweet that said hi, there’s an entire history, here’s fields if you’re new and getting started – here’s three or four fields that you might want to check out, related fields. Here’s – you know there’s tons of templates and resources that people have been putting out over the years, here’s 10 of them, or something like that. And here’s – if you’re anywhere in the world there probably is a meetup that’s related to what you’re looking for. It just might be called UX or HCI or research or design research, or something else.

Steve: Right. All the variations of all the terms.

Michele: Yeah.

Steve: Is there anything else you think we should talk about today? Anything I didn’t ask you about?

Michele: I think this has been really positive. I’m looking to chat with anyone else if you’re also trying to establish research as more of a center of excellence and building up the processes and the infrastructure that go beyond one team or a series of projects. I’d love for you to reach out.

Steve: What’s the best platform, or whatever, for people to connect you on? We’ll put it in the show notes for the episode.

Michele: Twitter is great.

Steve: Okay. Well, thanks very much, Michele, for a great conversation. I really appreciate your time.

Michele: Thank you very much, Steve.

Steve: Well, there we go! Thanks for being here for this episode. You can find Dollars to Donuts wherever you do your podcast listening. Didn’t we used to call that podcatching? I don’t know. You know you want to give us five stars on Apple Podcasts am I right?. Visit Portigal dot com slash podcast for all the episodes including show notes and transcripts. Our theme music is by Bruce Todd.

Jul 14 2019

54mins

Play

Rank #6: 17. Tomer Sharon of Goldman Sachs

Podcast cover
Read more

In this episode of Dollars to Donuts, I talk with Tomer Sharon, the Head of User Research and Metrics at Goldman Sachs. We talk about how to assess potential hires for user research positions, infrastructure for capturing and searching a body of data, and developing a practice inside a willing, yet large, organization.

Some parts of kind of pure research are creative. Probably the biggest one is translating a set of questions that a team has into okay, what are we going to do to get answers? If it was that easy to come up with an answer to that, then anybody could do that well. That’s not the case. A lot of people are having a lot of trouble with that part. So, I think that’s a creative part. You’re not going to see a beautiful painting coming out of that, but it is creative. – Tomer Sharon

Show Links

Follow Dollars to Donuts on Twitter and help other listeners find the podcast by leaving a review on iTunes.

Transcript

Steve Portigal: Greetings, humans! Thanks for listening to Dollars to Donuts, the podcast where I talk to people who lead user research in their organization.
Over the past while I’ve been putting together a household emergency kit. It’s primarily shopping exercise, and I’ve ordered a hand crank and solar powered radio, a replacement for matches, latex gloves, bandages, and air filter masks (which we made use of during a period of dangerously poor air quality recently). The last step was getting some food that will last – cans of soup and stew, crackers, single-serve breakfast cereals. There’s something satisfying about acquiring a bunch of stuff and storing it away, somewhat organized. And that led to a stray thought that I noticed – “Oh, I can’t wait to use all this great stuff!” And then I realized how crazy that sounded. I don’t want to use it! I don’t want there to be some emergency that is bad enough that I’m drinking the emergency water stored in the garage and eating canned stew, also stored in the garage. I mean, yes, we’ll eat or donate the food before it expires and replace it, but it’s a whole set of preparations that I hope I’ll not use, which leaves me with the hope for no shopping gratification, kind of a confusing way to feel.

But it did remind of the recent workshop I led with researchers in Sydney, Australia. We looked at a lot of the user research war stories I’ve been collecting, like those published in Doorbells, Danger, and Dead Batteries, and we pulled out lessons and best practices. There was – as there always is – a lot of discussion about safety and preparation. It seemed to me that people who worked in organizations with established safety cultures already had a strong baseline of safety procedures for user research and fieldwork, like texting a contact before and after a session, not going out alone, and so on. Their work cultures were strong on processes, especially for safety, so thinking about this for research was obvious. But not everyone works in that kind of environment, and plenty of researchers work for themselves, without that corporate structure to support them in creating best practices for safety. Anyway it led to a lot of discussion beyond just safety about running through possibilities ahead of time so that when any situation comes up it’s not a surprise and there’s at least a starting point already established about how to respond. I think this is a great idea, but I think we have to acknowledge the limitations – you can’t plan for every possible situation, there are always going to be things that come up that you probably haven’t ever considered. I think that some planning for the unexpected will help you to adapt in the moment to surprises, but that’s different than the false comfort of assuming you have every contingency planned for.

I hope I never have to make use of our large cache of sterile latex gloves, but maybe just having acquired them I’m in a slightly better situation for some other unexpected situation?

You can help me continue to produce this podcast for you. I run my own small business, and you can hire me! I lead in-depth user research that helps drive product and strategy decisions, that helps drive internal change. I provide guidance to teams while they are learning about their customers, and I deliver training that teaches people how to be better at user research and analysis. You can buy either of my two books – the classic Interviewing Users and Doorbells, Danger, and Dead Batteries, a book of stories from other researchers about the kinds of things that happen when they go out into the field. You can also review this podcast on iTunes and review the books on Amazon. With your support, I can keep doing this podcast for you.

All right! Time for my interview with Tomer. Tomer Sharon is the Head of User Research and Metrics at Goldman Sachs. He’s worked at Google and WeWork, and written two books – It’s Our Research, and Validating Product Ideas Through Lean User Research.

Thanks for being on the podcast. It’s great to have you here – virtually here in audio space that we’re all sharing. Why don’t you start off – just do a little introduction. Who are you? Where do you work? What are you doing?

Tomer Sharon: Okay. Thank you for having me, first. My name is Tomer Sharon. I am currently Head of User Research and Metrics at Goldman Sachs. I do have a second day job. I’m also heading a design group for a product called PWM (Private Wealth Management). Yeah, this is where I’m at in the past – well, almost a year.

Steve: Alright. So, what is Goldman Sachs?

Tomer: Goldman Sachs is, I would say, an investment bank. Probably one of the more important banks in the world. Big corporate. Definitely not one you would associated with design and research, at least not that type of research. But they are changing and they’re celebrating 150 years this year and they’re moving towards what’s called outside digital transformation and that includes learning more from their audiences and investing a lot more in design.

Steve: What’s the relationship between the existence of your role and this larger shift that’s going on?

Tomer: I think there’s a strong relationship. They have been realizing that they can’t just be living in their own box and they have to open up and try and understand audiences that they’re engaged with already and new audiences. I’ll give an example. Goldman has a commercial bank that’s called Marcus. It’s been around for a couple of years, but still these are consumers that Goldman is now trying to attract. So, it’s definitely not the kind of typical audience that they’re used to. So, they understand that they need to open up, learn from them, and design for them and with them, and that is a shift that has been happening in the past few years. And my role – I didn’t replace anyone. I’m the first one. – is a part of that shift.

Steve: Is there any sort of one point or one incident, or key moment, I guess, which sort of marks that transition in a company like Goldman to say yeah, we’ve been doing it this way, we need to do it this way.

Tomer: I think it happened – I don’t know if there was one event, but I think it happened in the past maybe two years. The user experience team there was very small and then suddenly they decided that it’s time to invest more in that. And from zero it went to several dozens, many dozens, within a year. And the more people do their work, show their work, share their work, and their work is very successful, the more teams and leaders talk about that and then it’s like a cycle that feeds itself and then it grows and grows. So, that’s been happening a lot in the past few years.

Steve: And do you have a perspective on – this awareness that you’re describing at Goldman seems to align with what I’ve heard and observed in financial services in general. That it’s an industry that maybe wasn’t seen as paying attention to consumers, users, design, and over the past number of years.

Tomer: Yeah. I will admit, this is my first financial services job. So, I’m not really familiar with that world other than Goldman. But, so I hear. I don’t really know from first experience.

Steve: But I’m inferring – correct me if I’m wrong – I’m inferring that that also is not a conversation that you’re having inside Goldman?

Tomer: No. It’s very, I would say, kind of practical and tactical. We’re not talking about the concept of having me and people like me there. We’re just focusing on doing the work that we know how to do, we’ve been doing for many years, and just bringing that insight and understanding of that world to an organization that wasn’t aware of that previously.

Steve: That world is the process and tools of learning about people?

Tomer: Um, yeah. I would say process, tools, people. They’re used to hiring different people. I saw that in Google at the time. I saw that at WeWork at the time where even formally you don’t have in the – I don’t know HR systems – you don’t have names for roles for what we do, for who we are. I can’t remember the names, but we were engineers or UI engineers, or things like that. Until you get recognized, and I experienced that at Google, and then you do have a job family for design and for research and so on. It’s a process.

Steve: So, at Goldman you’re trying to then like – that’s part of the…

Tomer: Well, I’m not – we’re too, especially in research, we’re too few people to start having a job family in the HR systems, but we’ll get there. What I’m trying to do is first a lot of evangelism. A lot of conversations with people to plant seeds in their minds that they might need somebody like that. That they might engage in a project, a one-off project, and learn more about it. And with several groups and divisions it works already. So, we have people there and they start doing their work.

Steve: It’s interesting that you’re sort of highlighting evangelism vs. like conducting research.

Tomer: Right. Right. Sometimes it’s integrated. Sometimes I’m asking people to – some kind of a leap of faith. And then we actually, we do the research and then the work talks for itself. I don’t need to evangelize anymore. They get it. They understand. They want more of it and from there the doors are open. I prefer it this way. Even in the past, I prefer to kind of do the work show then kind of wave my hands and talk about it. I found it always be more meaningful to people and more persuading.

Steve: I’ve heard this. I don’t know if this is what’s playing out for you in these situations, but that you help someone take a leap of faith. You do this work. The results speak for themselves for them. Do those results – or how can you help those results spread elsewhere in the organization?

Tomer: Um, I use that example when I talk with others and I invite people to connect. I mean in the same company you can hear from them. You can meet them. I sometimes facilitate those meetings. And then they hear from others. I know it will take time, that it’s not something that happens over a day or a week or a month. Sometimes a year. But these conversations happen, whether I’m aware of them, or not. Whether I facilitate them or not. I’m confident that the more work we do the more people we’ll have, and we’ll be more impactful.

Steve: How does that work across – I don’t know if they’re called business units. Goldman as you said speaks to different kinds of users, different kinds of customers, with its products and services.

Tomer: So, what I do – so, as soon as we have researchers joining the group, we assign them to a – let’s call it a business unit, or a product team. And then they’re theirs 100% of the time. I only support them with infrastructure, career growth and things like that. And then they do the work with the team. I do a lot of legwork kind of before that happens to make sure that they have a team that wants them, that needs them, and that knows what to expect. So, it’s not the first time that they hear about that research thing with the appearance of the person. So, that’s how I intend to kind of continue doing rather than hire to – like work in an agency model. I have a pool of people that kind of come and go kind of based on projects.

Steve: Okay, so, just to reiterate, you talk about help someone to take a leap of faith, and I’m maybe changing your words a little bit. And a next step from that is to hire someone and put them in that team.

Tomer: Yes.

Steve: And then they do the work and then the work, as you said, the work speaks for itself, but it’s the work that you’ve staffed someone into that team?

Tomer: Yeah. I’ll just add to that. In many cases it’s not really a leap of faith because there are people who are – I call myself an outsider – who didn’t grow up in Goldman, that know about this field, know about research, know about people like us. So, they’re very open to having them. So, that’s much easier.

Steve: So, you said you’re not looking to do an agency model.

Tomer: No.

Steve: From your face, I think there’s a point of view.

Tomer: Yes, definitely.

Steve: What’s the point of view behind that?

Tomer: The point of view is that I want people to feel they belong; the researchers feel they belong to a team. And I feel that if I just send them off to short term projects they can’t grow with the team. They can’t have any history with the product. They’re not familiar. They’re really like consultants that come and go. I want them to feel a part of the team, understand all the history, sit with them. I don’t look for them to sit next to me. I want them to sit with their teams. And then going to be a part of that team and then be more impactful. I strongly believe that this is the way to go.

Steve: Yeah.

Tomer: And I’ve seen that happen before with my own eyes. Not WeWork, because we didn’t do that at WeWork, but definitely at Google where we experienced, at some point, a decentralization. So, there was one UX group and they were decentralized into the different business units. Not much has changed because people were assigned to teams already, but it just felt like the right thing to do, to have people sit with their teams, work with the same team, same product, for long periods of time.

Steve: So what implications, if any, does that have for what kind of skillset someone that you’re hiring needs to have?

Tomer: Yeah. That’s a good one. So, I’m looking for more experienced people. More senior people. There will be a time where we will hire more junior people and that’s not the time when we’re just starting. And I had the same thing at WeWork when we were building a group from scratch. You need experienced people. So, I’m looking for people who are eager and passionate about building something from scratch, taking a team that maybe doesn’t know anything about user research and try and kind of build that relationship and build those results. We talked about that earlier. With that team and grow with them, grow the practice with them. And maybe if it’s extremely successful grow a team there at some point.

Steve: There’s always a thing with questions and answers, at least for me. I ask a question, I have an idea what answer I’m going to get. And it’s interesting when the answer is different. I know it doesn’t always cleave this way, but there’s sort of the soft skills and the hard skills thing. I imagined I was going to get a hard skills answer because I was thinking about some of the pragmatic constraints of this, but your answer emphasized more, I think, softer skills. Growth, advocacy, leadership from a research point of view.

Tomer: I think it kind of almost goes without saying that once you’re – I don’t know, maybe I’m wrong. But once you’re – I always like to – I tell that to my people when we hire. We look for resumes that scream researcher. So, if your resume screams researcher, then that’s covered. I’m more interested in the soft skills.

Steve: Yeah. Can I push on that just a little bit more?

Tomer: Sure.

Steve: And I think – my bias here, or the perspective I come from is the consultant. So, I come in and out. So, I know what the limitations of that are and so just from where I sit, I see the challenges of what you’re describing because – I mean there’s a life cycle of when a team is developing – there’s probably a smarter way to say that. But something is being built and it goes through stages over weeks/months/years and so the research needs change and so the way the researcher has to respond to that, or lead that, or support that changes. So, that’s kind of the – I’m just revealing what I was probing on here, what I thought the answer was going to be. When you say screams researcher in a resume does that sort of imply that the ability to – are you inferring from that resume the ability to support that team and all these different areas of its evolution?

Tomer: Yeah. A resume that screams researcher is a resume that you see all the methods, you see the flexibility in kind of not sticking to one method or one approach. Doing that in multiple cultures, multiple companies. Have a trajectory of growth. So, you’re doing kind of from less meaningful things to more meaningful things, and so on. So, that to me screams researcher. Not, you know, you look at the job titles and you ask yourself why am I even reading this resume. And then you read a cover letter that says something along the lines of “I’ve always wanted to be.” Maybe at some point, but not now.

Steve: Yes. There is an archetype of a person I think that identifies strongly with researchers and sort of what they’re about, but their experience doesn’t match that.

Tomer: And I’m smiling because there’s always an exception. I remember one at WeWork where – and I was talking – speaking the same way – only senior people. I’m not looking for more junior people. And then I hired two junior people because there’s always an exception. Because you see somebody who makes me ignore everything I just said and hire them.

Steve: What’s an example of something that overrides that?

Tomer: I don’t even know how to explain it. It’s just – sometimes it’s the spark in the eyes that you see immediately with people and you see that they are going to be like a sponge. That’s a good metaphor. And you just know it’s going to work.

Steve: And you said passion early on.

Tomer: Yeah, yeah, yeah.

Steve: And I think evidence of passion is past performance. But that’s not the only evidence of passion if you’re seeing that with people that you’re meeting.

Tomer: Right, right. Although in most cases I’m very reluctant to hire based on passion if you don’t have experience. Because you can have a lot of passion and motivation, but you know nothing. It doesn’t mean you’re going to be bad, but it means a lot of people are going to need to support. So, I think kind of my approach is that I’ll try and do zero to very little of that because we’re just starting. We’re just a few people. We don’t really have the time that we need to support our teams. We don’t have the time to support another person. Not now. When we grow, yeah, but that’s definitely – you know, if we were 10/20 senior people on the team I would say we have to have more junior people because we can’t have a team of only senior people. But right now, we’re not even close to that.

Steve: Can you say the size that you are now?

Tomer: Yeah, sure. We are four. In a couple of weeks, we’ll be five. Worldwide, both here in New York and in London.

Steve: And is there like a roadmap for number of people to get to that 10 or 20?

Tomer: Yeah. I don’t want to mention numbers, but I think we’re on a path of growth.

Steve: Okay. So, you talk about sort of the time commitment required to support people at different levels. I want to go back and clarify one thing. It was almost an aside. You said that you don’t want to do the agency model. You want people with these teams, but you’re trying to provide infrastructure. And then you used a phrase like career growth. So, given the size you’re at now, acknowledging that there’s bandwidth challenges here, so what does infrastructure look like? What does career growth look like now?

Tomer: Um, infrastructure. So, that’s easy. So, there’s a lot of – when I look around me there’s a lot of motivation to do research, but there are needs in terms of – and gaps in terms of tools, knowledge, guidance, process. So, research was happening before there were researchers, but it was kind of very kind of based on sporadic motivation from different people who are really trying to do the best they could do. So, we need to support them with services that are out there, industry standard services. So, we need to sign agreements and get those licenses on board, different vendors.

Steve: What’s an example?

Tomer: UserTesting, UserZoom. These would be two, but there are more. And, um – let’s see what else. We are kind of creating process for, okay what happens if you want to do research, but you don’t have a researcher and you probably won’t have a researcher in the next few years? But you still – you acknowledge the fact that you need research now. So, um – so we have some – we have established some kind of a way to ask for that and then sign up for office hours or something like that, and then get some help from us. We will advise you and help you kind of get going without the researcher. What else? We started working with OKRs. So, if that’s – to me that’s a part of kind of infrastructure. Or the cool kids now call it research ops. So, research ops.

Steve: So, explain what OKRs are and explain what research ops is.

Tomer: OKRs is Objectives and Key Results. It’s a goal setting approach. There are many. This is just the one that I’m used to, familiar with. And that’s just a way to set goals and see how you’re doing. So, we do that as a research group as well. Research ops – I would take the easy route and say this is everything that helps research happen without the actual research.

Steve: Are the OKRs an example of research ops? That’s what you were saying?

Tomer: So, the tools that I mentioned – the OKRs, hiring, knowledge management.

Steve: All of the infrastructure?

Tomer: Yes. If somebody wants to do – I don’t know, whatever – a usability test, and they don’t know how. So, you want them to have the tool. We want them to have a knowledge base that they can access and see okay, what do I need to do to run the usability test? And we want them to have guidance and support from a person, from a researcher who knows what they’re doing. And I would say all of that is research ops. For researchers, a big part of research ops is participant recruitment. So, finding people to learn from. That’s a big part. And something that I’m truly passionate about is kind of insight repositories. Building some kind of a repository that we can pull from later on. I would add to that any infrastructure involving measuring the user experience. So, building that. I would also include that under research ops.

Steve: So, you said you’re passionate about insight repository?

Tomer: Yeah.

Steve: Do I have to ask an actual question?

Tomer: What do you want to know?

Steve: What are you trying to get to? At Goldman, what are you working towards with that?

Tomer: It’s the same as everywhere. I, for many years, I realized I was kind of bothered by how wasteful research is. That even I felt that I’m “learning the same things over and over again.” I know that other people did research, in the same company, did research that I’m about to do and even if I get their insights I’m going to do that again. And I know it’s really, really messy and hard to retain all the knowledge that you gather from research. The second I had an opportunity to do something about it, I did. That was at WeWork. And we built a system that we called Polaris for that, to solve these problems. We identified the – it’s going to sound funny, but we identified that the main problem, the main root cause for these problems are – or is, the research report, that was what I called back then the atomic unit of a research insight and we changed that unit into – that’s why the metaphor breaks – a smaller atom, that we called a nugget, a research nugget. And that’s what we stored in this repository. So, a nugget was a combination of an observation, evidence and tags.

Steve: What was the last one?

Tomer: Tags. And then due to these tags you can then search through this database and find answers to questions that you didn’t design studies around. So, it happened many times at WeWork where people came to us and said, “what do we know about…?” or, “can we do a study about blah blah?” And we said let’s try Polaris first and realized that we have all the answers without even needing to do more research. This will only happen – you have to change your ways a little bit, not just work with a system like that – this will only happen if you do kind of continuous research – continuous and open-ended research. Back then at WeWork we did exit interviews. So, every WeWork member, customer that decided to leave, we pinged them and talked with them, interviewed them, and asked them kind of very open-ended questions such as why are you leaving? What worked well at WeWork? What didn’t work so well? If you had 15 minutes with the CEO, what would you tell him? Things like that. And then that allowed us to have answers to many, many questions because these research participants, these exiting members decided what they wanted to talk about. If they wanted to talk about – I don’t know, whatever – the price that was too high for them, or the coffee that was too great for them to leave to another place, or whatever it is that they chose, went into the system and then we could pull it out later on and then see okay we heard – and combining with – combining that with a user experience measurement system would lead you to a situation where you can say okay, I saw that satisfaction with coffee in our WeWork buildings in the Netherlands has gone down in the past month. Here is a play list of three Dutch members bitch about coffee from the past month. And then you have the what happened – the numbers. You have the why it happened from these videos. And if you’re going to “serve that” to the person that buys or decides how to brew coffee in the Netherlands then that’s half way through to the solution. So, we imagine something like that at Goldman as well.

Steve: So, it only works, you said, if you have kind of ongoing open-ended research?

Tomer: Yeah.

Steve: Why is that?

Tomer: Because if you always – so, the other type of research I had to give it a name. So, I would call it dedicated research. Dedicated research is research that you do, and you know what research questions you have beforehand, and you answer those questions. And then you can create nuggets and it’s all good. But then you’ll only have answers to those questions. When you do open-ended you have answers to questions you never imagined that you might have, or may have in the future.

Steve: What if my research question is what are the highs and lows of a WeWork member?

Tomer: So, if you do that once you’ll get a snapshot of that point in time. But if you do that continuously, all the time – and at WeWork we, at some point, interviewed -all the UX team members did that on a regular frequency. Then you have thousands and thousands of data points.

Steve: Okay. So, there’s sort of a scale here – scope or scale. I don’t even know which one it is, but there’s an amount of data that covers a breadth of topics and that is also refreshed.

Tomer: Yes.

Steve: Okay. So, what’s – I would hate if anyone asked me this question, but I’m going to ask you. What’s an insight? Because you were kind of saying a nugget is this sort of stuff, but you brought up insight. So, what’s an insight.

Tomer: I actually have a definition for that. I’m thinking about that these days. An insight to me is a deep understanding of the situation. So, you – I’m trying to think of an example. I’ll go to WeWork again. So, imagine a researcher that walks in a WeWork building, looking around, and then sees WeWork has shared open spaces, but also private offices. Okay. So, let’s say they notice in private offices that a lot of them have printers that they’ve brought in and that looks odd because WeWork offers printing. So, we have printers and we offer that as a part of your WeWork membership and if you’re going to go over you pay for more. It’s a nice stream of revenue for WeWork and when they counted it – let’s say the count it and saw that half of the private offices brought in their own printers. They go to a second building and a third building and they count, and they see the same. Half of the members that have private offices brought in their own printer. So, let’s say they stop here, go back to the office and add an insight, nugget, whatever we call it, to the system saying half of WeWork members in those buildings brought in their own printer. That to me is not a deep understanding of the situation. It’s very interesting. It may be indicative of something that’s going on that we’re not aware of, but that’s not enough. We have to understand why? So, I would encourage that researcher to knock on the doors and ask why? And then we may hear things like, “oh, you know you have a 15-page manual on how to install printers and I’m not going to waste my time on that.” Or, “you have to log in each time you go to a printer and that’s taking more time.” And so on and so forth. “We do steal your paper, so we enjoy that.” That’s what I mean by deeper understanding, so we can better understand the situation, know to answer why that is happening, have some evidence and then I would say that’s an insight. That’s a deep understanding of the situation. And to me the system that we built is going to enforce that, so you cannot submit nuggets or insights without that why part? Just facts are not enough. We don’t need facts. We need facts plus why they are what they are.

Steve: Right. There’s a behavior and what’s the reason for that behavior. I struggle when I hear people talk about insights because sometimes they talk about why a single person is doing something, as opposed to sort of why users of a system are experiencing something. Like, in your scenario, you knock on somebody’s door and say what’s up with your own printer? “It’s too hard to install.” And you come back and say – like there’s a difference between we understand why this person was doing it and then sort of the generalized conclusion. I don’t know, I’m putting my own language into your framework. It’s probably not working, but…

Tomer: That’s alright. What we would do with Polaris is gather those individual insights. So, each one would be a nugget. And then if you have 100 of these, the only difference would be the video, the person in front of the camera explaining why they brought in a printer, or whatever it is. And then you can create a playlist and show it to IT, or whoever decided that we’re going to go with this system for printing, and have them decide what they’re going to do about it.

Steve: Okay, so let me push on that a little bit. Because when you talk to say 5 people or something, about this behavior, bringing in your own printers, people are going to give related, but sort of seemingly individualized explanations. And there’s an act of interpretation, analysis and synthesis – words that we often use – to sort of say well let’s look at all of those. Like what’s the overarching reason? Talk about deep understanding. It’s not that we loose track of the fact that the installation manual is messy. There are not enough plugs. There are a lot of sort of reasons, but the larger issue is something like complexity, or not seen as adding to my business or something. There’s a higher order thing. To me, that’s what the insight is. I think you were talking about doing that.

Tomer: Yeah.

Steve: The words are slippery here when we’re talking about insights and nuggets and sort of explanations.

Tomer: Nugget is more the kind of the technical way we called it. But, I agree. And the way it happened in Polaris is through those playlists. So, we encourage people, both in – you know people who belong to the UX team, and ones that are not, to collect these nuggets into playlists and then prove a point, or do this analysis, get to an insight and share it. So, it depends on what you find and what you collect there, but let’s say you search for printers and then you get 73 results. You sift through them and you see that 15 are not really related to what you’re trying to communicate here. The rest is too many, so you’ll pick maybe 7 that the videos are really good, like “good participants” that eloquently explain the point and then you can add your analysis in writing and describe that higher insight, or deeper insight.

Steve: Right. So, Polaris kind of sets you up to make that interpretation.

Tomer: Yeah, yeah.

Steve: It gives you some structure or some way to quickly…

Tomer: Yes, yes.

Steve: Does Polaris capture that thing it facilitates you to make?

Tomer: What do you mean?

Steve: So, create this playlist. You watch the playlist and you come up with sort of new articulation. The biggest issue around printers for us is that we are doing this, and they are thinking that. Right? That’s a new piece of knowledge that’s created by reviewing what Polaris gives you.

Tomer: Yeah. Polaris was not smart. It’s just a tool. It would do whatever – it allows you to do whatever you want to do with it. So, it allows you to add text to do that. So, if that’s a yes then it allows you to do that. Polaris, in its kind of essence, is very simple. Just a tool that facilitates the kind of storage of those nuggets and creation of those playlists.

Steve: Yeah. So, that really helps me understand sort of what you’re aiming at when you talk about insight repository. It’s to me something you can re-query to come up with new conclusions.

Tomer: Yes. Exactly.

Steve: As opposed to sort of here’s all the things that we’ve concluded.

Tomer: Yes. Exactly. Exactly.

Steve: Okay. Wow. Here’s the GIF of the mind blowing up a little bit.

Tomer: You want more?

Steve: If you have more, yes.

Tomer: Now it’s just ideas and that’s not something I’m kind of working towards in Goldman, but I’m thinking – so imagine every company that should have a Polaris has it. That’s also a waste because then every company has its own repository and then I’m sure there’s a lot of overlap. So, maybe in the future there should be a kind of open…

Steve: Panopticon.

Tomer: And open Polaris, or whatever we call it, that anybody can contribute to and anybody can pull from. I’m not doing anything about that.

Steve: A friend of mine – so, this is like a third hand quote that I’m sure I’m misquoting, but a friend of mine told me this and he was quoting the head of knowledge management at NASA. This guy says, “the best knowledge management tool is lunch.”

Tomer: I can see where – I can see why that was said. Yet, I wholeheartedly reject the idea.

Steve: Yeah

Tomer: I mean, that’s not scalable. That’s not what if I went to the bathroom when that happened? What if I started working in NASA the day after that important insight was shared? I can understand the kind of anecdotal part of it, how it’s useful. But to me there has to be something – I don’t know what to call it – more solid.

Steve: Yeah. Analogously, you started off our conversation by describing the lunching that you’re doing at Goldman, connecting people, meeting people. That’s different I think than transmitting nuggets, but you are using sort of lunch, and I mean very vaguely lunch, time with people, talking to them as a way to – I don’t know, as a way to do what? What’s the difference between sort of the NASA lunch thing and what you’re doing with socializing and connecting people?

Tomer: I think what I’m trying to do is kind of socialize a discipline. And if I understand it correctly, the NASA lunch is to socialize an insight maybe. So, yeah, I don’t think we’re there yet in terms of socializing insights.

Steve: And who knows what the context of that quote, which has been quoted, which I’m requoting. It may be exactly coherent with what you’re talking about.

Tomer: Maybe

Steve: I want to loop back to something that you talked about as part of this infrastructure. You said there’s groups that don’t have a researcher and may never have a researcher. So, what are sort of tools, knowledgebase, that can help them do things? I feel like that’s – there’s a thing in our field about who does research? And I’m not even sure what the label for that is?

Tomer: Job security.

Steve: Yeah. Is that what it is?

Tomer: Should we let them do research?

Steve: Right.

Tomer: Yeah. It makes me laugh. Of course, we should let them. As if we’re the authority. But, yeah, of course. I mean why would anyone not be allowed to do research? Because they didn’t go to school? I don’t think so. If somebody wants to do it, to me that’s huge. So, we should give them everything we can to let them do it. Are they going to be doing a bad job? Maybe. But to me bad research is better than no research. It’s a first step and we are good about the tools, the socializing what we do, socializing best practices, things will get better. Yeah, there will probably be crap added in the first few times. Maybe the first 20 times. But if they really want to and they have this passion, then why kill it by saying that it’s our job, or something like that. So, yes, I’m all for “letting them” do research. Definitely.

Steve: I mean I think you highlighted exactly – so there’s – I think job security is a fear, but I think bad research is also a fear, as you said.

Tomer: Yeah. I’m okay with that.

Steve: And you said bad research is better than no research.

Tomer: To, me, yeah. 100%. Yes

Steve: I like how definitive you are.

Tomer: I’m…

Steve: Because that’s a hot topic, I think. I’ve heard people go back and forth on it.

Tomer: I know. I heard that too. I’m definitely on that side.

Steve: I also will say you’re describing ways to limit or mitigate bad research.

Tomer: Help make it better.

Steve: Yeah.

Tomer: Yeah, yeah. I mean first they need to know about it. What happens – there are so many people who develop, let’s say software, at Goldman. They’re not even aware of our existence. It’s not that they think about it and say, “uh, no, I’m not going to do it.” They don’t even know that this is happening, that we are there. So, I think there’s a long way to go. We need to kind of be more popular and be more known and then provide all the tools, help, guidance, knowledge that we can. Knowing that we can’t support everybody. It’s not going to happen. It happened even at Google that had, at the time 100s, and today probably a lot more researchers. There are teams that build stuff. Why not allow them to do research to help them?

Steve: You made the comment that research was happening before there were researchers.

Tomer: Yeah.

Steve: In the history of the world that’s also true, right.

Tomer: Yeah. True. We can’t stop that or control that.

Steve: So, maybe just a slight shift. We’re sort of talking about who’s allowed to do what, or what are researchers and what do we do? You mentioned early on that you also take on an additional role. Do you want to give some context to that? And what does that mean for you?

Tomer: Yeah, I also lead a design group for a product, Private Wealth Management. It allows very rich people to manage their money. It’s a part of a service that Goldman offers. And the digital aspect of it is not the primary aspect of it. It’s just kind of a supporting role. It’s mostly based on a relationship between an advisor, a Goldman Sachs advisor, and a client. And there’s the usual suspect of website apps and so on. I’m now leading a group of people that design that. And I mean design with the expanded way we define it. So, it’s not just designers. It’s also researchers and data people and a writer and prototyping and so on.

Steve: Are there differences between managing researchers – we talked before about people coming in with resumes that scream researchers – vs. all the different kinds of functions you’re working with on that team?

Tomer: Honestly, no. I don’t think so. I would be the first to admit it. I’m personally not the best designer. Not in the world definitely and not that I could be. But, I think there are things that are similar no matter what kind of group you’re leading. It’s good that you know something about what the group is doing, but I think it’s mostly about empowering the right people, giving them what they need, releasing them of things that are just stupid, that they don’t need to do, and they don’t need to be involved in, and focusing them on what they are passionate about. This doesn’t have any direct relationship with design or research or whatever it is.

Steve: Yeah. There’s an email list that I think you and I are both on that’s about design and user research. And there was a thread, or maybe people were having a conference call. I can’t even remember how it manifested, but the topic was researchers managing designers, which seems like it’s a newer thing. If you look historically, like research was sort of the accessory or adjacency to design, so design teams kind of managed researchers. But as research has grown there’s other people in the situation like yours where their label for themselves would lean more towards researcher, but they’re managing designers. So, it’s interesting that you sort of don’t see a difference because I feel like the thrust of this group needing to talk was hey, there’s something different here and so how are we going to deal with it?

Tomer: I know I have kind of my internal bias toward research. So, I’m probably more kind of attentive to mostly when that is not happening. Maybe than a person who would be a designer that manages designers. But I’m just guessing. I don’t know. I know that I’m definitely – I care about research and I notice and say something when it doesn’t happen. I don’t know – does it have to be a designer – designers need to know if that’s their thing.

Steve: I think I might switch gears entirely here.

Tomer: Go for it.

Steve: I’d love to just go way back, like as far back as you want to go and maybe give the story of things you did in your life to kind of get here, whether those are work or school or other things?

Tomer: That got me here?

Steve: Yeah. What’s your sort of background, or your narrative arc, if you will?

Tomer: So, I’ll tell you something from a long time ago that probably really was the tipping point, that I wasn’t even aware of that that was the tipping point at the time. I mean it’s not a secret I’m originally from Israel and I relocated, it was 12 years ago. And while in Israel I served in the Army. I signed for a career in the Army. So, the whole shebang. I was going to be a career officer for the long term. But then when I was 24, long story short, I was a paraglider. And I took a course and I got injured badly and I was out of the Army for a year. I was at home recovering. And that was a bad thing. Bad injury, but that opened my eyes. And during that year I came back to the Army and said I want to cancel the whole thing; I don’t want to stay; I want out. I will do my next job because we planned for one more job, but that’s it. I’m out. And that’s what happened. I think without that accident I would probably – I’d probably be retired by now, but I would be a career officer and not what I am today. And kind of looking back, I’m happy that that’s what happened. I would say that’s the biggest thing that affected what I’m doing today.

Steve: So, opening your eyes was realizing that you didn’t want to go on the path that you were on. Was there any hints for you of what path you did want to pursue?

Tomer: I knew it was creative. I was in a wheelchair for four months and then crutches and then learn how to walk again. And I made my way, once I was able to kind of get up, to a local artist that gave kind of very open-ended lessons in his basement, like a couple of blocks from my house at the time. So, I started painting and tried all kinds of ways to paint. And I know – I didn’t know to say that I will be an artist and honestly, I wasn’t really good at it. But I knew it would be something creative. I didn’t know exactly what.

Steve: If you look at your work today, does it match that?

Tomer: Um, not 100% overlap, but I feel some of it is, yeah.

Steve: I mean I have wrestled, mostly privately, with just the idea is research a creative field? Or are we creative?

Tomer: Some of it is.

Steve: I found myself in a collaboration with people that I think more traditionally fit that job description and I kind of had my hair blown back, just on sort of the speed and breadth of making stuff. It was definitely intimidating.

Tomer: So, I would call myself a researcher, but still I was heavily involved in shaping Polaris. That’s a product. It’s not research work. I’m now involved in creating a system that – I also lead a small team of engineers that build a system to measure the user experience. So, that is definitely more creative than maybe research. But some parts of kind of pure research are creative. To me probably the biggest one is translating one or a set of questions that a team has into okay, what are we going to do to get answers. That’s not always – if it was that easy to come up with an answer to that, then anybody could do that well. That’s not the case. A lot of people are having a lot of trouble with that part. So, I think that’s a creative part. You’re not going to see a beautiful painting coming out of that, but it is creative.

Steve: Right. I think, for me, creating the new story out of a bunch of experiences or nuggets, or whatever you’re pulling from…

Tomer: Realizing, getting to an insight. Yeah.

Steve: So, what did you do after the art class? What do you end up doing?

Tomer: I applied – I applied to what we call today, probably a visual communications program. Got accepted to one of the best ones, if not the best one in Israel at the time, and last minute decided that it’s not for me. And then I learned – I studied copywriting. So, I’m a certified copywriter, in Hebrew though. And then I took my first job – or, I worked. I worked for 3 years as – I didn’t really know what I was going to do, so I did something that I knew how to do and that was I worked in a very small consultancy for military oriented industries. So, I did that for – I was a project manager. I did that for I think 3 years. That was the time where I kind of learned all these things and really set my mind that anything army related is not for me. And then my first real job in that direction was I was – funny how names were at that time. I was an internet copywriter, which we would probably call today a content strategist, for a website that I would compare it to Monster or Indeed, or something like that today. And there I got exposed to – probably at the time it was Jakob Nielsen and people in that area. I started reading more. I did some, again what we would call today, product management work there. And then I asked them to switch to what we would call today a researcher. They said, “no.” And I was like, okay, and I looked for a company who would take me. And there was one company that took me as – again, it wasn’t researcher. It was called a usability something. And that’s it. That’s how it started. They were very brave, I should admit, because I didn’t know much.

Steve: But they took you as a researcher?

Tomer: Yeah.

Steve: So, that was sort of your first time with the title.

Tomer: Yeah. You want an even funnier story. The only person who actually knew what it was was the CEO. He was the one who interviewed me. And then he hired me and then a month later he decided to leave. So, the only person who really knew what I was doing left about a month in. But that went well. That’s how it started.

Steve: What was the point at which you came to the U.S.?

Tomer: Um, so, I had a couple more jobs and I realized very quickly that, at the time at least, there weren’t enough, or even at all, opportunities to grow into managing a group or a team of people who do that. I was always the only person in the company that did that. And I realized, or I got to the conclusion that the only place for me to work in a company that had a lot of people of my tribe would be here in the states. And I also realized the army there is not like the army here. I didn’t have any academic degree. I also realized that I needed the right degree because these companies that I was thinking about would not even read my resume. My resume didn’t scream researcher. So, I went to school. I continued working and went to school. I completed my Bachelor’s and then applied to Bentley University here and then moved. And as I was studying there I – during school there, in Massachusetts, I contacted Google and things rolled from there.

Steve: We talked about WeWork a little bit. Can you – like what was the – what was your role at Google and maybe what was your role at WeWork?

Tomer: Google, I was a user researcher, a senior user researcher. First in advertising. They’ve changed all the names by now, but at the time it was DoubleClick for Publishers. That was the product I was involved in. Ironically, I was the only researcher there in that group of hundreds of people. And after, I think, 2 ½ years I switched transferred to Search and in Search, I was first Voice Search and then we called it, at the time, Core Search – the bag of 10 blue links and how they developed from that point to what you see today as all the visual aspects of the result and so on. So, that started back then, vertical by vertical – TV, movies, music. I was doing a lot of research into search results for sports. I know a lot more than I should about all kinds of sports, cricket and so on. Yeah.

Steve: And then what was the role you took at WeWork?

Tomer: WeWork, I was head of user experience. So, started a group from scratch. The goal there, and this is kind of following several conversations with the CEO and co-founder that hired me. At the time – again, I’m sure things have changed since. But at the time WeWork had three big groups internally that kind of built or created the three aspects of the WeWork product. They were called digital, physical and community. And Adam, the CEO, felt, and he was very right, that while each group is doing a great job, sometimes if you look – if you’re going to think or look from how things are from the perspective of the customer, the member, there are gaps between those groups that we’re not even aware of. And the goal was to identify those gaps, to me that translates to research, and then solve them – solve the problems there. I’m thinking of an example. Think about conference rooms. So, conference rooms is something that WeWork offers. Members pay for it. So, somebody physically designed the conference room. An architect decided on the size and location. An interior designer decided on the mood and what will be in the room. Somebody from IT picked and AV system for that room. Somebody in digital developed a system to book this room. Somebody in community designed a policy of how to use this room. And community team members in the buildings enforce this policy. Everything is good. Everything is working well, but then situations happen. Such as members come to a meeting, they book the room and then another member is squatting the room and refusing to go out. Even if they walk in they realize that I’m a startup, I booked a room for a meeting with a potential investor and then I see a room that is designed as a music room with bean bags and no projector, clearly inappropriate for my meeting. Or, that person who’s squatting, I’m going to the community manager, but they are dealing with a leak of water from the ceiling on another member’s head. They’re all very nice, but they can’t solve my problem right now. So, this is what I’m talking about when I said the gap between one of those groups. So, we’re trying to identify those gaps – because in many cases we didn’t even know about this – and try and solve them. That was the premise back then.

Steve: You’ve written books. You give a lot of talks.

Tomer: Less now.

Steve: You have a good sort of history of creating material. You’ve interviewed a lot of people. People listening, what would you send them to, to buy, read, watch?

Tomer: Well, our publisher would not be a publisher – would be happy if I say that they should buy my book, yours too. And that would be – the name of the book is Validating Product Ideas Through Lean User Research. That’s a book for – I would only mention that because my first book is for researchers, and a lot of researchers do listen, so maybe I’ll mention both. But the second book, the one I mentioned, is for – it’s a step by step guide into answering different research questions that people have. Each chapter is a research question, step by step, on how to answer it through research that anybody can do quickly. The first book is called It’s Our Research. It’s to solve a problem that, at least at the time – I now have different thoughts about that – but, at least at the time a lot of researchers had, maybe today as well, and that is a problem that a lot of people don’t want to do research because they feel they have the answers already, or they have good intuition. And also, once they agree to do research in some cases they don’t want to act on it.

Steve: Wait, act on?

Tomer: On their research results.

Steve: What they’ve learned? Yes

Tomer: So, that’s a book that’s supposed to help with that problem. And I’m kind of having different thoughts now because my answer in that book was – and that’s why I called it It’s Our Research – make them feel that it’s their research as much as you feel it’s yours and then they would want it and they would do something about it. That was my point. Today, I’m having kind of different thoughts about how to get to a point where research is wanted and acted on. And I will try it at some point. We just need to grow a little bit at Goldman. But, my thoughts are when you – and I posted something about that recently – when you plug your own charger into the wall, do you really care how electricity gets there? What’s happening in the power plant? Why it’s working? How is it efficient? Is not efficient, and so on. You don’t really care. You want your phone charged. My thoughts about research is why wouldn’t it be the same. People have questions. Research provides answers. Yes, a lot is going on to get to those answers, but if you have a question, why do we need to bother you with all the details? Just do the thing, trust us to do the thing, and we’ll give you an answer to your question. I’m going to try that at some point. I’m thinking – it’s not political, but building a wall between stakeholders and researchers. That wall could be Slack or something like that through which stakeholders ask questions and researchers provide answers. If we have an answer immediately, if we have a system like Polaris or something like that, we can provide an answer. If we don’t, we will just ask a few questions, follow-up questions, and then do the research and get the answer. Just thoughts. I haven’t tried it yet.

Steve: Which makes me think of the research ops piece a little bit where – like building up participant recruitment infrastructure…

Tomer: Yeah.

Steve: …is an interesting one because back in the old days where we had to do everything ourselves, you’re learning about your problem by figuring how to – by recruiting. You also learn about your problem by dealing with your stakeholder and seeing what – it’s that art piece vs. this kind of process infrastructure piece and it’s interesting to think about like what is lost and what is gained? Or how is changed when you create infrastructure that – like if you’re a researcher and you’re completely decoupled from participant recruiting, that may change how you deal with people that you meet, or how you deal with framing the problem. So, for everything that we build up a process, that’s efficiency, that kind of is a query system, how does that change what we do? And who is coming to this field?

Tomer: Yeah.

Steve: These are not necessarily my own thoughts, but just things I’m hearing from people as well.

Tomer: One other thing that I would send people to is a series of Medium posts that I’ve published in the past year, maybe less, about measuring user experience. A lot of people like to talk about metrics these days. I took the HEART framework from Google and then we have a post per letter about happiness, engagement, adoption, retention and task success. And for each one, what it is, what’s important to measure, why, how, mistakes and what actions you can take from each one? So, this is something that I’m interested in these days, measurements. And I’m trying to figure out the “H” part, the happiness part, specifically. There are a ton of challenges with that. How to measure satisfaction and happiness. I’m also posting – kind of tracking my – not tracking my own life but paying more attention to when I’m exposed to requests to rate satisfaction and happiness and I share them with people, with my thoughts about them.

Steve: Okay. Great. Anything else that we should talk about in this conversation?

Tomer: I said I am speaking kind of publicly a lot less, but I do do that from time to time. I’ll be speaking in two conferences, the Face of Finance in April in New York and in London, User Research London in June, in London.

Steve: Alright. Well, thanks for taking the time to chat and sharing all the information and stories and everything. I really appreciate it.

Tomer: That was fun. Thank you.

Steve: Thanks. And so concludes another episode of Dollars to Donuts. Follow the podcast on Twitter, and subscribe to the podcast at portigal.com/podcast, or iTunes, or Spotify, or Stitcher, or anyplace you get your podcasts. Also online at portigal.com/podcast is the transcript and links for this episode (and of course all the previous episodes). At Amazon and rosenfeldmedia.com you can buy Tomer’s books and my books. Our rocking theme music was written and performed by Bruce Todd.

Mar 24 2019

1hr

Play

Rank #7: 27. Colin MacArthur of the Canadian Digital Service

Podcast cover
Read more

In this episode of Dollars to Donuts I chat with Colin MacArthur, the Head of Design Research at the Canadian Digital Service. We talk about bureaucracy hacking, spreading the gospel of research throughout government, and embedding researchers in complex domains.

Often the idiosyncrasies in people’s research and the sort of surprises that don’t fit within the template are the most important things that our researchers find. – Colin MacArthur

Show Links

Follow Dollars to Donuts on Twitter and help other people find the podcast by leaving a review on Apple Podcasts.

Transcript

Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization.

I just read the 2011 book “It Chooses You” by filmmaker and artist Miranda July. It’s one of the best books about ethnographic research that isn’t really actually about ethnographic research. In the book she describes a period of her life where she was creatively stalled in finishing the screenplay for her film “The Future.” As a way to either get unblocked or just avoid what she should be working on, she develops another project, to call people who have placed ads in the free classified newspaper the PennySaver, and arrange to come to their homes and interview them.

She reports on each of the interviews, including excerpts of the transcripts, and some amazing photographs. The interviews are sort of about the thing being sold, but because she’s getting outside of her cultural bubble, she takes a wider view, asking people about a period in their life when they were happy and whether or not they used a computer (since even in 2011 a newspaper with classified ads was a relic of a previous era).

These interviews are confounding, hilarious, disturbing, touching – everything you’d hope. And July is honest about what makes her uncomfortable, about her own failures to properly exhibit empathy when it’s needed, or her challenge in exercising caution in some dodgy situations while still being open to connecting with strangers. She incorporates her feelings about her own life as she hears from people about their hopes and their reflections back on their lives, lived well or not so well. She articulates her own judgements about the people she met and how that informs her current thinking about her own life and her aspirations for her future.
In one chapter she meets Beverly, a woman with Bengal leopard babies and birds and sheep and dogs. Beverly was clearly excited for Miranda’s visit, and prepared an enormous amount of fruit-and-marshmallow salad which neither July nor her crew want, but accept out of politeness, eager to get away from Beverly and her home, but then head straight to a gas station and throw the marshmallow salad in the trash, covering it up with newspaper in case Beverly stops by. Reading it, I felt my own judgement of Miranda for her judgement of Beverly, but I can imagine doing the exact same thing in a similar circumstance, and I appreciate July’s ability to observe her own judgment and infuse it with compassion at the same time. Ultimately, she views her struggles to connect as her own personal failure, saying “the fullness of Beverly’s life was menacing to me – there was no room for invention, no place for the kind of fictional conjuring that makes me feel useful, or feel anything at all. She wanted me to just actually be there and eat fruit with her.” In articulating something so nuanced and personal, we learn an awful lot about Miranda July as well as all the people, like Beverly, that she meets.

I can’t believe it took me this long to finally read this book, and I can’t recommend it highly enough.

Through Dollars to Donuts, I’m gathering stories about maturing research teams, looking for best practices, emergent approaches, and insights about organizational culture. This is of course highly related to the services I offer as a consultant. In addition to leading research studies in collaboration with my clients, I also help organizations plan and strategize about how to improve the impact research is having. Whether that’s working as a coach for individuals or teams, or running workshops, or advising on best practices, or leading training sessions, there’s a number of different ways you can engage with me. Please get in touch and let’s explore what the ideal collaboration would look like.

You can email me about this, or with feedback about the podcast, at donuts@portigal.com.

Coming up soon is the first Advancing Research conference. Rosenfeld Media is putting on this conference March 30th through April 1st 2020, at the Museum of the Moving Image in Queens New York. I’ve been working on the curation team helping to assemble the program and it’s looking like a really fantastic event. I’ll put the URL for the conference in the show notes. You can use my discount code Portigal dash ar10 to save 10 percent on your registration. I hope to see you there!

All right, on to my interview with Colin MacArthur, the Head of Design Research at the Canadian Digital Service.

Well, thanks so much for being on the podcast. I’m really happy to get to speak with you.

Colin MacArthur: My pleasure. Thanks so much for having me.

Steve: All right, let’s just start with who you are? What role do you have? What organization do you work for? I’ll just throw all the launching stuff onto you and we’ll go from there.

Colin: Absolutely. My name is Colin MacArthur. I’m the Head of Design Research at the Canadian Digital Service. We are an office of the Treasury Board of Canada that works with departments across the Canadian federal government to improve the way they serve their constituents. We do that on my team using design research and by helping people inside the government get closer to the folks who they’re trying to service. We do that in partnership with designers and engineers and product managers and policy experts on interdisciplinary teams, but the perspective that the researchers bring is a hard and fast focus on the people who we’re trying to serve and what their experience with our services is like.

Steve: Why is this under Treasury?

Colin: (laughs) Good question. The Canadian federal government, or government of Canada is organized with departments and then with a number of central agencies. And Treasury Board is one of those central agencies. Its name is a little odd in that it’s not actually the department of finance. It’s not the Ministry of Finance. It plays a central role in sort of managing and consulting with other departments about how they run themselves. It’s a management board of government. And so we are in kind of an interesting place because from Treasury Board we have a view of lots of interesting things happening across government. We’re positioned kind of naturally to give advice and also learn from departments that we work with and then to work across lots of different departments in a way that would be a little more unusual if we were nestled inside a dedicated department itself. So, Treasury Board is sort of one of the central agencies of the government and that’s why we’ve ended up where we are.

Steve: Is that where the Canadian Digital Service originated?

Colin: That’s exactly right. So, we’re relatively new. Founded just a couple of years ago and we started right in the same place we still are, inside Treasury Board, reporting to Canada’s first Minister for Digital Government who was also the President of the Treasury Board.

Steve: What’s the relationship between digital service and, I don’t know, what would you call like regular service. Like the things that government does. Because you included policy in kind of the mix of people that you collaborate with. So, everyone else you listed seemed – design, engineering, research, PM – seems very – yeah, this is how sort of software is made. How digital services are made. But policy – and this is from someone outside government, so maybe it’s a naïve question, but policy just sort of begs the question for me, like oh what’s digital versus just services?

Colin: Yeah. What a good question. I think the way we choose to look at it is we’re interested in improving services, period. So that means the elements of those services that are online, but also the elements that are offline and drift into paper processes and drift into things that are more typical policy problems. But in this day and age it’s pretty hard to have a meaningful discussion about service in general without talking about the digital pieces of those services. So, when we put together teams to work with departments we absolutely come with a digital emphasis. That’s one of the strengths that we can bring. That’s one of the pools of expertise that we have. But we’re just as interested in looking at the non-digital sides of that service. And in reality, they all fit together and attempts to kind of separate them out into the website and the paper part are often pretty hard and don’t end very well for the people we’re trying to serve. So, we kind of view ourselves as tied up in both and we try to staff our teams with expertise that allows us to do that. That said, our name is the Canadian Digital Service and I think often our entrée is our digital skills. But we try to be more than that. We don’t think digital problems can really be solved by just looking at the technology piece.

Steve: That seems to me analogous with so many efforts to do transformation or introduce new services or new products – you know private and public sector around the world where – I mean you’re kind of hinting at the power of the like the construct of digital. That it invites maybe a different mindset in approaching a problem that can exceed beyond the boundaries of what is digital. Or, as you said, they’re intertwined.

Colin: Yeah, absolutely. I think we view the computer part as only part of the digital mindset and approach that we try to bring. I think departments appreciate that. I think that we are often able to look at a problem from different angles because we’re not just looking at the IT systems involved. We’re looking at all the pieces related to the problem they have.

Steve: So, what is your role specifically?

Colin: My role as head of design research is to first of all lead and coach the researchers on our team. So, I help them as they’re embedded in product teams chart the direction they want to take with their research, improve the quality and the quantity of their research. Make sure their research is kind of fitting into the product lifecycle in the ways that we hope. And also make sure they’re developing as professionals. We are happy that folks don’t just come here to build services. They come here to build skills and we do that with our researchers, like with all of our other disciplines. So, helping and supporting the researchers is one big piece of my job.

Another is to convene the government wide community practice around design research. So, we know that design research happens in places other than CDS. We know that research is a tool that lots of other departments are using. But what I noticed when I first arrived was that there were very few conversations between people doing research in different departments, or even within a particularly large department. So, I use my role and my limited additional time to convene folks across government of Canada who do this kind of work and talk about the challenges of doing it and talk about ways that we’ve worked through some of those challenges. The joy of that is getting to see all of the sometimes hard to spot places where people are doing interesting things. You know taking interesting methodological approaches, having new discussions that aren’t visible if you don’t get them all in a room and get them to start talking. So, convening that government wide community of practice is another key part of the job. And related to that is trying to create some tools and some policy change that helps that whole community move forward. We try not just to talk about what the problems are and swap stories on the solutions. We try to learn from that community and then use our position and knowledge to nudge forward policy changes or government sort of practice changes that can help researchers across government be more effective.

Steve: Do you have an example of a policy change or process change?

Colin: Sure. So, one of the central pieces of research regulation in the government of Canada is something called the public opinion research rules and guidance, known as POR. I’ll try not to use that acronym, but public opinion research is this kind of class of research that the government has a process for managing – very thoughtfully and deliberately over a relatively – I don’t want to say slow, but certainly a longer timescale than most user research or design research would happen on. And so one of the key areas of confusion we saw when we started doing design research with partner departments was wait, can you really do this without going through the whole process for public opinion research? Isn’t what you’re doing just public opinion research? Why are these things different? Why should we believe you that they’re different?

And so we went to our colleagues in another part of Treasury Board who actually own the policy on public opinion research, and we said to them, “look, these are the kinds of things we do. These are the kinds of questions we ask. If these are the kinds of questions involved is this really public opinion research?” And their response was, “if that’s really all it is then probably not.” And what was great was that we could then work with them to write a clarification to the guidance around the policy which really just meant like updating a webpage to have a section that talks about POR relates to user or design research and kind of further explain to departments that they could be doing user design research that didn’t need to fall within this typical public opinion research cycle. And that kind of guidance is so important to loosening structural challenges to doing research across government, right? That kind of guidance from the center is what helps people trying to do research in the edge of their departments make the argument to their manager that this isn’t quite as risky as they thought it might be.

Steve: The perceived risk is in doing the research? Is that what you mean?

Colin: Exactly. And I would say risk is probably a strong word. People say “well we have these rules for doing public opinion research. I don’t really know about any other kinds of research, so you better follow these rules.” And I think that what we’re trying to do is say “well there are actually some other kinds of research that are useful and not quite the same thing”. And think about those and realize that maybe what you’re trying to do falls within those umbrellas and thus doesn’t need to go through the same process and make management and the executive layer of departments a little more comfortable with that fact.

Steve: I think implicit, and maybe you said this explicitly, the kind of processes and procedures and things that would be required to do “design research” are less onerous than to do public opinion research. Is that correct?

Colin: That’s right. I think that public opinion research is – is I think often interested in policy and government level questions about public opinions, right? And design research is often focused on questions like what’s it like to interact with this department or this service? And those are different things that require different methods and different rules of evidence and so what we try to do is keep people from getting them mixed up in one another. Now I think if my colleagues from other parts of Treasure Board were here, they would remind us that there are some forms of design research that can kind of verge into typical public opinion research and then it’s important to engage the public opinion research process. But there are also lots of things like usability testing of a new service interface that are clearly not in the realm of public opinion research and therefore we’re really happy to encourage departments to do.

Steve: So, in this convening that you describe, sort of finding these, you also mentioned that there are maybe hard to find – I may be putting a word in here – surprising…

Colin: Yeah.

Steve: …sort of areas of research. How have you come to find those people and that work?

Colin: Well you know it’s funny Steve. I think as we were setting up the community of practice, I realized that recruiting participants for research was kind of the best practice I could have for recruiting members of a government design research community of practice. So, like when you’re recruiting people for research, you put out a call for the kinds of folks you’re interested in, but you also – you snowball, right? So, we started with people that we knew and we said hey, do you know anyone else that does this kind of work, or is interested in these kinds of questions and do they know anyone else and do they know anyone else. Often the third or fourth degree from there, you get to folks who are like, “oh, I didn’t even realize what I was doing was design research, but it is and I’m excited to find this group of people who’s all trying to do something similar.”

Steve: So, by bringing these people together, some of whom wouldn’t even have identified with the labels that we would put on what you’re doing and creating a chance to share and improve practices, this seems like it spreads far beyond what design research within Canadian Digital Service would be involved, right?

Colin: Right.

Steve: The reach is much broader.

Colin: That’s absolutely right and that’s the reason why we did it, right. So, our mandate isn’t just to build services with departments. It’s to continue to improve how the government delivers service more generally. And one of the ways we do that is through finding the folks involved in doing what we do and trying to enable them, right? Give them more tools, whether they’re policy tools or methodological tools. Whether it’s – give them a space to vent or give them a space to celebrate. One of the hard things, I think, about pushing something like design research within government is it’s – it can be hard to find a place where there’s a group of people like you who are really excited about the kind of work you’re doing. And so I think there’s some practical benefits of the community and there’s also some sort of emotional support that happens in the community of practice that’s really heartwarming.

Steve: Are there ways to find – this is maybe the counter example, or the other part of the set – so that teams or departments or groups that should be doing this, that maybe don’t know that it exists, or don’t know that it’s accessible to them, that it’s reasonable or feasible, but would help them in their efforts to improve the way that they serve the people that they’re serving? How do they fit into – I mean it’s kind of a boil the ocean question since you’re already finding everyone that is doing it, but sort of opportunities to build the practice, whether you all are providing that, or you’re enabling them the way you’re enabling the people that you’ve convened? How do you think about that?

Colin: Yeah. It seems like your question is really like what about the people that aren’t doing research and should? And they are also an important group. So, I’ll say a couple of things about them. I think that most major departments do have some groups of people trying to do research by some name, right? So, they’re trying to do client experience improvement, or your digital transformation group has a group of user centric specialists, right? So, there are people trying to do something sort of related to design research, at the least. I think what we try to do is find those folks and then we try to introduce them to the body of knowledge that’s specific to conducting research. Like we believe research is a craft in and of itself and it’s hard. And it takes some work, but it’s also accessible and something many folks can learn. So, with that in mind we try to locate the folks who are working in the generally related areas and inspire and equip them to do this kind of work, or continue to improve how they do this kind of work. You know we talk about the challenges to not only design research, but sort of digital best practice, at multiple levels. One of them is the policy or sort of structural level and we – things like our POR guidance clarification, the public opinion research clarifications, those certainly help folks, but they’re not enough, right? They – folks also need the skill to do the work and they need opportunities to learn. We try to create lots of informal opportunities to see what other people are doing. And then, you know, the bottom layer, beyond just policy and skill, is, for lack of better word, inspiration. Showing that it’s possible, right? Like showing that this work can happen within the government, within all of the unique factors of the government, and that it’s useful. And so I think we try to not just give folks kind of on the edge some policy tools, but we also try to expose them to the skillset and we frankly just try to show it’s possible and continue to cheer them on as they push it forward.

Steve: That’s good. I think we’ll probably come back to the ways that you’re working with teams and departments, but I want to go back to one of the other things you said early on. You were kind of describing the two main things that you’re involved in. We’ve talked about sort of convening the community of practice aspect here, but you also talked about leading and coaching researchers. You were really emphasizing building skills was almost a core value. Like this is a thing that you’re really thinking about being an outcome for people that work in research. Where does that come from? That is not a universally held belief I think in groups of researchers.

Colin: Yeah, well I think part of it is born out of the broader mission that we’ve been discussing, right? So, I think we are not just here to do research that helps create better services. We’re here to help the gospel of research spread around the government. And so in order to do that I think I need to care both about continuously building the skills of our staff and then taking those same tools and making them available across the government. So, for me it’s hard to figure out how I could say, “ah, broader government you should be building your skills in this way” if I wasn’t also practicing that as a leader in our own community. And I’ll say we’re not just interested in sort of helping folks build their skills, but helping – and helping researchers within our group build their skills. We’re interested in them learning about how to teach other people about research. So, I think we’re trying to build the empowering and enabling and teaching ethos into the folks that come into our group that makes it easier for them to interact with our partner departments. It makes it easier for them to have useful conversations across the design research community of practice. So, I think that from a mission angle is why that’s been a central part of how I view my work. I also just fundamentally believe we can all continuously become better researchers and that one of the ways to do that is focusing on skills building as a continuous improvement approach, not as a sort of set it and forget it professional development activity.

Steve: There’s often service design happening, of various kinds, without there being research happening. Even though we should all clutch our pearls at that idea, it still happens. So, at some point research is given a title, given a mandate. Someone like you is involved. Can you talk a little about sort of your role and your own trajectory? Where it came from and how it got to where it is today?

Colin: We’re relatively new and relatively young. I think we are blessed to have a dedicated research group and research leadership, given our size and our age. So, how did we get there? You know as – when CDS was a young and scrappy startup or a handful of people, I think the roles were not as clearly defined and people did whatever they could do to help the partnerships move forward. And luckily one of the skills in that mix was research. There were people there who thought that design research was important and who liked going out and talking to people about services. And so from the very beginning that was a part of who we did our work. And so as we grew, and then we had to kind of start dividing into more discrete communities of practice, and being a little clearer about what people’s roles were, I don’t think there was ever a question that research was kind of an important dedicated skill. I think it’s very related to design and so I think that we’ve gone through some iterations of figuring out kind of exactly how we relate to design. But right now we’re a parallel community at the same level as design or product management or engineering. And I think the more we exist that way, the more we like it because researchers need different kinds of support and coaching than other folks do, right? Research is a really different skillset then designing, at a service or an interaction or even a visual level, right? And so I think our researchers are really happy to have a dedicated group where they can get feedback on their craft and have managers that are rewarded and selected by not just their expertise sort of in general design, but in doing research in this world. So, I think we’ve been happy with how that’s shaken out. And I think we’re just – we’re lucky that research was kind of part of CDS’s way from the very beginning, down to the people who were founding the team.

Steve: Did you come into CDS in the role that you have now?

Colin: I did. So, I came to CDS with experience at a similar organization in the U.S. federal government. And when I arrived it was to lead the design research team with a recognition that that was a distinct team with sort of distinct support needs and a need for a manager that knew research and knew about doing research in government. So, I know that that’s not always true and so I feel very blessed to have ended up in a situation like that.

Steve: So, there was a team, or a nascent team, and that team needed leadership?

Colin: Yes. It was a small team at that point. I think there were three of us when I arrived, and we’ve grown substantially. I think we’re now 10 people or so, and continue to be a part of the growth plans for the org.

Steve: So those 10 people – you sort of described early on about how they would be working closely with a specific department or team that they were kind of part of.

Colin: Yeah.

Steve: Maybe you can talk about what that cycle looks like or sort of how projects or jobs or roles and researchers are matched up over time. What does that look like?

Colin: Sure. Yeah, so researchers are embedded on interdisciplinary teams. We call them product teams. And those product teams work with partners through some phases. So, the first phase is a discovery phase and that is a phase where researchers really lead in open-ended research about the nature of the problem that the department is trying to solve. We usually get set up with departments who know they have an issue and are interested in kind of our different approach to things, or they have a goal and they’re interested in us helping them achieve it. But researchers lead their teams in unpacking what that means, particularly for the humans involved, right. There’s important technical discovery that happens in parallel, but we really try to get the whole team involved in talking with both the members of the public implicated by the service, but also the staff involved in delivering a service. You know government services are this wonderful like sociotechnical network of people and we try to understand how those all fit together in discovery. It’s very rare that we would just talk to the public. I think we would spend a lot of time doing that. We really emphasize that, but we also really try hard to see the other side of the service and how all those people work together to create the experience the public sees. So, that’s discovery, right. Getting the lay of the land and understanding perhaps some of the possible roots of the problem.

And then a team transitions into alpha and beta phases of building something. What that is varies tremendously based on what the problem is. I think one of our product teams with Immigration, Refugees and Citizenship Canada did a bunch of work on a letter that was involved in a rescheduling process. That was sort of classic paper content design, tested by a researcher. Happened in conjunction with some digital work they did, but was just as important. So, that could be part of an alpha or a beta, or it could be about building a new digital service like we did with Veterans Affairs Canada building a new directory of benefits for veterans. Those products take different shapes, but regardless the researcher is trying to bring the voice of all of the people we talk to in discovery back into the product development cycle. and so that can mean usability testing, or content testing. As our products get more mature, we also use quantitative methods. We run randomized control trials. We try to use analytics on things that are out in the wild. They use all of the methods at their disposal to try to keep bringing the voice of the people we’re serving into the product process. And I think what that looks like, as I say, just varies tremendously and we kind of like it that way. I think that’s one of the cool parts of being a researcher at CDS is we say well you have this team, they have their needs, you need to serve them and make sure they have the right information about people to make good product decisions, but there are lots of different ways we can get at those answers and we agilely – probably to overuse that word – assemble the methods and the timelines accordingly.

Steve: So, a researcher working on, for example, the Veterans Affairs, for the duration of that program that’s the thing that they’re working on?

Colin: Exactly. Exactly. So, we’ve been really fortunate to basically have one researcher per product team and to be able to keep them on the same team. Which – and sometimes that’s not possible, but you know research, as hard as we try to not do this, sometimes becomes a practice of implicit expertise, right? I think researchers who have a long history in an area, or with a product, kind of know things about how people will respond to the service that are hard to articulate or sort of systematically articulate in reports or the other ways that we try to codify that knowledge. And so we see huge benefit to people having some longevity in the product teams and thus in the domain that they’re working in. Not always possible, and I understand not possible everywhere, but for us I think researchers really enjoy kind of building a deep expertise that comes with doing 10, 15 or 20 studies for a particular product in a particular area.

Steve: Depending on the type of organization, the carryover from one product team to another – I mean in my mind government is sort of an example of a category where there’s a lot of different things that you’re doing.

Colin: Yeah.

Steve: And obviously there’s cultural things and organizational things. Whereas if you’re working in some commercial enterprise that maybe makes a lot of different products and maybe serves different customers, the breadth might be less.

Colin: Yeah.

Steve: So the value, I guess, if I’m doing the math in my muttering – the value of that sort of hard won knowledge is maybe necessary to preserve or cherish it in a different way, just given the breadth of what you’re doing.

Colin: I think you’re right. And one of the fun things about getting to support these folks is that I can have conversations on a given day that range from how the Royal Canadian Mounted Police want to handle cyber crime to how Canada Revenue Agency processes tax returns for low income people to how the Employment and Social Development Canada issues benefits, right. And those are incredibly different, both business processes and missions, but also involve very different people and very different kinds of concerns on the parts of those people. So, it’s pretty neat to get to hear about all of that, but it also creates a challenge in that when we start on a new team, or a new researcher comes to a new team, I think there’s some sort of basic government knowledge they’ll bring with them, but often they’re trying to come up to speed on a pretty complicated area, pretty quickly.

Steve: When you described earlier your role is to – you know in that kind of coaching relationship you have with the different researchers – but it sounds like you’re the one that has the overview or has the window into these different product teams and what they’re doing. Are there researchers – what kind of interaction do they have that isn’t through you necessarily about what each is working on or what kind of challenges they’re facing?

Colin: Yeah, absolutely. I am – one of the things I learned very early on was that I was not the best conduit for all of that information between them and each other. And so I increasingly view my role as creating opportunity for them to share directly with each other and across the org. So, what that boils down to – there are first of all researcher standups, where we talk about what research we have done, every week. And we try to focus on not just kind of what we did, but what we’re learning. And that’s usually not enough to create understanding, but it’s enough to create a hook, right. It’s enough for one person to say, “oh, that person is really talking about the experience of submitting a form that’s an application for benefits and that’s actually kind of similar to what we’re doing, so maybe we should go have a more deep conversation, or I should ask for some documentation.”

We also have rotating, dedicated research critique groups. So, researchers form groups of 3 or 4. They meet weekly to give each other feedback on their work. It’s a little trickier to give critique on research than it is on design artifacts, but just as important. So, they’ll go through research plans with each other. They’ll go through recordings of interviews and sort of talk about the approach that the researcher took and different ones other people might have took. They’ll go through reports and deliverables. And although the kind of stated purpose of those sessions is to help people grow and share skills with one another, I’ve observed that a real common output is better cross product knowledge between them, right. If you’ve spent the time really thinking about the pros and cons of someone’s research approach you tend to understand their product a bit better. So we have critique groups.

And then one of the other kind of structures we have at CDS is these research community meetings that are actually open to everyone from the organization. So, every other week we host these 45 minute kind of brown-bag, lunch style meetings where researchers give talks. They give talks on either their product work, when they’ve recently finished it up, or on their – or on new or interesting kind of methodological or government logistical things that they’re working on. So, we try to create lots of channels for that information to flow. I think that we’re still kind of a size where some of those meeting and interaction driven approaches work. As we grow, I suspect we’ll have to get more creative.

Steve: Right, critique circles for ten is different than doing that for 80, or brown-bags. Which is crazy that I would throw that number around and everyone would nod, like yeah, that’s the size of research groups in some organizations now.

Colin: Sure, sure.

Steve: It’s not that long ago that that was an absolutely ridiculous idea.

Colin: Absolutely.

Steve: So, at ten, you can have some kind of communal knowledge just based on – well, you’re putting formal things in place for sort of semi-formal knowledge exchange.

Colin: Yeah. Yeah, that’s right. I think that we’re certainly beyond the totally informal everyone talks over lunch about what’s happening size. I think we are at the edge of what works well for critique groups and sort of community meeting exchanges. But again, it’s a little trickier for us because our domains are so different, right. And so it’s harder to say we’re building a common understanding around the user of a particular product that we all study. Often, we’re studying very different things and there are things to learn across them, but there’s also real differences between them.

Steve: I’m going to switch gears a little bit. I think it builds on some of what we’re talking about, but as you talk about the team having grown, and maybe growing into the future, what do you look for? What makes for a good researcher for CDS?

Colin: So, we focus broadly on a couple of things. I think the first is craft, for lack of a better word. So, when we hire people, and when we’re going through interviews, we really dig into the details of their previous work. You know what – how did you construct the research questions? Why did you pick those questions? How did you pick the methods that you used to answer those questions? Why did you pick those methods? We spend cumulative hours working through that with folks because we really believe that the basic skills are so important because they’re hard to – if you don’t have them down, they’re
hard to maintain in our challenging context, right? So, you really have to be a great baseline researcher and know the basics really well to succeed here because – and this gets to the kind of second factor – it’s hard to do research within government and on product teams. And I think it’s hard to research everywhere, but there are a number of challenges that folks need to be ready to meet. One of those that we look for and also spend a lot of time talking about is ability to recruit research participants and build relationships and structures around doing that. So, because we’re changing domains and we’re going into new areas, it’s not uncommon for us to start a product and put a researcher on that product that has a very specific kind of group of people that we’re trying to talk to, and would be very hard to pay a commercial recruiter to access. And so then they have to get creative. They have to go make friends in the advocacy worlds related to that department. Or they have to work with that department to use administrative data to find people. All of these things are much harder than hiring a recruiter to do the work for you. That’s not to say we don’t use recruiters. Sometimes we can, but a lot of what we do is so narrow that we need to find people who don’t just love the craft of doing research, but love the craft of recruiting. I think over time we continue to look at ways to make less of the job, but it frankly just still is a reality given how many contexts we operate in and what people have to do. So, recruiting is one piece of it.

And I think the other piece of it is broadly what we call bureaucracy hacking and that is being able to navigate a bureaucratic process to get your work done within the time that we need it to get done and with enough cognitive flexibility to kind of think through what the really important pieces are, where there might be alternative routes to the really well trodden one, and ultimately get a result despite a complicated, multi-person, multi-process situation. So, we look for people who are excited to do that and who have some demonstrated skill navigating situations like that.

Steve: Is the public opinion research story you talked about earlier, is that an example of bureaucracy hacking?

Colin: I think it’s an example of institutional level bureaucracy hacking. I think that product teams themselves and people who are working on those teams often have to do the same thing, but kind of at a lower level. So, they’re working with a partner who says, “oh, like we have a departmental process around talking to this group of people. We need to go through that process in order to do this research. That process usually takes nine months. Your research is supposed to take 3 weeks, how do we make that happen?” And it’s kind of an interesting skillset, right, that’s required to navigate those situations. It’s not just sort of knowing what good research is. It’s also your ability to think analytically about a process and understand the reasons for the pieces of those processes and then look more broadly. I will say, as we grow, we try to do less of that at a departmental level and more of that at kind of a broader institutional level, but that’s still a work in process.

Steve: It’s interesting, and maybe just coincidental, but you talked about looking for recruiting skills, very specific kinds of skills, in talking to researchers, and then this bureaucracy hacking example that you gave is around the logistics of recruiting participants.

Colin: Yeah, yeah.

Steve: You know and – I mean I’m glad we’re highlighting recruiting. I think it’s sort of a neglected – just get people and talk to them sort of is sometimes the belief in research. You’re talking in some cases about getting to maybe harder to find groups of people, or groups where there’s a specific relationship. And we also talk a lot, in research in general, about operationalizing some of those things. We haven’t talked about that. What’s your view on – whether it’s the recruiting part or just in general, in the context that you’re in, how does that idea play out?

Colin: Yeah. So – I mean I think that it’s not surprising and certainly important that the industry at large is increasingly focused on operationalizing processes like recruiting and thinking about I think ways to devise that work and to scale it more efficiently. That seems totally reasonable given the size of research teams that are now part of the modern organization. I will say, at CDS I approach it with some caution, and I’m worried I’m going to sound a little like old and cantankerous, despite not being much of either…

Steve: You have me though on the call. So, just compare to me. I can be old and cantankerous. I’ll take the heat on that.

Colin: I appreciate that, Steve. No, I – for a lot of our work, the work of doing the recruiting is part of the research itself, right. So, actually going out and making connections with community organizations or professional boards, or senior centers, or all of the places where we do our research, that’s part of how we understand who we’re trying to serve and the social structures that surround them. And so I – when I think about how we’ll scale, I try not to think about ways that we would totally take that out of researchers’ hands because I think they would lose part of the picture of who they’re trying to study if they were to do recruiting in its own right. If they were to sort of separate that into a different role. I think that – the other thing that strikes me is that so little – like when I look at the substance of our research, at the substance of what we do, some of it would I think be possible to kind of mechanistically do faster, right? Like if we had better templates and better knowledge stores and sort of better processes to string those things together, like it could speed things up a little bit. But often the idiosyncrasies in people’s research and the sort of surprises that don’t fit within the template are the most important things that our researchers find. And you know perhaps that’s a sign of our organization’s maturity, but I get worried about kind of whisking all of it into a well-oiled, sleek process for – you know I wonder about the sort of edge of whiteboard, straggly sticky-note on the margins insights that we miss. And those – and for us, in our work, those are so often key, right. Those are – we’re still building an understanding of the space and so often those things that don’t fit well within your predetermined framework are the most interesting parts for us. And I worry about losing that. That said, making the logistics of scheduling people and where they go easier, I am all for it and we continue to look for ways to do that. But I think we have to be thoughtful about what we automate in our industry, just like all knowledge workers should be, I suppose.

Steve: You made a comment a few minutes ago when I asked – when you just said research in general – you said, “it’s hard.”

Colin: Yeah.

Steve: I just want to – I want to go back to that. I don’t know if I agree or disagree. Can we just reflect on that notion of research is something that’s hard? Like what about it is hard? Should it be hard? Is that a bug or a feature?

Colin: Yeah. As it came out of my mouth, I’m like oh that’s kind of a complicated statement, I wonder what you meant by that – to myself? I would say this. In some ways research isn’t and shouldn’t be hard. I think one of the things I love about this work is that I can sit down with someone from any one of our teams and explain some basics and they can go out and start learnings things in a more systematic way with some good pointers from a good 30 minute discussion. And that’s great. And so I don’t want anything I’m about to say to be read as kind of discouraging making research accessible to everyone. We say everyone at CDS can be a researcher and I really believe that. I will say that research is also a – one of those things that’s very easy to do mediocrely. I think that there’s a lot of subtleties to how you make people comfortable in research sessions. There’s subtlety to how you digest lots of qualitative information. There’s subtlety to how you arrange your research, so it influences your product in the most responsible, but impactful way. That’s all – that’s all – I think there is subtlety and trickiness to that and so I think it’s important to have an appreciation for people who are really good at doing those things and to – I don’t necessarily count myself as one of them for all of them – and to recognize that that is a real skill and that is a skill that we should – that we should celebrate in kind of our broader industry community. That is, I think we can do that without also saying hey, there are some basic things that people can do that are easy and quick and help make more people gather more data to make better decisions. I think those are sometimes placed in a false binary in the Twitter sphere, for example, and I am not as interested in that. At CDS, and I think my comment was somewhat related to the Canadian Digital Service in particular, there are some things that are quite – I won’t say uniquely hard about research, but are specific to our context. You know one of those is the sort of incredible domain switching that we expect researchers to do every 3 to 9 to 12 months. And that’s, I think, not unusual in consulting, but it is somewhat more unusual in a product driven organization like ours. There is also, I think, the hard work of figuring out how to fit your research into a interdisciplinary team. You know we don’t just write reports and make presentations and give them and sort of hand them off to designers. We don’t let people off the hook at that phase. They’re kind of in the trenches with all those other people as the product is being developed. And we expect the way that teams run their agile development process to reflect research and for researchers to be a voice in that. And I think there are lots of organizations that have that expectation. I think we’re trying to do that in combination with domain switching and with the third part of it which is, again, we’re trying to build skills. Not just in ourselves, but in our departmental partners. So, you’re trying to learn a new domain, fit your research skills into this rapidly evolving, interdisciplinary team, and help a partner learn the basics of research and appreciate research. I think that’s hard. I think it’s fair to say that’s a difficult thing to do.

Steve: I want to switch a little bit. I would love to hear if there was a point in your own personal, professional path when research was a thing that you identified with? Like I want to do that, I am that, that’s for me? I don’t know, is there a moment or a stage at which you connected with the field that you are in now?

Colin: Yeah. There was. I conducted my first usability test when I was in 7th grade. I was – this was pretty early in the days of such things. But I was working on the school’s website. That was sort of my hobby. I helped the computer teacher with that. And I was reading Jakob Neilsen’s Homepage Usability book. There was this beautiful book with multiple sort of printed out homepages and he talked about his methodology in the back and I read that. I was like oh, this usability test is sort of something we could do. So, I got someone into the computer lab and tried it out and there was this moment of like wow, when you ask other people to try something, and when you ask them questions about what they’re thinking and what their goals are, you challenge yourself in ways that you – that I didn’t expect. And it’s also incredibly rewarding. It’s an incredible high. I don’t know how else to describe it. And it continues to drive me and the folks on my team. So, from that moment on I knew that I wanted to do this kind of research in some way professionally. There were lots of steps between there and here, but I knew pretty early in life that I really loved learning about how people used computers and services and understanding their approach. And, you know, I’m really blessed to work alongside lots of other people who are similarly jazzed to learn those things about people.

Steve: And that’s an astonishing story both – to me – in the early point in your life at which this happened and the specificity when it happened. I think often these stories are about oh I went to a farm and then I realized I wanted to be a marine biologist. Like the connections are more diffuse. But you, at a very young age, were doing exactly, or pretty close to exactly the thing. I mean it’s not even a metaphorical discovery.

Colin: Yeah.

Steve: You literally discovered the work.

Colin: It’s odd to me too. And I do look at friends’ and family’s career trajectories and I’m like hah, I didn’t think that’s how it would work out for me, but I really – I just loved it. And I looked into other things, but kept coming back to this, this being what I really just enjoy doing.

Steve: Are there any points, whether it’s 7th grade or older or younger – what kinds of things can you look to in your earlier parts of your life that were, I don’t know, maybe weak signals or kind of nascent superpowers that connect to what you’re passionate about and what you’re spending time in now? Does anything exist earlier in your life?

Colin: You know, one thing that has always been somewhat odd about me, and I do think relates to my passion for this work, I have always loved seeing the metaphorical or physical backroom of the process. So, like when I’m flying, I’m always like hah, like what does the computer system that the, you know, gate agent used look like? And like what do they do and when do they do it and why? And I’ve always – I’ve always been like that to a point of I think some well natured teasing from my family about my desire to understand the details of how the parking ticket machine dispenser system works and what the numbers on that thing mean. So, I think, especially to do this work in government, you have to kind of have a love of uncovering process and humans and how they interact with that process. And the trick is you don’t have to love like process, but you have to love kind of post modern process. Right? You have to love the fact that like it is different things to different people and kind of is this great mirror game of clarity and unclarity all at the same time. That’s something from an early age that I’ve always really enjoyed. I also would say that I – so early in my career I worked for the U.S. National Park Service and one of the things that I did there was being a park guide. So, be out on the trail talking to people and listening to them and answering their questions and seeing how they interact with a place. And there – it’s real joy – I feel real joy to seeing people in an environment and both helping them discover things and see how they discover things. And often I feel like my work today isn’t that much different, right. I think I’m more open ended than park interpreters are, but it’s still here is this new land and this new world that you are encountering, and I get to be with you as you’re doing that and help you think through what you’re doing. And so I think the kind of odd areas of joy I found in that work certainly kind of echo through to what I’m doing now.

Steve: Is there any example of a notable success? Something that, you know, maybe that your researchers have done and kind of completed that you feel proud of? Or just kind of a notable example to share?

Colin: Yeah. I think that – it’s funny. You know we’ve done a lot of research that I’m proud of and I’m proud of our researchers for doing at CDS. But the things that make me most excited are the moments when I see our researchers teaching and coaching people and departments to start doing research themselves, right. So, for example, with the Veterans Affairs Canada project they – the department identified someone who would be a researcher and sort of continue on the research with that product after we handed it back off to them. And I, you know – those moments when those people were working together and getting to support them through that, I think those probably make me feel prouder than any particular deliverable or product that we’ve gotten out the door.

I will say the other exception there is I’ve looked – I was looking at our sort of internal database of research before this interview and I’ll say the other thing I’m particularly proud of our team for doing is becoming more and more diverse and creative in the types of research it does, right. I think when we started, we were trying to make the basics happen and so we did lots of interviews and we did lots of usability tests. And those are great, but there are other things to do too. I think we’ve become not just qualitative, but mixed methods. I think that we have embraced the complications of trying to kind of test services in more live ways, right. So, not sort of just usability test prototypes, but wire something up so that someone can complete an actual transaction with it and see the back end experience and orchestrate kind of a Wizard of Oz experience for them. Those, those increases in our skill and our ability to mix and match tools responsibly, that’s incredibly rewarding as a researcher at heart.

Steve: Is there anything that I should have asked you about that I didn’t, and you want to make sure we talk about?

Colin: No. I think your questions were well rounded. Nothing comes to mind from my list.

Steve: Do you have any questions for me?

Colin: What’s the most surprising thing you’ve heard during one of these interviews?

Steve: Well, this is going to sound like I’m pandering, but the fact that you did a usability test in 7th grade is – I’ve just never come across that in my career, let alone in one of these interviews. But actually – I will say – I said I would take the weight of being the old curmudgeonly person. So, here we go. I don’t like the what’s the most surprising thing question ‘cuz it presupposes that the person – and I realize this is not exactly research since we’re doing a podcast, but as an interviewer in general, I’m not trying to be surprised. I’m trying to be focused and interesting and sort of the most benign sort of factual thing should also be exciting and triggering of interpretations. So, you know what I mean? Like the surprise is me bringing my judgement into it. So, of course I gave a concrete example. I was surprised by something that you said, but I also…

Colin: Yeah.

Steve: People ask researchers that, right? When you did this study what’s the most surprising thing? And I think sometimes that question can be good if it provokes what are we learning, or where are our assumptions being challenged? But, how do we – I also want there to be space for like hey I didn’t come in here with assumptions. I came in here with just curiosity.

Colin: Yeah.

Steve: So, I’m kind of talking out of both sides of my face here, I guess. I do that.

Colin: Well, I agree. And I do think there’s a real risk to boiling research into a set of surprising and non-intuitive insights, you know. Like the idea is that we should be manufacturing surprising little blobs of knowledge that can be passed off into the ether. And I worry about that. I worry about the work getting boiled into that. And so I hear you. And I guess I agree with your curmudgeonly instincts.

Steve: Aren’t I a nice person? Any questions for me? I mean I’m going to slap you for the first one that you asked me. So, I’m just living up to the curmudgeonly mantle I chose.

Colin: I expect nothing less. Well, that’s good. No, I would say I am curious as to what your views on the burgeoning research ops movement are? As someone who I think has been in this world for many of its evolutions, what’s your reaction to the increased operations emphasis?

Steve: I mean I have – learning about research ops provoked a uninformed critical reaction. And you said it really well when you talked about the value that the recruiting can bring to the researcher. And, you know, we hear researchers talking about the value that they get out of doing their own transcripts. Personally that’s not a thing that I could deal with, but immersing yourself in the subject matter and the people in the community and the data and the conversations is super labor intensive and the idea that we’re going to fix that by kind of operationalizing things scares me because it takes away the quality and – you talked about subtlety, I think. But if you look a little deeper into what research ops people are championing, they’re not championing that. They’re – that’s my chauvinistic misinterpretation. And I think just by creating that word, I think it invites other people to also do that and maybe even implement that. But the people who are championing it are thinking about process and thinking about how does this exist? And, you know, having spent my entire career as a consultant, so not embedded in the organization, not building these processes – maybe advising on them, or kind of giving feedback, it’s sort of an interesting thing to watch from the outside. It comes up in a lot of groups. I talk to people who are kind of early on in their organizational maturity. You know I read something recently – oh, some research ops people were sort of declaring that the first researcher hire should be an ops person which, you know, from the outside seems like well of course you would say that. That’s sort of a discipline that you are evangelizing for. But I found myself working with a group where I just mentioned that idea as an idea that exists. I wasn’t in a position to say they should do this, they should not do it. But I said, given what you’re talking about your barrier here is ops, is infrastructure, is recruiting, is not reinventing the wheel. And I said, you know – they had headcount to hire one person and I said well what proportion, from zero to 100, is that person going to be dedicated towards this? You know, if the same way that you are – that you talked in the beginning, about your organizational change efforts, you know convening people, finding processes, sharing best practices. I think a research ops person can also be doing that, you know at a different level and using kind of different processes than you’re talking about, but…

Colin: Yup.

Steve: I mean that is championing the growth of research as a thing that we have skills to do at various levels efficiently. That’s what you’re – that’s one of the things you’re doing. That’s one of the things a research ops person could be doing. Taking away the hard, labor intensive, inefficient tasks that lead to the subtlety that makes research great, that you described -that’s not a thing I want to do, but I – it doesn’t have to be what ops is, so.

Yeah. If there’s nothing else, then maybe we’re at the end. Really great conversation. We covered so much interesting stuff and I learned a lot and I’m really so happy that we had the chance to speak today.

Colin: Likewise. Thanks so much for your interest. It’s been a pleasure.

Steve: Thanks for listening! Tell your colleagues about Dollars to Donuts, and give us a review on Apple Podcasts. You can find Dollars to Donuts on Apple Podcasts and Spotify and Google Play and all the places where pods are catched. Visit portigal.com/podcast to get all the episodes with show notes and transcripts. And we’re on Twitter at dollarstodonuts, that’s D O L L R S t o D O N U T S. Our theme music is by Bruce Todd.

Jan 07 2020

1hr 7mins

Play

Rank #8: 16. Marianne Berkovich of Glooko

Podcast cover
Read more

In this episode of Dollars to Donuts, I speak with Marianne Berkovich, Head of User Research & Consumer Insights at Glooko. We talk about doing research through leadership changes, setting up opportunities for self-critique, and how to build empathy, especially in health technology, by experiencing some aspect of the condition and treatment yourself.

It really bothers me when smart people go out and build things and spend a lot of time and energy to build things that are not for humans. And I’m like, erh, why didn’t they do that? For me it’s more about empowering people who have that energy and who have that entrepreneurial spirit to make the things that are right. Not just make stuff, but actually, make the right thing. It’s a set of skills that I think maybe everybody should have and maybe once everybody has those skills and can do it well, maybe the role of researcher doesn’t need to exist. Until then, I feel like it’s my duty to go out and spread the gospel, as it were, of this is how you talk to users. – Marianne Berkovich

Show Links

Follow Dollars to Donuts on Twitter and help other listeners find the podcast by leaving a review on iTunes.

Transcript

Steve Portigal: Well, hi, and welcome to Dollars to Donuts, the podcast where I talk to people who lead user research in their organization.

I was reading the New York Times today and I noticed something unusual, although I’m seeing this sort of thing more and more. After the byline, but before the article starts, is some italicized text in square brackets. It reads “What you need to know to start the day: Get New York Today in your inbox.” This is copy that only belongs online, in the app or the website or an email newsletter. Presumably you would click on something. But I’m looking at the newspaper – I feel like I’m starring in one of those YouTube videos where a toddler is trying to swipe a magazine and can’t figure out why it’s not a touch screen. I see the New York Times making this kind of error in their print edition every few weeks, and it’s kind of appalling, because it suggests a lack of detail that I don’t expect from a high quality product like the Times. When you get an email that has the wrong name in the salutation, even though we’ve all done it ourselves, it brings your appreciation down a notch or two. They clearly aren’t taking the care that they used to, and that we would hope for.

And it’s especially interesting because I remember when the opposite used to be true; when the experience we had online, with news especially, was a not-quite-there translation of a print experience. And now it’s flopped. The print edition readers are not the primary customers. The organization has identified a different key user and it’s not us.

I can’t help but wonder about any of the users we learn about, are they in a less desirable category or perceived that way because of changes in internal processes or organizational structure? Do we care about their experience, or are we making them into what we call “edge cases” which is a fancy way of dismissing them. It’s hard to imagine the print reader of the New York Times as an edge case, but hey, that’s where we are.

I want to remind you that I’m looking for ways to be able to keep making this podcast for you. Here’s how you can help. You can hire me! I plan and lead user research projects, I coach teams who are working to learn from their customers, and I run training workshops to teach people how to be better at research and analysis. I’ve got two books you can buy- the classic Interviewing Users and Doorbells, Danger, and Dead Batteries, a book of stories from other researchers about the kinds of things that happen when they go out into the field. You rating review this podcast on iTunes, and you can review both books on Amazon. With your support, I can keep doing this podcast for you.

Let’s get to my interview with Marianne Berkovich. We did this a little differently – as part of a live event. We closed out San Francisco’s local edition of “World Information Architecture Day” with our interview, live on stage, in front of an audience. Then when we got off stage, we sat down in a somewhat noisy room and talked a little more to cover some of the things we didn’t have time for. We’ve cleaned it up as best we can but the audio may be a little bit different from how things normally sound. It was really fun for the two of us to speak on stage, and I hope to have more opportunities to record episodes of the podcast in similar settings.

Marianne is the Head of User Research + Customer Insights at Glooko. She’s worked as a consultant and for Google and Adobe.

Thanks for agreeing to talk with me, Marianne. Why don’t we just start by having you introduce yourself. What do you do? Where do you work? Tell us about that.

Marianne Berkovich: My name is Marianne Berkovich and I am Head of User Research & Consumer Insights at Glooko and we’re an online diabetes management platform.

Steve: What is an online diabetes management platform?

Marianne: That’s a great question. So, diabetes is a condition that’s a lot to do with numbers. You want to keep your blood sugar not too low, not too high. So, it’s really conducive and lends itself well to technology and checking your numbers. We have an app for the person with diabetes. They can track all sorts of things there that could affect their blood sugar and then they can also send that information to their clinician who can further see patterns of you’re high in the mornings and what are we going to do about that? So, it’s a platform that both the clinician can access and look at those things as well as for the patient themselves.

Steve: Where in the history of this company and its product did you come in, to bring in user research?

Marianne: So, I am the first user researcher. The company was founded in 2010. I’ve been there for about a year and a half and I am learning all sorts of things of what it’s like to be, first in a healthcare company, or a health tech company, and also being the first researcher.

Steve: Okay, let’s see how easy this is. What are some of the things you’re learning about being the first researcher and working in a health tech company?

Marianne: First of all, as a researcher, it just feels very weird to be answering the questions and not asking the questions. Maybe I answer questions in a way that makes it easier for the researcher to ask the next follow-up question. So, but you know what a great researcher Steve is, so this is showcasing his talent too, right! So I think some of the things that I’m learning is the role of advocacy. That even though there was a need for hey it’s time for a researcher, we need someone full-time – before they were hiring vendors or kind of doing it ad hoc. And so, it was time to have somebody who could be dedicated and who’s trained in this. But at the same time there’s a lot of – I don’t want to say – in some cases there was resistance, but I think in a lot of cases it’s more of just not knowing what good research looks like, or part of the reason my title is so long is that people had an impression that oh, user research is just usability studies. And so, I talked with my manager and we put in the “& Consumer Insights.” Head of User Research & Consumer Insights – so it’s like it’s everything. It’s going to be going out and doing field visits. It might be surveys. It might be a lot of different things. So, to kind of help with that advocacy. And I feel like I do spend – I mean, I think as researchers we always spend a lot of our time in advocacy mode, but I’m not surprised – or maybe surprised – it’s just taking a lot more of my effort to do that advocacy work of what research is and how can it be used.

Steve: I just want to clarify. You’re advocating for user research?

Marianne: Yeah. So, I think even though the company talks about being patient centered and user centered, that what does that really mean? I think one of the things that I’m finding out – so, there’s a lot of people who have been in the field for a long time and they’re like we understand diabetes. We’ve been in this for many, many years. And to have somebody come in – I don’t have any experience with diabetes; I don’t have diabetes myself – to come in with these, you know, different insights or different ways of doing things and people are like, we’ve been there, we know how to do this. And so, to advocate for, hey, we went out and talked to some people. We learned this thing that’s different and knew and it’s maybe against the conventional wisdom, how might we use this? And maybe some of that resistance of like, well we’ve always done it this way. Or conventional wisdom says this other thing.

Steve: So, the other part that you were learning was working in healthcare tech. It’s a new industry for you. So, talk about maybe what that’s been like.

Marianne: Yeah, so – and before that I was – I did a fellowship and we were working in the legal space. And the legal space, the health tech space, are slower moving than just straight up consumer products. It’s like hey let’s build something and we’ll launch an app and do these things. You know regulated spaces of like what is the FDA like? I had a little bit of background in human factors testing, but we have a product that actually got FDA approved. But what does human factors testing look like? How is that different from just regular usability testing? What does it mean to understand the whole ecosystem of payers and employers and insurance companies and PDMs and all that type of thing of moving that needle and kind of inching things forward when you’re in this space that can’t be disrupted so easily because there’s good reasons why these constraints exist.

Steve: So, the company was about 8 years in existence when you joined. So, how much of that were you building from scratch? Describe a little bit about what you encountered and what you had to build around sort of the tech and regulatory aspects maybe.

Marianne: So, actually, luckily the regulatory aspect is just one part of what we offer. It’s the Mobile Insulin Dosing System. So, when you start on insulin, how do you come up to taking the right dose of a certain type of insulin. So, the product does a lot more than that. So, this was just one small part of it. And it was actually – that was already in the works when I joined. So, the app was already out. The website was already out. MIDS was already sort of in progress. So, I think for me it’s been a role of a little bit of back to basics of like let’s learn about our users. How are people with Type 1 diabetes different from Type 2 diabetes. So, in fact, I was telling Steve before this, I’m doing an ethnography right now and so I was in Fresno this week talking to folks with Type 1 diabetes. I’m in Phoenix on Monday, but kind of building that basic understanding and actually building personas and really having a deeper understanding rather than like, yeah well, we have people now who are Type 1s, they know that they need. To move beyond that, to really have some foundational knowledge, and it’s causing us to go back and be like okay, well, is this – is there things in the app that are working the best way. Maybe we can make the clinician experience more efficient. Maybe there’s things that we learn when we go to visit a clinician – be like oh, that’s not how we thought that worked, can we rethink that. So, I think it’s a little bit of back to basics and rethinking certain things rather than to build anything from scratch.

Steve: Before you talked about advocacy which was kind of saying hey, we need to learn these kinds of things, but in some ways doing that reveals the most scary thing of all – oh we need to rethink assumptions and yeah, a 10-year-old company, it’s entrenched. So, what happens when the things that you’re uncovering are inviting, challenging – they’re inviting the challenging of established belief structures around the product and what it looks like and everything. How are you doing that?

Marianne: I think it leads to another kind of wrinkle of working in a startup is that we’ve had a lot of turnover at the leadership level, actually. So, the CEO who was the CEO when I joined was not the founder. He was the second CEO. And he reached a point where he was like, “you know what I really like getting the company to where it is now, but now that we’re really – it’s time to scale and grow things, it’s not what I want to do.” So, we got a new CEO. We got a new head of product. We got a new commercialization officer. So, all that advocacy I had done and sort of like when we did they Type 2 ethnography when I first started, like all that – you know taking people on the road and showing them and doing these empathy workshops and all that – whoosh, out the window because we’ve got a whole new cast of characters now. So, I think that was part of it too. It was a little bit humbling to have to re-sort of create my credibility again with – it’s a whole new cast of characters and a whole new set of people to influence. Luckily my manager has been a great champion of that. And so, I think it’s finding ways that we can leverage and sort of maybe it’s not the right time for things. So, we’ll go out and do this research and we have this foundational stuff. We sort of did a big aha, then those people went away. Can we save that and find another time to sort of bring that to the fore when it might be a little bit more accepted? So, I think it’s both sort of pushing for it and finding the right time to sort of introduce those things.

Steve: So, what does a manager do? How does that championing work?

Marianne: I think, – and I don’t know if this is true, but I feel like as a woman you know I can say a little of things if she’s also a woman, but like women in general, like it’s much more effective if instead of me saying it, I say like, “yeah, what Steve said.” Or, you know, what somebody else said. So, I think because of that, I think it becomes more effective. So, she’s not tooting her own horn. I’m not tooting my own horn, but she is sort of pointing to the work that I’m doing. Plus I think we have different kind of communication styles, and sometimes she can sort of, – I’ll say a bunch of stuff and she can sort of synthesize it in a really nice way and it sort of becomes a little bit more impactful that way, that like I can spend all my time sort of yelling and screaming – metaphorically speaking – and then she can sort of bring it home in a very succinct way.

Steve: So, part of what she’s doing then is also – you talk about finding the right moments. Or, what should we be doing to have impact on the organization as the layers above us are changing. I don’t know. I’m wondering about when new people come in, and this is going to vary on a case by case basis, but you’ve gone through this whole series of trying to open people’s minds up a little bit and say hey things are different. Those people leave. New people come in, but is there an opportunity there? Do those people – what baggage do they have, or do they bring in? I’m not asking you to like slag anybody individually, but as you see these kinds of changes happen in an organization and you’re trying to craft a story that’s about how the world really is vs. maybe what we hope or assume – how does the – what changes when new people come in, with or without baggage, around what the truth is?

Marianne: I think part of it is understanding where people are coming from. Because I think in my last role – I was at Google for a long time and I think like certain assumptions I made that like everybody knew what user research was and everybody knew this and I kind of got used to that, that that was the norm. Like product managers of course know what I do, and everybody knows how this works. And so, I had to figure out like, oh, okay, I can’t assume that people know what these things are, or that when I say user research and what you think in your experience with user research, like maybe that was focus groups. And they’re like yeah, we did those and it was great and I was like okay, great that’s a start. So, I kind of know where I’m starting from with folks. So, I think that’s part of it. And I think also, because I am the only researcher and things move a little bit more slowly, like we did the stuff around Type 2s and now we’re doing the Type 1s. And so now I’m using all that stuff that I learned last time of like what was effective in terms of running a workshop and getting people to come on visits with me. And it’s like it’s a little bit smoother again this time, but we’re doing that again this time around. So, it seems that having that opportunity to redo that foundational research has resurfaced itself. And so, to take advantage of that opportunity, that it’s like well, let’s look at Type 1s now.

Steve: There’s something here about you’ve had lots of experience. This is not your first job. You’ve been at lots of different organizations, you’ve done lots of research and influenced stakeholders and product teams and so on, but it sounds like that there’s a good measure of learning in this job which is about how do I do the thing that I know how to do to be effective in this context. Does that ever go away for researchers, do you think?

Marianne: I hope not. I think it’s – and I think that that’s part of the reason that like, you know – one of the reasons I left Google is like it took me a while to realize the thing that I was doing there and the things that – that, kind of really I’m passionate about are not things that I could do at Google. But like who leaves Google, right. So, I think finding ways to find my path and things that are not sort of the traditional way of moving up in, you know, you’re a junior research, then you’re a senior researcher and then you’re a manager and all these things. To find my own path and sort of be okay with that and find different ways to learn in different contexts. Like the fellowship that I did with Blue Ridge Labs which is a social impact incubator. And I was again, the only researcher, and so I got to do more mentoring. And so that was interesting. So, I think finding ways to both follow my passion and find ways that like what does this organization look like? And I think we’ve probably heard this all before in terms of doing sort of user research or bringing that lens to our stakeholders or our people that we’re working with as well. I think that part never goes away and it’s always changing because it’s always a different set of people.

Steve: Right, more so – is the landscape changing more in a place like Glooko than in a place like Google?

Marianne: That’s a good question. I think it’s very different. Google’s obviously a very – there’s just many people. And so, I think the dynamics of what it’s like to have an organization that big and when I joined I knew all the researchers and now the scale – each product’s team is much larger than that. So, I think it’s a different type of – it’s a different set of issues. So, I think wherever you go – like I’m from the East Coast and people ask – they’re like well what’s better, east coast or west coast, or all this stuff? And I’m like they’re different, right. I don’t think there are things that you can really compare. So, for it’s, – and this is my first time working in a startup and so I don’t have another startup to compare it to. So, maybe if I went to a different startup I could do more of that comparing and contrasting, but in some ways, it feels like apples and oranges of a large organization that’s established and, you know, is well funded and all that vs. a smaller one that is working in a very different space.

Steve: Just hypothetical – I know we’re not supposed to ask – apparently you aren’t supposed to ask hypothetical projection questions in user research, so good thing this is not user research. If you were to look for a job at another startup, I’m just thinking about you joining an 8-year-old company, is there a maturity that you would look for, or a timespan that you would think differently about in the next stages of your career?

Marianne: I think it’s less for me about that. It’s more about really getting jazzed about the problem that I’m solving. So, for me diabetes is a very big issue. Thirty million people in the U.S. and growing. And so that feels like a real meaty issue and that gets me excited. So, I’m like even if some days it feels like I’m pushing a boulder uphill, it seems like a worthwhile thing and that’s what gets me up in the morning. So, for me it’s much more about that and going in with eyes wide open of like well what kind of organization in it – is it, and like okay do I want to take that on. And if the answer is yes then working around those constraints because that’s just the nature of the beast.

Steve: You talked before about figuring out the right job title that would describe, to the rest of the organization, the way that you were going to work. As you – can you say more about sort of what that process was, what those conversations were that identified an opportunity, that helped get you excited that this was something you wanted to do?

Marianne: Well honestly, I think part of it was naivety, that like this was – I think it was actually the way it was posted or sort of written up was Manager of Consumer Insights and I’m like okay, whatever. You know, I’m not on a ladder. I’m the only one, so it doesn’t really matter. So, for me, I was so excited – in fact I was consulting before and so Glooko was one of my clients and I got hired. So, we crafted the role a little bit based on what I was doing. And so for me it was less about negotiating the right title and in fact I think only after I joined – because I was like, sounds great, like it sounds like this is a good fit, sign me up, and then only when I started and I was like hey, I want to make business cards, but I don’t really like this Manager of Consumer Insights – we had that conversation after I joined. And so that’s when my manager started telling me about like hey, this is some of the perceptions because I had gone in very naively also of like everybody knows what user research is. Like, we’re in Silicon Valley, like everybody knows what this is. And so, it didn’t even really occur to me that that advocacy, or the extent of that advocacy that would need to be done.

Steve: So, what – you worked with the organization as a consultant and then came in-house to kind of lead the effort. What was similar and different about – you know, the before and after that transition?

Marianne: So I think one thing is, it was very intimidating. I didn’t know anything about diabetes and so I ran my first study and like everybody showed up and so they were listening in on the call and I’m like how’s this going to go? Like I hope I don’t say anything stupid. And so, you know I think as a consultant, coming up to speed on a different domain very quickly, and I think to me it’s about asking those open-ended questions and asking questions in a way that it doesn’t really matter if you know the domain. I mean it’s certainly much better if you do, but there are ways that you can sort of cover that up, and especially if you’re going in with an apprentice mindset and all those types of things, it kind of helps the situation along a little bit. So, I think for me that was definitely intimidating to not know the domain and have everybody show up.

I think another difference is as a consultant, and I wasn’t a consultant for very long. It was about a year or so before I decided to take the role with Glooko , but I was very cognizant of how I was spending every single hour and whether it was going to lead to making money or not, because is this going to generate a lead? Is it going to generate a sale? All those types of questions were very much top of mind whereas when you’re in-house I think you can spend the sort of hours – like you’re spending your capital in a different way. You’re building relationships and that’s kind of how you’re sort of making money. So, to me it’s sort of a different way that you don’t have to be so aware of every hour leading to money.

Steve: I’m going to switch gears a little bit and maybe we could just rewind. I’d love to hear you describe maybe your path. How did you get into this stuff? What brought you to where we are sitting today?

Marianne: So, I was an English major for undergrad and after I graduated I didn’t know what I wanted to do. So, I spent a couple of years – I actually at that point did work. I was in D.C. and I worked at the National Museum of American Art. We were digitizing the collections. It was a very long time ago and so I was actually cleaning up all the photos and I’m like eh, I’m not really into this technology thing. My Dad also was a computer science professor. I’m like definitely not into that technology thing. Like that’s stuff that my Dad does. Boring. So, I spent a couple of years kind of bumming around a little bit, decided to move out to Denver and I started working as a technical writer. And I was working at a financial services company and so my role was to write the help text for this complex financial software. So, I would sit with the developers. They would explain to me how the complex software worked, and I would write it up. And then I had a thought. I’m like if we just made the software easier to use I wouldn’t have to write anything. So, that led me to first get a certificate in technical communication. And in that I started kind of looking around a little bit more and found the whole field of human centered design and human computer interaction. I looked around and I found a program at Carnegie Mellon. And at that point I had actually switched over to Lockheed Martin, which is a great role – you can ask me about that in a second. Sorry, I just keep getting the questions to ask. And, but I – you know I decided that I really needed a degree in this thing. That just kind of reading books about it or whatever wasn’t enough. So, I decided to go to grad school and I remember just seeing that description of the master’s program and I was like that’s exactly what I want to do and like all my life had been leading to that point to be like that’s exactly what I want to do and having that feeling of like, yup, this is the direction I want to head in.

Steve: Which master’s program? And then keep going. Yes, please.

Marianne: It was a master’s in Human-Computer Interaction at Carnegie Mellon University. So, the role at Lockheed Martin. So, I’m old. I’m definitely over 35. And this was sort of in the olden days and what we were doing is taking – so, you know how we have wildfires and all that type of stuff? So, the resources to manage all those things, like sending the air tankers and the trucks and the crews and all that – that was being done by hand. So, Lockheed Martin got a contact to turn that into – it wasn’t even online. It was just kind of a digital program to do that. So, we would interview subject matter experts and I was like a requirements analyst. Like we didn’t have roles of like designer researcher. Like that wasn’t a thing yet. And so, for me it was like super exciting to solve a real problem. We got to actually go to Boulder and like see the command center and see some of those things. I mean I really felt like I was making a difference and I knew what this was about. But I felt very much at a disadvantage of like, I don’t know what should the icons be? I don’t know. Nobody has best practices around icons yet. And so, it felt like early days and that’s why I really wanted to go back to school and like learn some stuff because I knew there was a lot that had been done already.

Steve: So, you come out of that program and where do you go?

Marianne: Consulting. Yeah, I actually stuck around Pittsburgh for a while and I again ran my own consulting thing. I do that to sort of – when I’m in exploration phase. Or, when I’m dating somebody who just started a PhD program and I can’t leave Pittsburgh. One or the other. So, yeah, and the guy I was dating, he was very entrepreneurial, and he was like such a great cheerleader. He was like yeah, we can figure this out. He had been running his own startup before he started the PhD program and he was like, yeah you can do this. I’m like, yeah, I can do this. So, he did help me a lot and so I was finding projects and was kind of getting my feet wet with that. But then I found that it was time to leave Pittsburgh. That it wasn’t what I wanted to be doing. I really wanted to learn from other people and so I started looking for other opportunities.

Steve: What were those opportunities that you found?

Marianne: You can take it in a different direction. So, I actually interviewed at Google at that point and they didn’t want me, I didn’t want them. It was a very different company at that point. But I also interviewed at Adobe. There’s actually a pipeline from Carnegie Mellon straight here to Silicon Valley. I mean seriously, just pull up a bus, just put the graduates on it and bus us all out here. So, there were a bunch of people who I had gone to grad school with who were at Adobe and I went to interview there, and it felt like a really nice fit. I remember actually – after the Google interview I came home to my hotel and I just turned on the Simpsons and like all I could do was just sit there and like not move. But after the Abode interview, I actually – it was one of my first times out here in Silicon Valley and like I went hiking. I was like I was that inspired and that energized that I went hiking. I’m like this is a sign that maybe I should take this role, that this is a good role for me.

Steve: We can cue up some Simpsons for you after this conversation. You know, just put you in a dark room. Um- so, when you talk about the work that you’re doing now, you’re doing field work, you’re doing ethnography, you’re taking people out. When did you learn how to do that?

Marianne: That’s a good question. I feel like I should credit CMU with setting me on the path and teaching me about contextual inquiry – thank you Bonnie John. And I think for me one of the things that they told us at Carnegie Mellon is – it’s a one-year program, so if you didn’t come in as a computer scientist you’re not leaving as a computer scientist. If you didn’t come in as a designer, you’re not leaving a designer. But we’re going to give you enough exposure to coding and designing and having your stuff on a wall and getting critiqued and to do some social science stuff. And so, for me, having been an English major and I minored in art history and theology, I mean I was as liberal arts as you could get. I wasn’t going to come out as any of those things and I didn’t gravitate towards it. I was like I’m not a designer. So, I think I just gravitated towards research. And I think maybe the English degree kind of prepped me for that in terms of asking good questions and thinking about things and looking for patterns. So, I think just naturally I’m a pattern seeker and I think honing my ability to sort of ask questions just came with time. So, I think it was having a good solid foundation and then just kind of having a natural affinity for it. And I read your book – much, much later, but I did read your book.

Steve: Right. By the time my book came out you’d been doing this for a while. Can you talk about the environment at Adobe? What was there there to develop the practice of user research? In the practitioners like yourself that are coming in with X amount of experience, what did they do well that let you get to the next level of your craft?

Marianne: I think actually I was really lucky at both Adobe and Google, being surrounded by really smart people. So, that made it okay of like I don’t know how to do this, and can someone help me? Like I learned how to do surveys through some of the really awesome people at Google. So, I think for me it was just being surrounded by really supportive people and I think that was one of the things that was part of the culture at Abode. Like we would go to my manager’s house. She would have like all the researchers at her house and we would do these offsite – and it really felt like family. I mean it sounds a little cheesy, but – and so it felt like it was a place to grow and learn. So, I think for me that was a great stepping stone as my first sort of real job after graduate school, to be like oh, okay, this is what being a researcher is like. Oh, this is how you interact with product teams. So, I think it was just learning all those basics. But I think one of the things that – kind of coming back to the question of learning to do ethnographic work and all that is once you’re out – and especially you’re probably one researcher. You’re rarely lucky enough to work with another researcher who can sort of observe you can say like hey, why are you asking that question? Or, maybe we could do it differently. One of the things that I found, when I did have some opportunities at Google to work with other researchers, I’m like we don’t all have the same skills. Like what happens between you and a participant is different. And so, one of the things that having noticed that is I started a class at Google to critique ourselves. So, give researchers an opportunity to observe each other moderating and to give feedback to each other. And then I read a Medium article about it. It’s called Don’t Leave Data on the Table, if you want to look it up. I think that’s one of the things too is that we don’t get that critical eye anymore and we assume that like every researcher is the same and it’s like well actually, you know – and those things can also creep up on us. Like I see myself all the time, especially now that I’m sort of teaching other designers at my company more of like, ah, leading question. Yes or no question. You know. So, I find myself still doing that, even if I’m aware of it. But I think having those things pointed out to us is also really helpful too.

Steve: What’s the structure of the critiquing class?

Marianne: So, it was slightly different when I taught it externally at CHI, and internally. So, internally all our videos are available to each other. So, basically people would send in clips from research that they did. We would make small groups, so it lets you work in a group of five, and we would watch a segment of somebody conducting a study. And at any point people could stop the video and be like hey, I noticed this. It was like when we were actually practicing to teach it, one of my co-teachers, she actually had her pen and she would do a lot of like pointing and sort of like do a lot of things that indicated high status. And she was like, I never noticed that – until she saw herself, until we talked about it, she wasn’t aware of that body language and what it was conveying. And so basically we’d spend 5/10 minutes, or however the time divides up, to watch that video and stop if we see anything. And we asked people to rent that idea that like, again, we don’t know the context of what came before. We don’t know the context after. But this is what I’m noticing. And that’s what we do as researchers is we notice. So, it could be a pattern, it could not, but at least invite people to consider something that they’re doing. And also, by seeing other people, just having that conversation, seeing what other people – and be like okay, I didn’t show that, I didn’t do that in my clip, but like I’ve definitely done that before too. So, learning from each other as well has been really valuable.

Steve: It sounds like a way to create kind of a safe space where it’s okay to critique is that everybody is kind of up for that critique and we’re all being exposed together. Is that kind of the way to make it safe? Because this is our workplace and now we’re talking about how we’re not living up to sort of the high standards of our profession.

Marianne: Yeah. And I think that’s one of the things that we tried really hard to make it a safe space. And one of the things that we did was actually we had – spoiler alert – we’d give out like balloons, which are also like these cords that you pull to sort of stop the production line. But just having a room full of balloons made it a little bit more fun. When I taught at CHI we had to create the videos and so I had these little stuffed animal guys that people could put their phones on when they were doing the videos. So, then we had little stuffed animals on all the tables. And so, – and also modeling it ourselves that – I don’t remember if we actually showed some of the videos, but we definitely told some of the stories of like when we did this ourselves – it’s like I’ve been a practitioner for how many years? Like I’m still learning. I still screw this stuff up. So, making ourselves vulnerable was part of making that safe space too. And also, I think what we tried to do was, when we did it within Google, was not have everybody who was on the same team be together. So, it wasn’t like anybody who you worked with directly.

Steve: You used the phrase “sort of screw this up” and I think like the pen example and I’m sure there’s others that are clearly things we shouldn’t do as researchers. To me it seems like there’s a certain amount of stuff that’s subjective that you and I would do it differently because we’re different and we have different personalities and different energies. For me, critique about that would be really interesting because I like to see how other people handle things. My question here I guess is, is there – does the critique approach look at alternative approaches? How do you talk about that vs. best and worst practices?

Marianne: I think we also gave an opportunity for people to stop the cord on themselves. To be like I didn’t know what to do here. Like help me brainstorm. So, there’s that opportunity too of like for the person themselves to be like how will we handle this? And to hear different perspectives on – there’s, you know, I could have done this, or I could have done this, or in situations in the past I’ve handled it differently. So, I think having just that variety of there’s not one right way to do that, that it depends on the context. And maybe sometimes somebody explains like well this is what happened earlier in the session and so it led me to do it in this way and we’re like, oh, okay, well that makes a lot of sense. And so, I think having a little bit more of that context helps too.

Steve: Are you able to bring any of this into the environment you’re in now where, if anything, you’re teaching people and leading them.

Marianne: Yeah. And I think I’m learning a lot more about screwing it up also. While I’m out in the field doing these Type 1 ethnography, like I’m trying to help designers to be able to do their own research. And so, one of our more junior researchers is actually doing one on one interviews. So, I helped, and we talked about it and I wrote the discussion guide and the study plan. We talked about it and she was all on board and off to the races we went. And then I actually had a cancellation, so I was able to watch one of the sessions and I was like oh no, I didn’t – I screwed up. I didn’t prepare her well enough. It was a little bit more than she could handle and that was squarely on me, you know. That because I’m so used to doing it and it comes so naturally to me that I didn’t recognize that she was in a different space and she would need a different level of support and that I just hadn’t prepared her. And so, we role played a little bit and we practiced a little bit. But I still feel like – I feel guilty. I’m like she’s got more sessions this week and like I don’t know if she’s really prepared. So, I think that part is on me as well to make sure that the things that I’m modeling and the things that I’m sort of teaching are doable for where the designers are and not just hey it’s easy, anybody can do this.

Steve: In my opinion there are some things in life that we do that we maybe learn more from screwing up than we do by succeeding. I don’t know what falls into that category and what doesn’t. I wonder if, as much as you feel for this person and feel responsible for them, I wonder just ultimate perspective, did you set them up for a lot of learning they might not have otherwise got to by – of seeing the edges of what they’re able to control or what they’re able to execute on?

Marianne: I can’t speak for her.

Steve: Good. Alright. Well we hit a dead end. Let’s see what other questions I have written for you here. If we were to have this conversation again in a year or two years, what kinds of things would you want to talk about that you would have achieved in this organization?

Marianne: I hope that there’s less sort of friction. I think we’re already starting to see the seeds of it, that you know product managers who have been at the company for a long time are saying like oh, I’m doing this new thing, you know I need user research. They don’t know what it is. They don’t know why exactly, but they’re sort of like it’s a good thing generally. So, I think to have more of that. Honestly, I think like some of the things that are more – a little boring, kind of more infrastructure stuff – like I hate recruiting. I’m a terrible recruiter. I’m terrible with like time zones and all those details, that type of stuff. So, like I’d love to have a recruiter. I’d love to have another researcher on board. And some of the things that I feel like I have a great success under my belt, that it looks like we’re moving away from NPS (Net Promoter Score), hooray, thank you, thank you, and moving towards satisfaction. You know asking satisfaction question and how often we’re going to do that. You know, that’s a really interesting thing that I’ve been thinking about is can we pop that up in an app. I mean this is an app that people use for health and like they might be having a low blood sugar. You know this is a stressful time that people are using the app. Like nobody is excited about managing their diabetes. So, to pop up something in the app, to be like how are we doing, that’s not the right thing. So, I haven’t figured out like how we might introduce those satisfaction questions and when is the right time to do those things. But at least moving away from net promoter and to asking the satisfaction questions and asking that on a regular cadence so that we can see over time like are we getting better and kind of what the trajectory is, I think that would be a huge thing to have accomplished as well

Steve: That’s a great note to end it on. So, thank you very much. It was a great conversation. Thank you. Thanks, everybody.

Okay that was the end of the interview on stage, and now here’s the remainder of our conversation. So, welcome back Marianne.

Marianne: Thank you. It’s the bonus track, huh?

Steve: Right, this is the director’s cut version of the podcast. So, one of the questions I wanted to follow-up with you on is you talked about the work you’re doing on the Type 2 and the Type 1 and really trying to drive change in the organization. I’m wondering, sort of tactically or practically, what are ways to bring those experiences that patients have into the organization itself?

Marianne: I think actually one of the most fundamental ways that it started is using our app itself and so – I don’t have diabetes myself, but for the first 2 weeks that I was – or first couple of weeks I was at Glooko, I sat down with our certified diabetes educator and we pretended that I had just been diagnosed. And so, she talked me through what somebody would hear. How they would be introduced to pricking their finger and measuring their blood glucose. And so, I spent 2 weeks and I logged all the food that I ate, all the exercise that I did. I measured my blood glucose twice a day just to get that experience of what it was like and like yeah, it was pretty crappy, even after a couple of days, to prick your finger. And to imagine that’s for the rest of your life. Like, I had the luxury of stopping. And so, for me it was a really small way to understand what it was like for somebody who was newly diagnosed with diabetes to do that. And I also presented that back to the team and I sort of, encouraged other people to try. So, we instituted a program. So, anybody who starts at Glooko now can get a blood glucose meter and to try it out themselves and to see what that experience is like. So, it’s one small way to at least start building that empathy.

Steve: Are there others that you want to describe?

Marianne: Yeah, actually, so I wrote a Medium article about different ways to build empathy when working in health tech. So, one of them was experience it yourself. So, actually to try the app and try measuring your blood glucose all the time. See for yourself. So, when we did the ethnographic visits, to bring people out so they could actually talk to and see people. And listen to others. So, some people that I work with actually do have diabetes and so I wanted to hear their stories. So, we started something called Life with Diabetes Storytelling. So, just a couple of folks each time and they’ll either tell their own story of how they were diagnosed and what it was like. Or some people have children who have diabetes and that whole experience. Or some people have parents who have diabetes. So, it humanizes it even further because these are people that we actually work with. So, it’s one thing to visit somebody’s home and you never see that stranger again, but these people that we see all the time and we think nothing of somebody at the lunch counter, or lunch table, pricking their finger and taking their blood glucose, or pulling up their shirt and dosing insulin right at the table. Like that’s just kind of what happens. And to ask those questions and making it okay to ask like well when did you tell your wife? And what is it like to do sports and kind of do those things that you used to do? So, it’s been a really great way to create empathy and even bond as a team, to hear each other’s stories. We put them up all on the website so people who start new can watch the videos themselves and learn about people stories.

Steve: And what’s – the format is videos. Are there are materials that are – I guess describe what this looks like?

Marianne: So, I was a little ambitious at first. I was like we’re going to have three people who speak once a month and I realized the company’s not that big and a lot of people actually don’t want to talk. And even I was encouraging people to – even if you don’t have diabetes, you don’t know anyone with diabetes, like there’s lot of this stuff on the Internet. Like watch some videos. Read some people’s blogs. Like bring that in. Haven’t had any takers on that yet. So, now kind of what we’re doing is we just have two speakers and we do it about every other month. And I just give people the floor. I say you’ve got 15-20 minutes, tell your story. Some people take less time. Some people are a little bit more hands – they take a little bit more time. But all I’m doing is giving them the floor, just creating that space. And it’s been wildly popular. Like engineers will come. Like people – we actually fill the room where it is. So, it’s just an opportunity for people to tell their story. And I also wanted to make it a little bit special, so the JDRF has these little bears called Rufus the Bear that teach kids who are diagnosed like how to do their insulin and all that. So, I’m riffing off that. So, I have these little bears that I give people after they speak, and it says, dear so and so, thanks for sharing a diabetes story. So, it’s personalized and people sit them on their desk so that when you walk around you can see who shared a story. So, I wanted to make it a little special of like, yeah, thanks for sharing and we know whose stories are out there too.

Steve: That’s great. So maybe a different question, following up on some of the things we talked about earlier. You know thinking – we talked about how you learn research and kind of your process through these different environments that you were in. You know at this point, do you have a super power?

Marianne: I think my super power is complexity busting. So, I think that based on what I’m hearing – and I think qualitative research gets poo-pooed a little bit, but I found that being able to suss out those patterns from just a few interviews and turn that into a robust framework of – you know I did a thing when I worked at Google around ads and what are the characteristics in ads and that framework is still being used. So, some of the things that I’m doing now in terms of persona research – or developing personas and also the Bingo card framework of the needs of people with diabetes. So, there’s a couple of columns, there’s a couple of rows and people have different needs and sort of fill that down individually. And those types of things that lead to action and sort of like synthesizing, encapsulating like here’s the things you need to know. Like, I’ve thought about it and I’ve taken all this chaos and I’ve put it into something that’s usable and actionable that the teams can use. And I think that’s one of the things that really makes me happy when I’ve figured something out and then turned it into a product that the team is like aha, we can run with it. This is actually useful for making decisions.

Steve: Right. I think those two pieces are really interesting because I’m with you on the finding patterns rapidly and being able to tell a new story about them, but I think you’re productizing them and communicating some kind of communication design around them so that other people can understand that, I think that’s two superpowers in one, maybe the way you’re describing it. Yeah. Do you have a way of thinking about your own brand, your own identify as someone that – as a researcher, or someone that works in tech, works in these kinds of spaces? How do you think about yourself that way?

Marianne: I think for a long time I thought I’m sort of like everybody else and like we’re all sort of doing the same things. And I have an opportunity now – I’m part of this incubator for women leaders and it’s having me sort of rethink what is my brand? And that one needs to have a brand. I think I also maybe started thinking about it when I started having my own consulting practice and I’m like how do I stand out and what do I offer and how do I package these super powers, and kind of bring that to the fore? So, – and part of this incubator that I’m in, it just started somewhat recently – but the women who are in it, like it’s not just people who are in tech. I think a lot of us are in tech, but we have like doctor and we have some lawyers and we have some other people who are doing other things. And so, it’s great to be with that different community. And so, one of the things that we did in our first session was really think about ourselves as a company. It’s like we all work for companies. We’re all like what’s the vision statement? What’s the mission statement? Every company, you’ve done SWOT analysis. So, we’re turning that lens on ourselves. And so, part of our homework for this month is to figure that out. And I think one of the things that I was thinking about is my mission is really user centered everything. That user centered stuff can be applied to any domain and having worked in finance and wildland firefighting and consumer stuff and with creative professionals, I think I’ve seen it there. But that’s all been within tech. But really, applied to things outside of tech. So, it can be applied to parenting. It can be applied to government. It can be applied to just a lot of different things. Like in fact I joined a gym recently. It’s called 9Round Kickboxing. And I realize how user centered they are. So, I’ve got to riff on this for a little bit. So, some insight somebody had along the way was people don’t want to spend a lot of time at the gym. They want to go at any time they want to. They don’t want to wait for a particular class and they like the special attention from a trainer. So, what does 9Round offer? You can literally go at any time. There’s 9 rounds that you do in order, but you can start as soon as you get there. Each round is 3 minutes. So, you’re done in 30 minutes and it’s one of these high intensity workouts. And there’s a trainer there who can like help you and spar with you at a particular moment. I’m like this is a fabulous example of being user centered and really understanding what your customers need. And I was like I signed up for the year. I’m like done. This is exactly – I’m your target user. You’ve figured out my needs and this is fantastic. So, thinking a lot more about human centered everything and how can we apply that to domains and sort of just everyday life because when things are human centered it’s more humane.

Steve: Is there a source or a reference or an inspiration for that human label, kind of in what you’re looking at?

Marianne: I think it came from you know having worked in a lot of different domains. I think what really crystalized it for me was I read a book called More Human by Steve Hilton and he’s married to a Silicon Valley exec, but he was in the UK and then he moved here and he was like – went to – started hanging out at Stanford and came upon design thinking and he’s like aha, all these things I’ve been doing and thinking about policy and applying it to business and all these things. And so, he really crystalized it and I’m like, yeah, I believe that. Like what if our businesses were more focused that way and what if like the policy of how do we get rid of homelessness actually like prototyped for us, rather than just spending tons of money and doing a particular project for 5 years. So, I think he really captured that human centered everything for me.

Steve: So, as you think about that as your brand, how are you going to operationalize that brand – to use a terrible phrase. But how does this go beyond a concept for you? What do you think you’re going to be doing?

Marianne: So, I think it’s something that I’ve been doing in terms of mentoring, like working with entrepreneurs – so when I left Google, I was sort of exploring two paths. One was more mission oriented something or other and the entrepreneur side of everything. So, right now my full-time job is much more in that mission driven kind of aspect of it, but I’ve also been doing mentoring for entrepreneurs. I’m mentoring a woman in Venture for America right now. I’ve done kind of office hours for a bunch of things. I’ve taught at Nasdaq Entrepreneurial Center. And so, I think those are ways where I can find different ways to apply it and find ways that – how do you find intrapreneurs and entrepreneurs to give them the tools to be able to ask questions and go out and talk to users to apply it in a lot of different ways? That is a way to make that happen. And so, I think that for me, whether it’s my full-time job, or kind of like the balance of those things may change in the future. So, maybe I’ll be spending more time and I’ll find a role that lets me do more of that working with entrepreneurs and helping them develop that and maybe less in this oriented stuff, or maybe I’ll be able to find ways to combine the two. But I think for me, in sort of the next phase, it’s kind of how do I find the overlap, or sort of finding the balance of those two things?

Steve: You know sort of the thesis of this podcast is around – I don’t know thesis, but just the people we want to have as guests are people who are user research types and I’m wondering, as you describe this sort of human centered everything, and these mentorship roles, are you being a researcher, as a way you would define it in doing that work? We have labels and so I want to ask you to kind of unpack the labels and see what you’re sort of framing on this work you’re exploring vs. maybe the work that connects you and I and why we’re having this conversation?

Marianne: Yeah, that’s a really interesting question. I think for me, it’s – I think once I get sort of hot under the collar about something, like everything else goes out the window. You know I was talking earlier about once I found Glooko and I’m like wow, they’re working on a really big, juicy problem. This is something I wanted to do, who cares what the actual label is, or what the title is. So, I think for me, like it really bothers me when people – smart people – go out and build things and spend a lot of time and energy to build things that are not for humans. And I’m like, erh, why didn’t they do that? So, I think for me it’s more about empowering people who have that energy and who have that entrepreneurial spirit to make the things that are right. So, not just make stuff,
but actually, make the right thing. So, for me it’s a set of skills that I think maybe everybody should have and maybe once everybody has those skills and can do it well, maybe the role of researcher doesn’t need to exist. But until then, I feel like it’s my duty to go out and spread the gospel, as it were, of this is how you talk to users. This is how the type of information you can get. These are types of things that you can’t get and the limitations of the things that we can do as well.

Steve: So, it’s a really articulate lovely reminder of why we do research? It’s what we’re trying to have happen and you’re looking at achieving that goal through research, or through mentorship, or through advocacy and all the things that you’re doing are putting stuff out in the world that helps people – I don’t want to put – now I’m putting a lot of words in your mouth, but there’s kind of a producing of things ideal that you are a champion of. So, I see research – as you explain it, I can see research as a part of that, but not the only one.

Marianne: I think you summed that up really nicely. So, thank you for making me sound more articulate.

Steve: That’s the reflecting back technique. Okay, so now maybe wrapping up our epilogue, anything to add in this? Anything I should have asked you about?

Marianne: I was just thinking another user centered everything. I’m doing some volunteering and working with a foster child and I’ve been able to turn that into a user centered thing too. Because she doesn’t live with me, I just see her for a few hours each week, so I think very intently about the activities I’m going to do and being a crafty person also, I have spent a lot of time on YouTube videos and Pinterest and whatnot and making activities that I’m like well we’ve got to work on like fine motor skills. We’ve got to work on like counting or matching and stuff like that. And I’m having such a great time designing these little activities for her and also seeing how she responds and like, oh, she doesn’t like those types of things. Or, she needs more of these types of things. So, I feel like bringing it to – like I think I just naturally bring it to all aspects of my life and I see the power that it has, and I just feel I’m like – I just need to be a champion for those things because I see the value of it and I just want the world to know.

Steve: That’s a lovely note to wrap up on. So, thank you so much, Marianne.

Marianne: Thank you so much.

Steve: Well! That wraps up this episode of Dollars to Donuts. Go to portigal.com/podcast for the transcript as well as links for this episode. You can follow us on Twitter, and subscribe to the podcast at portigal.com, or iTunes, or Spotify, or Stitcher, or anyplace excellent podcasts are distributed. You can buy my books are available at Amazon and rosenfeldmedia.com. The amazing theme music was written and performed by Bruce Todd.

Mar 06 2019

50mins

Play

Rank #9: 22. Vicki Tollemache of Grubhub

Podcast cover
Read more

In this episode of Dollars to Donuts I speak with Vicki Tollemache, the Director of UX Research at Grubhub. We discuss how to manage incoming research requests, running a weekly research session for testing designs, and why candidates should come into job interviews with a point of view about the company’s product.

To me, researchers are educators. They’re there to translate and educate the organization that they work with about who their users are, what they’re experiencing, where their pain points are, what they care about, what their motivations are. There’s a number of ways you could communicate that and you can educate. Experience is probably one of the best, but due to time constraints, not everyone can come into the field with us and experience that. If you just rely on reports and communicating from the researchers there’s something that’s left out in the details. There’s a richness that’s not there that I think even researchers realize. – Vicki Tollemache

Show Links

Follow Dollars to Donuts on Twitter and help other people find the podcast by leaving a review on Apple Podcasts.

Transcript

Steve Portigal: Thanks for joining me on Dollars to Donuts, the podcast where I talk with people who lead user research in their organization.

I’ve mentioned a public workshop happening in San Francisco; it looks like that has fallen through but I will be speaking at the Mind the Product Conference in San Francisco next month. I’m also doing in-house training workshops so let’s talk if that’s something your team might want to pursue.

In my consulting practice I have the opportunity to work with different organizations with varying levels of investment in research, varying levels of maturity in their research and product practices, and so on. I started this podcast as an extension of that, as a way to highlight the emergent practice of user research leadership. So supporting me and my business is the best way for you to support this podcast. My consulting work informs the podcast. It also pays for this podcast. If you have thoughts about the Dollars to Donuts, email me at DONUTS AT PORTIGAL DOT COM or write me on Twitter at Dollars To Donuts, that’s d o l l R s T O D o n u t s.

In 1961 the new chairman of the FCC gave his first speech, addressing the National Association of Broadcasters. He saw the potential of television, but complained that watching the programming currently available revealed a “vast wasteland.” That phrase echoed down the decades. In 1992 when cable TV executive John Malone coined the phrase “the 500 channel universe.” While that might have signified opportunity to the industry, but to the viewing public it felt like the vastness was simply increasing in scope. Yet somehow we made it today where 500 channels seems almost quaint, and critics herald the Golden Age of Television also known as Peak TV. Well, we purchased a Roku TV set recently. Roku basically provides an operating system for the television where you can add apps, just like you’d add apps to a phone. So we added all the usual suspects like Netflix, YouTube, Hulu, NBC, Vimeo, Amazon, and some others that we discovered like Pluto which is a free streaming channel with a ton of programming. But what I didn’t realize was that Roku is a somewhat open platform that allows interested parties to add content. Looking into how to do that, it doesn’t seem that much harder than putting out a podcast. And while the choice isn’t as overwhelming broad as with podcasts, there are an astonishing number of Roku channels. I found a blog that every week or two updates with the latest channels, not all of them, just the ones that they’ve reviewed! Here’s some of their latest update reviews.

Funny TV Network – A collection of funny clips from Family Feud hosted by Steve Harvey
Know Your Tools – Tool reviews, safety tips, recommendations and innovations
The Home Depot Channel – Tips and tutorials for home remodeling, home maintenance, and tool use
Louisiana Cajun Recipes – A Cajun cooking show hosted by a self-taught cook
Smoky Ribs BBQ – Essential grilling recipes
Stories of the Century – A 1950s Western TV series about a railroad detective who roams the west, tracking down outlaws and bandits who are preying on the railroad
SOS Coast Guard – A 12-chapter 1937 film serial starring Bela Lugosi and Ralph Byrd
Sci-Fi Zone – 19 vintage Sci-Fi movies from the 1940s and 50s
DramaTV – Vintage public domain dramas from the 1940s and 50s
Scary TV – 20 vintage horror movies from the 1930s, 40s and 50s
Shemaroo Yoga – Yoga tutorials from Anushka Malao
Aircraft Channel – Aircraft accidents and near accidents reconstructed with analysis of what actually happened
Rockwell Off Road – Mud-bogging and proving grounds videos

It’s not particularly different from what’s already online on some platforms, but I was surprised at how the bar for “television” had got, I certainly expect a lot of crap from TV in general but this is barely curated Internet detritus that mingles with TV channels from established media players. I’m not saying all this content is necessarily garbage, or isn’t of interest to some people, perhaps many people, but that my mental model for changing channels on a television set, even if I can’t pick anything over the air where I live and haven’t had cable for many years, that’s still an entrenched mental model, so to find that this new television lets me watch NBC and the Opossum Saga at the same navigation level in the menu is just surprising. I think they’ve got some way of streamlining the experience so searching on the platform will more likely reveal big brands that have more traffic or that perhaps have paid slotting fees. You won’t come across The Lawnmower Channel unless you know to look for it, I think. And just like I’ve mentioned in previous episodes these shifts in mental models, in how producers expect something to be used versus how consumers expect something to be used, these are fantastic things to explore in user research, especially as systems grow in complexity and scale, like this 5 zillion channel rokuverse.

Well, I think it is time to get to the interview. I spoke with Vicki Tollemache who is the Director of UX Research at Grubhub. All right, well, thank you for being on the podcast.

Vicki Tollemache: Thank you for having me.

Steve: Why don’t you start by introducing yourself? Who are you? What do you do?

Vicki: I am Victoria Tollemache, but everyone calls me Vicki. I am the Director of UX Research here at Grubhub. Essentially, I manage a team of researchers across our ecosystem and my job is really to make sure that we are working in a way, doing research that’s going to impact our organization. So, it’s a lot of strategy. I’m working with product leads and our design VP to make sure that we are positioned correctly and chasing the right questions.

Steve: Hmm, chasing the right questions. What does that mean?

Vicki: That means I mean – my research team – I think generally research teams are much smaller than the design organizations and product organizations they’re supporting. We have a million questions coming at us of all shapes and sizes and it’s determining which questions make the most sense for where we are as a business and which questions, if we get answers to, will the business be able to respond to and we’ll be able to have impact? So, making sure that we’re positioning in the right space.

Steve: Right. You mentioned impact as well right off. So, what does impact look like?

Vicki: Impact to me is, in the scheme of things maybe it’s identifying needs within our users ecosystem that we could create solutions for, to better solve for pain points they’re experiencing. Or it is – maybe we’re creating a new experience and we’re not quite certain if what we’ve created matches the user’s mental mindset. So, doing research at that point that our product and design teams can respond to and make changes to.

Steve: Does that fall under the label of evaluation? It feels like there’s something else to it.

Vicki: It can be. What do you mean?

Steve: I mean maybe you’re just describing the way I’d like to see evaluation done where it’s not just thumbs up/thumbs down on…

Vicki: Oh no, no, it’s iterating right through the process and we’re working with design to be like what can we change about it to make it a better experience? And then potentially testing again.

Steve: And you’re framing it around mental mindset which is sort of the underpinnings of a concept.

Vicki: Sure, absolutely.

Steve: Not the implementation of a concept. I think that’s why evaluative to me is like – evaluative is maybe about the details of the design?

Vicki: Sure.

Steve: And so I feel like maybe there’s another word that describes what you’re doing which is looking at the value proposition or the construct, or the mental mindset.

Vicki: Sometimes we call it an experience audit here, but I feel like that’s a term we’ve made up ourselves. I don’t know if I’ve ever heard that in the field or in the wild. Because we were trying to describe some of the things that we’re doing and we’re like are we auditing the experience?

Steve: Not that you asked my opinion, but audit seems more passive than what you’re talking at. To get at that gap between a mental mindset of a consumer of something and producer of something is more – that’s more – you have to extract that. You have to synthesize that.

Vicki: Sure. I think I’m speaking to two different types of research, right. One is when we’re going out in to the field and trying to understand that environment that our users live in, and especially where they meet, because we tend to find that that’s where a lot of the friction points are. And then working with design and product to be like how do we solve for some of these problems that we’re finding? Are these problems that we knew existed even? And then once we’ve gotten to that place where we’ve come up with solutions, then pairing with our users to determine have we solved for this in a way that actually makes sense for the users?

Steve: When you say “where they meet” in that first part?

Vicki: So, I mean Grubhub is an ecosystem, right? We have what we consider four distinct types of users. We have drivers. We have our restaurant partners. We have our diners – these are the people who are consumer facing. They’re ordering from us. And then we have internal employees. I mean the easiest, most basic way to explain that is we have an internal care provider that provides help to all our different users and those users don’t exist by themselves. They’re constantly interacting with each other, right.

Steve: Yeah.

Vicki: And usually they’re interacting with each other when there’s problems. And so either we can help them solve those problems, or sometimes we can maybe be a hindrance in how they solve those problems and they have to try to work around us. So, those are sometimes some of the biggest opportunities for us.

Steve: Okay. So, where there’s interactions or intersections between those different…

Vicki: Types of users.

Steve: In the ecosystem.

Vicki: Yeah.

Steve: Is opportunity.

Vicki: Opportunity, yeah, for us to – sometimes it’s about providing them more independence and building trust, right. And then there’s understanding why is there trust breakdown and why do they feel like they don’t have autonomy to kind of control the – the situation that they’re in.

Steve: Maybe we could step back.

Vicki: Sure, we did kind of jump right into it.

Steve: That’s me, not you. So, what is Grubhub? It’s an ecosystem of these things, but what is – how do they combine?

Vicki: I think Grubhub considers themselves a marketplace that is providing restaurants the opportunity to compete for consumers in regards to ordering food for delivery or pickup. So, that’s what we provide.

Steve: In what parts of the world are you running services?

Vicki: The U.S. and then we have a little bit of presence in London, but that’s generally on the corporate aspect of it, right. So, we also have a corporate platform where companies can partner with us and provide their employees a credit and then the employees kind of – it keeps them at work if they can order lunch at work and kind of eat through a meeting kind of thing. So, we have a little bit of presence in the UK with that. But otherwise, I think we’re in 300 markets in the U.S.

Steve: Is that a lot?

Vicki: I mean I think we have a very good presence in the United States. I think if you went into any major city, or even minor city, we would be there, right. So, we’ll be in Little Rock, Arkansas. We might not be in Jasper, Texas. And I’m using references from the South because I am from the South, so it’s easier references.

Steve: Okay. Alright. That’s good context. So, let’s – so, you’re sort of describing some of the things that you’re working on. And you made a point that research teams are often smaller than design teams.

Vicki: Yeah, product and design teams, right.

Steve: Yeah. And that’s the case here.

Vicki: That is the case here. I think – there are 7 of us currently, including myself. We have an open position, so that will be an 8th person. And then we also swell in the summertime because we usually bring in one intern. And last year we were fortunate enough to hire an intern, so it’s a good opportunity for interns as well.

Steve: What does a user research intern do at Grubhub?

Vicki: Well in the days since I’ve been here I always want to win best internship. That’s always my goal. I want that intern to get as much experience as possible. Usually they come in and they tell me they would feel more comfortable getting moderating experience, being able to run projects on their own. Last year we had the intern own – it felt like the easiest place for them to start was kind of doing more smaller research questions, so more evaluative research. So, they took over – I think we called it Research Day – and essentially ran that for the entire summer. But then we also took them out into the field with us for one project. So, they got to go on a trip and experience what it’s like to be within the restaurants, shadowing the drivers, us going into the diner’s home and learning about food. So, they’re owning one thing, which is kind of nice right, that they get to own something. They’re getting experience moderating. They’re putting projects together. They’re working with product. They’re working with design. And because Research Day goes across the entire ecosystem they’re getting exposed to everything. So, it’s like a really good place for them to start and then they will get to partner with some of the researchers. Kind of like a more mentor/mentee and go out into the field on some of our bigger research projects.

Steve: What is Research Day?

Vicki: Well we called it – at first it started as Diner Day. It’s just a standing day, once a week users come into the lab. The week before the designers can pitch ideas they have. We go through, based off how thought out, flush out the ideas, priority based against our organization’s needs. What are we going to be testing? Do we have access to it? Can we get the right users in? So, we go through and we kind of have a pitch day. We choose the idea and then 5 days later we test that idea in a lab. So, it’s supposed to just be really, really quick – 24 hour turnaround for findings. Just a bulleted list of findings that the designers can then go take action off of.

Steve: And that runs?

Vicki: Every week. It’s really frequent cadence. I found in past organizations sometimes people can push off doing research because they say it takes too long. It takes too long to go through scoping. It takes too long finding people to recruit. So, I felt like if we had this standing day setup for diners – just bringing 5 diners in once a week – it meant that designers could kind of come in last minute sometimes and we can get research done for them. And they can also experience that and watch that because we’re in a lab based situation. Or we can also go on the streets of New York. But in the lab base it’s kind of nice because we can also do remote interviews and reach out to people outside of New York, because New York is kind of an outlier compared to the rest of the country in regards to what our platform looks like. Also, people are very experienced ordering food from Grubhub or Seamless here and sometimes we’re looking for a split of like maybe less professional food orderers.

Steve: Seamless is another platform.

Vicki: Seamless and Grubhub are the same thing, just different brand. Eat24 is part of our brand as well. But they’re all the same. There’s just a different brand logo in the top left hand corner.

Steve: When you do Diner Day or Research Day, so a tactical question, you don’t have a lot of lead time.

Vicki: Um-mm. It’s quick and dirty.

Steve: So do you already know kind of what people are coming in before the idea is pitched? What’s the dependency there?

Vicki: Generally it’s a mix of diners, right. It’s starting to get harder and harder to find truly new diners to our platform, but it’s a mix of new or returning diners and we can sometimes switch what that ratio is within the 5 days we have from doing the pitch to actually doing the recruiting. So, we just know we’re going to bring 5 diners in. Some of them are going to be new to us, never ordered from us. A percentage of them are going to be returning to us, or we may shift that where it’s all returning based on what the project is.

Steve: So, you have kind of infrastructure in place such that you can quickly change.

Vicki: Yeah. That’s the nice thing about food, right. I mean I worked – I came from AT&T before this and I worked on the B2B side and we had to recruit network engineers and it was really difficult. It’s expensive. They don’t have much time. It’s hard to get them in. But so many people order food. It’s just an easier recruit, right. So, within 5 days we can get those people. We used a platform called userinterviews.com. Have you heard of it?

Steve: Yeah. I haven’t used it, but I’ve heard of it.

Vicki: It makes it very easy. And they’ll recruit anywhere for us in the U.S., so it’s super nice and they turn it around very, very fast.

Steve: So, once you understand what that flow is going to be…

Vicki: We have that relationship and they know that we’re doing that on certain days, so it’s just like an ongoing recruit that we have going.

Steve: Okay. Just coming back to one of the things you said, you mentioned a couple of times kind of aligning the research activities you do, whether it’s at that level or other things, with the business questions, the goals of the organization. And I think maybe you mentioned this, but maybe you could talk a little bit about how you come to an understanding of what those goals and questions are for the organization so that you can choose the research activities that support that. How is that input for you?

Vicki: I think when I first came – I mean I’ll be honest about this too. When I first came in it seemed like – and we still have this to a certain extent and I think a lot of companies deal with this – ongoing reprioritization of everything. We begin to work on something and then it will be like, no our priorities have changed. So, we had like a lot of like stop/start projects. And at the time that didn’t make sense to me so I wanted our research team to prioritize things based off what we knew our users needs and identifying those. And then we had little bit of leadership change and we had a new CPO come in, Sam Hall, he’s from ClassPass, he’s great. And he’s actually – I work pretty closely with him to understand like what the business priorities are. And I work pretty closely with our design VP to understand some of the design challenges that we’re facing as well. And even for something like Research Day we have an ongoing backlog of research questions that’s coming from the designers and/or product, but there’s some weeks where we don’t have anyone come to pitch and then my team has questions that we will go answer. So, we don’t not do a research day because product and design aren’t asking us questions, but we have questions as well based off things we’ve seen in the field and some of those can actually be themes that come up later. So, in the end, my answer is it varies. Working closely with product helps us from a business perspective. Working closely with our design leadership helps me because they have questions that they are also separately going after based off kind of some of the design questions that they have. And then I just work to prioritize those. We do a scoring. Does that answer the question?

Steve: A scoring?

Vicki: Yeah. So, certain people get a weighted score and in the end it’s also kind of what my team thinks is going to be the most valuable for us, and also where we’ll have impact. I mean I worked at organizations where they’re like every year they set a goal for the research team to do more research, like more projects. And that doesn’t make any sense to me. It should be about doing projects that actually impact the organization, right, that people actually make changes from. You can do a million research projects. If no one does anything with that research it’s like dealing into the abyss.

Steve: So, if I come to you and say I have a question about how people go through this flow in the app, how would you score that?

Vicki: Who are you? Are you a designer? Are you within the leadership team? And then what is that flow? Like what part of the experience is that? Is that something that we have as an initiative that we’re heavily focused on that we’re invested in as a business to like make a better experience? So, I’d have to ask some of those questions and then I’d also look at do we already have previous research that might be able to answer some of your questions? So, there’s a number of factors.

Steve: And sorry if I’m getting like obsessively detailed, but like are you – when you say score to me that means there’s a bunch of questions and you kind of apply a number to those and total them.

Vicki: Sure. Yeah. And it does work like that. It depends on who’s asking for it? Where we are in the product lifecycle? Like if we do this research for you will you be able to make changes from it within this release? Or are you telling us you might be able to make changes six months from now. So, we’re asking – do we have impact? Who is asking for this? How does it align to our business priorities? How does it align to our research and design priorities because we’re part of the design organization? And then there are certain scores that go with each of those answers.

Steve: Okay. I think I’ve heard those questions as kind of an intake or prioritization approach, but I haven’t heard it labeled as scoring which I think makes so much sense when you say it.

Vicki: And then I usually – because in the end our design organization is within our product organization and I usually also work fairly closely with our CPO to be like this is what I’m getting from the design organization. This is what I’m getting from your product leads. This is how we’re prioritizing. Just to make sure. Like it’s a gut check, like does this seem right?

Steve: So, I mean, there’s the example where if nothing comes in for Research Day you have your own things.

Vicki: Yup.

Steve: Are there situations where you’re proposing research to these kind of leaders that you’re working with?

Vicki: Absolutely. Last summer we had a new VP of design come in and one of the things he really wanted to push is quality of our product, right. And then the question came to me, like how do we measure quality? So, there’s a number of ways that we could go about measuring quality. I work with the data science team to go after certain metrics, or to understand how we might be able to measure that within some of our data. But then from my background I was like we could also do benchmarking. Like that’s something we haven’t done before and we rely really heavily on A/B testing which sometimes is like building things very piecemeal. So, I proposed that we go through and do a twice a year benchmark after some major releases to measure like what is this experience for users in our current product. And then if you look at the definition of quality you can only truly measure quality by comparing it to something of greater or lesser quality. So, the suggestion was where we can can we also do a comparative where we’re benchmarking against one of our competitors. And they were completely on board with that. And they allowed us to go to an outside vendor and go through the benchmarking process.

Steve: I thought you were going to say you’re benchmarking against yourself over time.

Vicki: You can do that – we’re doing that as well, but also against our competitors. When we can. We can’t do that obviously on the business side, but we can do that on the diner’s side.

Steve: You can’t do it in the business side because you can’t get access…

Vicki: We don’t have access.

Steve: …access to that part of it.

Vicki: No.

Steve: Okay. I see.

Vicki: And that’s always one of the things – I’ve heard of researchers who will ask – generally our restaurant partners or driver partners, they’re also using our competitors, but from an ethical perspective I don’t like to ask those types of users to share that information. I just don’t think it’s appropriate.

Steve: How do you benchmark – you know when you’re doing competitive benchmarking, how do you compare apples to apples?

Vicki: Our product teams are broken up into initiatives. On the diner side we have two initiatives. One of them is more about like the core product, right. And then benchmark is really about is our core product usable. Like where are there friction points. So, I worked with our product lead and our designers to identify like what do we consider core tasks? And those are based off like what are common like user flows, user journeys, through the app? But it also was based off where do we also know that we see CPO, so care calls taking place, because we also are always trying to lower those, right. So, from that we’ve mapped out like what our tasks were and then we took another one of our competitors, who essentially is going to have the same, similar tasks and we just did a benchmark against those tasks for those competitors. So like, you know task timing, success, ease of use. We also created this quality metric which is kind of based off of reliability. And then also just qualitative feedback, right. But there’s other things as well. There are things sometimes we hear from our users that are taking place that maybe we weren’t aware of and we definitely will go to product and design and be like hey we think that we need to do a deeper dive here. We’re hearing a number of things. There’s friction taking place that we weren’t aware of. Maybe we’ve made changes somewhere – and that’s the thing with an ecosystem, right. You might think you’ve made an improvement in one space and you’ve actually created two problems somewhere else and we’re definitely given the freedom to then go explore that and suggest solutions for those additional problems. Or maybe roll back the first solution to begin with.

Steve: You brought up solution as part of the research process…

Vicki: Yeah.

Steve: … a couple of times and I wonder if you could just say more about what’s a researcher’s role in – what is…

Vicki: A researcher’s role in solutioning?

Steve: Yes.

Vicki: Um, well, at Grubhub, again because we’re a small team, I think in some ways I wish that we could be more hands on than we are. Maybe we’ll identify a need and our product team has also had a hypothesis that that’s a need as well, so they’ll be like how do we solve for this thing that we’ve identified? It’s an opportunity for business. We see that it’s an unfulfilled need for users. Perhaps we’ll put together – we’ll define what the problem is. We’ll put together a workshop. A researcher will usually be involved in that workshop, but sometimes they can’t be so then maybe one of us come in as like a speaker, to speak to like – see the work and solutioning that they do and comment on it or consult on it. But it usually begins there and then we’ll go through rounds of iterating on the design as we test it. And a lot of that will take place in Research Day. Does this make sense?

Steve: It makes sense. Maybe you have thoughts about sort of the philosophy.

Vicki: I think my team would like to be more hands on with what takes place in those workshops, but we just have a limitation of time.

Steve: Yeah. Why do you think that’s something that they want? If you look at researchers around the world, what – I feel like some are very interested in the product, the details, the solutions.

Vicki: Absolutely, right.

Steve: Some are – some maybe don’t see that as their role.

Vicki: I think there’s always like, just from a human perspective, a desire to control. So, I mean, from that basic level it makes sense to me that people want to be more involved in solutioning, but I also think – you know researchers, they’re out in the field. They’re witnessing that. Maybe they write a report. Maybe they communicate their findings, but sometimes they have to be fairly like theme, high-level findings and there’s a lot in the details that can get left out of reporting and I think they feel like they provide value to be in those solutioning sessions because they can speak to some of that nuance that maybe wasn’t communicated within a report, right.

Steve: Yeah. Well even in the example you gave where there’s just no bandwidth, that maybe you guys are working in a consulting role.

Vicki: Yeah. I mean when you have a smaller team, unfortunately it becomes that way, right.

Steve: And whether you’re consulting, or in those sessions, you’re still there to – it sounds to me – so, tease apart – I think what you mean by solutioning is a collaboration where solutions are generated, but it’s not that researchers are there to do design.

Vicki: No, yeah.

Steve: But they’re there to bring that extra detail to inform those design decisions.

Vicki: Yeah, I mean – I think to me researchers are educators, right. Like they’re kind of there to translate and educate the organization that they work with about who their users are, what they’re experiencing, where their pain points are, what they care about, what their motivations are. And there’s a number of ways you could communicate that and you can educate. Experience is probably one of the best, but due to time constraints of everybody, not everyone can come into the field with us and experience that and I think there’s something to be said about – outside of not having the experience yourself with users in the field, if you just rely on reports and communicating from the researchers you kind of – there’s something that’s left out in the details, right. There’s like a richness that’s not there that I think even researchers realize. Like if my presence is there I can speak to some of those one-off situations that maybe we didn’t cover in the report, but might get brought up within solutioning. Does that make sense?

Steve: Yeah, yeah. I think we’re sort of unpacking what solutioning looks like. That it’s not – that there’s a dialogue and a bunch of different perspectives that can come forward and that the researcher isn’t necessarily there to say oh I think we should start the task here instead of over here.

Vicki: Yeah. Yeah. It’s more about bringing that richness of like, if we start the task here this might actually happen to some of the users. Whereas if we started it here maybe that won’t happen, if that’s a good or bad thing. Does that make sense?

Steve: Yes.

Vicki: Just that deeper understanding. I really do believe that researchers tend to just be educators.

Steve: That’s a good metaphor, isn’t it?

Vicki: Yeah. It’s what it feels like to me. I remember – well I was reading this book recently, John le Carré – do you know who he is? He wrote Tinker, Tailor, Soldier, Spy.

Steve: Um-hmm.

Vicki: And he was talking about the role of a war journalist and it’s really not about reporting on just what’s taking place in the field, but it’s also about like building empathy and trying to encourage people to care. I kind of feel like in some ways we’re kind of like war reporters, based off his description. We’re out in the trenches.

Steve: Right. You want to bring it back in a way – I mean it’s back to your impact point early on.

Vicki: But there is a lot left out when you communicate through reports and findings, right. Sometimes you’ll have a user who’s extremely insightful, but do you just report to that one thing, or is that like a one-off? But then you’re in solutioning session and you’re like oh, well it’s actually useful here. This is where I should share it.

Steve: So, yeah, if we took away the constraint of limited resources or just the bandwidth that the folks on the team have, what are the ways that they could be educating?

Vicki: Well – I mean that’s the thing – not just taking constraints from my team, but I think one of the things realistically, I wish that we could take – or I wish the product and designers – product team members and designers could come into the field more and have that experience more, right. I think by them being better versed in what that experience is like within different markets, within different types of restaurants, with different types of users, they have a better kind of base to make decisions from. They’re more informed. They have a deeper understanding. They understand what some of the more nuanced situations might be and therefore they can make informed decisions. But unfortunately with meeting schedules everyone is like strapped for time, right. Like sometimes the reality is they just can’t come out into the field.

We have another program here at Grubhub called Parts Unknown. I’ve branded all of our research programs. We also do – it’s not all within programs, but Parts Unknown, started by – we were inspired by Anthony Bourdain, obviously – this was before, unfortunately, he committed suicide, but bringing people out into different markets and taking a deep look at what our ecosystem looks like outside of New York and Chicago, because we are kind of outliers, and understanding like what food culture is in these cities and how people think about food and what types of food they order and how that impacts our ecosystem and what our ecosystem looks like there, and we were able to bring – and we still do this – product and designers into the field and I feel like that has been some of the more like enriching, like those people come out with a much, much more informed – they’re much more informed and I think they’re able to make better decisions as a result of it.

Steve: That sounds like it’s kind of an immersion experience.

Vicki: Yeah, yeah. And sometimes we were partnering with food bloggers too to have a night with them, just to understand like their take on the market. It was pretty interesting. So, we still do that, but we’ve also kind of evolved that program as well. It’s not just about looking at different markets. It’s also about looking at different parts of an experience that maybe we haven’t looked at in a while and trying to determine we think it’s this, is it actually this for our users?

Steve: Right. Is this a category where you look at the different people in your ecosystem – is there – how rapidly do these things evolve?

Vicki: What do you mean?

Steve: I mean like how static are the things you know about different people in the ecosystem?

Vicki: Well I think as our industry changes and people are competing against each other within our industry that there are things that are happening that are changing the ecosystem on a pretty regular basis. Does that make sense?

Steve: Yeah. So, hence the need to – if there’s areas of the ecosystem that you haven’t looked at in a while, you have to go back…

Vicki: We have to revisit. Absolutely. And that’s kind of – this is again where I get nervous about if I should speak to this or not? So, some of the things – we have just certain areas that we haven’t looked at in a long time. I mean our company is 20 years old and we’ll be like oh this is our workflow and then we’ll have our CPO be like I want us to take a look at this because I have questions about what this workflow really is. So, we’ll go through and do this thing as an audit. The researchers will sign up as users and go through an audit with that experience, what it’s actually like as a user. And then we can pinpoint like this is what we thought it was, this is what it actually is. We think it takes 10 days. It actually takes 30 days and this is why it takes 30 days. That kind of stuff.

Steve: Yeah. So, if there’s so many different facets to the experience in the ecosystem and there’s change happening.

Vicki: Yeah. And you have – within our – I mean it’s not just our product organization, right. Like we’re working with an ops team, a sales team, we have a marketing team and sometimes different teams are building different parts of a flow and they aren’t coming together to check like does that all make sense. Isn’t it always like the idea of like your product is a reflection of your organization and how well they work together?

Steve: Right. What’s the cliché about don’t ship your org chart. I don’t know who to attribute that too.

Vicki: Yeah, I think that is a cliché, yes.

Steve: Okay. Can we talk about your team a little bit?

Vicki: Sure, what would you like to know?

Steve: What kind of people have found their way to your team?

Vicki: Well I inherited the majority of my team and they kind of have a pretty diverse background. We have a couple of – I think my team is comprised of a designer/comedian, an industrial designer, someone with a sociology background, someone with a psychology background and then two researchers who actually have like human factors degrees. And they come from a variety – most – only one of them, our intern from last summer, this is her first job. The rest of them come from a variety of other industries to get here. And one of our researchers, our associate principal, he and I have worked at like 3 to 4 other jobs together. I’ve known him for a very long time at this point.

Steve: I feel like I would be remiss if I don’t ask about /comedian because that’s just there to be asked about.

Vicki: He does like a whole bunch of live storytelling, but I think it makes him a very good presenter. He kind of has like charisma on the stage. I’ve actually had a – when he presents I have people reach out to me to be like you should hire more comedians. Because it makes him very engaging. He’s very good at telling stories, which is why I mention that he’s a designer/comedian background. But he’s worked in research for like, I think, the last 8 years.

Steve: I’ve definitely come across people in research who have either actual, or sort of – they have actual education in theater or some related field.

Vicki: Well you know communication is such a huge aspect of it and if you have that charisma and ease with yourself I think it just gets the message across better sometimes. People are more engaged and feel maybe less threatened by the message. Does that make sense?

Steve: Yeah.

Vicki: There’s something to it. He definitely knows how to charm an audience. I think it also makes sometimes the research a bit more stickier because people are more engaged with presenters like that.

Steve: I think – right, people that find their way into research eventually discover that so much of the work is about working with your colleagues and not…

Vicki: Oh, absolutely. It’s about relationships, right.

Steve: Far less than it is about fieldwork or research…

Vicki: I think that’s anything, right. Like in the end it’s all very relationship based, right. But I think there’s something too, I know the previous director here, he had us do an improv workshop and there is something about – my sister I told you is a comedian. There is something about like comedians in the end generally are introverts who kind of force themselves to be extroverts, which I feel like there’s parallels with that with research. And then there’s this idea like, with especially in improv, where instead of saying no to things it’s kind of that yes/and and building off of each other which also tends to work better with relationships as well. So, maybe it kind of facilitates that. I don’t know. But yeah, in the end it is all relationship based.

Steve: When you talk to people as prospective hires – I don’t know who you hire, but whether it’s the interns or other people, what kind of things are you looking at that kind of strike you as this person might be worth talking to?

Vicki: I definitely want someone who is very analytical. I mean we have – I’ve had candidates come into interview who, when you ask like what do you think of our platform, they just say it’s great. I want someone who has actually maybe taken a look at it from a critical lens and has identified things that they think might be opportunities. Even better if they showed friends of theirs, or family members, and kind of did a small usability test on it and came back with a critique. My team is very collaborative. We get along very well. So, huge egos aren’t a big thing. Like I want people who are going to come in – kind of say everyone brings something to the table, everyone offers something, and I feel like for researchers they learn a lot from working together as well. So, I definitely want someone who is going to fit into that culture and is excited to collaborate with other researchers.

Steve: But how do you look for that in a…

Vicki: An interview process?

Steve: Or whatever – the scouting or any part of that process before you actually are working with somebody?

Vicki: I mean for me, we have a recruiting group, right, who does all the recruiting and they bring in resumes. We talk about kind of what I’m looking for in the resumes. I mean a lot of it I’m sussing out when I’m doing phone interviews for the initial screen, right. And I guess a lot of it is me just trying to get to the core of who they are and how honest or comfortable I feel like they are with describing who they are. The intern that we have right now, I keep referring to her, she was probably one of the best interviews I’ve ever done and she just told me a very honest story about a research project she did, that they presented it, that the company rejected the idea that they came up with, and I was like what did you do? And she was like I drank a glass of wine, it was 6 weeks of work. And then she told me about like, she’s like but then the next day we went back to the drawing board and we decided to do – and it was just like a very honest – she exposed vulnerability. She was very much herself. She was funny. She could speak very clearly to herself. And I’m kind of looking for that. When I interview someone I want to feel like I’m having a drink with someone, that they’re comfortable describing who they are, which I also think is a sense of maturity where people aren’t – I mean interview process is kind of nervous, right, so you’re coming in maybe with a little bit more of a sense of yourself and a little bit more comfortable about who you are and being honest with who you are. I think a lot of people make the mistake of hiding themselves in an interview and I think you want to be very, very open and honest about who you are because if you aren’t it may not be the right job for you to begin with, right. So, I think I’m kind of looking for that. Just some who can speak to what they’ve done and who they are in a very clear and honest way. Does that make sense?

Steve: Do you think that’s a research specific way of looking at candidates?

Vicki: That’s human specific.

Steve: That’s just hiring someone you want to hire.

Vicki: Yeah, that’s hiring – but then on top of that then I’m looking for like, I want to hear examples of projects that they’ve done in the past and how they’ve approached research? I definitely like to hear about how they’ve handled failure in the past and what they’ve learned from it? Situations that they’ve had to work – environments they’ve had to work in that were not comfortable for them, and why? I think generally the most basic questions anyone asks in an interview though. But it’s definitely about how comfortable and honest and open I feel that they are in the interview that is the thing that probably sways me. But also that they have to be analytical. I do have an issue when people come in and they haven’t done any research about Grubhub, or looked at our product or had any thoughts on it at all. Because if you’re a researcher you should be asking questions, right, and doing some research.

Steve: I want to pull apart some of what you’re saying.

Vicki: Sure.

Steve: Because you’re saying analytical, but the example you’re giving is them having looked at Grubhub. Those seem like different things to me.

Vicki: Well, I want them to have – I mean when you go – when anyone goes in for an interview, haven’t you done a little bit of research on the company that you’re going to work for? You’re curious about the product, like what is this product? Like what kind of research will I be doing? Let me take a look at it. Are there a huge amount of problems that I see? Is this going to be like a battle, or is this like oh, I can see like some big – I want someone who has like thought through like what am I going to be working on? What seemed to be the problems here? And I want them to speak to like problems that they’ve identified. Does that make sense?

Steve: Yeah.

Vicki: It’s kind of like they should have been a little heuristic, or maybe even like shown it to a couple of their friends and have an idea of like these are some of the things I think that you might be dealing with. And even speak to like how they might solve some of those problems. I mean we usually also have them go through a practice where we use a site that is not Grubhub to understand like how they look at something? What problems they identify? And then what kind of research solution they would put together for that? But that’s after the phone screen.

Steve: Okay. So, I understand a little more what you’re referring to when you say analytical. There’s a bit there of initiative, but also thinking through something before you kind of get into it.

Vicki: Yeah. And to me, if they can speak to that – like because one of the concerns I have is the communication aspect, especially when you’re doing any sort of like generative, ethnographic work, I mean that’s – I used to have a researcher who called it the big messy and it’s – you go into the field – there’s a lot of stuff that you can focus on, like what do they – how do they hone into what’s important, right. And I kind of feel like I get that from talking to them about like what they’ve seen within our own product. If I have someone speak to the fact that UberEats changed their app logo color and they didn’t like it, I don’t know if I think that that is necessarily a big important finding. Whereas, if I have someone talk about problems they experienced within our search results and like how that might not be conducive to someone looking for food, that makes more sense to me. So, what are they honing in on? What do they consider a finding? And how do they communicate that to me? That speaks a lot to how they might do that for the organization.

Steve: I’ve worked as a consultant my whole career, pretty much, and many years as an agency and I think – maybe the reason you’re hearing me sort of check in on this approach is that I think in the agency lifetime that I had, and certainly my own practice, there’s kind of a consultancy hubris of like coming in and like let me tell you about your thing. So, when you say you want your prospective hires to tell you about your thing.

Vicki: I want them to have an opinion, right.

Steve: Yeah.

Vicki: And I want them to be able to communicate that opinion. And I want to understand what that opinion is based on. And if they don’t that’s concerning to me.

Steve: Yeah. So, I think there’s something here about the process more than the substance of their opinion, but the process – because they don’t have access – like I never want to be the person, personally, just to come in and say look I don’t know what strategy decisions you have, but here’s what I think you should do.

Vicki: I’m not wanting them to lead that, but I’m wanting them to have looked at our product and been like here’s some things I think might be weaknesses, right. Like I’ve considered what your offering is. I’ve looked at it and I’ve noticed these things. I mean that’s the least you can do when you go in for an interview, right? Whenever I go in for an interview I review whatever I can access of the product, depending on what part of the business I’m interviewing for, and I make sure like, if someone asks me what do I think of it that I have like feedback based off some sort of heuristic I’ve done and I can tie that back to some sort of best practice of some kind, right. And so I kind of expect that as well from researchers coming in. If they haven’t looked at the product at all they kind of haven’t prepped themselves for the interview.

Steve: Yeah.

Vicki: And then that concerns me. And also researchers should always be asking questions, right. Like you should want to know like who would I be dealing with here? I’ve had companies reach out to me where I’m like oh my God that taxonomy would be crazy. It’s so much, like how do they deal with that, right. Like I can look at that and then I’ll have those questions for like how do you guys handle that. You can tell a lot by the questions people ask as well.

Steve: Right, that’s another aspect. What kind of questions are you looking researchers to bring you in the hiring process?

Vicki: We definitely want them to ask questions. I’m always shocked at how many people have no questions for me. I mean I expect them to ask about typically how we do research. I know right now we’re hiring for kind of an entry level person. So, a lot of times I’m trying to understand have they approached certain methodologies? How are they like wanting to expand or grow as a researcher themselves? I want to understand from them – like where they want to work, what experience they want to gain? And then also like how we handle research at Grubhub, kind of similar to all the same questions that you asked me at the beginning of this interview. I kind of expect them to be asking those same questions, right.

Steve: Now they’re just going to listen to this.

Vicki: I do think sometimes on Fridays – I call it 30 minutes with like a college grad, because I get a lot of college kids reaching out to me through LinkedIn, asking me how to get jobs. And I’ll talk to them for the first 30 minutes of my Friday and just give them feedback about what I think is a good interview and we’ll go through that. It’s very similar to all the questions that you’re asking right now.

Steve: I’m just the college kid, right!

Vicki: So, from the agency side – I came from an agency background as well. I worked for a company at the beginning of my career called Usability Sciences. They’re based in Dallas. They’ve been around like 30 years. It’s always interesting to me the agency side because I feel like even though agencies want to come and be like this is what you should do, there’s always kind of like this sphere of like overstepping that with the client, right. There’s that like kind of thin boundary and I know talking to researchers who are just from an agency background, they sometimes feel like they have no voice. Or they can’t push back hard enough because they’re afraid they’ll lose the client. And so therefore sometimes they’re not quite – especially if they’ve only had agency experience – I think that’s one of the things they talk about experience they want to get is like being able to work closely with product and scope and push back when they feel like they are more empowered to do so. And that’s something that they lack from agency.

Steve: What about for you? So what are the key things you’ve seen as a difference when you’ve worked in-house or worked as a consultant?

Vicki: Well I started – I left consultancy in 2013 and for me I’d started in 2007/2008 and I just saw a change in the industry, right. 2007/2008 sometimes we’d be working with someone from marketing. They would come in. They would want us to do like just the usability test. That same client would come back year after year, same usability test, no changes. Same usability test, no changes. And then as time was progressing I would see that companies where all of a sudden you had like a research director. Or you had a researcher and they were our go to person. And I could see that people were staffing up inside and that I think the industry was swaying maybe from like agency, or definitely what my consultancy offered, to that offering taking place within the business. So, yeah, I wanted to go see what it was like. Like why are people not making changes and coming back to us year after year? What is the political landscape that they’re in that maybe is leading to something like that? How do you maybe have more control or more say when you’re internal vs. some of the limitations you feel like when you’re consulting? And just to be more embedded, right. To get to work closely with designers and to work – I mean we would come in and do like maybe 3 weeks with a company. We wouldn’t do like a long, long embedded process. So, it was just that understanding of like what is it like to be embedded and work with product and design and what are the limitations they’re facing? Because I could see those within the projects that I was working on, like that was happening, but I didn’t understand why that was happening?

Steve: What was the first job that you got post-agency?

Vicki: JCPenney. That was one of our clients and I jumped over to them. But probably after – I don’t know if you know anything about JCPenney? They went through a big shift where they hired this guy, Ron Johnson, from Apple. And he decided to kill all of their rewards and sales programs. And if you know anything about the JCPenney customer, that’s – customer shops sales first at JCPenney. And their loyal customers are very much versed in that whole points/rewards program, so by getting rid of that they alienated their core customer base, the company kind of took a dive that I don’t know if they’ve really come out of it. I think it’s also like retail has changed, right, and that was taking place in the midst of that change of retail because of online sales. So, I went there. It was interesting, but also the company was doing massive layoffs, constantly. So, then I went to AT&T after that.

Steve: What were you doing at AT&T?

Vicki: AT&T was a closer commute. At the end JCPenney was a 2 hour commute for me each way which is bananas. I was in Dallas at this time. AT&T, on the business side, essentially they had a portal of portals and we came into a product organization that’s first question to me was we don’t know who are users are or what they do. So, there it was just like starting at the very bottom and trying to help inform product who was just desperate for information. So, that was a lot of talking with network engineers. And I have mixed feelings about personas, but that was actually one of the best scenarios I’ve seen for why you build personas. Like if someone is like we don’t know who our users are, maybe you build some personas to explain who your users are. They didn’t have much access to data either. They had little knowledge of who was using the platform and how.

Steve: So they didn’t know anything?

Vicki: Yeah. It’s interesting. There’s like a spectrum of like not knowing anything to like then working in a place where people assume they are the user themselves and that they know everything, right. I mean both come with their own challenges and problems. It’s probably better to be somewhere in the middle.

Steve: And from AT&T?

Vicki: Came here.

Steve: What’s the role that you came into?

Vicki: Manager of the New York – so I manage the New York team which is all the diner facing. And we had a director that oversaw everything. He left, I guess about a year ago now. And then they gave me 6 months to progress into his space.

Steve: So, is this your first…

Vicki: Director position?

Steve: I was going to say leadership in general around research?

Vicki: I’ve been a manager since JCPenney. But this is first director. But I don’t have a manager beneath me. So, I’m still managing as well, right. Everyone direct reports to me.

Steve: Is that the difference between a manager and a director? Can you tell I don’t have a job inside an organization?

Vicki: I think it’s different for every company, but within our org structure most directors have managers and I do not. So, it’s me and six, soon to be seven, direct reports.

Steve: Can we go back then before this agency – what was it that led you into that kind of work?

Vicki: I started my career in marketing research at a small ad agency in Dallas who handled all of McDonald’s local DMA markets. McDonalds provides an allotment of money for national coverage and then depending on the size of the DMAs, the DMAs are given money to do local advertising and then sometimes the DMAs themselves fund additional advertising. So, my agency handled all the local advertising for markets.

Steve: Is DMA direct mail?

Vicki: Designated marketing area. And through that I started doing – they started investing in like creating microsites for like ad campaigns and we would track those through Google Analytics. And what – I was in charge of helping tag those and I’d have to come up with like what’s the user’s journey? Where should we tag? What do we consider success? What are we measuring? And then I was at a Christmas party and I met someone and explained to them what I did and he worked at Usability Sciences and was like oh, you should come work for us. It sounds like you’re just doing like heuristics. ‘Cuz oftentimes when I went through tagging I would discover like there’s no clear path how to get from here to here and that’s one of like the core paths that users would try to accomplish on this microsite. And so from a Christmas party I got hired at Usability Sciences. Yeah, but it was just – I went to school with a business degree. I came out and worked within marketing research. I did a lot of – I essentially was like a data analyst if I’m honest. And then just through the fact that they had certain jobs that they didn’t know who to give to and they gave it to me. I started working with Google Analytics and Omniture and tagging and then somehow that led to me getting hired as a researcher. I had not even considered that this was a career when I was in college that I could – I mean I was in school in the 90s. No one talked about it. This was not like a career path I’d ever heard of.

Steve: Right.

Vicki: But I did enjoy research. And also I had – part of my degree was business, computer information systems, and so I had a pretty good understanding of databases and database structure from that which helped a lot with doing data.

Steve: So, when you think about your team now, like there’s probably a few different ways that people have come up – come into research. Like what you just described. You said some people have come through different kinds of designs, some human factors trained people – like what’s the…

Vicki: A successful background? I don’t know if – is that what you were going to ask?

Steve: No, I wasn’t, but you can answer a different question.

Vicki: I don’t think there is one. I think it kind of comes down to the person, right. I don’t – I don’t know – I have a weird view of education. I kind of – my mom is English, my Dad is South African and generally there school is considered as a way to make you a more well-rounded person and then once you leave school maybe you specialize in something and then you pick up those skills somewhere else. And I kind of feel like, just from all the different researchers I’ve worked with and the different backgrounds that they have, that I don’t know if having like a degree in human factors actually means you’re going to be the most successful researcher. I’ve worked with researchers who have nutrition degrees who are amazing. So, I think it just comes down to the individual.

Steve: I feel like there was a period of time where most – I think most researchers, some came from a very heavy social science background, but most in industry came from just ad hoc backgrounds like what you’re talking about, or my own background.

Vicki: But that’s changed, right? Or it is changing.

Steve: Yes. And, right, so what does that mean – we’re probably at the early stage of that change, but…

Vicki: I don’t know if we are. ‘Cuz I have a question. Like a couple of years ago when I, like maybe like 2012/13, I was considering a move to the Northwest. And at that time I felt that it was very difficult to have an undergrad degree and experience I had, which was a decent amount of experience, like 6 years, right, and find a company in the Northwest that was willing to hire someone who hadn’t specialized and done a post-grad, or a PhD. Now they might hire someone with a PhD who had no experience in the field over hiring someone who has 6 years of experience, but not PhD. And then I feel like since then that attitude has changed, but I’m not quite certain why that attitude has changed. It seems like people are more open to hiring people from various backgrounds. Is that just my experience? Have you experienced that?

Steve: I mean my sense with research is that demand exceeds supply.

Vicki: There’s a low supply, right.

Steve: So, doesn’t that mean we have to revisit sort of who gets to play?

Vicki: I didn’t know – I thought maybe – I’ve worked with a number of PhD candidates, or people with PhDs and they are very strict and maybe one-minded about how research has to be done. Whereas I think sometimes, especially within the world of business, you have to be able to maybe make compromises, or like maybe take shortcuts, but feel comfortable with those shortcuts, and sometimes I find that people with the academic background are less comfortable taking those. Have you had that experience? And I didn’t know if that was actually what was impacting? Like maybe the application of research within the business world, maybe academics, it was a hard blend. Does that make sense?

Steve: Yeah. I mean my perspective is anecdotal at best.

Vicki: Yeah, mine too. I’m just like guessing. Like I’m trying to understand like is that what happened? Or like maybe your point. Maybe it’s just there’s not enough supply so they’re like we’ll take anyone.

Steve: Right. But is – and I feel like early on I met academics who were sort of early in the industry and who represented some of that mindset that you’re talking about and that just constantly I meet people with high levels of education who are so excited and hungry to extend what they knew how to do.

Vicki: Sure.

Steve: Kind of in the kind of work that we’re talking about. So, what I don’t know is how much did sort of academia shape and maybe limit their mindset. But then I’m also excited about teams like yours where you have different backgrounds.

Vicki: Yeah. I think you want the blend, right? Like everyone should kind of have a different mindset, right. Because they’re all bringing a different lens. That’s one of the reasons I also like designers and product people in the field. Everyone is seeing something different through their lens because of their background and their experience. And so it’s much more insightful when you have a pretty good mix, a motley crew as I like to call it.

Steve: Yeah. That’s a great phrase. And maybe if someone you know, for example, comes from academia and has kind of a here’s how I was trained to look at problems, that person is going to do well among a motley crew where they’re going to be kind of creatively elbowed once in a while.

Vicki: Actually, at AT&T we had a researcher I loved and her background was anthropology and the rest of us were maybe a little more – had come to it a different path, and she definitely pushed the rest of the team in a very good way. It was a good balance.

Steve: Yeah. If you look at research as a team sport which is kind of where you started off.

Vicki: I think it is, yeah. I think it is a team sport.

Steve: Then you’re casting a team to have this great thing together.

Vicki: Yeah. That’s one of the things – a lot of the people I interview are coming from backgrounds where they are maybe one – a solo researcher in an organization, or they’re one of two researchers and they don’t get to work together because they’re only one of two and their desire is to work with a team of researchers and I actually do believe – in my consulting life we always worked as a pair and I actually think it just strengthened you because you were learning from another researcher and I feel like you picked things up and learned just from seeing what that other approach was doing or talking about – like understanding how people communicated things. Like there’s something to be said about research as a team is much stronger. I always feel for people who are doing it – they’re on the path alone, by themselves.

Steve: What do you think about researchers and research in like 2029? Who is it going to be made of? How do we kind of…

Vicki: I don’t know.

Steve: What’s the desired future? How do we get there?

Vicki: I mean I kind of go back and forth on this. I do feel like if companies – I think researchers are always – if research is a team they’re always doing to be less than there are of design and product. And in some worlds design and product need to take on aspects of research, right. And so, then is it that people will just be – we’ll have design and product or maybe jobs will begin to blend together more and there won’t necessarily be a dedicated researcher? But I can go back and forth on that where I’m like no, I kind of thing you’ll always probably want at least a research team who can oversee and maybe help teach people how to do their own research, that maybe don’t necessarily do all the research on their own, or it’s not its own kind of thing, because I do think the more involved product and design are in research, the more that they take on, the better they become. So, I don’t know. The future is – I’m wondering if design and product and research will all kind of blend together and people will begin to own all three of those skills which might be a lot to ask from someone.

Steve: But even think about your motley crew metaphor. You’re talking about sort of motley crew – it’s hard to say without sounding like I’m saying the name of the band.

Vicki: Well it is the name of the band.

Steve: I know, but I feel like the emphasis is different. Motley crew, I’m trying to just use the original phrase, but it’s coming out like the band. If research as a team, the way you have it now, is built up of a motley crew, imagine – what you’re talking about makes me think about like if you’d sort of change the departmental structure there’s a different make up of a motley crew that includes the skills of design and the skills of product and the skills of research in a diverse way.

Vicki: Sure.

Steve: And so what is that? Is that a new thing that’s like a hybrid?

Vicki: I don’t know the answer. I mean my tendency is to not be that extreme. I think that design and product and research will have to blend because they talk about like embedding design and research and product together, but then at some point people are going to be like, well can’t the designer also do part of the research, or maybe own the product? So, it will blend, but to the extent I think depends on the company and where they are in that evolutionary process. And in some ways, I mean design, even research, like to take that on – I feel like researchers themselves wear a lot of hats and then designers doing researchers, that’s a lot more hats. And then mixing product – I don’t know if a human is possible of like being that multi-faceted, right? So, I don’t know what it will look like, but I do imagine that they are going to have to blend together more. I don’t know exactly what that will be. Having a research team, especially if these research teams working more in like a consultancy basis because they’re so much smaller, and knowing that you build a better product and design organization by them being more involved in research just lends me to believe that there will be more blending, but I don’t quite know what that will look like and to the extent of it, and I’m sure it will be different on a basis by basis situation, depending on how evolved that company is.

Steve: That’s a very good, specific, non-conclusive…

Vicki: Did I just dance around that?

Steve: Well, we’re talking about what the future is going to be and how to get there. So, it’s not like you have the magic answer.

Vicki: Yeah, but I don’t want to be like it will be this, ‘cuz the future – do you ever read Chuck Klosterman? Are you a Klosterman fan?

Steve: Yes.

Vicki: Did you read his last book about where he’s like essentially everything we know now might be wrong? And it’s like trying to think through what we know now
and how that might be wrong and therefore what multiple instances of the future might be based off us being wrong now is the only thing that we know.

Steve: Yeah.

Vicki: So, that is leading me to my roundabout way of answering that question. It’s a very good book if you haven’t read it. It’s really, really good.

Steve: That’s a nice pivot from my Motley Crue reference.

Vicki: To Klosterman. It melds well.

Steve: Okay. It seems like maybe that’s where we should look to wrap up. Is there anything else that we should talk about?

Vicki: I don’t know. Have I answered all of your questions?

Steve: It’s not possible to answer all of anyone’s questions.

Vicki: That’s true. I mean my greatest fear for this is that I would be your most boring interview.

Steve: Well, my eyes were just fluttering. It’s for the listeners of this to decide if they’re bored or not. If they made it this far.

Vicki: They’re engaged in some aspect.

Steve: Some aspect. Whether it’s just rage listening, or, I don’t know.

Vicki: Rage listening!

Steve: Is that a thing?

Vicki: I’m assuming it must be for someone.

Steve: Alright, well thanks very much for being a guest. It was great chatting with you.

Vicki: Great chatting with you as well.

Steve: All right. That’s the wrap on another episode! Subscribe to Dollars to Donuts wherever you get podcasts. If you’re using Apple Podcasts, how’s about giving the podcast a rating and even a short review. This helps other people find out about the podcast. Portigal dot com slash podcast has transcripts, show notes, and all of the episodes. Follow the podcast on Twitter, and buy my books Interviewing Users and Doorbells Danger and Dead Batteries from Rosenfeld Media or Amazon. Our theme music is by Bruce Todd.

Jun 27 2019

57mins

Play

Rank #10: 21. Ruth Ellison of Digital Transformation Agency

Podcast cover
Read more

In this episode of Dollars to Donuts I speak with Ruth Ellison, Head of User Research at DTA, the Digital Transformation Agency in Australia. We discuss the challenges of user research – and digital product development – in government, embedding researchers into product teams but maintaining a guild model to connect them, and how research can impact policy.

My role is really more of an enabling function, looking at how do we bring in the right people into the teams? When they’re here, how do we help mentor them? I’m connecting them to other researchers in our communities. Also trying to look at how we lift the conversation around research. Part of my role is about that strategic aspect of research. How do we do it better? How do we help enable the broad decision making of government? – Ruth Ellison

Show Links

Follow Dollars to Donuts on Twitter and help other people find the podcast by leaving a review on Apple Podcasts.

Transcript

Steve Portigal: Greetings and thanks for checking out this episode of Dollars to Donuts, the podcast where I talk with people who lead user research in their organization.

Coming up in San Francisco on September 13th, I’m teaching a public workshop – Fundamentals of Interviewing Users. I’ll put the link in the show notes. I bet you know someone in the San Francisco Bay Area who would get value out of this workshop and I would appreciate you recommending it. I also work with organizations directly to help them elevate their user research practices.
Of course, supporting me and my business is the best way for you to support this podcast and help me make more episodes. If you have thoughts about the podcast, reach out to me at DONUTS AT PORTIGAL DOT COM or on Twitter at Dollars To Donuts, that’s d o l l R s T O D o n u t s.

I went to a cafe in my neighborhood. I placed my order and then swiped my card in the payment terminal. They told me “We’ll call you when your order is ready” and I went and sat down. I heard a couple of orders get called, “double cappuccino, soy milk latte” etc. After a few minutes they called me: “Steve!”

I was briefly taken aback. They never asked me for my name, how did they know that order was for me? I realized that my when I paid for my order by credit card, of course they got my name. But this seemed like a new customer service behavior. I was curious so I paid attention the next time I went to Starbucks. They asked me for my name. They do this before payment. My local Starbucks is inconsistent as to whether or not they ask for my name and whether or not they call out my order by the contents of the order or by my name.

There are many regular routines that we go through that become almost scripted, so when something goes off-script, like being called by name when I was never asked for my name, it really jumps out. Eventually the script gets rewritten and we regard the change as familiar, but those moments of change are sometimes tentative moments in an experience. This cafe could have asked me for my name not so they had my name, but so that they could signal to me that they were asking for permission to address me by name in a few minutes. I’m sure there are cases where the name on the credit card doesn’t match to how someone prefers to be addressed.

Maybe I’m just too sensitive in noticing this change, to find it an abrupt surprise. But you can just imagine the well-meaning coffee shop staff feeling excited about being able to do this, to get the customer name and call out the orders in a more personalized manner than just “double americano.” They could, but did anyone stop to think if they should?

Note that I’m not complaining about my service experience, just reflecting on it to suggest that it’s an interesting moment, when things to change. Knowing that things are going to change is an opportunity to get ahead of that change and try to understand more deeply from the people who use that service what it is that they are expecting, and where there might be mismatches between what you want to do and what that change will mean for these customers.

Well, on to the interview. It was wonderful to get to speak with Ruth Ellison and I think you’re going to really like our conversation. She is Head of User Research at the Digital Transformation Agency in Australia.

Ruth, welcome to the podcast. Thank you for being here.

Ruth Ellison: Thank you, Steve. Glad to be here. Thanks for having me.

Steve: So, why don’t we begin as I often do – as we all often do, I guess – ask you for an introduction. Tell us, what do you do?

Ruth: Hi. My name is Ruth Ellison. I’m the Head of User Research at the Digital Transformation Agency. So, at DTA – we call it DTA – we love our acronyms in government. So, DTA is a small, and relatively new, government agency set up to actually help Australian government create simple and fast clear services. So, this involves improving the skill sets of people who work in the space, the digital space, and also just helping looking at projects and how we do digital transformation across government.

Steve: So, what is digital transformation?

Ruth: Oh, that’s the million-dollar question.

Steve: Do I have to pay a million dollars to get the answer?

Ruth: It seems that we’ve got a digital transformation strategy. Um, for me it’s really about how we’re transforming the way government thinks and approach problems. How do we help our citizens interact with government in ways that are better? So, we use digital, but really for me it’s not just about digital. It’s about the people. It’s about the services and how we move to much more human centered lens to problem solving and problem definition and even just the way we deliver our services.

Steve: And then who are the – I don’t know if clients is the right word, but who do you engage with within governments?

Ruth: So, the DTA has a very interesting function. We’re a centralized government agency, very small. There’s only a few hundred of us. Our clients are actually other government agencies, mostly at the federal level. Because we have three levels of government in Australia – federal, state and local. So, we work mostly at the federal level, but part of our function is also looking – we have services cross between boundaries of government and how does that work across? So, our clients are mostly at the federal level. The other government agencies, often very, very large or very small, it doesn’t matter the size. What matters is what are they doing that involves interactions with their users, which is normally citizens or businesses and organizations.

Steve: What are different ways that those relationships get initiated in government?

Ruth: This is interesting for us. A lot of work comes through, depending on the size of the project, as opposed to the size of the problem, is actually involved in investments that’s over a certain number of dollars, but we’re also interested in projects that have a social impact as well. So, if it meets a range of our internal priorities we’re keen to get involved. So, the internet really involves a way to transform the way government interacts with citizens. We’re interested in reaching out and working with people. So, it’s not just government. We also work with government and not for profits and other private organizations to work out what’s the best way to collaborate and finding ways of working. It’s really exciting.

Steve: Can you give some examples, over the last few years, what kinds of things that DTA has worked on?

Ruth: Yeah, so part of our transformation agenda is how do we uplift the skillsets required to work in these ways? I’ve been involved in a few projects back in earlier days of DTA – it’s called Digital Transformation Office. One of them was actually around looking at how people – this might sound really boring, but it actually was fascinating, about tax obligations. So, we’re looking at the space of how do people start up small businesses and what are the kind of challenges and barriers that they face when interacting with government. So, we went to the full discovery process, going through discovery. We do alpha, beta and live. So, discovery was really about what is this problem space looking like for starting a business. As a government we have a lot of assumptions around how people interact with us in our role within people’s lives, but it’s really – discovery, quickly discover that there’s a lot of other factors and lots of other interactions that happen that can be quite surprising. Based on that the team actually – we actually narrowed it down and actually looked at this particularly interesting space, like the maker movement and the people who are making a lot of jewelry, or they’re crafting beers, or they’re doing very niche kind of things where it’s kind of a hobby at the moment. They start shifting over into potentially a small business. That space is fraught with a lot of questions and uncertainty from our citizens because they’re not sure if, “what happens if I don’t say anything to the tax office?” Do I get a tax debt down the track? You don’t want to end up with a $20,000 debt. It’s very scary. How do we actually help solve that particular problem? So, through this kind of discovery and alpha process we narrow it down and actually helped, I think, three agencies, work on this particular problem and how we define what people’s obligations are to the government, and how we can make that a little bit easier?

Steve: What gets output from that process and what gets made or implemented by an agency as a result of that kind of program?

Ruth: I think people hear the word digital and they tend to assume is it a digital product? In this case it was, but some other projects were not. In this particular case we end up with like a little smart, online, questionnaire toolset, but we didn’t know that was what we were going to have. The team was quite deliberate in starting out very open to look at what is the actual problems and what’s the context of these lives that we’re trying to understand because we can’t just assume that a digital tool is going to solve that particular theme. And it’s a bit of a mind shift for government because often it’s quicker just to go down, and just – we’re going to have an app or a new website to fix this. And part of this way of working is going actually, it’s not actually true. Are there other ways we can solve this? And how do we make people’s lives easier. So, we did end up the digital thing and end with this and it actually crosses over multiple agencies which for those that don’t work in government, it’s actually a very challenging space. Agencies tend to work in silos, just for the nature of the way we fund projects and the way we work, and the way skillsets are built, that it’s very sell into one agency at a time. So, to have something that worked across multiple departments and agencies was actually a really amazing achievement and was well done to the teams for being part of this process and actually being willing to change as we went along and learn lots of new stuff about our users?

Steve: You’re talking about the teams in these agencies, that they’re open to that?

Ruth: Yes. So, I think in the earlier days of DTO or DTA people treated this as a bit of an experiment. So, what we did was we formed multidisciplinary teams. So, we didn’t have user research going out doing research and throwing it over the fence. We’re trying to shift that mindset because traditionally a lot of research happening to government is very much on the valuative side. We do a lot of usability testing. We do a report so you kind of give it back to the teams and they do stuff with it and you hope they implement some of it. This was about trying to change the way we approach research, but approach the whole problem landscape. So, the team was formed from, I think, 3 or 4 different agencies, including us. I was actually one of the researchers embedded into that particular team at that time. So, this was before my Head of Research job. And then we actually worked with the teams, with developers, business subject matter experts, taxation experts, and my favorite, a lawyer, as well on how we actually tackle this. And the reason we start looking at those kind of skillsets was because we find if we’re just doing a very IT focused problem and we don’t then look at the problem from a human lens, it comes down to a very technical solution. So, by bringing people like the business folks in and the lawyers, we start having this kind of transformation that occurs that’s very different. So, the lawyer, I was working with her and before she joined the team she was saying we have to send stuff up to her team and they’ll approve the wording, and everything go out to the users. And it was slowing down the processes of how we do design and research. So, the team asked her to join our team and she became embedded. She went from this process of “research stuff, that’s not really what I do. I’m here to be your taxation expert and your lawyer, or the legal expert” to coming on all the research and just changing her mindset on how we actually approach the way we word things. So, going from “we can’t say that because it’s not how government says things,” to “that’s not going to work for the people we saw,” was a massive mindset shift and it was fascinating.

Steve: What do you think – how did you create the conditions for her to have that kind of personal/professional transformation?

Ruth: So, that’s an interesting question. It really comes to the whole team. We were very lucky to have a team of people from different agencies who were willing to try these new ways of working. A lot of them had done our job before, but not necessarily worked where you have all these multiple skillsets in one team, embedded for a 20-week process of trying to go through a full discovery all the way up to releasing something. So, part of that was we have social contracts in place as a team and how do we create a safe environment – psychologically safe – where we can raise things and challenge each other and it’s okay to do that? It was an active work every day. You know the whole team is working on these kind of ways of working to say it’s okay to come along. It’s okay to not agree with things, but come along and let’s just try having an open mind when we’re going out to do this research. And let’s just observe and we’ll come back and as a team we do our analysis and our synthesis together. And that really helped her and the other team members transformation of their mindset because everyone was involved in the process. Not just me as a researcher bringing somebody else out with me. And I think that was a significant mindset change.

Steve: So, this group of people, 20 weeks – like how big was the team?

Ruth: The team was – I think we had about 8-10. The size of the team changes and even now in my role as Head of Research we help teams to understand that the team does change. The nature of the team changes over the course of these phases. So, that was – we kept it small for that particular team because we had to move fairly quickly. We had to deliver something. But we also had to show the teams how we could uplift the kind of capability in both research and the other kind of specialist skillsets. There was agile skillsets. There was things around development. But my job was to lead the research transformation.

Steve: When you say uplift, does this mean that people that participate in this 20 weeks come back to their agency afterwards with – they’ve gone through some professional development?

Ruth: Yeah. So, the goal is for them to go back to their home agencies or departments and actually continue the momentum. So, we learn a lot about that process. That’s a whole = how do you build momentum and continue momentum when you’ve taken the team out of their context in their daily working lives? So, they were embedded into our agency for that 20 weeks where while we create the space for them to do that. We had the walls. We had the equipment that we could take them out. It was very different when they went home. So, we learned a lot about that process and we iterate throughout our services as a result. But for them it’s about how do we enable those kind of processes and systems for them to learn these things and to take home and keep going with it. For example, they had somebody whose job was, I think, to shadow me and actually pick up the sort of research skillset so they could try to take that back home and do a bit more of a deeper dive in research.

Steve: It seems like there’s two sort of – I’m sure there’s multiple objectives, but it seems like one objective is go through this 20 weeks in order to launch something – in this case this sort of small web questionnaire that you created. I think it’s on the web?

Ruth: Yeah, yeah. It’s live.

Steve: So, the project has an objective and an outcome, but the objective for this whole engagement is to uplift the skills so that people can keep going afterwards?

Ruth: Yeah, yeah.

Steve: Are those two objectives in harmony when you do a program like this?

Ruth: That’s a really good question and that’s part of our learnings from this, about what is the measure of success when we’re talking about increasing kind of skillsets within the research space. They’re kind of what we call the specialty skillsets. How do we be clear about what that measuring was? I think part of my learning personally going through that was that wasn’t quite as clear back then. So, we had some tension at times where the team needed to show something to deliver and then at the same time I’m trying to uplift the research capability to show the rigueur, to show how do we work in these ways? Now our screen cycles were only a week at a time. It was very, very fast. And taking people through a very – what was like a very uncomfortable process. And it’s uncomfortable. They’re getting challenged every single day and our job is to help them through and navigate through that process. So, yeah, it creates tension. But, as I said, we have integrated as a result of learning, going through those programs to look at how do we now better serve our customers which are the other government agencies?

Steve: Where – if we were to sort of draw an org chart in the air – maybe it’s not an org chart for the government – where does DTA reside within the structure of the Australian government?

Ruth: Yeah, we have an interesting function. So, we are not a delivery arm. We touch on policy work. But we sit under the Prime Minister and Cabinet kind of function. So, what that means is we are this centralized agency. I think maybe the closest equivalent might be the 18F in the U.S. A little bit of GDS in the U.K. I’m not sure what the Canadian version would be. We have a little bit of that kind of function where it’s not tied into any particular agency because my job is to work across government agencies. But that brings its own challenges around – particularly around the research practices, how we – I’m going to come back to this at some point about how we manage our knowledge across government. A lot of research is happening across many, many agencies. So, we’ve got this uplift function, but as my role as Head of Research I’m also interested in how we manage those other functions within government as well as it relates to research, as a point of research practice?

Steve: Sorry, other functions?

Ruth: Yeah, so at the moment, because I do a lot of – I help teams do a lot of recruitment, but it’s internally within our organization or other government agencies. So, they reach out and go, “Ruth, we need to hire a researcher. We’re not quite sure what to look for. Can you help?” So, it’s about reaching out, how do we help them and craft – how do help them find the right fit of people? Are they doing a discovery piece? Or are they doing just a lot of very heavy product work that’s just iterating just a product day to day to day? And trying to work out what is it that they actually need and how does that work and what kind of researcher would they need? So, helping them find that right fit because we don’t have a pool of people. We’re a very small agency, so we just can’t pop out and just embed into teams. We did that before in the past. It doesn’t scale for us, so now we’re working how we support those functions? Other things we’re also looking at is how do we manage the knowledge that we build with government among our citizens? We have so much research and we spend a lot of money as a government looking at our citizens experience, their interactions with government, but it’s often done in those individual organizations. So, they might have lots of data internally, but what happens when we start needing to look between those touch points, between the different agencies? How does research work in those spaces in between? Because people are having a very agency focused view of their citizens. So, part of our – one of our functions is actually looking at how we provide a more holistic lens across government to show that when a citizen engages in government it’s not just a departmental time, but it’s actually a range of organizations, whether it’s government, private sector, friends, networks – whatever it is. How we show that relationship in the context of people’s lives, so it’s bigger than just one particular service or element of a service?

Steve: Are there aspects of how government is structured? I don’t know whether that’s regulatory or funding? Because every big organization has sort of siloed research and siloed data and doesn’t treat the user experience as coming across all those things. I think that’s common above a certain size, but I’m wondering are there any interesting compounding factors in the way that government works that maybe changes that – how you even try to address those problems.

Ruth: Yeah, it’s a very good point because different government agencies have different rules around data collection and different legislation around how much and what data you collect. There’s often discussion that as user researchers, the kind of data we collect may not necessarily be the same kind of data that the legislation demands that we collect a bunch of other things. So, often there’s tension of going well, we’re here to understand about our citizens. How do we collect that – the people might go the consenters only allowed it to be shared within that particular department. So, it’s like hitting up against issues around consent and do we start looking at broader consent around all of government? Or is it mostly just federal government? Or is it just between portfolios of government? We haven’t nutted out that issue yet. So, it’s a challenge that we’re working on right now because content is actually a big, big problem space for us. Like we want to do research that’s ethical and safe and it’s really challenging when you’ve got multiple agencies who are funded separately. Back in the private sector days I will come, and I will do work for one organization at a time. This feels like we just up the whole scale of the problem because now we’re taking it across multiple agencies who may not necessarily want to share, or be as comfortable sharing, what they have. For a number of reasons. Whether they feel it’s a reputational risk – because we deal a lot with risk as government. There’s always focuses about risk. And I think about what happens if people find out people said that about our service? We don’t want people to know that people say that about our service. So, you get those kinds of complicated…

Steve: Oh, so there’s a risk to the agency.

Ruth: Reputation.

Steve: If the feedback or any kind of user data that’s critical of that agency that becomes public – if you’re a civil servant you don’t want that.

Ruth: There’s some mindsets – I believe they call it reputational risks for government. So, it’s one of the challenges heading up research is how do we deal with that kind of mindset and how to shift some of that thinking as well. So, a lot of framing I’ve been using a lot more recently is about this is about risk reduction and increasing the certainty in the decisions we’re making that’s policy or your product or your features – you know whatever scale we’re talking about, research helps you with that decision making. We’re not something that you just do on the side and just throw it over. That actually becomes part of your decision-making processes and one of your many inputs. Because end of day we understand that people have a lot of pressures and they’re making decisions about why they release some things, whether it’s a policy or a government service. But how do we help them do those things a bit better. So, that framing around risk reduction or reducing certainty – it speaks a language of some people.

Steve: That also sounds like private sector product development. The consequences are different. It’s not just about finances. It’s about other – I’m gathering that you’re in a situation where there’s other consequences besides profit of the organization in the commercial sector. So, you said reputational risk is a very tangible thing that people are kind of operating around.

Ruth: Another challenge that government – we work in a four-year cycle over here in Australia. So, the cycle of government, if it’s incoming, is it a different change of government happening? The agendas may change as well and that would change the focus of your research. When you’re trying to be a little more strategic with research it can change that function, or not. Like we have that additional level of think of. So, although on the one hand we still need to understand our users, our citizens, no matter what. No matter what government is in power we still have – it’s still a part of our function in the role as a researcher to keep that understanding going and to keep developing that deep understanding. But the stuff that it aligns to in the strategic decision making does change and that’s always an interesting factor in working with government. It’s what actually made me – it’s one of the things that made me leave private sector to come here was the complexity of all these kind of nitty gritty things that look so simple on the outside, but it’s actually really complex and really messy. And I love that.

Steve: Okay, what do you love about that? That sounds like something that some people might run away from, but you’re drawn to it.

Ruth: Because it’s hard. It’s really, really hard. I’ve been a consultant for 7/8 – for many years – and I enjoyed that breadth of consulting. But what I found as a consultant, I would go – we’d go in and we’d help people do the – you get the joy of seeing something happen, but I was – I reached a point in my career where I wanted to see that really affect change on a much greater scale and affect it in a much longer kind of timeframe as well. So, I was at a point as a consultant where you do things and it affects a particular project or a program. And actually, when I started looking at how do we affect the whole scale change in government and how do we start shifting that mindset? It was hard to do it as a consultant. So, I thought the only thing I could maybe try doing is to join the public service. So, 2 ½ years ago I left the private sector and I’d come in to try to do this with a bunch of other really smart people across all levels of government, trying to look at how we do this transformation. And it’s hard and I like that because it’s – you’re in for the long haul. You’re in for this massive change. You know it’s not easy and we’re doing this kind of research that’s really – you look at the behavior in organizational and government level change and it’s just complex and it’s fun. Exhausting, but fun.

Steve: So, you’re describing in the consulting world there’s moments or points in the process where you draw satisfaction or some sense of reward from.

Ruth: Yeah.

Steve: And now you’re in this much more complex situation. It’s hard. I’m sort of inferring things take time. What rewards do you identify yourself? Whether it’s milestones or successes, like what keeps you going in a very hard endeavor that you’ve chosen?

Ruth: For me it’s about the empowering of others. One thing you said, it’s about we help people. We like to help people. That’s why we do the role that we do. And for me this is about how do we help at scale? Our government has been doing things for a very long time and there are some very, very smart people in government that often bog down in the structures of government, the bureaucracy, that sometimes doesn’t enable the kind of stuff that these smart people are working on. So, what gives me satisfaction is being able to help some of that transformation. To help others empower them to do this stuff better. And the satisfaction I get is when somebody has that kind of mindset shift and they’re taking this stuff back home. Or, they’re coming back to me later and go, “Ruth, it’s been a year since I’ve seen you, but here’s what I’ve done.” And they themselves have gone back and been this change agent for not just research, but for just human centered – good, human centered thinking. And it’s great when you see that kind of stuff happen. So, helping others to do this stuff better.

Steve: You said the long haul and that makes me think as someone who is a consultant, you’re done. You’re not there to get that feedback or to see the consequences. Or you have to kind of find your way back in to say hey what happened a year later? So, if these are longer term working relationships that you have, you’re able to hear from those people and see kind of what’s happening.

Ruth: Yeah.

Steve: Maybe we can talk about the Head of User Research role. When did that come about? How did it come about?

Ruth: So, Head of Research, I’ve been doing that for about over a year, maybe two years now. We had a previous head of – it was actually design research – Lisa Reichheldt, who you would know. When she left they actually split that role up to two. Her role was an executive level role, but they’re not splitting up the role and taking it down a level. So, it was an interesting organizational change I think – into a Head of Service Design and Head of Research. We also have Head of Content Strategy and Head of Interaction Design who are my equivalents. So, my role involves helping my agency, and across government, to really better the practice of research. I find I’m doing a lot of it – I don’t do as much research anymore nowadays. My role is really more of an enabling function. So, part of that is looking at how do we bring in the right people into the teams. When they’re here, how do we help mentor them? So, part of my role is I catch up with a lot of the researchers and check in how they’re going. I’m connecting them to other researchers as well in our communities. And also trying to look at how we put together much more – how we lift the conversation around research. So, part of my role is actually about that strategic aspect of research. How do we do it better? How do we help enable the broad decision making of government? We’re also starting to see a shift to much more policy sides rather than just product and services – up into policy land where a policy can trigger many, many services and products. So, that’s really fascinating as well. And I’m really enjoying that kind of aspect as well. The other parts of the role is really – we look after a guild. So, I don’t know if – I’ve heard other organizations have this structure too. So, our researchers who come into our organization are embedded into product teams, but we have a guild that then unites all of those researchers and those interested in research to come together once a fortnight and actually share the knowledge. So, it could be – for example, I’ll be going back and doing a sharing of what I might learn at a conference. Somebody might share about something happening in their project. A problem that they’re having. How do we solve it together? So, the guild is almost like a tribe that cuts across our organization no matter what team you’re from and it unites us as a practice. And that’s how I work through that kind of process to enable the better uplift of our practice.

And externally, my job is just to advocate for the research role. So, I often get called in to talk to other executives about what does this research mean? “Don’t we do this stuff?” “Haven’t we been doing this stuff for years?” Answering those kind of questions. Look at how it helps them in their strategic perspective. How it helps in their decision making? And also, just help their staff connect better and just do the craft better.

Steve: You said before how one of the things you’re doing is helping agencies hire into that agency as opposed to into DTA – hire researchers directly into those agencies.

Ruth: Yeah.

Steve: Are those – and then you’re talking now about sort of maintaining connections with researchers through sort of formal and informal methods. I’m just trying to understand, is that within DTA only? Is there engagement with these other agency researchers, I guess I’d call them?

Ruth: Yeah. So, the supporting through the guild is a DTA only thing that we do. Or anyone that comes into DTA on a short-term project, we connect them up into the guild, so we can support them that way. But for the broader government we actually got a Google mailing list. It’s called a service design user research mailing list. It covers design and research and we support people through that. But it’s not just us – we’re a facilitator, so our job is to connect others to each other because we’ve got amazing people out in government and if they connect – and peer – we learn from each other. That’s a really good thing. So, we’re not the be all to end all. We’re that connecting function. I think some of these we’re brokering. You know we help broker relationships as well. Otherwise, it can come across as oh we have to go through DTA, or do we have to be involved in all this stuff? No, because it’s about enabling others, empowering others to connect and keep growing together.

Steve: So, there might be a researcher in one agency that you know and another researcher in another agency that you also know, but they don’t know each other.

Ruth: Yeah.

Steve: And if there’s something going on you might say oh this is the researcher you should talk to.

Ruth: Yeah. So if somebody – say they’re trying out a particular technique for the first time and I go so and so from here has actually done it before, have a chat to them about how they do it because they would have some good learnings in the government context of what worked and what didn’t work and it will just help speed you up when you try and do this yourself.

Steve: So, in DTA itself, you listed a bunch of different functions that you have kind of peers leading. I’m sort of wondering how the team sizes vary? I don’t know if team is the right word, but sort of the number of people maybe in each of those functions?

Ruth: At the moment I think we’ve got around 13 researchers, but the number varies significantly across DTA depending on the projects that we’re supporting at the time. It also doesn’t include the projects that we support that belong to other agencies. So, that number scales. At one point I was working on project that had 18 researchers and that was just within the one project. So, that was a good question there about scaling and processes and ethics and all of that. But the number changes and what’s interesting is that the recruitment is done by the product owner and we provide a supporting function to help them get the right fit. But it’s done by the products where the person comes – they join the product team. So, they don’t belong to my team of researchers. It’s more of a virtual kind of team if that makes sense. It’s where the guild/tribe comes in. And those product teams, they usually range between the kind of 6/10 kind of sizing and there’s usually a dedicated research function in there. Not all teams can have one because as a government we often have budget constraints. So, we have people who do research who are not necessarily researchers. But we try to encourage that specialism, whether it’s a researcher or a service designer or an interaction designer because they bring a depth and they’re upskilling people on their team as they’re doing it too.

Steve: So, someone could be hired into DTA as a researcher, but the thing they’re spending their time on is a project team?

Ruth: Yes. Yeah, that’s right. Although I try to steal some time to work on our community, our practice matter things. One of the things at the moment is we’re reiterating through our processes and policies at the moment. So, through guilds – we use the guild as a bit of a working spot as well. We come together one hour a fortnight and we use it to work out what we want to tackle next and how to improve this particular process. So, I try to steal time that way from those teams to enable our better practices.

Steve: For Australian government employees, how geographically dispersed are they?

Ruth: So DTA, because we’re so small, we’re based mostly in Canberra which is the capital of Australia, and very small. I think we’re only about 400,000 people that lives in Canberra. Our other office is in Sydney which is even smaller. What makes it interesting, when you have that kind of diversity in spaces, is that other agencies are even bigger than us and they have many more people scattered around Australia. But the challenge I think that comes to research is recruiting. Finding strong talent in Canberra can be challenging as well and often people – I love Canberra, but lots of people – there’s always jokes about Canberra. Honestly, it’s too cold. It can snow in winter. It’s cold. It’s cold in Canberra and it’s hot in the summer. It’s small. We’re inland. People have the stereotypes with it being a government town. So, there are all these stereotypes around Canberra being a boring place. It’s actually not. It’s actually a beautiful place and I love it. But recruitment becomes really challenging. How do you encourage people to leave their beautiful cities of Melbourne? Leave Sydney and come spend some time. So, part of it is trying to shift the conversation around how do we work on more meaningful work instead and shifting the conversation around that. So, I think I always joked to Leisa before about do we need to have a recruitment video to show people, hey, it’s great, come work in Canberra. But it does change the way then on recruitment and how we uplift capability within existing skillsets as well.

Steve: Can we go back to something you were saying about policy? I want to understand maybe the mechanics of that more. I understand that the policy is set by parts of the government and then, as you said, that impacts the products that are being created. But you were describing maybe a different cycle, I think, or a different stroke in that cycle where the work that you’re doing can – is it correct to say that it’s impacting policy?

Ruth: Yeah.

Steve: What’s the dynamic there, I guess, is really my question?

Ruth: So, there’s been a shift – more recent shift in government. And I think you’ll see that globally – in global governments as well about how we shift this more human centered thinking and lenses over to policy lens. Traditionally in policy I do a lot of research. When you create a policy, they go out and do big scale studies. There are certain methodologies for policy research. But part of what we’re trying to help is how do we get the policy folks who are creating this policy actually out in the field, actually talking to the people who are impacted by this policy, and actually going out into the communities that will be impacted down the track by this, and actually understand that context. So, some people do do that already, but it’s not necessarily the norm. So, we’re trying to look at how do we shift some of that thinking and that’s part of my kind of focus at the moment is how we start working towards that lens and it’s really interesting. Because you know people have been doing policy creation for years. It’s not a new thing. A research lens is not a new thing, but the way we do our kind of research is actually new for them. And trying to help people, going, actually this stuff that we do is not just a digital thing. There’s a bit of a perception that oh, this kind of research is to create digital products, right? And you go, actually no, we can actually apply it across a range of things. Because it’s really about understanding people, understanding what is it like to – whether it’s government or whatever it is – what does a person’s day in the life sort of look like in this space. Especially when it’s government, we’re serving such a diverse range of people from our indigenous communities. Those are very disadvantaged. To those that might be on the other end of the scale as well. All the way to large scale businesses. It’s such an amazing, diverse set of users. So, how do we kind of work together to shift that thinking? It’s a good challenge and one that I think will take time as well.

Steve: It seems very analogous to working with those agencies – your phrase of uplifting their skills – those product people are starting in a different place than your policy people are, but it sounds like it’s the same thing. We want to help you to understand the benefit of going out to the communities and seeing what’s going on.

Ruth: Yeah. As I mentioned earlier – I remember starting government 17 years ago working on range of systems and our work involved doing usability testing. You see people using your thing. So, research in that aspect is not new, but I think the more general style in exploratory style research is new, or newer for government now because it’s been happening for the last few years. So, seeing that shift has been really interesting, to say how do we help you do more of that style of research as well. And they’re all very valuable and the stuff you were doing before, usability testing was great, and it’s got a purpose, but how do we shift that conversation to talk about are we actually solving the right problems. I think that comes under the guise of maybe a lot on the innovation labs and design thinking. At the end of the day for me it’s about the fundamentals of who are these people we’re designing for and what problems are we trying to solve, and how do we understand the needs deeply? How we get there – there will be a range in methods – how to help people work out what that combination of methods are.

Steve: When we talk about research methods that’s often about how do we get this data from these users we’re trying to understand but your emphasis seems very strongly on how do we engage the different people that work in government to learn that and do that and use it? It sounds like – the word methods to me would encompass what you’re doing there as well.

Ruth: Methods, approaches. Yeah, it is. I mean it’s one of the things – words have meaning, and words are powerful. Within government, when we talk about research, you’re talking about methods. People go what are we talking about in there? Are we talking about market message? We do a lot of that. You know is it just a design thinking thing which is also a big buzzword. Or is it this agile thing. Being clear about what’s the outcome we’re trying to achieve as government. I use the word methods, but really, it’s about that focus of we know we have an outcome, how do we best meet that and make sure we’re doing something that’s good for our users that doesn’t disadvantage them, but hopefully improves their lives as well.

Steve: It would be great to talk a little about what your trajectory was through your education and your professional work that kind of led you on the path. Talk a little bit about the consulting work before government – maybe just go back a little further and talk about where did you start to see this path that you’ve been on?

Ruth: I actually totally by accident fell into it. I started off as a business analyst designing mainframe screens. That was over 17 years ago, my first kind of proper design kind of job. I had no idea what mainframes were at that time. What is this thing? And it was just by the luck of the draw, I happened to be sitting next to a team that was the very first user centered design team in Australian government. So, I would hear them. They go out and they do these things. They come back, and they would share these stories about the people that they were meeting and the kind of feedback they had about the products they were testing. I said what are they doing? This stuff is really interesting. So, being a curious person, I stuck my head up over the fence and asked them what is it that you’re doing? Tell me about this user centered design stuff. I never covered that in my university degree. We covered all about systems gathering and requirements gathering, all these different methodologies that you do to do that – never once heard about this human centered stuff before. So, by accident I heard that, and I just asked can I join the team? I thought what’s the worst they can say? No. And then I’ll just keep doing what I’m doing and keep trying to find a way. So, I asked and I joined that team. And so, I had my grounding by learning on the job. So, starting government 17 years ago, just learning about what this user centered design stuff was. Really loved it so much and then went back to uni to try to get a masters in human factors, back in the day.

Steve: What was your first degree in?

Ruth: My first degree was information technology and systems. So, very, very IT driven. So, I come from that kind of analytic lens. But doing more of this stuff I just realized I love research. But it was a gradual thing because I didn’t know that such a job existed. And it could be a reflection of Australia at the time because 17 years ago the whole user centered design movement was happening, or starting to happen. Companies were getting set up to do this kind of space. Government agencies were thinking about this stuff and doing usability testing. That’s what my UCD was at the time. So, that was my first foray to this field. From there, and I think like many other people in Australia, we were generalists. We came from UCD to a UX generalist. We did everything, including research and prototyping, everything. But I found soon enough – I quickly realized that research was my passion. I just wanted to do more of this stuff. I think I just got really interested in what people do and say and think and why do they do certain things. And that was fascinating. So, over the next 17 years I ended up leaving government and then going out into consulting. I was head huntered to come and do this work and it was great. So, I did that and was working in the private sector for quite a while and had this broad range of experiences. And then from there, that’s when I decided okay, I’ve been doing this stuff for a while, I’m enjoying it. I’m enjoying being a principal and helping others be better at the consulting craft. But that’s when I then came back to how do I do this longer term, impactful thing around government because that’s where I live and I’m really passionate to see how they improve government. There’s just so much interesting work happening in that space. So, that’s when I made the leap back. So, I started off at DTA as one of the lead senior researchers that they had and then when Lisa left I took over and became Head of Research.

Steve: When you think about the time that you were consulting, as someone who – I’m sorry. I’m going to go back to something else. You started talking about going back to uni for – was it human factors?

Ruth: Human factors – yeah back in the day there was human factors. But we didn’t have all the amazing design degrees we have now. Back then it was – that was the closest thing I could find because I wanted to improve – I just wanted to broaden my skillset in this kind of human centered design space and the only thing we had at the time as human factors. So, I was doing that remotely. I was also one of the first few students who was doing remote uni work. They’d never had that before. It was challenging. And I got to work and learned with a bunch of people who were designing airplanes and aircrafts and hospital systems. And when I’d introduce myself it was like I work for government and I design systems, it was very jarring for me to go wow these people are doing stuff that – massive design things, so if you get it wrong it falls out of the sky or people die. And I was just designing systems. But it was such a good grounding for me because human factors really is about that processing, that cognition. It really drew upon the school of psychology and the sciences as well, from this particular course that I was doing.

Steve: Was there ever a point at which you started to identify yourself as a researcher specifically?

Ruth: That’s a good question. That’s hard because in Australia people have mostly been generalists. That’s how it just evolves. And you had to be in the consulting world to survive because if you – I would just get really jealous – I would see the kind of work you’re doing and go you can be a specialist, but in Australia we just weren’t quite there as a country. We didn’t have the maturity to say I am a researcher. So, I called myself just a UX person for a long time, but research is my passion and I always seeked projects doing that kind of work that’s research and help enable that in our clients and in our practice. I think it just evolved. I don’t know actually when. I just started calling myself that kind of thing. It’s just something that I did.

Steve: So, your first job that you came into DTA with – you had research in your title. Not to conflate the way we identify ourselves with what our job title is, but that’s a point where you went from generalist to specialist. Is that right?

Ruth: If you’re talking about titles, probably yes, because previously my title was principal consultant, UX consultant. And part of my job was – although most of my work I was in was research work – that’s just what I like, so I will do that kind of work. But that wouldn’t naturally go down well with the clients. They just weren’t ready for that in Australia. They were more familiar with UX. We need to improve our experience and however you get there, we want to help them get that outcome. So, although probably 90% of my work was research, the other part of it was actually supporting – oh, probably a little bit less – I was actually supporting the practices as well. As principal, my job was to make sure the quality of all the work coming through was also good. I was doing a lot of that kind of stuff in that kind of leadership role. It was the other half of my job before I left consulting. So, I think the detail – just use the title, to the exclusion of everything else.

Steve: One of the points you were making earlier was about the challenge of recruiting – I guess in Canberra specifically and maybe government is just a part of that. But what was – how did you get recruited to rejoin government?

Ruth: I was actually working at the former DTA, or DTO, as a consultant doing research work for them. Because it was the first time they were going through these ways of working and we were doing discovery, alpha, beta, live. And they needed people who had experience doing research, who could help lead that process and to help facilitate that process. And they were also recruiting for other specialists. So, I was one of the many specialists that was brought on because they had to scale really quickly and trying to build that capability within weeks was not going to happen. So, they had to find it externally. So, I was one of those many externals that were brought in. When I came in I saw these ways of working and going wow, I didn’t know we could do that in government. It’s a bit of a stereotype. People think about government as a very, very slow moving beast, that you can’t do a lot of things. But in reality, it’s about how do we frame it and how do we talk the language of the people who are decision makers to help bring about this change. So, for me seeing that, I was like I want to be part of this kind of stuff. It was a bit of like start up, doing all these ways. It’s like I found a home. So, seeing that was how I then made a leap. So, DTA – or DTO at that time – they kept asking can you come on board and if you like doing this stuff, if you like government, come join. So, I took a while to think about it, quite a while, and then made the leap. Because it is a big leap to leave what is a very interesting job as a consultant and you’re helping so many different clients and so many different problem spaces. You know one day you could be working for an airline, the next one for a bank and the next one for a not for profit. You know that’s really, really interesting. I like that. But there was this need and I wanted to be part of that movement as well, to transform the way we do things as government.

Steve: You’ve made this point a couple of times that even though the D in DTA is digital, the output, the outcomes don’t have to be digital. Do you have any examples where that’s kind of how it’s gone?

Ruth: Yeah. So, there was this great project a few years ago, done within DTA. It’s Medicare – Medicare is our kind of health system where if you’re a citizen you get access to major health services. It’s covered by the government. So, one of the services I was looking at was having a baby. So, in Australia you have a baby. You have to register the baby for a Medicare card. There’s a whole range of process around the registration. And as someone who has gone through it myself in the last two years, it’s actually quite a tricky timeframe. If you’re exhausted from delivering this amazing little thing, you’ve got a million things going through your mind and then you’ve got all these forms you have to fill in. And sometimes that involves, because of identity things, you have to go into offices with a newborn. It’s just hard – it’s hard. And I was particularly emphasizing now that I have a little one myself. So, this particular project is looking at how do we make that process easier. Now what was interesting was that the team had gone – and I think they were thinking, maybe we just digitize the form. Maybe that’s what we mean when we talk sort of digital transformation. That might be the easiest way we can deliver something quickly. But what they soon found out, and I had a really awesome service designer called Mira who joined the team. She worked with them and she actually suggested they bring in a lawyer, because they kept hitting up against issues like oh, well of course we’ll just digitize the form. What else will we do? But when they went out doing the research they were finding out that a lot of the stuff we were asking for as government, we really have and the hospitals have. Because when you give birth you’re giving a bunch of details over. So, they actually went out and did a bunch of research out in hospitals and what they ended up doing was actually deciding, you know what, we don’t actually need something. We don’t actually need a form. Or we don’t actually need a digital thing, like an app or a website to fill in. What happens if we even just remove that process and just assume that if we’re having a baby and we get consent, we can just issue the card. So, they were exploring this what might we kind of situation. So, it was radical, and people were going, “no you can’t do that.” But when the lawyer joined the team, there were lawyers going actually you know what, we can actually make this work. So, the trial was actually in one of the hospitals up in the Gold Coast. It was actually looking at what happens if you sequenced in from the parents at the time of the birth. They’re filling a bunch of information to the hospital about the birth of the baby, what happens if we use that and issue the Medicare card. So, it’s something that had gone from taking weeks and weeks because you have to go in and see multiple people, get the thing signed. And gone down from that to only a few days. It was amazing because you’ve now gone from filling in something to not even having a thing and stuff just appeared in your letter box. I think that’s a very great example of transformation happening where you don’t have to have a particular, physical digital thing. And it was good to see what they did with it.

Steve: Are there things outside the professional realm, like outside your work at DTA, where your skills, the things that you have a passion for and the things that you are just so great at, is there anything that you do outside – were those things manifest in other forms?

Ruth: So, Canberra is a very small place and I love community. So, things I would do are not quite – they’re not research related, but it comes under people. I’m involved a lot in organizing community events around things like TEDxCanberra. So, I’ve been involved in that for about 5 years about bringing and facilitating the amount of smart people that we have and for others to learn from that. I really enjoy that aspect and that comes through like a job as well. I really love connecting people and seeing them grow because my intent is if you can grow and be a million times better than what I can ever be, that is an awesome outcome. You know always hire people that are smarter than you. You want to encourage that growth. So, for me that stuff that manifests outside of my job has to do with things like that. Things like BarCamp back in the days. Something called GovHack, very early in the days – so we look at open government data and how do we help people flesh that out into interesting things. All those little stuff like that is what – again, we’re connecting community and people.

And something else I do that’s not even related is I love making. My hubby and I, we’re both makers. We’re like crafting things. So, I make science-themed jewelry for fun on the side, using our lasers and 3D printers. For some reason, that process of designing something that’s very, very tangible and you’re prototyping that thing and you’re wearing it and you’re seeing people use it as well. It’s very interesting. So, although it’s not necessarily a research thing, it’s a fun thing because you are trying to use a creative side of your brain in a very different way. In a much more tangible way.

Steve: So, is that the connection to the creative?

Ruth: Yeah. And again, I’m just doing research on a very small scale, again to wear my stuff and see how it works. It’s just very, very small-scale stuff. But you learn a lot about people’s behaviors, about when they say things like oh Ruth, can you make that in these colors because I think people will buy that. And what people say they want and what they actually end up doing. Like we talk about that all the time, but to see it in action with your own product is very interesting because you’ll make stuff because people ask for it. So, let’s just try it and see what happens. There’s not a need for it. You just lazily see it sit there. But just little things like that where it’s really tiny stuff that we talk about in our practice that manifests in this way. It’s interesting.

Steve: Can you describe one of your pieces?

Ruth: Yeah, so one of the things that we do is we look at data. So, my hubby actually takes the data from looking at the weather data for the cities in Australia. So, the Bureau of Meteorology here has about 100+ years of weather data. So, we take that and actually visualize it into a wearable piece. Unfortunately, I’m not wearing one today. I’m wearing a serotonin molecular today which is happiness. We take that and then actually model it into a bit of a big jewelry statement piece and then laser cut it. And that’s meant to show, well just data visualization. How do we make science interesting and communicate these things in different ways that’s not just a PowerPoint or a presentation or a survey, but actually a very tangible way of understanding data. So, I’m interested in also that crossover between when we talk about research, the data and then the data sciences folks. I just love geeking out some of the data stuff and going hey, we could find other just weird wonderful ways to display this. So, it may not have any actual use to researchers, but it’s just geeky and fun.

Steve: What do you think people who purchase your jewelry – because so much of our work is the people that make something have a model of it and other people that use it have a model, but sometimes our job is to just try to articulate that delta. So, you’re describing kind of what you’re putting into these pieces and even just the geeky pleasure. But for someone that buys one and wears something that you’ve made, what’s their interpretation or their narrative of what it is, or what it means?

Ruth: Yeah, it’s interesting. They’ll come up and they’ll go – they identify as a fellow science geek. Especially seeing that, they’re going “is that a serotonin molecule?” Like, it is. And then we have a conversation about science and about what makes them interested in science. I use it as kind of a talking point to work out how did they get into this field? Or if they’re not in the science field, how do they actually trigger – it’s a good way to practice research questions actually. But it’s just fascinating seeing that kind of shared – I think they’re having that shared platform or love of a particular field, in this case science. And that’s what connects people and they feel like oh, they’re part of this group of people who love these kind of geeky things.

Steve: There’s something, I don’t know, secret about it, that someone who gets it is going to get it when they see it and somebody else is just going to not recognize that.

Ruth: It’s funny. Not a secret club, but it is that recognition that somebody else enjoys this thing. Whether people have that in music, or whatever their particular passion is, when you connect on that kind of level it’s just so interesting. And then you start hearing their stories about how they either buy it for their daughter because their daughter had started doing this thing out in Antarctica – wow, just the stories you hear. I think that’s where it comes to my research – I just love hearing stories as well and you hear all these fascinating things that people tell you when you’re buying jewelry because it’s just a conversation starter.

Steve: Yeah, and you’re connecting on something. You’re right. It’s not secret. Secret is about being hidden. This is just about that connection, or something about the recognition, that’s a better of putting it, to get those stories. That’s really wonderful. Is there anything else that we should cover in this conversation?

Ruth: I think we had a good range of discussion.

Steve: I think so too. Well, thank you so much. It’s been really great to chat with you.

Ruth: Thank you, Steve.

Steve: Thank you for sharing so much great stuff.

Ruth: That was really fun. Thank you very much for your help, for your time.

Steve: All right. That’s the wrap on another episode! subscribe to Dollars to Donuts wherever you get podcasts. If you’re using Apple Podcasts, how’s about giving the podcast a rating (five stars?) and even a short review. This helps other people find out about the podcast. Portigal dot com slash podcast has transcripts, show notes, and all of the episodes. Follow the podcast on Twitter, and buy my books Interviewing Users and Doorbells Danger and Dead Batteries from Rosenfeld Media or Amazon. Our theme music is by Bruce Todd.

Jun 15 2019

57mins

Play

35. Danielle Smith of Express Scripts

Podcast cover
Read more

This episode of Dollars to Donuts features my interview with Danielle Smith, the Senior Director of Experience Research & Accessibility at Express Scripts.

Something that I’ve really changed the way I thought about since I’ve been at Express Scripts — we are in the healthcare ecosystem. So the experiences we deliver, if they are not of quality, they do have serious repercussions on people’s lives and people’s time. We are ethically bound to measure the user experience from different perspectives. Before something launches. We have prototypes or concepts or ideas, we do our due diligence in terms of user experience research, to make sure that the thing that we’re putting out on the world doesn’t just happen to people. – Danielle Smith

Show Links

Follow Dollars to Donuts on Twitter and help other people find the podcast by leaving a review on Apple Podcasts.

Transcript

Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization.

Many years after we moved into our house, we finally hung up our art. Sure, we had hung up individual pieces, something that was already in a frame, say. But it was always piecemeal, a nail here, a hook there. And we continued to accumulate meaningful pieces from travels or family events. And we continued to occasionally pull out the box (a moving box, made for a mirror, I believe) and spread out all the various bits and pieces and just generally fantasize about having them up.

My goal, however, was to have a plan, an intentional way of placing the different posters, prints, photos throughout our home. Every time we would try that, I would get overwhelmed and give up. I tried taking small bites: after seeing homes with big frames and medium photographs, we chose a few photographs from our travels, blew them up, ordered specially cut mat boards, and frames. We mapped out where in our living room these would go; we essentially carved off part of the home and planned the photographs that would go there. I hoped that this would simplify the challenge of where to put the remaining posters, but we found ourselves stuck, still.

Eventually, we opened up that box and made some hard decisions about what to hang and what to set aside, and then – before too much time could go by – arranged to get everything framed. We were inching closer, but sitting with our posters and prints, all framed, we still couldn’t figure out what to hang where and so (this shouldn’t be surprising) I got overwhelmed and gave up.

But buried in that frustration and surrender was a recognition of what skills I’m missing – an ability to reorganize visual information spatially in a few different ways, a set of starting principles for grouping, placing, and so on. Surely someone must have this expertise and be offering it as a service?

It turns out that, yes, there are professional picture-hangers. We called one and booked an appointment. A few weeks later, on the scheduled day, at the expected time, the doorbell rang. We answered the door and before we could finish greeting them, two men were in our foyer, one of them having magically unfolded and gently placed a table (please assume it transformed from a flat briefcase shape to a table with a soft whoop noise).

The guy with the table handled production. The other one handled creative (and the clients). We showed him all of our framed pieces and talked a very little about what they meant, and we showed him around our house, pointing out areas we were interested in and the few pieces we had already hung. Meanwhile he was riffing constantly, throwing out ideas, getting energized, and delegating to the production guy, who began attaching extra mounting hooks and hanging wire to all of our store bought frames. After a short time, we backed off, and we watched them work.

The “creative” began moving frames into different rooms, laying them on the floor and trying different combinations. Like a problem-solving algorithm, the solution began to appear, bit by bit. The floors throughout our house began to fill with clusters and arrangements of different prints, both thematic and visual groupings. We were called in for frequent consultations as the plan emerged. Eventually there was a plan for where everything was going to go. This was the piece we could not have done ourselves and in a short time, they had done it.

Then came the rest of the production. They began hanging up everything. That meant figuring out where each item went – exactly, putting in nails in exactly the right spot, using a level, all the details. Especially because so many pictures were clustered, something being slightly off would really show, so perfect execution was key. This was also something we could have not pulled off ourselves.

The final results were astonishing. Rather than hanging things in a grid, with the top edges aligned and a consistent space between each, they put together a number of clusters where the posters emanated from a central point in an almost-spiral flow. And they chose how to place different prints within that cluster in order to create a kinetic sense, such as having a poster with a bird along the left hand side, with the bird facing to the right, so that the content of the images supported the physical placement on the wall. This was not something we could have even imagined, let alone executed.

It is immensely gratifying to be in the presence of a highly-skilled individual. When those skills are being deployed for your benefit, it adds another layer of delight. I believe that delight is further enhanced when we ourselves have tried and failed. This story is a reminder to me to seek out these magically-talented individuals and take advantage of what they have to offer, whenever I can.

And it’s my goal to be a magically talented individual for the people I work with, someone who brings in a level of skill that isn’t achievable without my help. I strive to conduct user research with that kind of finesse, I hope that when I coach and train teams in doing research, I’m helping them see what that magical level of skill looks like and move them forward on their own path towards achieving that.

I would love to hear from you about what you are working on and how my expertise can support you in moving your effort forward. Please keep me in mind.

Now let’s go to my interview with Danielle Smith. She’s the Senior Director of Experience Research & Accessibility at Express Scripts.

Steve: Well, Danielle, welcome to Dollars to Donuts. Thanks so much for being on the podcast.

Danielle Smith: Thanks for having me.

Steve: Why don’t we start with an introduction from you to say a little bit about what you do where you work.

Danielle: My name is Danielle Smith. I am Senior Director of experience research and accessibility at Express script. And Express Scripts is actually Cigna company. But we are a pharmacy benefit provider. So we help with getting your prescription drugs. If you have insurance coverage, we are the companies that help with that. And my team measures the quality of the user experience from a few different perspectives. One includes user experience research. The other is our NPS program for our digitally engaged customers, and our data science team and some digital metric reporting, oh resides with my team.

Steve: So I want to ask about your team. But I want to just back up slightly, who are the kinds of people that are having this digital experience with your company.

Danielle: So our digital experience is primarily geared towards people who have prescription drug coverage through their health insurance provider or directly if you’re on Medicare Part D, but you are interested in getting home delivery of your maintenance medications. Our website does have kind of general information. If you do have a health plan, you want to go in there and look up a medicine see how much it costs which pharmacy But most of the functionality on our digital tools for consumers really look at the home delivery of medicine,

Steve: and then use the phrase to kind of describe what your team does. It’s about measuring quality. Is that right?

Danielle: Yeah,

that’s sort of the way I think of it.

Steve: Can you pick apart that phrase a little bit? It’s an interesting one. And I’m curious how you think about

Danielle: something that I’ve really thought about or changed the way I thought about since I’ve been at Express Scripts, we are in the healthcare ecosystem. So the experiences we deliver if they are not, if they are poor, or if they are not of quality, they do have serious repercussions on people’s lives some people’s time. So when I think about my team, we are almost ethically bound to measure the user experience from different perspectives. Before something launches. We have prototypes or concepts or ideas, we do our due diligence in terms of user experience research, to make sure that the thing that we’re putting out on the world doesn’t just happen to people. We have some ideas It’s usability, its appropriateness. And we’ve really tried to test a test and air quotes, do research on everything before it hits the site. And some cases, of course, we can’t do that. And I’ll talk about that in a second. But we’ve tried to do, I would say far above 80%, of what hits the site goes through some level of user research related to that. And another aspect of what we do to ensure quality is to make sure it’s accessible. And most of our accessibility work has really geared towards visual impairments or different visual abilities, because we are the digital team. So my team serves as subject matter experts to help our developers aren’t marketers, and to deliver experiences that are usable by people that use screen readers or have low vision just right out of the gate. So we don’t have a separate experience for folks who may be having any sort of different visual abilities. And that was a big point of focus for the first few years of my role here was to just really get that going. But it’s part of our user experience, practice and it’s in our The research team because we do specifically do studies, blind and low vision users throughout the year, to make sure even though it’s something’s technically compliant, it is actually usable by folks that use a screen reader specifically. But following good design practice, and rolling in those things before you launch still does not guarantee good experiences. And so the other parts of my team work to measure how what we build interacts with reality, and to see if it does actually generate a good experience and has, you know, high quality like we want. So the NPS program I know NPS itself is a problematic metric and has its taters in the industry. And that’s fine. I’m not here to defend it. But what it does is gives it lets us speak in a voice that executives and understand and it gives us the leeway to have an open channel with our users. So we send out these monthly surveys for one part of the program. But another part of the program allows users to leave feedback directly in our mobile app on our website. Tell us what’s going on. So yeah, that gave us a score, report the score, executive lift score. But the words that they use when they tell us if we’re on the mark or off the mark have been priceless and making sure that things actually work as intended given reality. And then the other piece is behavioral analytics. So we run a B tests, we instrument the site and make sure that things are working the way we think they should work in terms of people flowing through different funnels or parts of the site and monitor the experience that way. So we work really closely with our product teams to help them understand their metrics, make sure they’re gathering metrics and help them use them and interpret them in the right ways. So when you use the word quality, what does that mean for you? So for me, I’m using quality to be synonymous with a good user experience and healthcare so complicated. I’m not going to be so arrogant as to say we’re going to delight people. I just want them to be able to get their task accomplished which is getting their drugs prescriptions delivered. or checking the status of their delivery. I want them to get their tasks accomplished with ease. And that’s what I mean by quality. Are we not getting in their way? Do we make them feel like they’re part of the process? And do they understand what’s going on? Because it’s complicated enough.

Steve: You also use the phrase, I hope I get this right. working as intended given reality.

Danielle: Oh, yeah. Reality when you’re talking about healthcare is something that we cannot ignore. And we can’t we can’t even formulate all of the scenarios a person might be in at least you can’t do that yet. And people’s realities may be like right now like, Okay, well, now my doctor’s office is closed, and I need a refill. How do I do that? It’s a pandemic, what’s going on. So we have to be making sure that we are listening to user feedback to make sure we update the website, and even communications and talk to our friends in the call center to make sure that we are ready to respond to our users. reality, smaller, like everyday situations happen. As you can imagine, if you’re sick, there are some conditions that give you kind of a transient low vision situation. So you use the website yesterday, I, my team wants to make sure you can use website tomorrow. Just because you’re on this medication or you have this condition should not take away your ability to use certain tools. We also have lots of families, family situations are fluid who has access to your account today, you might not want them to have access to your account tomorrow. And this is your health information. So we have to be always ready to listen and react when people’s contexts change, or their actual reality is way more complicated than we even thought of when we were designing the user interface.

Steve: Can you say a little more now about the word measure is kind of the key verb that you use to describe kind of overall what you all are doing.

Danielle: Sure, and I’m glad you picked up on that because I don’t want to give people the wrong impression that Don’t do discovery or you know the call, but I am of the belief that the work that we do is still measurement. We may not have a type metric, we might not have a number around it, but it is still the collecting of data to get at some underlying construct as best as we can. And I was recently reminded that I came from a grad program that was more quiet leaning. And it’s probably why I did grow my team, the way that it grows more, including analytics, as well as user research and having them live side by side. So it’s not just about things we can put a number one it is about understanding scenarios and understanding people and listening to how we can resonate with our users and more. So I don’t want to miss categorize what we do, but I think of measure in the broad sense of the word.

Steve: I mean, it’s a really powerful frame for what research does and there are lots of different frames for you know, if you have to say one word, what is it that we do and measure is One that I have heard all that often, but it’s very compelling, especially when you kind of explain, hey, there’s lots of different kinds of measurements. And even when you’re talking before about NPS, and part of its value as a way to open up a conversation with other people in the organization, when you say measure overall, I imagine that that is similar that by framing this around measurement, you are positioning yourself relative to your stakeholders and colleagues that this is the kind of guidance and information that you can bring.

Danielle: Yeah, and I talked about it that way I talked about what my team does is x data sounds very superhero like that. It’s like it’s data about the user experience. And just because I say data does not mean it’s all quantitative, and it’s taken a while for our partners and even some folks on my team to get on board with this idea. But we need to be able to have some sort of convergence on the user experience. And convergent validity is a goal of most research teams that I’ve been exposed to where you want to have some analytics show problems and user research give you the other side of it, maybe some survey data to give you some aspect of scale of some of those opinions. And by us all being on the same team, it gives us that ability to be fluid in our methods and speak to the business in a way that allows them to hear us. And sometimes that way is to use NPS or a web analytic metric to get our foot in the door and expand how they view the users by layering on the qualitative and sometimes it’s vice versa. We have different stakeholders that are more interested in in qualitative data and then we overlay some quants to help them understand scale and focus and or if it’s even able to get to that level of maturity given where we are in the process.

Steve: Accessibility is part of the group and for me it’s new to hear accessibility specifically called out, alongside other different roles. How did accessibility end up being part of what you’re focused on.

Danielle: Oh man, it’s hilarious he was just like a line in my job description analysis like that. Okay, and I thought what that meant was, this is literally my first week on the job I thought that would be similar to my accessibility role in other parts of my other jobs I’ve had before, which was its usability person I need to be aware accessibility exists, and I enrolled in some tests in my normal course of evaluation that help the accessibility team, check their boxes, about compliance. However, I did not do my due diligence about looking at the expressor sweb site to understand what that might in my job description actually meant. Just say like at previous jobs at Dell and larger companies have specific hardware interface requirements that come from the accessibility team that kind of get handed to the usability team that we just make sure as we go through development, make sure that they get rolled in, but what I did not notice and Express Scripts with that. Back when I joined the company in 2016. They had two different websites, there was a link on the top of the site that said accessible view. And it took you to a text only view of the site, and during my first week on the job My boss is explaining this to me and how, nobody really wanted that to be the strategy and the teacher. I was like, Oh, so we need to learn how to build accessible websites, she’s like yeah, like, oh, okay, well, let me figure out how to do that. So it was a whole different a different perspective on the same problem, but it gave me that opportunity for us to say that we’re going to build a website that works for folks of different abilities, without having a separate site that was personally offensive to me as a black person to have something separate for people who are different, really annoyed me. So, I wanted to make that go away. And having more inclusive web experience. So that became part of our world, because at the time Express skip splits really open to having a much more modern digital experience, we came up with a plan to help spread that awareness spread that skillset across the organization, one person on my team really picked up the mantle and dove into accessibility, best practices, compliance. She’s not a lawyer but building relationships across the organization to have that compliance and a network, and then diving into doing research with totally different abilities like specifically people they use screen readers because we wanted to make sure that not only did it kind of pass all the checks, but that it, like I said, was usable, and bring that video and bring those usability results into the organization, so that they can understand what it meant to be inclusive in that way. And that just really fit with the way we do user research, because the whole point. A lot of the point of user research is to bring the voice of the user outside of our walls into our walls to help people inside the company, understand the operating environment of people outside of the company. And by having a focus on accessibility within user research allows us to apply those same principles same practices that folks were kind of used to, but with a different audience, and it just really worked well. And so, I can’t remember the date but it was at least two years ago, we sunset it that outside, we have one experience, and we have a group of folks within the organization that are really passionate about this developers help each other out. Under normal situations we would actually run usability testing on site at the American Council of the Blind. To get more engagement from the community, it actually turned into a natural extension of the usability team meeting over it’s showing my age but user experience team, because, as designers, as researchers, we are responsible for inclusivity and accessibility is one part of that, and in health care, especially cannot, an all good consciousness exclude that important of a population. And so we do have focus on that and we have it in the name of the group, we want to remind people that this is who we are, what we do, and now we have this culture in my work that I almost don’t have to, but I probably sit

Steve: right there, designers who are specifically focused on designing the accessible experiences.

Danielle: Now, every designer is responsible for it. So, we had, like I said, this me and my team helped to make sure that add new designers have gone through whatever training modules that are we have available designed for accessibility is rolled into our design language, and it is part of the core competency of every designer, the content team of the development team to just build it into everything we do. Can you talk a little bit about some of the ways that your team I know your team covers a lot of different functions, but how you work with other parts of the organization. So Express Scripts senior healthcare org it’s a little different than a lot of the technology companies I’ve worked with before the design team, lives in it. So there is one big group that handles all of the external and internal facing technologies, and that’s a little different from what I was used to, but it works out really well here we work very closely with our technical product owners and partnerships within the business.

Steve: How do you plan for what it is that people on your team are going to be working on.

Danielle: So we are aligned to what the technology or so, I’d say that once a year, we hire on our big buckets of work so we know we are going to work on the website, on the site of member experience for these of students who work on the website, we’re going to work on the mobile application, and we’re going to work on, maybe one or two other properties or different experiences. And so, once we’ve aligned on that across the product team, the design team, and the front end engineering team, then we each kind of go off to our own teams and look at how much work can we actually do. So, this is an ongoing area of maturity for us as a leader of people. It’s something I’ve had to learn to take different approaches to doing, and I noticed wasn’t really your question but I’m going to talk about anyway. Go for it. Easy research design content, kind of a big deal of design, we’re all in this field because we have this level of empathy for folks to believe we want to design things that make the world better. And what I admired as a leader of people it’s, who has to manage like resources and organizational requests and put together budgets plants and stuff, it’s hard to figure out how much work we actually can do when no one on the team wants to say no to anything. And, as a leader I’ve had to figure out different ways of training manager, it’s been very illuminating because I’m sure I was this person when I was like fresh out of school to, kind of, I can do it, I can do all this stuff, and then later I’ve my work quality suffers turns out I’m working on the weekends and at night. I might start to resent the work I’m doing, and if I just raised my hand and said hey I know I said I could do all these, all these things but turns out I can only do one of them, my life the good life. So, as a leader of people I try been working really hard closely with other leaders on my team, you know they’re people on the team to help understand how much work can be done in a given situation. To help evangelize a message of how much more impact we can do if we just focus on one or two things that we know we can achieve and say no politely, but start to learn to build the muscle of No, that’s been my own personal growth area as a leader, and as a individual I’m guilty of that too, like I said, but trying to plan the work and where to focus becomes easier as you learn as a UX professional to accept things that you can do, and be cognizant and aware of what you can’t do and communicate that to your, your leadership your peers, so that everyone knows what’s possible.

Steve: You reframe for me when you say become scribing No, as I can’t do it not know as I refuse to do it right, there’s a human limitation that we are always taking into account,

Danielle: right, there’s a human limitation to everything. And we know that our takes time. I mean, everyone’s work takes time, but if you’re going to do a good job. I mean, you know, as a consultant, it’s hard to tell a client No, but you’re not really telling them no or cheese you’re telling them. Well, I could do that, but it would suck.

If you want to do it well. It’s gonna take this long, or we can do, fewer things in a shorter amount of time, having that conversation is a skill that needs to be developed and it will make you much more successful in the long run.

Steve: Makes me think that there needs to always be a flat no but it can be an example you just model. You’re coming back with peers to trade off stickers to consequences. So you’re facilitating something.

Danielle: Yes, that’s my kind of, when I say my personal growth that that’s my, because then my growth is not just no good to say, Here’s why. I don’t think it’s a good idea, or I won’t be able to do these things here’s why. And this is what I mean time, right, this is how I think, engage other options to get this stuff, it’s not a plan now, and it’s not the Yeah, I’ll do it and I’m secretly working out.

Steve: When I asked you about accessibility you went all the way back to your first week on the job and what that was like and we’re talking a little more now about where things are at right now. I wonder, maybe just a stepping back question, if you could describe a little more of that arc, like what you came into how your job is evolved over the years that you’ve been there and up to date where we’re at now.

Danielle: When I came in. It was at the beginning of what we added at our technology transformation. And part of that was building a design team. So they were ugly at hand handful of folks, and I was brought in as a director to build a user research team. And that was a lot of executive conversations about the value of user research building an understanding about how we work. That’s a process where lots of quick and quick in your clips, does that mean it’s really quick, in healthcare, but doing some initial studies that demonstrate the utility of what we do, creating lots of new templates and getting feedback on presentation style so that we can communicate value and consistency and that didn’t take as long as I would have expected and it wasn’t as hard as I was expected. And by that I mean, I come from other organizations that had more or less established UX disciplines for some sort of experience if so, when I had been a consultant for a couple of years. So, what I was expecting coming in and like to have to do a lot of r&d, honestly to talk about how this is important and we needed it. I had to do very little that, and that was a real surprise. So lesson me it looks like it’s not able to super easy, it’s just that doing that work did not take up as much of my life as I thought it would. Because it was mostly an awareness problem, people just didn’t know that this existed, and once they heard about him can understand the work that we did, they became avid consumers. So it was a big we got a lot of fans. I did a couple of presentations to senior leadership, to this day, Express Scripts president we’re waiting in hallway because he back in those days it was a lot of conversations about what users are doing. And I also presented to clients a lot. So the way that our business works, like I mentioned the beginning is that we help with your prescriptions, if you have coverage with health insurance. So, our clients are really good health insurance plans are different businesses if they self fund their health insurance. And because this was a new function within the company, we wanted to make sure our clients understood that we were doing this and that, if I sent some of the survey, and they reached out to their health plan administrator, it wasn’t new news, so that was a different thing for me to have all these client conversations, and I still do that to an extent but now our sales team, kind of knows about what we do, so they can speak to it. I see pulled in a client conversations on a less frequent basis, but back then I was probably going to St. Louis, a couple times a month to show clients our disability lab and talk to them about what it is that we did for the first and probably two years after like the first couple of years, usability practice maturity, we had a couple of folks on my team to start to do analytics and data science lead and have taken over the NPS program to clean it up and systematize it, and now we have a winner I would say it’s healthy analytics practice where we can start to put things in place like a B testing and talking about how that is used in the organization and become consultants for different analysis questions. so we’ve gone from kind of not like having cabbie leaves little pieces of UX that there was a couple UX people bringing across the organization a couple of folks doing usability testing, they started building a website before I got hired, honestly, so they knew they wanted to do this they had executive buy in, but really, rolling it into product and having it not be optional check the box thing really started to happen over the last couple of years, and now let’s say almost gotten to be a management of demand like under this umbrella shield to keep my team from getting silex projects as we call it. So, and my job itself. I started off with a small team as a director. Now I’m the senior director some pretty well removed from doing research. Back then I think I managed a vendor on one project I was pretty well involved with a couple others I might have done a survey to myself, and now like I can’t, I joke all the time like I don’t know what you mean. I don’t know what it is the PowerPoint slides. I’ll barely do it. I just, I listen to what my team tells me, and I make a path for them that have impacted the organization. So my job has changed from building the competency, demonstrating that we have this competency and getting buy in to making sure that what we do the work that we do, is impactful, and I do that by creating barriers helping were a candidate, identify stakeholders and to fix some sort of weird problems in my own cover, to help in my team to understand how the business works, and vice versa. Okay, the business understand how different new parts of the business, understand how my team works and how they may or may not work with us.

Steve: What are sideways projects

Danielle: and sideways projects are, you are I some violence aged some big translate into like the lunch and learn or something, and someone in a different part of the business, let’s say, in our traditional IT department I say they they make dashboards for infrastructure monitoring, I make it so they heard of my team I heard it was their VR thing. They love this idea, they contact any person on my team, ask them if they can help them by running a usability test on their infrastructure monitoring thing, and that is a sideways project. I have to make sure that we don’t say yes to those kinds of things we don’t necessarily say no because they are under Resources they can point people to like I said like the sauce. But we don’t get involved with too deeply was just to preserve our family and focus, and it’s one of those tough ones. It’s not like things are unworthy. It’s just the human limitation, the team’s ability to do the work. If you’re doing that, then you’re not doing something else, because everybody is busy. If we do have spare cycles spare bandwidth, then we do consider those kinds of projects, but you shouldn’t be down.

Steve: So you’ve talked about the healthcare ecosystem, but I assume, when you call it an ecosystem that means you’re sort of outside. You’re not a care provider for example, but I’m wondering if you’re impacted by regulatory concerns as part of your role in that ecosystem.

Danielle: Yes, we are. And that was another thing that was surprised during my first week on the job. I have an academic background, doing user research, or research like this, and I’ve also been in industry for a good 1015 years. So, I understand, and expected that we would have MBAs that we need to put in place with our research participants, we would have to get informed consent, and we’d have to have certain ethical practices about letting people participate and back out and, you know, things that I come to expect is second nature. What I did not understand, was that healthcare is healthcare. There are quite a few laws that come into play, how you can talk about sensitive health information, and I had to get very friendly with all of our attorneys. And I will say all of them but quite a few. So my first week maybe, and that I got introduced to one of the attorneys that set in your mind. At the time, and he was like you did but let me invite people to come here and tell us what, and stuff like that they kicked out a series of meetings as me speaking with our attorney, are different, and they’re several different legal departments inside of a healthcare organization. I also learned, but he was feud with a few of them about what user research is and how we can use the data, and who we can and cannot talk to and what’s very important as a user researcher in healthcare, is that you are not soliciting information about somebody’s health condition, but it is the context of which they’re using our service. So we do have to could, and I learned this in this first couple of months, being on the job, and I now have to teach it to new researchers on the team put this person in place, but we do have to do very specific things to protect the data that I’ll share with us, because they are, as you can imagine, if you are in a usability session about using a prototype to refill a prescription, we have to make sure that none of that data is real, it’s a prototype, so you’re not pulling your health records to build this prototype, but as you get feedback he might start talking about what happened last time, and start talking about your specific prescription drugs, and my researchers have to redirect you. The reason why is because that is protected health information, and we have to be sensitive to that. And it’s just something I never really thought about coming from academia coming from industry, this idea that if we run like a common thing we do and I’m sure people still do, is in a gorilla hallway research where we just grabbed somebody, and it’s awesome sponsor of the project, give me unless you still a feedback or do an interview about how they might manage this certain situation at home. Well, you do that in healthcare, and you’re talking to an employee, and they might, as part of the usability evaluation or interview reveal some of their health conditions, and we have observers present. That might be a breach of their privacy. So, again, we have to be careful of who who observe sessions, who has access to recordings, and how we anonymize and resear