Cover image of Dollars to Donuts
(36)

Rank #200 in Management category

Business
Management
Entrepreneurship

Dollars to Donuts

Updated 8 days ago

Rank #200 in Management category

Business
Management
Entrepreneurship
Read more

The podcast where we talk with the people who lead user research in their organization.

Read more

The podcast where we talk with the people who lead user research in their organization.

iTunes Ratings

36 Ratings
Average Ratings
36
0
0
0
0

A standout

By EFOhio - Oct 11 2019
Read more
Steve Portigal is a favorite of mine when learning and teaching people how to do User research. This takes it a step further; interviewing heads of research departments from Citrix to ADP to Goldman Sachs and on the flip side, Etsy. There’s no “why are we here” here, just great content.

Great podcast

By Alwe9405 - Mar 27 2019
Read more
Excited to find a solely user research focused podcast!

iTunes Ratings

36 Ratings
Average Ratings
36
0
0
0
0

A standout

By EFOhio - Oct 11 2019
Read more
Steve Portigal is a favorite of mine when learning and teaching people how to do User research. This takes it a step further; interviewing heads of research departments from Citrix to ADP to Goldman Sachs and on the flip side, Etsy. There’s no “why are we here” here, just great content.

Great podcast

By Alwe9405 - Mar 27 2019
Read more
Excited to find a solely user research focused podcast!
Cover image of Dollars to Donuts

Dollars to Donuts

Latest release on Jan 07, 2020

Read more

The podcast where we talk with the people who lead user research in their organization.

Rank #1: 7. Judd Antin of Airbnb

Podcast cover
Read more

We kick off the second season with Judd Antin, the Director of Experience Research at Airbnb. Judd and I speak about their model for embedding talented generalists with product teams, skill-sharing among researchers, and just what exactly makes research “sexy.”

I don’t know of another way to do things better than to give and get feedback. It should flow like a river. And I think that can be hard, to be open, to focus on strengths rather than weaknesses. To be humble is a thing we’re always seeking to be better at, but that’s how I approach the task of building a team. – Judd Antin

Show Links

Follow Dollars to Donuts on Twitter (and Stitcher) and take a moment to leave a review on iTunes.

Transcript

(French version)

Steve Portigal: Welcome Judd.

Judd Antin: Thanks, Steve.

Steve: It’s great to speak with you. Maybe let’s just start as we always do with just kind of the broad strokes. Do you want to introduce yourself? Tell us a bit about you and what you do, what your role is.

Judd: Sure. Sure, happy to. So I am the Director of Experience Research here. So experience research at Airbnb is a team that was formerly called Insights, but what we really are is a UX research team. We are embedded in our design organization so we have designers, content strategists, product managers, engineers, data scientists as our partners and we’re a team of 17 folks at the moment. We have a pretty diverse group of people on the team which is something we do on purpose. And everybody is sort of embedded in their teams, working directly on the day to day of products for guests and hosts all the time. So, yeah.

Steve: That’s good. And of course those intros, as we know from research, like we could just follow up on all the things that you said. That would probably fill our whole interview. Let me go back to some things. Can you explain Airbnb?

Judd: Sure I can. Let’s see if I can do this in a nutshell. So Airbnb is a marketplace that let’s hosts who have space come together with guests who need it. So…we provide an opportunity to travel in a much more local way. So I think you know, it’s – you know Airbnb has been one of the poster children for the sharing economy but I think for us the way that we think about it is that Airbnb is an opportunity to connect hosts and guests together, to have much more genuine local experiences. Local in the sense that you’re traveling to a place that is pro-, potentially just in a neighborhood. You may be staying in a spare room or a place where your host is either there or they’re on vacation and you’re sort of experiencing their city from their point of view and you know getting recommendations from your host about where to go in the neighborhood. So I think that is the thing that makes Airbnb really beautiful and unique is that it’s kind of this view of travel which is to say yeah you show up in San Francisco, the Golden Gate Bridge and Alcatraz and Coit Tower, those are important things that we know from research that everyone wants to see, but if you never left Union Square and Fisherman’s Wharf it would be hard to say that you got a real sense of what San Francisco is like. So Airbnb really lets that happen.

Steve: And you started off at that great explanation by contrasting sort of a common or public perception around it – you said “poster child for sharing economy.” Is there something that – and obviously everyone in the company – everyone in any company understands the contrast between how they’re perceived and – I’m just thinking about you in the role that you have and what your team does. Do you deal with that – I don’t know if it’s a gap or delta – the world thinks of us as this. You know we go out and people to sort of understand. I don’t know…

Judd: Oh yeah

Steve: You’re nodding so I’ll let you answer.

Judd: All the time. We spend all of our time talking to guests and hosts on Airbnb. We make a concerted effort through travel and remote research to get outside of the Bay Area bubble for example because, you know, Airbnb is very much present here. But what that allows us to do is see what the everyday experience of traveling or hosting on Airbnb is like for people and I think that means that we get a huge dose of real talk. You know we have a mission – you know sort of our tagline as a company is belong anywhere which is something that we all pretty deeply believe. We want people to be able to have a travel experience which lets them have a really genuine welcoming experience of a place that they can start to feel at home there and belong there. But we learn a lot about how that vision sort of translates to reality in the everyday life of a host, or the everyday life of a guest who is traveling. For host that might mean we expose the diversity of hosts. So for a lot of hosts we talk to the – the money is really an important reason that they do it, but they also feel a strong connection to hospitality. They like to experience people who are visiting them from all over the world and they talk about that as a motivation for doing it. For travelers, similarly, we find out that finding a cheap place to stay is often – I mean let’s – however much – that’s not going to be a part of our vision at Airbnb, but it is a reality of the way that people think about Airbnb when they travel. It is a part of their motivation for getting on the platform. And so I think one of the great things about the research team is that we can be that dose of real talk. We can say like okay this is important. This vision is important. The mission is important. But let’s talk about the real lives of guests and hosts and what matters to them.

Steve: How do you characterize – I don’t know, I can only ask it through suggesting an appropriate metaphor that – and I know the synthesis or the ping ponging, if you’ve got – you know here’s public perception, here’s what real people involved – here’s what the real talk says. Here’s what our internal aspirations or you know Kool-Aid or whatever it is for any organization – I mean how – what’s the sort of – tell me about how those things come together. You have to deal with what’s believed and what the aspirations are, with what the world says and with what the real talk from real people is.

Judd: I think we have a pretty happy balance between the Kool-aid, or mission driven aspect of what the company does, and the real thing, the real talk of people. What we do most consciously is probably to remind people in this building that we are nothing like the vast majority of people on Airbnb. And that’s where I think the real talk matters because if you make a bunch of assumptions on the basis of what seems obvious or intuitive to us here in this fantastic building in San Francisco in SOMA, like we’re going to get it wrong. We’re going to get it wrong for California, for the United States and certainly for the world, the vast – the majority of our business doesn’t happen in the United States. So I think that is one of the most important roles of real talk is just to say… It’s not that difficult. I don’t think we feel a real tension between the mission and the values and the everyday lived experience of guests and hosts. We just have to remind people that most people’s daily lives are nothing like ours and so we just have to keep that in mind when we make design and product decisions.

Steve: I love that that’s not a tension. Those things can work well together. Not every organization is the same and maybe elsewhere you’ve observed that as more of a tension.

Judd: Yeah

Steve: But, why? How does that work well here? What’s going on?

Judd: Here’s my theory. My theory is that it’s because we’re a travel company and here’s a list of things that we don’t have to worry about – advertisers, monetizing engagement. What that means is that to the degree we can get people to find the perfect place for them, where they want to travel, get them out of our digital products and out into the world to travel, that’s success. You know what I mean. And so I think we don’t have to make a lot of assumptions about what motivates sort of digital behavior. We have to get people to travel and I think it attracts a certain type of a person who – and you know that combined with our mission for belong anywhere, we think it’s easy to have that kind of empathy for the world because traveling is such a – as an experience is kind of full of empathy. It’s like understand the world from someone else’s point of view.

And as a research team we kind of focus on that too. We’re like hey product team, let us help you understand the world from another point of view, the point of view of guests and hosts that are nothing like you.

Steve: That’s wonderful. So the company that’s about getting out of – giving people the chance to experience something differently than what they’ve experienced, in other words travel, is one way that a culture gets created that there’s a hunger to understand – a willingness and a hunger to understand.

Judd: Yeah.

Steve: The world is different than what we’ve assumed.

Judd: Yeah. And I think that’s a big selling point for researchers who want to work at Airbnb because you know they think – travel is such an evocative experience. You know if you look back at the – I think the average person who thinks back over their life to the 10 experiences that were most educational, most instructive, most inspirational for you, a good chunk of them are probably travel experiences. So people’s eyes are open. Like it’s emotional. It’s intellectual. And so as a topic for research, you know facilitating that kind of travel experience is really sexy for most researchers we talk to. It feels really good. It feels really real. You know I think sometimes working in the domain of digital products you can think what does this all add up to? You know what experience I’m really making. Like we don’t have that problem because travel, we’ve all had experiences of travel that are truly sort of transformation.

Steve: So what are the experiences that researchers have? Can you give some context or situations where researchers are involved in – are researchers traveling to study travel for example?

Judd: Yes. So we have done a fair amount, not as much as I want us to do, but we have started to do travel, primarily as a way of understanding – getting outside that Bay Area bubble. We spend a bunch of time in the homes of hosts. That’s one thing I think we do a lot of. So one thing we haven’t done yet is kind of like I think a travel along project. You know if you research at Uber you could like ride along with an Uber rider or driver. We haven’t done a travel along. I don’t know, that seems a little invasive and weird. But we spend a bunch of time in hosts homes and one of the reasons we do that is because that’s one of those things where it’s very difficult to get a full picture of the day to day of a host, the things that challenge them just in the flow of trying to provide hospitality and use our products without really getting into it with them. Find out what’s their routine of sort of messaging with potential guests – you know scheduling, cleaning, doing key exchange, all of that day to day stuff. It’s really hard to find out the mechanics of that without being there. And it can be very difficult to get the context and depth and empathy we need about motivations for hosting, let’s say, without a really in depth face to face conversation and where better to have that conversation than in the home.

Steve: Right. It’s just funny sitting here that you refer to it as their homes. Of course it’s their homes, but I’ve only ever been a guest, never been a host. So the unit that you’re referring to, this environment, it’s a place that I stay. I forgot that it’s somebody else’s home. So to hear you say that, even the language, which I think is probably a key – that’s probably key for you guys to be talking about these as homes.

Judd: Yeah. No, I think it’s true. And you know I’ve been thinking a lot about the language we use. You know I think in research – you know I don’t know many people in user experience who use the word subjects. We mostly talk about participants and we use – you know sometimes we talk about users. You may – you know we can talk about whether that’s a good thing or a bad thing, but I kind of always want to turn to guests and hosts – the things we do for guests and hosts because it’s a reminder of – it’s similar to the way you said oh right it’s someone’s home. It’s like right, this is a guest, this is a traveler and this is a traveler who is showing up in somebody else’s space, in their home where they live and sharing that space whether or not the host is physically present at that time – they’re sharing that space. And on the flip-side like this is a host. Not only is hospitality and hosting something that we care a lot about as a company, something we try to promote with our products and look into with our research, but it’s a home. You know it’s really personal. Again, like that is a sexy thing for researchers because it’s like one of the most important things in your life is where you rest your head. And the idea of letting somebody else into that home can seem – you know that’s where the issues of trust come in that are so interesting to look at because you have so many dimensions of that. You have the sense of trust that’s instilled in Airbnb as a brand, Airbnb as a platform. The way we facilitate relationships between guests and hosts. And then you have this face to face personal think where I am letting you into my house if I’m a host or I’m going into your house if I’m a guest. And you know I have a lot of people when I talk about Airbnb that ask about trust and safety and things like that. And the only thing I’ll say about that is it gives me a huge amount of faith in humanity how rarely there’s a serious issue. You know. The number – like millions and millions of nights are stayed in Airbnb’s and the number of serious issues that happen when people trust each other and guests go into homes is just minuscule. From our point of view it’s too many – one is too many.

Steve: Right.

Judd: But it’s really sort of restoring a faith in humanity and I appreciate that we are constantly on the lookout for doing research that can facilitate that. Like creating wonderful rich, trusting interactions between guests and hosts.

Steve: And another reframe for me that’s helpful to think about this is these two groups of people that are being connected as opposed to a bed and a toilet for the night.

Judd: And to the end of real talk right. There are some guests who do see it as a bed and a toilet. You know, it’s a cheaper bed and a toilet then they can get at a hotel. But not all guests, probably not most guests. You know the aspect of hospitality, the aspect of getting into a neighborhood, living like a local, it’s really powerful and I think it’s a real motivator for Airbnb.

Steve: I want to dive into some other language that you used and maybe get you to unpack that a bit. You said a couple of times that certain kinds of problems or areas are sexy for researchers. So that suggests some insight into what appeals to researchers. Like what we like doing. So what does sexy mean for researchers?

Judd: Yeah. So in my experience, when I say sexy – when I say sexy I mean two things. One, a topic for research which is really meaty, which you can really wrap your arms around and spend years at a time diving into. It has a lot of dimensions. It’s not easy. It’s probably a challenge. I think most researchers find the challenge sexy. For example it is something that requires us to deeply understand a problem and figure out how to translate it into design and product and communication. So that’s one aspect of sexy. And then the other thing that’s sexy I think for a researcher is the fact that they can be set up for a direct line to impact product with their work. I think when I talk – I talk to a lot of candidates and when I think about how research is structured in their organization what I hear is well they had to advocate for budget, for a study, and it took them several months to do that. And then they completed the study and then they had to present it up the chain three or four or five levels to get that VP to advocate another VP to get that PM to put it into the roadmap. And it hurts my heart. You know it’s like the worst way to do research. And so I think researchers find it sexy when they don’t have to do that which is – which is one of the beautiful things of Airbnb which is that we don’t have a perfect situation but we have researchers set up as partners, with designers, with PMs, content people, data scientists, engineers, at every stage of the product cycle. The way that they have impact is through being there every day for direct conversations about what we should build? How should we build it and why? Are we doing it right? Iterative development, launch and learn, repeat, you know. And that is sexy to a researcher.

Steve: Can we talk more about that staging or how people are set up? So there’s a number of individuals with different roles that are partnered together.

Judd: Sure. So, yeah – so we have an embedded model for research where – which I would contrast to the more service-based model where there’s like a central organization that sort of fields requests and sends out a researcher into the field. Our model is an embedded model where a researcher gets situated into a product. Let’s say that product is search. That researcher sits physically next to, hopefully between the PM and the designer, the engineer, the data scientist. And I make a big deal of that physical presence because I think that’s where a lot of the action is. There are a lot of – I’ve had conversations with researchers before where they say I need to get into that meeting. I go, what meeting? And they go that meeting where all the decisions happened. And usually I look into it and find out that meeting doesn’t exist. That meeting happened like in between
her desk and his desk at 1:45 and it happened because – like I’m elbowing a shoulder right now.

Steve: Um-hmm.

Judd: Like that’s why it’s so important to physically have that person there, physically present for everything that’s going on in the product cycle. And that person finds partners in all the other functions. So certainly with design – we have very close relationships with design, product managers. We have a strong and wonderful data science team here. Engineers of course. Content strategists. Everybody is kind of at the table, one very collaborative team trying to make great product.

Steve: So I have a dumb question and we can say there are no dumb questions.

Judd: There are dumb questions.

Steve: Thank you. Thank you for affirming my dumbness. You know product cycles, development process focuses on different tasks are hot and heavy at different points. So I’ll just ask this in the dumbest way possible, what’s the researcher doing at different stages of that? What does it look like for them?

Judd: Sure. Yeah. So I guess it would start with an exploratory or formative stage where we’re trying to figure out what we should build. What are the problems? What do they look like? Why are they problems? So very often that’s a place for in-depth qualitative research. We love to have a model where we do in-depth qualitative research and then we follow it up with really rigorous larger scale survey work to check prevalence. So that kind of one-two punch is really powerful for us so we can say here’s a deep, contextual rich understanding of these problems and we know a lot about how prevalent they are and we can segment them a little bit in ways that matter to us. So cool, we did that. Okay now maybe we have a product road-map. Or we’ve decided to build a thing in a certain way and the designers were hopefully participants in that research from the very beginning. Very often they are in the observation room and they’re coming on home visits. They’re participating in feedback on interview guides and surveys, all that stuff. But then I think they really kick into gear at that point where the implications for design kicks in from the study and we are starting to do sketches and ideation and visioning work and I’m a huge advocate of low fidelity prototype testing as a way of sort of pointing the ship. So at that point the researcher might be doing sort of a rolling study where at first mocks maybe Framer prototype, sort of golden path, Wizard of Oz-y things that are a little bit more rich and interactive. Doing repeated user studies with those. And then at a certain point in our cycle we’re sort of iterating and we end up passing off to engineering. Again, like the engineers are often there through the whole process, but then the sort of hot and heavy engineering kicks in. At that point there might be some more design refinement that can go on, or if there’s a concurrent project that the researcher is working on. But we get towards a stage where we have a functional prototype we can test we tweak that before launch. We launch. Now we can talk to real users in the wild and we get all that great ecological validity and then we’re back to square one in a way which is okay. Maybe we worked on a thing, we improved it, but a whole new set of problems reveal themselves. Like no design solution is perfect and so the researcher’s job is never done.

Steve: That’s awesome. I feel like applauding that narrative. So everybody’s busy doing this stuff. Again, I guess that’s the embedded model kind of brought to fruition. In a service model you probably wouldn’t either have the resources or wouldn’t choose to do it that way.

Judd: Yeah, I mean – so here’s – like the way that I think about that trade-off is that – the greatest thing about a service model that I can see is that you can really provide the person with the perfect expertise for a product – for a research project that’s being requested. We have to have a model that’s a little bit more like a generalist. So people on my team are generally expert at something. So maybe they’re just more passionate and experienced with diary studies, or surveys. They do little stats. You know we’re a very multi-method team. But they kind of have to be a generalist because whatever – if you’re embedded, whatever research need is kind of thrown at you, you need to handle it. You need to handle it. You need to collaborate with another researcher who’s an expert. If it’s sort of at the edge of your skill set you need to work with the data scientist. But it demands this generalist model.

And then the other thing that I think is important is that in the service model – like I – we don’t spend a lot – like the amount of time we spend on reporting varies a lot, but it is generally not a huge proportion of their time. I think of a huge amount of time spent on beautiful Keynote or PowerPoint decks as the price you have to pay for a service organization rather than an embedded organization because when a researcher is embedded presumably that PM, that designer, the engineer, they were along for the whole thing. They’ve seen it at every stage. They were in the back room and you debriefed after every session or at the end of every day. They participated in the affinity diagramming and so we may still do a video reel, cut together some clips. We may put together a Keynote, but that’s not the primary deliverable. They’ve been getting it the whole time. It was a constant back and forth. I just think that’s so much more efficient and it lets a researcher develop like real product expertise in that area rather than parachuting in and out all the time.

Steve: Are there ways that – as you said they have to work as generalists. But you have diverse staffing, people with different backgrounds and different experiences. Is there any sense of kind of community practice that you get very specifically in that centralized model? Like the researchers sit together and they share stuff, but what’s the gestalt you can create in this model?

Judd: I think we have to work just a little bit harder for it because the peripheral awareness isn’t necessarily there, the way that it would be if you had one team and sort of everybody knew who was allocated to which project at which time. And honestly like we don’t have a perfect solution to that yet. I’ve never been a part of a research team where we didn’t have a problem with that type of information sharing and it’s really important and we’re trying to do better at it. Actually we’re talking about it right now because number one we want that community of practice. We want feedback to flow. We want people to find beautiful, like synergy of research questions. We don’t want to have people doing the same project in parallel or reinventing the wheel. And all those things require that kind of peripheral awareness of what everybody is working on, or the ability to get that information quickly. On the other hand you have this sort of classic, collective action problem where if it’s too onerous for any one person to provide that information, or consume it for that matter, then the whole model falls apart because it’s sort of only as strong as its weakest link. So we have to figure out how to create a lightweight model. And to be honest the way we’re doing that right now is with Trello. I don’t know if you ever use Trello? It’s a pretty low tech tool in the sense that it’s like the closest analogy – Alex Schleifer, our VP of Design and my boss, he calls it like a digital whiteboard. It’s got columns and cards and columns. So a product team has a column and a card is an active project. And it might have a link to a research brief or an interview guide, but it’s just that simple. And that’s it. And people can check that to find out what’s going on. They can update it. They can move a card into the done pile. I don’t think it’s a perfect solution, but you call that a real problem that we work hard at.

Steve: So everyone can see what’s being researched? Or see what has been? You can look in the done pile and see what’s been done?

Judd: Yeah. So we’re still working this out, but the model is that things that are recently completed stay in the done pile. Eventually they kind of get archived and they’ll live in kind of a research repository elsewhere. But things that are active or recently completed, it’s sort of a one stop shop to view them all. We’re going to try it out and see how it works. Like I don’t think there’s a perfect solution for this, but it’s something that I hear from the team that we need to work hard at. Partly because we find those collaborations and cross-pollinations and partly because it helps us feel like a team and like we have that community of practice and that’s something that we work on actively promoting.

Steve: That sounds like another sort of – that’s the tension of the distributed – I’m sorry, the embedded vs. the centralized model. Not that the centralized model has it figured out because you know creating deliverables – I don’t know how many times I’ve encountered the like – I’ve had – I’ve been the vendor outside an organization and had people in that organization come into a new role and not know of the project and have to reach out outside the organizational walls to get that collateral or those deliverables from a vendor from a year ago, me. So it’s not like if you – it’s not like the centralized model always works for that. The deliverables just have a shelf life and they disappear.

Judd: For sure.

Steve: So is this knowledge management we’re kind of talking about here?

Judd: Yeah. You have somebody with institutional memory and they’re always there. Although of course moving from team to team to sort of cross-pollinate is a good thing, but we probably do that on the order of like 18 months to 2 years. I think everything is a trade-off. I choose to optimize for perfectly positioning a researcher for impact and to be a voice, a constant voice in the product process. The trade-off of that decision is that we have to work harder at the community of practice bit. Thankfully we have a really passionate and collaborative group of researchers at Airbnb who are devoted to that, actively seeking that and we’re small enough that we can have – you know small enough physically in that we can all sort of see each other in the space of 30 seconds and that we can meet in a reasonable sized room and share ideas on a regular basis.

Steve: Are there – do you have regular meet-ups or certain types of activities that are meant to make that happen?

Judd: Sure. You know we are in the midst of a concerted effort for information sharing. So we meet all the time. We meet every Monday morning for a standup which is sort of go around the table and everybody say the two things they’re working on that week. We meet every Tuesday for a kind of what I could call more like a team meeting. Like we get into it. We talk about recruiting and hiring. We often have special topics. We talk about growth. We have – growth in the sense of team growth and individual growth. We talk about sort of big picture product and design context stuff that’s going on. We also have something that Kathy Lee (one of the researchers on the team) put together which is Shop Talk. So it’s like a Friday event in which people can have a beer and give a presentation they would have given to their product team to the other researchers. And we’re also instituting skillshares. I sort of demand that everybody on the team – so my model for this is sort of, in my memory as somebody who got a PhD – so when we get a PhD you become the world’s foremost expert at some tiny little thing that nobody cares about and then you write a dissertation which if you’re lucky your advisor will read and then you’re kind of done with it. But, so clearly that’s why I’m not in academia, but one of the great things about that model I think is that you carve out this niche for yourself which is your calling card and that’s a thing that I ask every researcher on my team to do. Yes there’s this baseline that we must be rigorous, beautiful generalists, but what is the thing that you’re uniquely good at as a researcher? What is your calling card? What is the thing that you want to at first sort of really consume all the materials you can find about, go to seminars, take workshops and read books on and then before you know it you’re writing the books and you’re teaching the seminars and you’re giving brown bags to the rest of the team. I just think if you have a team where everybody has that unique team, an aspect of methodology or practice, communication or whatever it is, that they are passionate about developing into a unique skill and they’re all committed to sharing it with each other, that’s my model of a team that grows individually and grows as a team. Like that sounds awesome to me and we’re work- you know I don’t think that we are perfect in that regard, but we have that as an aspiration.

Steve: What are some types of things on your team – topics for different researchers, that’s there thing?

Judd: Sure. There are methodological topics – just as simple as survey best practices. So we have somebody on the team, Janna Bray, who has made it sort of a thing she’s really vocal and generous about to raise everybody’s game at surveys. We have another researcher, Steffenn Kuhr who is really passionate about rigorous evaluative research in which you are trying to bring as much objectivity to the sort of usability and user testing process as you can and he has some really interesting ideas about how to do that that he’s honed over his career. There are also folks who want to bring aspects that are a little further from sort of core research methodology as their special sauce. So for example, Natalie Tulsiani is a researcher on the team who recently gave a workshop on moderating the observer room. So the topic is as a researcher moderating the group of people who are observing a study. That just kills me. I’m like that is so great. You know what I mean. I worked with a researcher at Facebook who her thing was improv comedy. You know she thought improv could make you a better qualitative researcher and I believe she’s right. So that was her thing. She was passionate about that. I think it could be all over the map, but I just love the idea that everybody on the team has their thing and they’re just chasing it and bringing everyone along with them.

Steve: Those are great. Those are good examples. So the idea of the individual passion is a driver here.

Judd: Yeah, I mean look, the research shows just in general that when somebody stops growing in their job and they feel like they’re in a rut they’re going to leave. So just from a purely practical point of view, it’s like in my interest as somebody trying to build a team to keep everybody growing around things they’re passionate about. But I also think that’s the shortest path to everybody building everybody else’s game to be stronger. Everybody raising everybody else’s game. We will – that is the path to becoming a world class research team and I think it’s both demanding of an individual to always be raising their game and demanding of that person that they be generous and collaborative with the researchers around them to do the same.

Steve: Right. So as a member of that team I am getting energy and reward from being able to follow that passion, but also I’m receiving from others.

Judd: Yeah.

Steve: From them, so I’m always growing and developing as part of that team.

Judd: And if you do that I think developing that community of practice is easier. You know it’s easy to see where it comes from and how it grows.

Steve: So you’ve talked about growing the team. Can you go back in time and describe a bit of the history, the evolution? Where you’ve come from? Where you’re looking to go?

Judd: Sure. So my history at Airbnb is admittedly a little bit short. I’ve been here since the middle of May, so what amounts to about 6 months. I was the 10th researcher. As of this week we have 17. So we’re growing quickly, but we’re not growing for growth sake we’re growing because – I think because the embedded model we have means that awareness of research is high, the team is producing great work, and it’s just I’m getting more demands than I can reasonably service right now. So I need more researchers and that’s a beautiful problem to have.

Steve: So there are teams that don’t have an embedded researcher that…

Judd: That’s right.

Steve: That team needs somebody.

Judd: That’s right and we’re in this moment of accelerated hiring which is necessary in my mind because there are enough teams that have no point of contact, that getting – I mean going from zero to one is sort of technically an infinite improvement, right. And in the research that we do it is really dramatic at that moment. And I think this is – so we’re trying to take advantage of that inflection point. A lot of research teams in industry go on a ratio, right. So a ratio of designers to engineers, researchers to designers, for example. So they might say like 3 to 1 is the right ratio. One researcher to three designers, or one researcher to three designers and PMs together. And what we’re trying to do is just get ahead of that and then back off. And the reason that we’re trying to get ahead of it right now is because we’ve – I think we’re at that moment where having a research voice in all of these teams that have sprung up and don’t have any research representation is going to be disproportionately valuable. So we’re trying to concentrate growth right at this moment and then we’re going to slow down.

Steve: So there’s a context here of sort of shift in product development – let me see if I can ask this properly. You’re describing the teams are springing up so that says that what the product is is evolving and growing – do you call the thing a product – what you as a company provide and the different pieces you’re creating to do that…

Judd: Yeah.

Steve: …is evolving and growing so there’s a thing that’s being made that maybe wasn’t being made at some point and that’s a new team?

Judd: Right. Or you know – and I don’t – I’m not an expert in the growth of companies or product teams of course, but in this one, you know I think as the product organization matures there are two things that happen. One, new products spring up. The other, existing products, the pie grows, right, so that the amount of work is – becomes specialized enough to carve out a new team. It’s just too much. So where there was just one team before that for example looked after all the things we do around host dashboards and host tools, for example, well now we’ve a reached a scale and a complexity of the product such that there are several important work streams. And then they break them out and PMs and designers and researchers and data scientists specialize in those areas. And so I think – that’s a thing that’s kind of been happening, like spawning teams at Airbnb, and in that moment having a researcher at the table is super crucial. We don’t want to lose that momentum and there’s enough important work there – a lot of those people have come from other teams where they had a researcher and they’re like hey I’m working on this new thing, where’s my researcher? I love that. I love getting those requests. Why isn’t there a researcher in the room for this. I means I have to say no more than I want to, but I’ll take that problem any day.

Steve: Are you trying to hire with specific team roles in mind?

Judd: Mostly no. Mostly what I want to hire for is talented generalists and that’s because of what I think of as the virtue of that embedded model and because I don’t want to create – well I don’t want to create an environment in which there’s a bunch of people who are really only either happy or well suited for one type of work because stuff changes. The product moves. The demands of research shift such that okay I need to move some researchers around to tackle this new product area and I don’t in general want to have researchers who I think I’m going to make dissatisfied or their skills aren’t going to be a good fit if they need to slot into a new areas in 6 months or a year.

Steve: So you’re using the demand to figure out how big your team needs to be, but it’s not sort of slot filling, it’s capacity building.

Judd: Yeah. Overall capacity building. I think if we chase anything at all as a team we chase diversity. Like I want a team full of people with different methods, different perspectives, different backgrounds. We just hired this talented researcher named Andrew Sweeney who is going to be our first researcher in Portland. Portland is where a lot of our Customer Experience team is and so Customer Experience, those are the people who answer the phones. If you have a problem before or during your trip that’s what they’re for and they have a whole suite of internal tools that they use to handle these problems, to track information about cases that are open. Those tools are not as good as they could be and we have never done systematic research on those tools. So in comes Andy Sweeney who has – Andrew has a background working on complex, information rich tools, right. So his approach to the problem is different than the one that I would take. He goes to system usability scale for example which is not the first place that I go, but that’s brilliant for this situation because we’re talking about understanding information flows in a complex information environment. Efficiency is important because we want to enable Customer Experience agents to be going as fast as they can and not blocked by the fact that they have the information tool, like a CRM, and three spreadsheets and a notepad and their workflow is just broken. So chasing that diversity as a team is something that I think goes to the earlier point about raising everybody’s game where everybody is helping each other out. I want to learn about system usability from Andrew.

Steve: And so Andrew’s – he’s tackling a new problem space. That’s not hosts or guests? That’s the new organization and being able to deliver a good experience.

Judd: That’s right and I think he – so in a way we’re taking on a new sort of constituency which – as a research team – which is our internal CX agents who we call crewbies. So we have hosts and guests and crewbies.

Steve: Say the word again?

Judd: Yeah, crewbie. People who work in CX, manning the phones and doing Chat and email. They call themselves crewbies. But I mean they’re the most talented, empathetic people on the planet. Like we – many people who work here do these shadowing sessions where we’ll sit with crewbies and sort of just observe their day to day and we do that to gain empathy for the problems that guests and hosts have during and before trips and after trips. But these are some of the most patient, amazing people on the planet and we’re, you know, in the process of working towards better tools for them.

Steve: Okay. So that sounds like a marked evolution in what – in how the work that your team is doing is impacting the company.

Judd: Yes, it’s a more internal focus than we’ve had before. I think it’s fundamentally the same sorts of things which is like do rigorous research that is deeply understanding the problems and needs that people have and understanding potential solutions, but it’s in the context of making our business more efficient and providing better service.

Steve: So will Andy have access to the designers and the engineers that are creating those tools for the crewbies?

Judd: Oh yeah, there’s an entire products team that’s spun up in Portland and in San Francisco just to work on this.

Steve: So once you have a product team then that’s – in your embedded model – that’s when the request for the researcher comes?

Judd: That’s exactly it. And I think I look at a lot of things that way. So I could have hired Andrew 6 months ago when there was no product team, but the reason I didn’t is because he would have had nowhere to land. I think that’s setting a researcher up for failure. You know I want to know – you know a researcher can do the best work in the world, but if there’s no landing zone for that research then they’re not going to be able to have a lot of impact most of the time.

Steve: So at the risk of re-asking a question I brought up earlier, we were talking about slot filling or not, is Andrew someone where you saw his particular – what made him a diverse candidate? Can an individual be diverse? I don’t know. What was unique about him that contributed to diversity? It sounds like the thing that he was brought in to do is perfect. For the company, perfect for him. Six months ago were you saying that the right project for him wasn’t there? We talked before about capacity building versus slot filling, but I’m wondering is there sort of a nuance to that that goes with the diversity aspect?

Judd: Yeah, I mean – I guess I would be lying if I said that there was never an appropriate time to look for a specific skill set and that it was always just general capacity building, looking for great researchers. This is a case where absolutely I thought I’m sure we can find an incredibly talented researcher who has experience working with information rich internal tools and making them work better and understanding complex workflows and efficiency and all this stuff. I’m sure we can find that person and low and behold we did. At the same time I think, you know Andrew has done a bunch of things in his career, not to emb- I’m sure he’s going to be embarrassed about this and – and he just started on Monday by the way, but – he’s done a lot of things in his career and he would be perfectly capable of filling almost any other slot. He just happens to be a great fit for this role.

Steve: Let’s go back in time. As you said you’ve been here “x” amount of time, but you must have some sense of the history. You were #10 you said.

Judd: Yeah.

Steve: Do you know the story of #1? Like how did research become a thing that was hired for and done?

Judd: Well research is in the lore of Airbnb. One of the stories that gets repeated internally is about one of Airbnb’s early investors who at a very early point in the company turned to Joe and Brian (Brian Chesky and Joe Gebbia, two of our founders) and he famously said go to your users. And so they spent several weeks in New York City with some of the first hosts on the platform, deeply understanding what they were going through and what their experiences was like. And Joe and Brian are designers by training. They were at RISD and I think that having designer founders – having you know two of our founders be designers has sort of infused this kind of design and research sensibility into the whole thing . And that’s been beautiful for us as a team because I don’t – let’s say to the degree anyone cares about buy-in from the top, which I think realistically every team needs to care about, relatively speaking we do not have to worry about that. We have people like – we have a whole bunch of leaders saying where’s the research. Let’s look at the research. Go to your users. So it’s really embedded into the lore of the company. I wish I could be more specific about the history of this team in particular, but honestly I can’t.

Steve: So with that caveat I’m going to ask another specific question which you can then throw back and say you can’t answer. Especially for this podcast, I think an implicit topic – I think it’s implicit – is this idea of research leadership. And the trajectory often in organizations is a researcher is brought in. Sometimes they’re a junior person and they kind of work for a UXer design person, depending on the industry. And then some organizations there’s a point at which it’s acknowledged that this is a specific thing and it needs to be led, not by someone who’s a designer, but by someone who that’s what they do. That’s my sort of general sense of the pattern and I wonder, do you know when did that happen at Airbnb? When was research a leadership – when was there leadership in research?

Judd: Yes I do.

Steve: See! You said you couldn’t answer it.

Judd: I can answer a few questions. So in the beginning of the team it was just one or two researchers and obviously the design team was much smaller and it was just one big team. Everybody reported to Joe Gebbia. And over time what happened is the design team grew. I think the research team, which was at that time called Insights, was not growing at the same rate, and partly because I think of a – the fact that it’s hard for somebody who’s not a researcher to know how really best to leverage and grow that organization. But the partnership was there. They were doing great work and then at a certain point Alex Schleifer showed up. Alex is our VP of Design. He’s been at Airbnb for about a year and a half and he took over leadership of the organization and almost immediately recognized the need for research leadership. And it took him awhile – it took me awhile to meet him and I think to his credit he gives me a lot of freedom and says like Judd I want you to build a world class organization and I want to support you in that, what do you need? And he gives me a huge amount of context and feedback, but he’s set me up with his team such that my peer now is Katie Dill who runs sort of the organization of designers. Adrian Cleave who runs our design operations organization – but we are Alex’s first team and he treats us like peers and so there is no sense that even though I’m in the design organization I don’t have the sense of – that some – of research being – the sole value of research being to feed design for example. I feel genuinely like we are equal partners in a product process and I give Joe a lot of credit and Alex a lot of credit for recognizing that that was a useful and valuable thing to do with research from a pretty early time.

Steve: So the position that you came in to take was a new position?

Judd: It was a new position which was to lead the research organization. Yeah. And I was and have been given like a fair amount of leeway to say how should we do this. And that feels great. I don’t have the answers. You know when he said that I was like great I don’t know how to do that, but I’d love to work with you and figure it out.

Steve: That seems really key and I don’t know if that’s just part of the overall culture or something that is unique to you, but – and I’ve had this conversation with people a number of times about knowing the solution versus knowing how to get to the solution and the comfort with the ambiguity. You’ve mentioned a number of times in this conversation that we don’t have this figured out. We don’t have the best solution but we’re trying this.

Judd: Yeah. I think anybody – you know it’s interesting because we were recently having some conversations about the structure of product teams in general and how hard that is and going why hasn’t this been figured out by now? Shouldn’t there be one canonical way to set up a product team? No. There is no one canonical way because every business is different, every product is different. Values are different and that implies different things for structure. Well I think it’s the same thing for research, for design, for product overall. I feel everything that I do is to make good principal decisions in response to the realities of the situation at Airbnb. To try to build a research team which is uniquely responsive to Airbnb and at the same time embody some qualities that we really care about – world class rigorous research, perfectly positioned researchers every time. Those are kind of our two mantras. Center of excellence, perfectly positioned. And those are sort of – there’s no playbook for that. I think we apply those principals. We apply them systematically and we communicate a lot. And I don’t – it sounds stupid when I say it out loud, but I don’t know of another way. I think I’ve been really, really hard with my team and they are embracing of the idea that feedback is the most important thing we do. Feedback for each other, feedback for me, feedback for me with them – because I say to them I don’t know of another way to do things better than to give and get feedback. It should flow like a river. And I think that can be hard, to be open, to focus on strengths rather than weaknesses. To be humble is a thing we’re always seeking to be better at, but that’s how I approach the task of building a team. I welcome any and all feedback. I don’t pretend to have all the answers. I just think I have a principled way to get there.

Steve: I’m going to ask sort of a clichéd research question but I think it follows from what you’re saying. In a purely speculative way, like if we were to be talking in five years let’s say, what do you think – what could you imagine the state of research at Airbnb being?

Judd: Five years is a long time. Well, hopefully we will have scaled to the point where we have what we would consider full research coverage. To me that means being sort of lean and agile, but having a researcher represented kind of at every level, from the ground level up to the leadership level and we’re definitely not there yet. It takes time to build that capacity. You want to build it from within too so we’re growing that capacity at the same time. I also think in 5 years we’re going to be a company that has already hopefully long since already embraced the idea of like a really global/local product so that we have product teams that are staffed all over the world and I have researchers embedded in those teams and they are connected back to San Francisco and they are providing this very difficult thing which is how do you create a product which is sort of you know 80 or 90% core, but that 5% or 10% on either edge is the real local bit where the rubber meets the road. So how do we create a uniquely Singapore product or a Germany or Brazil product where we represent belonging in a way that resonates culturally where we have tweaked our onboarding process to highlight different value propositions. Where we do special things that facilitate trust between guests and hosts that are unique to an Indian market for example. And I think that’s going to take an organization which is global. Like I don’t think that sending researchers from here out to those places is going to cover that. So that I hope sooner than 5 years will be true. And then the last thing I’d say is that I hope that by that time we will have created not just a product organization, but an entire company which is totally driven by research and insights in which – you know I don’t think that we need to do all the work, but whether we’re talking about marketing or legal or policy or local operations or customer experience there are research questions in all of those places and I want us to be able to have an organization where somebody on my team is the glue to every one of those teams.

Steve: So taking embedded beyond product?

Judd: For sure. Because I mean – for example name a – it’s a little bit stupid to me that most organizations have a product research and a market research organization. Name a product question which doesn’t have an influence on marketing and advertising and vice versa. I’m sure somebody could snark and think of an example, but that would be the exception that proves the rule, you know. I’m not saying – there’s no need to have one big kumbaya team, but there is a huge amount of value to saying like look fundamentally we’re just researchers applying our expertise in different disciplines and in different ways, let’s be more together than not. And in my experience most organizations are not like that. So I don’t know. That is a big aspirational goal for this company.

Steve: I love it. Just so well put. So that’s kind of the looking ahead. It’s a lovely vision. I want to go back. You mentioned a few things about yourself – PhD and some other places that you’ve worked. Maybe you could just talk about – I’m just curious about sort of parts of your background, whether it’s professional, personal.

Judd: Yeah.

Steve: Go back and talk about some of the things – what are some of the things that are in your background, experiences or education or whatever, that are really present for you now in the way you’re looking at the world and the kinds of things you’re trying to make happen?

Judd: Yeah, I mean, I guess if I – that’s a really – you’re forcing me to be introspective. Well I began – as an undergraduate my major was cultural anthropology and so I began my career as an anthropologist. You know very focused on meaning and understanding and writing culture and the idea of culture and what it meant. And I feel like that’s really important to me now, especially at a company that’s focused on travel. It gives me a lot of empathy. You know I feel like I studied a lot of kind of epistemology that gave me like a fundamentally subjectivist approach to research which I think makes it valuable – makes me a better multi-method researcher and leader in the sense that I think objectivity is a myth. That I think everything has a cultural and social lens and all we’re doing is like seeking confidence and that confidence is built through multi-method research, through looking at the same problems from multiple directions and perspectives. So cool, if you do that there’s no territoriality. We just need all the methods. They’re all flawed and they’re all powerful. So I think that was influential for me.

My PhD was in social psychology and information systems so I took a real jag into experimental social psychology and data science. That I think informs me because I, you know, have had now very deep experience with both the most ethnographic qualitative work there is, you know in which I spent six months at an after school arts and media program in the Bayview learning about informal learning outside the classroom. And I’ve done a huge amount of experimental social psychology and data science and I can appreciate – you know I can hold my own with basic statistics and write R and Python and PHP and talk to an engineer in code and stuff like that. So I think having had that experience is deeply influential for me.

And the last thing I would say is that I think – so after I graduated from undergraduate I went to culinary school. So I spent six months getting a degree at the French Culinary Institute in Manhattan and then I worked as a chef for awhile. I actually – I was not destined to work as a chef. That was an extremely difficult life and I found that out and was grateful to have kind of something to go back to which was anthropology, and graduate school. But having been a chef, having been somebody who worked with my hands and really embraced that kind of creativity and appreciated flavor and the craft of perfect taillage, like knife skills. I think that helps me have like a little bit of a window into a design world. I don’t think that I have a lot of a window compared to the amazing people I work with on the design team, but I think it gives me an appreciation for craft, for art, for the details, for the menial but beautiful handwork that makes great food for example. I think I can apply that to research and to design.

Steve: That’s awesome. You know we talk about researchers as kind of translators and even this – the diagram you drew of somebody elbowing somebody they’re sitting next to. It’s kind of like a translation thing and it seems like the experiences that you’ve had have given you a lot of vocabularies to translate between and maybe even the translation was sort of an ingredient in those.

Judd: I agree with that. I think – you know recently I gave a talk – I’m not sure I should say this, but I’ll say it anyway. Recently I gave a talk at a learning lunch here at Airbnb, mostly to a data science crowd, and the talk – this is also a talk that I gave at a design conference in Philadelphia. And it was basically a social psychology talk. It was about the ways that we are all biased. It was about confirmation bias and minimal group bias. It was about post hoc bias. And like the point I was making is research is human and research requires human – humans. Every type of research does and humans are flawed. We cannot avoid our biases. The best we can hope to do is counteract them. By hanging out with each other. By trying multiple things. And that talk got me into a little bit of trouble only because I think it seemed like what I was doing was being unduly negative about different methods, and I was. It was kind of one of those talks where you crap on everything and that was my point. It was like okay everybody spends time exulting the virtues of their particular method and that’s great. They’re all strong, but they are all weak and we are weak. And so I think part of the translation for me is to be able to speak to everybody and go okay I get the strengths of your method and I get its weaknesses and this one too and this one too. So let’s just get past that and work together.

Steve: And that’s being human.

Judd: How can you avoid being human?

Steve: Yeah. Is there anything we didn’t talk about in this conversation we should cover?

Judd: I guess one of the things that is on my mind a lot is, because I’m growing a research team, is the idea of responsible growth for a research team. And what does that mean? I’ve been thinking about it because we are growing and I want to make sure that we do it in the right way. I’ve experienced both types of growth, responsible and irresponsible. And the way I think growth gets irresponsible is not even really in a headcount way or a budget way. It’s probably more like a communication and culture way. So the things that I want to promote in the content of responsibly growing this team at Airbnb is radical transparency and communication. The worst outcome in the world is if new researchers look at old researchers look at managers look at me and go I have no idea what these people are thinking. Or I don’t know how to plug into this decision making process and I don’t feel like I have a say. How did this get to be this way? I don’t think it’s right but I don’t have a voice. Okay, terrible, terrible. Need to avoid that at all costs. So the way we do that is by being fundamentally transparent and collaborative. Like everybody from me on down, we talk about what we’re thinking and why we make the decisions we make. We open them to feedback and maybe we form a committee in which everybody has a voice. Okay, committees, that sounds kind of bureaucratic, but what I mean is like involve everybody from the intern to the senior manager in doing something like how should we build a skillsharing system? Cool, we can all be involved in that.

And then the other thing I think is responsible growth is making sure that everybody has a path forward in their career because other than feeling like you’re stuck in a rut, the other thing that I think is a recipe for a researcher being dissatisfied is feeling like they have nowhere to grow. And so making sure that everybody feels like they have a path is really important to me as the team grows. That path is building out your unique niche, working on your core skills and expanding them, taking on more stakeholders, taking on more senior stakeholders. And you do not have to become a people manager to be a lead. That is another thing I feel like is really important because people management is a set of skills that is unique and crazy difficult and learned. And you choose to focus on them. Not everybody wants to do that. It is absolutely not the case in this organization that the only path forward toward seniority as a researcher is through people management. It can be through research leadership, through product leadership, not just people leadership. And so as we grow I think the responsible bit is making sure that all of those paths for growth are open to everybody and everybody knows about them.

I spend a lot of time thinking about that because it’s scary to think that we were 10 when I started, we’re 17 now. I don’t know, we’ll be 25 or 30 in the next year or year and a half. That’s a lot of growth and everybody thinks the next person in the door is going to be the one that changes the culture, but I don’t think that. But I think we should be deliberate. It should happen on purpose not by accident. That kind of growth, the planning around it. So that’s what I seek by doing it this way.

Steve: That’s fabulous. Do you have any questions for me?

Judd: Why are you doing this podcast?

Steve: Why am I doing this podcast? You know it’s self-serving I think. I’ve been around long enough that the best work was being done by vendors. Let’s just say that. And I still am one. I hate the word vendor…

Judd: The “v” word.

Steve: …we all know what that means. You know people who work outside organizations. Teams like – companies like this didn’t exist. Teams like this didn’t exist. Leadership roles like yours didn’t exist. It’s a big change that’s happened in the last few years where now – I mean the kinds of vision for what research can be and how it can impact. You know it has to be done inside. It’s not to say that – I’m not saying that my work doesn’t have value, it’s just different. If you work with organizations the context has shifted. There are people inside organizations that have roles and titles and responsibilities that didn’t exist before. So my professional life has changed. So it’s self-serving because this is a chance for me to learn about this shift. It’s fun to be able to do that. It gives me an excuse to have conversations with you and learn things I wouldn’t otherwise learn. And you know I think much of the best work – whatever the percentage is – amazing things are happening inside organizations. So that’s the place to kind of look and learn. I’m not – I like working outside organizations so when you’re a consultant or a vendor you journey from place to place, like in The Incredible Hulk. You know you have these adventures and then you have to leave at the end which is an obscure reference for people that didn’t watch the T.V. show in the 80s or 70s or whenever that was.

Judd: I didn’t get it.

Steve: Okay.

Judd: I liked it though. It’s a good reminder of how young this field really is.

Steve: Especially in the form that it’s in. The conversations we’re having about change in design, insourcing and acquisitions and so on. I feel like research follows design and kind of we’re trailing by a couple of years. But we’ve seen research firms get bought too, not just design firms, you know in our recent history. We’ve seen that happen. So I’m curious.

Judd: As much there is no one canonical way to build a research organization there’s also no information out there about how to do that. And so to have a resource where you can learn from people who are trying to build those organizations is really valuable to the UX community so thanks for doing it.

Steve: Alright. Well it was great to speak with you. Thanks so much for being a guest and for being my host here today. I’m throwing those words out in a really confusing way now.

Judd: Nicely done. Thank you, Steve. It was a pleasure.

Jan 19 2016

1hr 4mins

Play

Rank #2: 14. Monal Chokshi of Lyft

Podcast cover
Read more

In the final episode of the season I speak with Monal Chokshi, Head of User Experience Research at Lyft. We discuss traditional paths to a user research career, creating routines for meeting different types of users, and the emergence of leadership roles in user research.

As researchers we love having curiosity and it’s fulfilling to do the hands-on work. That’s what makes us passionate about research, but in a business one of the dangers is getting too far into that fascination, but then not taking that next step. Maybe that’s where managers can help with making sure that the data and the analysis and then these insights and actionable recommendations are affecting product. We’re not just doing the research and being fascinated for our own curiosity and fulfillment, we’re making sure it’s put to good use. – Monal Chokshi

Show Links

Follow Dollars to Donuts on Twitter and help other listeners find the podcast by leaving a review on iTunes.

Transcript

Steve Portigal: Thanks, Monal for being a guest on Dollars to Donuts.

Monal Chokshi: Yeah, sure. Well first Steve thanks for having me on your podcast episode. Really excited to talk more about user experience research at Lyft. So my name is Monal Chokshi and I’m the head of user experience research at Lyft.

Steve: And if I didn’t know what Lyft was, what would you explain to me?

Monal: Yeah, so Lyft is an on demand transportation service. So for those not familiar, basically all you have to do is download the Lyft app, open it and tap a button and basically you have a driver ready to drive you wherever you want within minutes at your doorstep.

Steve: Where does Lyft operate?

Monal: So the headquarters are in San Francisco. We also have offices in Seattle, Nashville and New York.

Steve: And where do you have drivers?

Monal: Drivers all throughout the country. I believe we are now in over 190 cities.

Steve: This is an American operation?

Monal: Yes.

Steve: So drivers in the United States right now. Is there an international goals for the company?

Monal: Well right now we recently in the last 6 to 9 months announced some partnerships with international ride sharing companies. For example with Didi which is the largest ride sharing company, similar to Lyft, in China. And Grab which is in Southeast Asia. I believe they’re based in Singapore. So the idea being that travelers from the U.S. would be able to travel to one of these places, open the Lyft app and call a car. It wouldn’t be a Lyft car. It would be one of these folks from our partner companies coming to pick them up. So we haven’t launched that yet, but that is a partnership that we’ve announced and we’re really looking forward to getting that going.

Steve: It makes me think of like code sharing and airlines where you can purchase through the sort of front end that you’ve always used, but the service is delivered through a partnership with somebody else. Is that a, is that a – does that comparison apply to something like this?

Monal: I think it might. I think it’s a pretty good way to think about it from an analogy.

Steve: I’m sure there’s probably better analogies for how products and services in one market end up in another, but that’s the best I could think of on the fly. That’s really interesting.

Monal: Yeah.

Steve: And I’m sure there’s going to be design challenges – what does that experience look like in that different situation.

Monal: Yeah, we’re really excited about designing this new experience just because of the various challenges that come with different cultures and different transportation schemes for various cities, especially international. As we all know the cultures, the modes of transportation are different from city to city in the US and then overseas there are various challenges with language as well. So there’s a lot of really exciting things to think about.

Steve: I love like the really specific use case you’re looking at now which is I’m from the United States, I use Lyft here. I’m in a different context so how can I bring access to that service with me. It makes me think about what people that use Netflix complain about it or I think I hear this with the iTunes store. For example, you can’t get the same access when you’re in a different place. So this gives you – I’m sort of imagining it gives you – you can sort of bring your Lyft travel experience with you into these other environments.

Monal: Yeah. And people use and are comfortable with Lyft here. We want to be able to make sure that they have the comfort of the same kind of service over there, something that they can trust and feel good about. Something that’s familiar in an area where everything seems unfamiliar to them.

Steve: Can you talk about the work that you do here and what is research like at Lyft?

Monal: Sure. So research at Lyft has always been part of our core DNA in a sense. The founders, Logan and John, are really focused on treating people better and that includes all of our users, our drivers, our passengers and so really understanding who these people are and figuring out how to create the best user experience for them has always been top of mind here. Before I joined we did do various research activities and there were several initiatives happening at Lyft. Once I joined, I joined to sort of make that part of our regular product and design processes and so the kinds of research that our team does really runs the whole gamut of what UX research can do. We do everything from field studies and surveys to user and usability testing of new designs and products. So yeah, it’s been a really fun place to work, just also because user experience is so well regarded and supported here.

Steve: When did you come to Lyft?

Monal: I’ve been at Lyft for about a year.

Steve: So you’re describing a little bit about how – you know what you were brought in to do, to sort of – I don’t think you used the word formalize, but that’s sort of the word that comes to mind. I guess is the question about – you know what that change has been that you’ve helped drive.

Monal: Sure. I think a lot of the research previous to when I joined hadn’t been formalized in a sense that now it’s much more a part of the regular, daily, weekly process. And I’ve also joined to basically build a world-class team of user researchers so that – you know Lyft has grown so much in the last year and we are continually growing at a very increasingly rapid rate. So there are a lot more needs that we have from a research perspective and I think as we grow the company there are more needs for insights. So, bringing that to the table and helping build a company that’s very data and insights driven is a priority for the company.

Steve: I’ve seen sort of two different modes of how insights – I’m sure there’s many more, but two kind of primary modes in terms of how insights can impact decisions. One is sort of the reactive and one is sort of the proactive. And you know where reactive is somebody comes to researchers with a question and says ‘Hey, can you help us make a decision about this?’ and the other is researchers often who have done research are evangelizing those insights to try to bring them out to find the people that can use them to make decisions. I don’t know if you even buy into that kind of framework, but I’m wondering a) does that ring true and b) if so like how does that map to what you’ve seen in your work?

Monal: Yeah, sure. I think we do a mix of both. When I first joined I think it was more the former in the sense that people have specific questions and we’re always aiming to try and answer business questions using various methodologies, whatever is going to work best to get those questions answered. But there are definitely times when we actually see something that is interesting and we’ll probe further on it in order to say actually there’s an opportunity here, or actually there’s some pain points or something that we need to look at further here and we’ll bring that to the attention of stakeholders in the company for further investigation.

Steve: So it sounds like part of that evolution, the growing of the company and the growing of the role of what you and your team are doing is – I mean developing the ability and developing the context to have those conversations. Yes we can answer the questions you’re asking and hey everybody else here’s things that we’re already doing or can do to support you.

Monal: Um-hmm, definitely.

Steve: So maybe that’s – maybe what I sort of asked inarticulately a moment ago, maybe that is the vector of a growing research practice is that you shift from reactive to more kind of reaching out.

Steve: Okay. So we’ve talked a little bit about sort of some of what you’ve been working on in the last year, but maybe we can just go back, back, back and maybe talk about how did you end up being a researcher? What are some of the things that you’ve done that have led to today?

Monal: Yeah, definitely. It’s funny, I often joke that my background and my entrance to becoming a UX researcher is very non-traditional just because it is in fact on paper very traditional. I’ve been focused on this area since the beginning of my career, even back in my days at Stanford as an undergraduate. So I think that’s pretty rare for folks in our area. So yeah, when I was growing up I always had a fascination for computers and technology and played a lot of video games as a kid. Started programming on my own for fun when I was in middle school, on my Commodore 64 – dates me a bit. And when I went to school in the mid-90s I was lucky enough to be at Stanford where a lot of stuff with the Internet was just coming, seeing light, and getting out into the world. So very lucky to have been there and found my major which is called symbolic systems. So symbolic systems is an interdisciplinary major made up of core classes from computer science, philosophy, psychology and linguistics. And I graduated with a concentration in human computer interaction. So a lot of what I do today is rooted all the way back from that time and one of my first – I guess the – my start of my career, I’d say I did – I was more of a – what we would call today a UX generalist. So back in the dotcom times, as you remember – sorry to call you out…

Steve: Hey, I had a VIC-20. I had an older Commodore computer than yours. I’ll own it.

Monal: Cool. So as a UX generalist – I’d maybe call someone like this today – I did everything from front end coding and UI design to information architecture and user research. And back then, I mean I was really seeking more objective ways as a designer to know how do I go about my design. And unfortunately I think most user research at that point was limited to usability testing. So while I had done a summer internship while I was in school at Boeing in a usability lab, I grew more interested with various kinds of methods of research. Like what else can we do, how else can we learn about users? And I was very lucky to have had a mentor at this company called Trilogy – it was my first company that I worked for – someone named John Morkes who now has a consulting company in Austin, Texas. He’s had it for many years now and back in the late 90s he actually worked with Jakob Nielsen at Sun. So he was probably the best mentor I could ask for and we together built a usability lab at Trilogy and he mentored me while I learned more about doing things like contextual inquiries, focus groups, all kinds of methodologies and that’s what got me really excited about focusing my career in this area. And I had some really big successes from doing multiple method research in terms of helping the company understand what was actually happening with users, even at that time and this was back in like 1999. So I saw the power and the value of what user research could provide. From thereon out it got a little more challenging as the fall of the dotcom era occurred and at that time user experience research or usability they called it – I guess UX was sort of coming to bear around that time, the term. It was much harder to find a job doing this because at that time it was seen as somewhat of a luxury kind of a position. Companies didn’t necessarily see the value in having it beyond something that if they could afford to do it, great. It wasn’t table stakes. So I focused on continuing with design, but I kept coming back to research in my mind, this is what I want to do. So I eventually went back to grad school to focus on understanding more different kinds of methodologies, how can I apply this? How can I become a UX research as for a career. So I went to UCSD and studied cognitive science there and eventually started working as a UX researcher and I have been ever since. So prior to Lyft I was at SoundCloud in Berlin for 2 years, starting a research team there. And prior to that Intuit and Yahoo for several years. And you know as I mentioned around the dotcom time I worked in some various startups.

Steve: You prefaced this whole story with – that your story, it’s a traditional path which makes it sort of – as you said, that’s the exception.

Monal: Exactly.

Steve: So I was listening as you were talking and think well what is the traditional path. So I don’t know. I’ll just throw it out as a question. What about your story and the path you’ve gone through, what makes it traditional?

Monal: Traditional on paper.

Steve: Yes, I’m sorry, yes. That’s an important disclaimer.

Monal: Right, right. Well it’s because I actually, you know back in 1994 I took a class at Stanford called introduction to Human-Computer Interaction with Terry Winograd where we were actually designing and looking at user interfaces for software. So this to me is something that most people, maybe going back that far, didn’t necessarily look at at that time. And even when I talk to folks today, you know 20+ years later, folks are still like wow, what you do is really cool. And I say yeah I know. They’re like how do I do that? How do I get into this industry? I want to do that, but I don’t have the background. I didn’t go to school for it. Should I go to school for it? Should I take classes? How do I break in? And I think there are a lot of folks who – most folks I know who are researchers today, even designers, people in the UX field, they’ve come from all different paths and I think that’s part of what makes it a really interesting set of folks because we are so diverse and we have lots of different paths that we come from, but all of ‘em include doing something with people, trying to understand people and enjoying that.

Steve: So what’s your answer to those people that ask you how do I get into this? What would you tell them?

Monal: Well I think it depends on the circumstance. My advice – I mean, and it depends on what kind of career they’re looking for, but there are more classes today which I encourage people to take. There’s lots of stuff online and I mean obviously the best thing would be to just get the experience, but I think it can be very difficult to do that cold. I know there have been folks who have just gone out and done research on their own and said hey, look, I’ve done this about your company. I’ve used your app. I tested it. Sent it over to the company and sometimes with really great success. I’ve even had some people send me stuff like that about Lyft. So it’s really interesting to see how folks are basically trying to break into the industry. But I mean definitely a traditional on paper route is very helpful, but I don’t think it’s necessary.

Steve: So in your traditional on paper route – it sounds like I’m talking about a paper route – I mean, you know the – it seems like there was this culmination for you with UC San Diego. It seems like that’s where you said oh I’m going to go become one of these things and that’s the way that you went about doing it. And of course you had all this background, going all the way back to Terry Winograd’s class, but like a graduate degree in – what was it, cognitive science?

Monal: Um-hmm.

Steve: See, I don’t know. My perception, that seems not traditional on paper to me. That sort of seems like more in the HCI realm and less in the – although having said that I’m not sure is it a social science degree or a design degree. Like what’s my perception of the traditional on paper thing? So the fact that you and I don’t even share definitions of traditional on paper even though – we have – I’m a little older than you but we have – we’ve seen some of the same eras of this work. I don’t know I think it only goes to bolster your really important point that the diversity of backgrounds and disciplinary orientations and all that that make up the work is just part in parcel of what this practice is about.

Monal: Um-hmm, definitely. I mean I chose to go to UCSD because – well first it was on my radar because of Don Norman and you know the Design of Everyday Things, previously called the Psychology of Everyday Things, really resonated with me and that was sort of my bible for – for you know when it first came out. And then there were multiple labs there that were focused on using ethnographic techniques in order to design technology for everyday use and so that was one of the things that really resonated with me in terms of going there. It wasn’t so much the cognitive science academic courses. It was more about the folks there and the opportunities to do research there where people were thinking very similarly in terms of you know how to do research and it was very applied in that sense. So yeah, so I – even though it does sound somewhat orthogonal to UX it was actually very close to it.

Steve: Especially once you explain it. Yeah, seeing an academic use of ethnography kind of in terms of using technology that seems very – that’s still cutting edge for the work world and it’s extremely cutting edge for academic back a few years even.

Monal: Yeah and back then I think in the early 2000s – I mean when I graduated from Stanford I always wanted to go back to school ‘cuz I’m always like wanting to learn and a student at heart and I wanted to work first. And so I already had on my mind getting a graduate degree and craving the desire to go back to academia at some point. I just wanted a break and some real practical experience, but that said it wasn’t that I hadn’t tried becoming a researcher. Back then you really needed an advanced degree to get into that field because I had tried. And part of it was the fall of the dotcom era. So it was really difficult to have those opportunities, but the companies that were still looking for folks like that, like some of the bigger folks – I mean I don’t even think Google –Google definitely didn’t have folks then looking at that. But maybe Yahoo or eBay. That you definitely had to have more of an academic background to enter.

Steve: Right and such a contrast to now where as you said there’s a lot of classes out there where these classes, and General Assembly springs to mind, but I think there’s probably dozens and dozens of other examples, but they seem to be maybe the most common brand that I come across where that – like a UX Intensive from General Assembly, like that’s the that’s sort of the symbol of qualification where an advanced degree was something. So that’s – if you just compare and contrast what it takes to get an advanced degree vs. what these kinds of programs are providing and sort of the coin to the realm has really shifted. I don’t know part of me things the population has – of researchers has just exploded between the time that you’re talking about and now. I’m now sure what the – do we know what the cause is and what’s the effect? Or what has led to what?

Monal: Yeah, it’s a good question and actually before I left for Berlin in 2012 I really feel like there’s been a shift. I mean – when I finished my degree and started working again I thought wow I’m always going to have to work at a big company because I still had this mindset of what I explained back in the dotcom days where this was a luxury. You’re the last to come into the company and you’re the first out. But that’s no longer the case and I’ve thought about this a bit and I sometimes think that maybe, you know back then it was all about the technology – wow, this is really tough to build. There’s a lot that needs to go into this. Just getting that done was a really big accomplishment. And I think now – not to say that building code is easy, or writing code, but building an app it doesn’t have to be that difficult. And so it’s become really important in the marketplace to have a great experience. It’s not just about having the functionality now, it’s the form as well. It’s a lot of times a differentiator as to why people choose your product vs. another. And I think that’s part of the reason why people are jumping on the bandwagon of oh yeah, actually this is really impactful. We can learn how to get our product to be successful in the marketplace.

Steve: And that think about the importance of the experience, I agree with you totally and it’s – we keep getting it demonstrated to us over and over again – I mean as consumers I guess we see this. It’s really hard to do that. And you know we have lots of good examples of a good experience and it’s just amazing to see how it’s impossible to sort of replicate. I think it you look at sort of the battles between mobile phones over the last 8 years or something, like it’s hard to – just because someone does one that has a good experience you just see these terrible things come out of other companies. And if we were two people wearing – you know with design in our title, we might talk about that issue in a certain way. I wonder if you have a perspective, wearing a research hat, like why do you – what’s the conversation a researcher should be having about why it’s hard to – we know that a good experience is important, but the creation of that experience is like amazingly elusive.

Monal: Yeah, I don’t know if it’s necessarily something that researchers can always pinpoint too. I think a culmination of factors have to come together. Various things and some of them are just very emotive and users can’t describe them. Researchers might be able to observe it happening, but I think it takes a really good researcher to be able to dig in and understand all those things that are happening in this person’s subconscious so to speak.

Steve: Yeah. That makes me think of a comment you made early on where you used the phrase ‘world-class research team’. Can you define what that is?

Monal: Well I’ll tell you a little bit about what I look for in a researcher. So there’s three main traits that I sort of boil down to what I think are most important for researchers – user researchers to have. First people need to be empathetic, so have user empathy – and part of that is being a really good listener. So being a good listener, being a very observant person, also being personable and approachable because any time you’re with a user or someone that you want to learn from and about they need to feel comfortable in your presence. And along with the user empathy piece and the definition of being empathetic is you need to be able to look at things from that person’s perspective. So if someone is really judgmental that’s going to detract from their ability to be a researcher just because you have to be able to consider their point of view.

The second thing is analytical thinking. So obviously strong one. One of the things that I think can be the most impactful for the quality of research is being able not just to collect all the data, which you might use the empathy skills to do in a sense, but to then take those insights – well take the data and the findings and turn them into insights and actionable recommendations that will drive good results for the business and for product design. So there’s a lot of other areas also where analytical thinking can come into play, but I think that’s what’s unique to this field and this profession.

And lastly, this one is probably overlooked by many, but I would say having really great project management skills and being organized because oftentimes as researchers we are managing so much when it comes to participants, sessions, multiple projects and you have to be organized and keep all of your data very organized. It’s just part of the territory in terms of having those skills to manage all of that. And oftentimes manage the teams and stakeholders that come along with that so that you can ensure that people are there to observe, to participate. You know there’s some administration work – administrative work going on here, but a lot of it revolves – your success as a researcher resolves around you being able to stay on top of it and be very detail oriented.

Steve: So those are things you look for, right. How do people exhibit those characteristics in a way that you can assess?

Monal: Um-hmm. I mean it’s definitely hard to do in a single interview. During the interview process I definitely like to be able to see them in action. We have folks do – typically do like some kind of homework during the process and I often like – like very many companies like to have them do a challenge in the office where we recreate, you know kind of a real life scenario, some role playing, and so we can kind of see how they tackle problems as well as interact with folks and understand how they may work with stakeholders. I mean teamwork, all these other things are also very important, but the three that I pointed out were specific to, I think, user experience researchers that are not necessarily applicable to other fields for specific reasons.

Steve: I like that your three kind of span a research project, for lack of a better noun. I mean you’re not looking at people’s data gathering abilities. Or not just that, but what you do with the data and then how do you help it – how do you help that result succeed in the environment in which you’re working. So sort of managing the logistics of your stakeholders and their participation and kind of their engagement and so on. I like your list quite a bit.

Monal: Thank you.

Steve: I think those three things make sense, but is there anything controversial there or anything missing or where you would dispute how other people approach this?

Monal: There’s definitely a lot missing from the list. I think these are just three things I keep in mind, but of course there’s plenty of other traits that are important. You know, as I mentioned, teamwork, being a fast learner, being adaptable to change. So many things that we probably look for on a holistic level. You know if you want somebody on your team you want to like them. There are lots of things that I think can be applied to many different roles when you join a company. I mean at Lyft we definitely get excited about folks who are also excited about our vision and the vision at Lyft you know is something that attracted me at first. Having lived in Berlin for 2 years, coming back to San Francisco I didn’t know much about Lyft, just because we didn’t have it over there and I’d heard a lot of friends and other folks talking about this company. So I finally decided to give it a try when I was on a visit here and thought it was so interesting, like the whole sharing economy and talking with my driver and I started thinking about Lyft as a company and realized I didn’t know much about them, let me check ‘em out and searched them online and started reading about their story and that’s part of what got me really excited because our founders were really interested – they weren’t just interested in building a better taxi. They were really interested in kind of the long term, big picture vision of creating better communities and reconnecting people through transportation. You know the long term vision is redesigning cities. It’s really cool. I mean this is such a hot space right now and we’re working on so many exciting projects as a company. One of the things I love is that we’re constantly executing. We’re getting things out the door. There’s always something big happening, but the vision is always what keeps us focused on what it is we’re trying to do. And that vision is something that I really can get behind because it makes my job feel a lot more meaningful. I mean I think as UX researchers, you know having empathy is something that we just innately have within us and we desire helping others and so when you look at the meta, for this company that’s what we’re aiming to do as well. So, you know we’re looking at reducing the carbon footprint, getting fewer cars on the road, putting more people in vehicles, making rides cheaper through ride sharing, Lyft Line, and building communities through that experience. I’ve had so many amazing conversations with other passengers or drivers in the car and I always just leave with a smile. So it’s really fun to work on something in the office, take a Lyft home, and actually talk to my driver about oh, wow, you have this issue, we’re actually fixing that right now. I can’t actually tell them that, but it’s really neat to know that we’re making a difference on a daily basis.

Steve: It’s fascinating to hear you talk about the relationship between, especially between you as a researcher, but with your researcher sort of identity on, between being a UX researcher and the vision of the company. So you talked a little bit earlier on about the different users. You kind of said, like you talked about passengers. You talked about people using the app which I guess might not be the same as the passenger. You talked about drivers. Tell me about, who do you study? What do you look at?

Monal: Well as you said, definitely passengers. People who are looking to get a ride, who are using Lyft to take rides. And then the drivers who give rides. So currently the Lyft app that we have today, the single app, serves both of those types of folks. And for people – most people probably don’t ever see the driver’s side, but once you apply to be a driver, are approved as a driver, there’s a whole nother world on the other side of the app. So we look at both those users and more recently we announced at the beginning of this year a partnership with a company called National MedTrans Network. So their users are a whole new set of users for us to investigate and learn about. I can tell you a little bit of a story about them. So they – National MedTrans Network is a company in New York. They work with insurance companies like Medicare and Medicaid whose patients are these elderly folks who basically need routine medical rides, rides to doctor’s appointments. A lot of times – all the time non-emergency rides. And they often don’t need to be transported in an ambulance or any special kind of vehicle. So National MedTrans has all these relationships with various car service companies, including taxis in New York City. One of the interesting things that came to bear was that whenever they have a taxi or car that doesn’t show up for a patient or cancels for one reason or another, which is pretty common, it takes – in New York City it takes them about 45 minutes to get that passenger, that patient, another ride through the taxi company. And so they started using Lyft and found that wow you can get a ride in just a matter of moments using Lyft and get this patient home so they can take their medication, so they can do whatever it is they need to do and not have to wait there on the street or whatever. And these folks have a lot of times more special needs then some of our normal types of passengers. So the great thing about this is in terms of building this new product research was highly involved. And so I went over to see National MedTrans, brought a designer with me and as anyone who’s done contextual inquiries know there’s so much more to learn when you’re in that environment and learning who the users are, what their daily tasks and processes are and the entire environment. And this company where we visited them was like people in cubes functioning as a call center. They’re scheduling rides. They’re working with taxi companies. They’re talking to patients, caretakers, people who are managing transportation needs for these patients and it was really, really insightful for us to be there and use the insights there to figure out how to build the best possible product for these users. It’s something totally different than what we have in terms of our native app offerings today.

Steve: And who will- within that use case are there multiple users there that have different experiences with Lyft?

Monal: Well the primary user in this case is the person at Nat MedTrans who is requesting a ride and managing the ride. So right now a lot of these elderly folks, they don’t even have smartphones. So it is a little bit of a juggle in the sense that Lyft wasn’t necessarily designed for this use case from the first and so we’re getting to the point where how can we build the best possible product. There’s the driver, there’s the passenger and now there’s this middle person who functions as if – kind of like the app does for you or me who might be taking a ride today.

Steve: It just speaks to a really, really (in capital letters for me) BIG idea. A lot of the technologies that we see are apps and there’s interesting – I don’t know let’s just call them democracy or democratic access I guess is probably a better word for this. If you don’t have the income or the technology, sort of know how to use those things you’re shut out of a whole bunch of ways to participate. And what excites me about what you’re talking about is that it’s a fascinating research question then is how do we make non-mobile device users Lyft users? Those principals can apply to anybody but you guys are digging into that. It’s really fascinating.

Monal: It is actually really fascinating and just being there and thinking through these whole – you know this set of issues and this problem, it really brought me back to my roots where this feels like a true user experience problem to solve, you know. Lots of different players, various tasks and processes, how do we make this work?

Steve: I hear the sort of analytical aspect of it as you talk about it, but I also hear your sort of – your point one about empathy. Like there’s something in your voice which won’t come through in our transcript, but there’s something in your voice as you talk about it. It seems like this touched you or this struck you in a certain way.

Monal: Yeah, definitely. I mean the thing that I think struck both me and the designer when we were there was like wow we’re there in this call center, someone called in, a patient who had been waiting and they couldn’t get a ride and how quickly they were able to resolve the issue with just using Lyft. And it just blew our minds. We were like wow. It was just really cool. Like we made this person’s day so much better by just getting them home.

Steve: So that project sounds like a sort of a special one, maybe sort of outside. Or it’s a new area for you to be looking at beyond the traditional set of users that you’ve been thinking about. I don’t know, in the life cycle of doing the research, whether it’s getting your data or helping the teams make use of it, I don’t know is there any spots that you find particularly challenging that you’ve had to dig into?

Monal: Well another thing that actually drew me to Lyft was the fact that I think there are a lot of really interesting research challenge with our product. We’re really on the cutting edge with regard to mobile devices, being completely mobile, having to try and do research in a very constrained environment which is a car. And I think that there are a lot of methodological challenges that come with that. So for example if you think about it the best research comes from real life situations. You want to be able to observe – from of course from life qualitative user research. You want to observe what’s happening in the real world, or simulate it as closely as possible. And if you want to find out what’s happening in ride experience, if you’re in the car you’re already biasing what’s happening there because you need to announce yourself as the researcher. Everyone knows you’re from Lyft. People are going to be a little bit more cautious about how they act. Let’s say take the researcher out of the car. Let’s just put a camera in there. Well then there’s legal constraints as well. You need their permission and if you have their permission then of course they’re again going to be a little bit on guard and this will really bias the results if this is what you’re interested in learning about. So we’ve had to come up with some creative and innovative ways to study some of the things that we’re interested in learning more about and I think along with that you know recording, conducting research, and huge amounts of data are always challenges for researchers across all industries. So specific to Lyft though, and specific to this area in particular, I think we have some really interesting challenges and it’s fun because it’s – you come to work every day thinking about – you’re really stretching your brain and you feel really good when you come up with something new and get to be creative in those senses.

Steve: Are there any process innovations or anything that you can share?

Monal: Not exactly (laughs). I can tell you about one of the first programs that I started when I joined. So we wanted to ensure that we had user testing for most of our projects coming through on a regular basis. So I started a program called Drivers in the Office, so D-I-T-O, which we call “deeto”. And along with that PACSITO, which is passengers, which we often abbreviate as PACS, in the office. So DITO and PACSITO, it’s like a weekly user testing cycle and it’s been a lot of fun. We have people come into the office here in San Francisco. Occasionally travel to other cities and occasionally do some remote studies wherever appropriate. But what we do is we have, and especially for drivers, for DITO, what we’ll do is we’ll actually have drivers come and bring their car, park in our lot and then we’ll have them use a prototype of a new design or product that we are testing and this allows us to capture everything that’s going on while they drive. So it’s one thing to be a passenger, in your house, on the street, looking at the app. You can stop, you can look at it for as long as you want in a maybe normal situation let’s say. But as a driver you have so many other distractions and concerns. I mean it’s a very cognitively complex activity. You are doing so many different things at once and now we’ve just added one more thing to pay attention to which is the app. So it’s a huge design challenge for our team and also from a research perspective it’s really important for us to be able to get the data in the right environment, in the contextual way that they will be using it because they’re using the app at an arm’s length. They can only look at it upon glance, most of the time when they’re driving for safety reasons. And we have to be really cautious about ensuring that it’s not overly distracting. So one of the things that’s been really great about this program is that times when we’ve tested like more low fidelity prototypes where we can’t go in the car, versus going in the car, we’ve learned quite a bit.

Steve: There seems like there’s an interesting, maybe sort of a hybrid – there’s like a semi-contextual aspect here where they’re not in a lab, there’s a high fidelity simulation of what the thing might be. You’re dealing with their actual car, you have realistic use cases, but you’re not also sending them off into the wild to live with this thing for a day. It’s a constrained experiment, but you’re sort of creating I guess as much context within an experiment as you can.

Monal: Um-hmm, right. And I think that’s important for us to – it’s a big part of how it’s going to be used in the real world so we need to capture that – how they do so in that environment.

Steve:What about other examples of kinds of, within what you can talk about, what are some other kinds of research that your team has done?

Monal: We basically try to do as much mixed methodology as needed. We do surveys, as I mentioned contextual inquiries. We’re basically employing everything that we need to design the right kinds of studies to answer the business questions. You know we have some user interviews coming up. We have – what we also try to do is work very closely with our analytics team.

Steve: Can you talk about anything that has – that people that use Lyft has experienced that’s changed as a result of some of the research you’ve done?

Monal: Definitely. One of the most exciting things that I feel user research has impacted was our Lyft app redesign that launched at the very end of last year. So we did a lot of iterative user studies, user testing especially around various versions of designs and the product moving forward and most of the product and design decisions relied heavily on the research that we did. So by the time we launched the app, the new redesign, we felt and our team felt very confident that it would be successful once it launched and it definitely was. We’ve seen really great results and have heard a lot of great positive reviews and comments about what we did and I think a huge part of that, I feel really proud to say, was due to the research that we did. So whenever you can affect the flagship app on such a big scale with this design you can feel really good about it. Of course we don’t take all the credit. There’s such incredibly talented people here that you know we were just lucky to be part of that process.

Steve: I think even just this phrase about sort of being proud of what you accomplished, it makes me think about how do you lead a research team. I mean giving out kudos, I think that’s one part of it. I’m just projecting on you here a little bit, but what are other things. Talk to me about being a leader in research. What does that mean?

Monal: Yeah, I mean it’s interesting because I think until recently there weren’t a lot of leaders in research. Primarily I feel like research has had a lot of individual contributors and that’s how we’ve been seen in a lot of ways, people who are providing data analysis, insights, and hey, great, thanks you did your job. And now with the growth of research and especially user research we’re seeing more roles where there are leaders. And I think it’s great for all individual contributors because a lot of times – and this still happens today – the individual contributors who have worked as researchers in the past haven’t always reported up through someone who understood what they were doing. And so it’s great to see that that’s changing to a degree now and I think just in that sense of having the empathy (back to empathy again) for understanding like what these researchers are trying to accomplish, what their personal goals are – I think a lot of what researchers in general – I’m just thinking back through my career as well – have felt is, I mean we really want to make a difference. We really want to have influence on the product or the design. We did this great research, we want to make sure that it gets in there. And this hasn’t always been the case for me in past companies. And one of the great things here at Lyft is that what we’ve produced has been so highly utilized, probably more than any other company I’ve worked at. So coming back to your question in terms of leading, I think ensuring that the work that we do does get implemented or is seen as respectable, credible, and that people understand what we do and how we do it. And bringing them along for the ride too, it brings us to a point where within the organization you’re seen as oh wow, like they do really great work and they’re going to affect really positive change for us. So sort of helping the team be known and understood as experts and ensuring that the work that they do has a positive effect on the company and for themselves.

Steve: And you’re right. That fell on the individual contributor to sort of – you had to do all that you had to do and you had to be able to do that.

Monal: Um-hmm.

Steve: And so you’re talking about like what a good manager does. They sort of, they advocate for their team, they…

Monal: Um-hmm.

Steve: …protect them and champion them.

Monal: Definitely.

Steve: You talked early on about this really important mentorship relationship that you had in your career. Are there commonalities between what you experienced kind of as the recipient of mentorship and what you try to provide as a leader now at this point?

Monal: Definitely. I mean John who was my mentor back then, he gave me so much and I feel like I’m at the point in my career where I definitely can do that for other folks as well. So right now it’s primarily here at Lyft, but I’d love to do that outside of Lyft as well.

Steve: How do we define mentorship? We talked about leadership, but what’s the difference between mentorship and leadership?

Monal: Yeah, I think mentorship is a little bit more focused on – well in my mind helping folks develop new skills as well. Leadership doesn’t necessarily have to do that. I think leadership can do that. I mean if there was a Venn diagram it would probably be overlapping in the circles, but I definitely believe that a good mentor will help someone on a personal level develop into whatever career goals they have. Helping them along that path to get to where they want to be. And for me I learned a lot of methods from John. I think I’m trying to do that as well on a mentorship level, but also just be someone always who’s there to talk through, you know, career related growth.

Steve: You know in some ways – we’ve talked a little about sort of the evolution of the field and it makes sense that at that period of time developing methods, not that that was all you were learning, but that was kind of a key. And I think about your three areas – I mean the first one you listed, you kind of titled it as empathy and then you listed like 18 other, like really important sort of soft skills that are crucial to being a researcher. So when you’re there for people as you say I think that’s starting to model, yes the methods, but also now these other pieces that are so important to being a good researcher.

Monal: Um-hmm.

Steve: That are just very, to me very human skills that we all can – we all can do to work on.

Monal: Um-hmm, definitely

Steve: Are there other things about you and your background that we should understand to know what kind of researcher you are? Any weird dots you want to connect for us?

Monal: Yeah, actually I think one of the things that’s really unique to me is that I previously had another life as a professional athlete. So I was very active in team sports and that’s a big part of like my philosophy too. Very team oriented. I started playing soccer when I was six years old and I got a scholarship for running when I was at Stanford. So that’s been a really big part of my development and who I am and the really cool thing about running at Stanford is that when I started there and started running on the cross country team there we weren’t nationally ranked. And by the time I was a senior my teammates elected me as the captain of the team and we won the NCAA national title that year which was less than four years later and so it’s been really interesting to draw a lot of comparisons and parallels to my athletics career and how I’ve envisioned like sports and teamwork and accomplishments and my working career. And it’s funny, I don’t even realize I’m making these analogies. I started talking about it to a coworker the other day and I’m like wow I do this actually quite a lot. But there are so many parallels to draw with regard to pushing yourself hard, getting to the next level, goal setting, you know the competition and teams, relating to people, supporting one another. And one of the reasons why I think we won the national title back in 1996 was because we had such a tight, supportive team and I really feel that we have that now at Lyft as well. User research is part of the product design team and we are such a close team. Actually we just came back from a weekend retreat in Tahoe. So we’re kind of like a tight-knit family and I really feel that that support and that kind of environment fuels – helps fuel really great success.

Steve: So some of these parallels you’re describing, they seem about what athletics teaches you about working and collaborating. Are there any threads that you can pull from athletics that go into things that are specific to UX research?

Monal: That’s a really good question. I think I need a minute to think about it. It’s something I haven’t thought about before. I’m sure there are. I think I need to noodle on this, but I like the question a lot. I think because a lot of what – the parallels I’ve drawn have been more meta, like high level. Yeah, nothing springs to mind, but at some point I’ll get back to you. I’ll be like ‘Steve guess what, I figured it out and now my life’s questions have been complete Like I know. I see the light!’

Steve: Do you have questions for me?

Monal:Yeah, actually, so I was curious because you’re one of the few UX research veterans in the field and I was curious what kind of trends you’re currently observing in our little field that’s growing?

Steve: Yeah, I think – I liked what you said about mixed methods. I think that seems to be, you know, where traditionally on paper there was a lot of us and them. I feel like I’m hearing more harmonious stories and I think it behooves us as a practice to tell harmonious stories, but because that speaks to a successful, integrated, mature, not a diva, you’re wrong, this focus group sucked, blah, blah, blah. Like that’s sort of – that’s what we’ve been saying and I think true or not it’s sort of disharmonious and I feel like the stories of how groups have come together, to me that’s like oh we’re a little more mature as a world now because we can coexist well with everybody. And the qual/quant things used to be cats and dogs, but as you said you guys integrate well. I’ve heard that from lots of groups that I’ve talked to and I’ve seen that. You know clients I work with start to say like oh yeah, we have this group, this group, this group and we can kind of poll lots of – so that’s mixed methods. That’s also, you know – I don’t know sort of mixed disciplines I guess. So it seems like that – to me there’s just a maturity level there with that. Yeah, that’s my big one I’ll leave as the answer for that.

Monal: I agree and I really think that in order to have the most robust insights we do need a good triangulation of different types of data to be able to tell that story. I was also curious, I mean as we talked about UX research has come a long way from you know the mid-90s and where the worldwide web was at that time. I mean just technology has come a long way and UX research has alongside of that. So that said, where do you think UX research needs to develop most given our current landscape?

Steve: These are the questions I should be asking you but I didn’t.

Monal: The tables have turned.

Steve: Yes. Yeah, what’s the sort of opportunity? And this came up as a topic in an earlier episode of this podcast series where I don’t know that we answered it, but we talked about just this idea of research sort of evolving to the point where it disappears. That research right now there are leaders that are trying to develop in a business, but – and you’ve talked about the successful collaborations and so on. Is this an activity or is this a discipline. You know I hear so much about teams trying to – I mean one way to deal with the overwhelming demand of research is that research leaders are empowering teams to go do their own research and so then they have to deal with this question of well what do we do as the researchers vs. what do we kind of let everyone do. I mean that’s the consequence of the evolution of research and we’ve been saying, as part of that evolution, like everyone should be doing it, everyone can do it, right. There’s books that teach people how to do research, like that’s a change and so we – I think in the early days we struggled with do we let go of this or do we hold on to it and we chose to do a bit of both. So how does that play out I think is an interesting question? How should it play out? And research doesn’t exist in a vacuum, so how technology has developed. You know we are in a lot of the way – in a lot of environments that research is being used it’s a close colleague of design, right. I mean you talked about being part of the product design organization, not some other – part of the market research organization. That’s who our sort of, our teammates are. And design is changing and you talked about this already too, right. That companies used to just do technology and sort of experience was secondary. So I think where research goes we’re sort of subject to these larger changes in how – in how experiences are created. So I don’t have an answer to that and I’m not sure it’s a need. I don’t feel like oh we better get this sorted out or we’re dead. I just feel like there’s some interesting, sort of existential questions to reflect on. You know I mean over the course of our careers you and I have seen what kinds of organizations and what kind of roles are there for us. It’s shifted a lot and as you said there weren’t leaders like yourself a few years ago and not in the numbers that there are no. So even just for our own careers, like it’s going to be different in a few years. I don’t know. It’s tough to predict the future, but I think those are the – those are interesting, if abstract areas for me to keep looking at.

Monal: Um-hmm, definitely. I mean I think it’s – I agree with you and I also think it’s really great news that the discipline is growing because there is a lot of value that we can provide. And you know the Lyft research team is also growing so I think it’s a really good sign of, you know how much we’re able to impact the organization in a positive way and I love seeing now, after having come back from Berlin – previously it was like oh we can only work at big companies. And now I feel like every startup has a researcher. Maybe not every startup, but it’s becoming more and more common which is amazing.

Steve: You know the designer as founder sort of thing, it’s like a – it’s a thing that gets recognized and talked about and I feel like maybe we identified with – I feel like I was on some Slack channel or some list and people were saying well when do we get our first researcher as founder. I’m probably misquoting someone that said that to me and I apologize for that and that company may exist, but certainly that idea is nascent whereas designer as founder in the startup world at least is well established. Maybe that’s the, you know sort of next wave for us.

Monal: Um-hmm. Cool. I have one more question for you.

Steve: Alright.

Monal: So I’m really curious, and I know you’ve had a lot of questions about this podcast, but I was curious what you’ve found to be the most striking similarities between all of whom you’ve talked to so far.

Steve: Don’t you love when you go out in the field and the person at the end says like “hey what else have you heard?” Or “am I normal?” They want you to kind of normalize them as part of it. That’s not exactly your question here, but yeah, what do researchers have in common? I think the emphasis on product is really, it’s like a thing that we should be proud of as the field and it goes back again to your list of three things that you talked about. As you kind of researchers needing to do good data collection – I’m paraphrasing you here – make sense of it and then sort of help it live in the organization. I am definitely twisting what you said, but that’s sort of thrust I took away from it. And I don’t know, I carry some scars from my early days of coexisting with people that – where research was their end – they loved – and you know I guess scars, but I’m also definitely guilty of this myself. Like research is fascinating. If you do it well the experience is great and you learn things that have no relevance. And I mean I’m a big fan of not knowing if it has relevance, but you learn things that you don’t know if it’s what you’re going to do with it, but you also learn things that just are great and they’re not applicable, but are part of what you have to do on that journey. And then you make sense of things and you find some really interesting takeaways, or things that are potential takeaways that aren’t necessarily usable by the company. Of course then you find here’s the things that we learned and here’s what we need to do about it and here’s all – but all the way along there’s a lot of fascination, right, like…

Monal: Um-hmm.

Steve: …it’s super engaging and to me very creative and I don’t know. I can be seduced at many points along the way. I mean, I don’t know – I think I’m good at making sure I get to that end point, but there’s a lot of seduction along the way with just stuff that is super inspiring, super thought provoking, that you kind of dig into and start synthesizing and organizing. So the scars are around, you know maybe sort of early days in the field, or my early days, where that’s where people kind of dwelled and someone had to come along and kind of pull them and say like okay you can’t deliver that. You have to show actionability. That sort of discussion was needed. And I mean it makes sense that the people that I speak to, like you who are leaders – I mean in order to be a leader you have to be beyond that. There may still be practitioners and contributors who kind of need to learn more of that, but I feel like if I take the temperature of the practice in terms of how it’s being led it’s about what we do with research.

Monal: Um-hmm.

Steve: And that is a massive change again over when I started in my younger days. And so to see that in common – again, maybe you go well of course Steve look who you’re talking to and what kind of jobs they have and look at what kind of companies they work for. You know again, yeah, you might find people at that skill level, like yourself, and experience level – if we were to like shift back 10 years you would be maybe working in an R&D lab or an advanced products group or something and not shipping stuff. So yeah, I think one of the commonalities is like these are people making products, or helping companies make services, whatever the thing that their company does they’re doing that and obviously that’s – I think that’s really, really good.

Monal: Um-hmm, yeah. One of the things that struck me about what you’re saying is that fascination and I think as researchers like a lot of what we love is having that curiosity and it’s so fulfilling just to do the hands-on work. My role is managerial, but I’m always still – you know I do some hands-on, individual contributor work as well. I just love doing it, right. That’s what makes us passionate about research, we love doing it, but I think in a business one of the dangers is getting too far into that fascination, but then not taking that next step. And maybe that’s where managerials, or managers can help with making sure that the data and the analysis and then these insights and actionable recommendations happen and are affecting product. That we’re not just doing the research and being fascinated for our own curiosity and fulfillment of that, but that we’re actually making sure it’s being put to good use.

Steve: I think that’s really important and it reminds me of just a period in my own career where I was managing my own team in my consultancy and I think like – we always had many different projects for different clients and I want to say maybe for a period of 18 months, which is a long time in sort of consulting years right, I didn’t go in the field at all. I was doing what you’re talking about, right, sort of managing the problem, helping with the synthesis, kind of trying to tie everything back. Sort of a creative lead and it was really cool to sort of – you know you remove one part of the puzzles and the other parts – I guess the other things kind of rise in prominence. So it was real interesting for me to step out of certain tasks and step into other tasks in a larger way. And I actually remember the first time that I went – it just worked out that like oh yeah I was going to do some of these interviews and I almost had a panic attack. Like it had just been so long. And I was not on my own. I just remember like kind of coming up to the door of somebody’s and just about freaking out. And you know the way I work now I’m sort of – I do more of everything, but having had that experience of sort of again playing just part of the role was really, really interesting. It’s good to hear that you – it sounds like you’re in the right spot. You still have your hands in things, but you can lead and kind of manage in that spot.

Monal: Um-hmm, yeah.

Steve: Alright well that’s maybe everything we have time for to talk about today. But this was really interesting. It’s one of these conversations where now I’m curious about a million other things. I’ll have my imaginary next 90 minute conversation with you. But yeah, thank you very much. You’ve really shared a lot today.

Monal: Thank you Steve. This was a lot of fun and yeah, I really enjoyed it.

May 10 2016

1hr 6mins

Play

Rank #3: 15. Leanne Waldal of New Relic

Podcast cover
Read more

Welcome back to Dollars to Donuts. This episode features Leanne Waldal, Senior Director of Product Research at New Relic. We talk about establishing research in an organization for the first time, building up a diverse set of research collaborators, and the pleasure of taking on certain types of challenges.

I’ve seen hopeful examples in startups recently where even though there are only 3-5 people they bring in someone for research and that’s highly unusual. Usually, if you’re starting up a company you have seed money, or you have your first round, the first people you’re hiring are engineers. You’re not making one of your first 10 hires a researcher. I know some example now around the San Francisco Bay Area where they’ve actually brought someone in whose role is to do research really early in the company. And that either points to a certain amount of humility around, oh, just because I am the user it doesn’t mean I know the users. Or, because they’re going into a space they don’t know. – Leanne Waldal

Show Links

Follow Dollars to Donuts on Twitter and help other listeners find the podcast by leaving a review on iTunes.

Transcript

Steve Portigal: Well, hi, and if you’re a new listener, welcome to Dollars to Donuts, the podcast where I talk to people who lead user research in their organization. otherwise, welcome back to Dollars to Donuts, the podcast where – okay, you got it.

It’s been a little while since the last episode, and I’m really happy to be back making new episodes for you. In a little bit I’ll talk about how you can support me, and the podcast, but first I wanted to mention an interesting article I’m reading. It’s from The New Yorker magazine, from the January 28, 2019 issue. It’s an article by Robert Caro who is an author of among other things, a multi-volume biography of Lyndon Johnson. This article is adapted from his memoir called Working: Researching Interviewing Writing, and his recollections about those activities are what caught my eye. He describes going to do research at the Johnson Library and Museum and going through boxes and boxes of papers. It’s the hard and tedious work of investigative journalism and what struck me was that amount of inferring, and cross-referencing and delving he was doing – he was coming up with facts and perspective that were not there for the taking but were from analyzing and synthesizing the information that he did have as well as what he didn’t have. Later in the article he talks about the importance of “place” in his research, that to understand Johnson he had to leave New York and really spend time in the Texas Hill Country. He writes “As soon as we moved there, as soon as the people of the Hill Country realized we were there to stay, their attitude toward us softened; they started to talk to me in a different way. I began to hear the details they had not included in the anecdotes they had previous told me.” And finally, he talks about silence in interviewing, citing two characters from fiction – Inspector Maigret and George Smiley, and their tactics to quote keep themselves from talking and let silence do its work unquote, where Maigret fiddles with his pipe and Smiley uses his tie to polish his glasses. Carrow has his own brilliant technique: “When I’m waiting for the person I’m interviewing to break a silence by giving me a piece of information I want, I write SU (for Shut Up!) in my notebook. If anyone were ever to look through my notebooks, he would find a lot of SUs.”

Speaking of the connections between journalistic interviewing and user research, I want to recommend the podcast from 2017 “The Turnaround” – it’s interviews with different kinds of journalists about doing interviews, about getting to the story. I learned as much from the contrasts between our different objectives as I did from their best practices that apply directly. Check it out!

Now, obviously, I’m back with more episodes of Dollars to Donuts, I’m taking more of an open-ended approach to new episodes, and they’ll appear as they’re ready, without any specific frequency. It takes a lot to do this podcast and rather than taking advertising, or doing crowdfunding, I want to ask you to support the podcast – and me. You can hire me! I plan and lead user research projects, I coach teams who are working to learn from their customers, and I run training workshops to teach people how to be better at research and analysis. You can buy my books for yourself and your friends and your colleagues – I’ve got two books – the classic Interviewing Users and more recently, a book of stories from other researchers about the kinds of things that happen when they go out into the field – it’s called Doorbells, Danger, and Dead Batteries. You can also give this podcast a star rating and a review on iTunes, and you can review either book on Amazon. With your support, I can keep doing this podcast for you.

Let’s get to my interview with Leanne Waldal. She is the Senior Director of Product Research at New Relic. She has led research at Autodesk and Dropbox as well as running her own research agency for almost 17 years!

Thanks for being on the podcast.

Leanne Waldal: Absolutely

Steve: Let’s start with a basic introduction. Do you want to tell us who you are? What kind of work you do? What your company is?

Leanne: I’m Leanne Waldal. I am Senior Director of Product Research at New Relic. New Relic does application performance monitoring. It usually makes people’s eyes glaze over. Basically, we have a tool that developers and engineers, primarily that group, uses to monitor the traffic on websites and mobile apps. So, if you think about streaming media, online retail, financial services – tons and tons of data and traffic – their site reliability team, or engineering team, might use us to monitor that to make sure nothing goes down. I’m based in San Francisco and I’m a part of the design team which is a part of the product team. The product team is mostly in Portland, Oregon. We have a smaller portion of it in San Francisco and we also have a team in Phoenix and in Barcelona.

Steve: Can we talk a little more about New Relic and who the customers are? You said to make sure that it’s not down, but I’m assuming it’s not just that binary state, like is Amazon up or is Amazon down?

Leanne: It’s to monitor it. So, be watching to see if something happens that’s slower. Usually it doesn’t matter if something is faster. It means you’ve optimized performance and things are running really well. So, if things are going slower or you notice that people are having longer wait times for page loads or checkout loads or media streaming, or whatever sort of thing they’re trying to do in your app. You would have spent a considerable amount of time setting our products up, basically, to then do all that monitoring and raise alerts when something goes wrong.

Steve: Okay. So, it’s not up and down. It’s slow and it could be any part.

Leanne: Exactly. Lag time. Checkout’s not happening in the amount of time that you expect it to.

Steve: Did you use the phrase reliability engineers?

Leanne: Yes. So, the new role that basically Google created about 10ish years ago is the site reliability engineer. They wrote a book about site reliability engineering. So, our target user is usually called a dev ops engineer or a site reliability engineer. If you are a more modern company with the way that you’re keeping track of your website and your apps, then you probably actually have a development operations team, or a site reliability team. If you’re not quite there yet, then these people might be a part of an engineering team, or IT operations or something.

Steve: That’s good context. Then you said, you’re part of the design team which sits in the product team?

Leanne: Yeah. So, at New Relic, engineering and product and design all report up to the chief product officer. Different companies do it different ways. Engineering might be alongside product or design. Design might be alongside. But at our company, if you’re in the product org, you belong to the design team, the product team or the engineering team, or a few different teams that are in the org.

Steve: How long have you been in the role that you’re in?

Leanne: I’ve been at New Relic for six months. I did a similar job at Autodesk, before New Relic, for about a year and a half. And before that I had a similar role at Dropbox for a couple of years.

Steve: How do you compare and contrast those three organizations and what you saw and what the trajectory was in those roles?

Leanne: We could start with Dropbox. Dropbox was private, pre-IPO. It was very much a unicorn. When I started there it had 400-600 people. I don’t remember exactly how many. There was no research. They had a small design team. They had, as a lot of tech companies at that time, being a unicorn, had tons of engineers. And also, because Dropbox is a consumer app as well as a business app, everybody used it, so they felt like they knew who the user was.

Steve: Everybody that worked there?

Leanne: Everybody that worked there, yeah. So, I started setting up a research team there and grew it as the design team was growing. That’s different from Autodesk where I went next. A very old company, very fascinating products with fascinating customers and use cases. And people there who were researchers there, who had been researchers there for a long time. And what I was doing there was combining research and analytics together. At Dropbox, analytics was separate. And I had a small team that was more globally disbursed. At Dropbox, my team was all in San Francisco, to start. Here at New Relic, when I joined, there was one researcher here in San Francisco and one in Portland and a similar sort of tone around a lot of the people who work here at the company. Our site reliability engineers (or have been), so they know a lot about the market and a lot about the users. So, we’re basically here to help the company understand all of the new users it’s acquired since it started because it’s a different company now than it was 10 years ago. Ten years ago, New Relic was a company that was mostly reaching out to developers and engineers to use its product and now we’re mostly focused on enterprise and we have lots of enterprise customers now.

Steve: There’s that classic – you referenced this, right. People think they know they’re customer because they are users.

Leanne: Yeah, we all do.

Steve: But you’re kind of saying yes, to a certain extent. I don’t hear you shutting that down?

Leanne: It’s like yes and the marketplace out there has changed and maybe there’s something new that we’re not looking at right now. And also I’ve noticed that for companies where they maybe used to do contextual inquiry and go out and visit their customers and users and understand them deeply, in the last 10-15 years companies have tended to move to surveys and remote interviews and stepped away from that sort of like deep understanding you get from being in a room with someone, or on a construction site, or on the road with them when they’re using your mobile app, or whatever it is. So, you can see everything around them while they’re using your product or your app. And when you move away from that you lose sight of that holistic story of the customer experience. And then research can come in and do that for you.

Steve: Why do you think there is that movement towards the remote and movement away from the contextual work?

Leanne: Contextual work takes time and effort and can be exhausting. Exhausting to do it and exhausting to come back and know what to do with all that data, how to tell a story out of it and how to decide how it can have impact or value? Or sort of the like, now what do we do? I think also as humans we get really familiar with where we are, and we neglect to notice that the world around us has changed.

Steve: So, that’s back to your point then. We started this company based on work that we’d done and skills that we had and users that we knew, that we were, and the world moves on and shifts and new customers…

Leanne: And it’s not just New Relic or Autodesk or Dropbox. That happens at any company. You can be in banking or in legal. Ask anybody who has been there a long time, compared with a person who just started. They’re having much different experiences of it.

Steve: You described, especially with Dropbox and New Relic, this point at which the company had little or no research, and that is not unique. We know lots of companies like that.

Leanne: Oh yes.

Steve: Do you have a hypothesis about what’s going on? There’s a point at which someone like you starts talking to these companies and starts saying, “hey, here’s what I would do if you had me come in and work with you.” What’s going on beforehand? What’s the point of need or pain that’s identified inside these companies when they realize…

Leanne: Sure. So, what I’ve noticed at the companies I’ve worked at, as well as companies I was interviewing at for this sort of position, and companies I know the story around when they hired someone in sort of like a director or higher role in research, usually something changes in revenue and that change in revenue sets forward sort of like how do we figure out what’s going on. And if they have someone who is currently working there in a role who has had experience in the past with doing market research, or doing churn research, then those people will start to raise the flag of like oh, we have to do research. And then it’s how do we do it. So, that’s one reason that someone brings someone in to do it. Another reason is that there’s sort of like a groundswell of engineers, product managers and designers who are telling the people who make hiring decisions, “we need someone to be in charge of design.” Or, “we need someone to be in charge of research. We’re doing a lot of this ourselves and we don’t feel like we’re doing it well enough.” Or, “we, for example, do lots and lots of interviews. We don’t have time to synthesize the data. So, could you please hire a researcher to work with me.” So, sometimes that happens. Another way it happens, which is how it happened here at New Relic, is that the company decides to hire like a VP or SVP of design and then that person who knows, oh this is what a design team looks like – you have someone who is focusing on content design and language and someone who is focusing on interaction design and visual design. And I also need someone who is focused on research. Sometimes it’s from the top down. Sometimes it’s from the ground up. Sometimes it’s there used to be this role at the company before and someone left or was laid off and then they just didn’t backfill it and then after a while realized oh that’s really a role that’s really important.

Steve: We need that.

Leanne: Yeah, and it’s not – I used to think that maybe it was sort of like a trend, like everybody is starting to be customer centric, or user centric, or whatever that means. So, then they were like, “oh, well Joe down the block is doing that, so we need to do this too.” And what do we do, we hire someone to run research. But, as I have been interviewing for jobs over the year and talking to different companies and doing these jobs, I’ve realized, no, there’s actually something that happens inside the company that causes that change. It’s not so much like copying a competitor or another company that you care about. You might do that with engineering or something, but research is sort of like the first thing to go if you start running out of money. And, I’ve seen hopeful examples in startups recently where even though there are only 3-5 people they bring in someone for research and that’s highly unusual. Usually, if you’re starting up a company you have seed money, or you have your first round, the first people you’re hiring are engineers. You’re not making one of your first 10 hires a researcher. So, I know some example now around the San Francisco Bay Area where they’ve actually brought someone in whose role is to do research really early in the company. And that either points to a certain amount of humility around, oh, just because I am the user it doesn’t mean I know the users. Or, because they’re going into a space they don’t know. They just had a really good idea about it and they really need to understand the space.

Steve: That’s an encouraging sign.

Leanne: Yeah.

Steve: Is there a distinction between – you gave a number of different scenarios where research comes in – do you see a distinction in sort of the context, or the action that’s being taken when bringing in a person to do research vs. bringing in a person to lead/manage/run/build research?

Leanne: Yeah. So, if you’re brought in as a person to do research you’re usually reacting to the things you’re being asked to do. You usually don’t get to pull your head up and do strategic work. You’re doing lots of like compare this design with this, do this survey to answer this question, help me with like interviewing these six users in remote interviews. If you’re being brought in as someone to say like where should research fit into the company, it’s more challenging, but more interesting and exciting because then you’re being told like tell us what we should know? Like where should we go with this? Are you going to focus more on the market research side, or the sales research side? Are you going to focus more on the product design side? What are all the things that we need? What kind of programs do we need? How do we get access to users? How do we interact with users and engage users? There’s just all these little pieces of how a research process and practice runs inside of a company that is great fun to put together because you have to do it based on who you’re working with and the culture of that organization. It’s not just a cookie cutter that you put into a company.

Steve: Is there a trajectory from one to the other? Like someone that’s hired because the need is to do research. Is there a trajectory for that person to start answering the kinds of questions that you’re talking about?

Leanne: Yeah, yeah. I think if that’s sort of like the way you work, or your personality, or the way you’re motivated, you can get there. For example, I have someone on my team right now who I mentioned to the other day like – she’s a researcher and I said, “you could make some day, if you wanted to, a really amazing research operations manager because you’re really good at all these pieces around managing the research and communicating it.” Because half of research is PR and sales. You have to find the people to engage with it, to sell it to at the end. You know you want your research to have impact on product or a marketing campaign or whatever you’re doing. So, you have to gather the people around you. And you have to keep track of, particularly in a B2B company, who are all the people who have to be involved if I want to do visit customers. So, in a consumer company you just go out and find people. You talk to them and you do whatever you want with them. At a B2B company you have to be engaging with customer success, with sales roles – whatever they’re called – account execs, account managers – whatever that company calls them. You have to often also be engaging with product managers, designers, design managers. Engineers, didn’t used to be, just as a stereotype, interested in research and now at some companies they are very interested. And so managing all those types of people that you either have to get approval from, or you need to make sure that they’re bought into what you’re doing, or you need them to come along with you and do it, takes a lot of work. So, there’s this whole new role of research operations that usually helps keep track of that. And I’ve noticed some researchers have a real knack for sort of like managing all those details around their project and then shining a light on it and sort of going out and selling it, sort of like a marketing campaign. They just know who to schmooze with and talk with. And some researchers are really good at doing the research and writing the report and sharing it. I pointed out to her, “this is something you are doing for our team without being asked. Like, that’s a real superpower.”

Steve: Does that start to change the – I think you said this, at least indirectly – does someone who is shining around operations, like your colleague here, does that start to change how research is perceived or experienced by others in the organization?

Leanne: Yeah. Especially if you’re an organization that’s not used to research. Or they’re used to doing it themselves. So, they’re not used to partnering with someone else and they’re not used to having someone say like, “oh that’s not really a survey. That’s more of an interview.” Or, “oh, that’s not really an interview. We need to actually go watch people to do this.” Or, “this can’t be answered in an interview survey, or watching people. We need them to be in a diary study.” They aren’t used to having someone bring in all these different methods. And they aren’t used to having someone sort of like find some insights and then tell them. So, that’s why it’s so important to have a collaboration. I want you with me while I’m doing this research so that when I tell you the pens need to be blue, not red, you were there to hear everyone say they like blue better than red, instead of throwing the report over the wall.

Steve: I hadn’t really thought before of operations as a – I’m not sure you’re saying this – but, almost a Trojan Horse. That there are these sort of tactical objective problems to be solved under operations. How are we going to find these people? What is it going to take logistically? What is it going to take legally and so on? That starts to change some of the conversations maybe that can happen internally. Like you said, “this is your problem. We recommend you go about it this way.”

Leanne: Yeah.

Steve: I hadn’t thought of that under an operations lens myself.

Leanne: It’s really more in a B2B company than a consumer company because you have to manage so many relationships and oftentimes a researcher wants to focus on sort of like the details of the research. Like ask any researcher, they don’t want to do all the details around recruiting and scheduling people – or, most don’t.

Steve: Right.

Leanne: And so, you want someone who can do that. And then sometimes researchers also aren’t the people who know how to best sell their work or edit it down into like the three bullet points for an executive presentation. And it’s helpful to have somebody who knows how to do that. You know just sort of like editing and presenting and PR and marketing. And also knowing who to market it to? Like, oh, I know that there’s this sales VP in this office in Toronto who is going to be interested in this. That mentality of like keeping all these dots rolling together. And if you have someone who knows how to do that and also understands how to do research and what goes into research, that’s what I see makes a really good manager from an operations perspective. There’s the people management side too. Being able to mentor and coach and take care of people. But there’s more and more in B2B research a need for someone specifically in this operations role to just sort of like help everybody and raise the brand of the team and the company, or in the product team or the design team.

Steve: It’s so cool that we’re at a point in this field and this practice where we acknowledge, as we always did, that there’s a number of very different kinds of skills and expertises that are required. And you just outlined a whole bunch of them. But that we’re able to sort of say, and not be laughed out of the room, like oh, that might be different people. You talked about collaboration and collaboration with people with different strengths or superpowers.

Leanne: Yeah. And another expertise and skill that’s being introduced more into companies is the academic anthropologist who is now working inside companies on a research team. Like EPIC used to be primarily all academics and now there’s a bunch of people from business involved in EPIC, and also people who used to only be academic anthropologists and ethnographers working on research teams, inside tech companies or consulting agencies, or whatever. So, I recently hired a few anthropologists on my team. One who is a former professor, one who had applied anthropological research practices for market research. Another who had done research with an NGO, but also worked inside of a company recently. And I think that when you have that academic background and you bring it into a company, it’s like a special sauce. You also bring in a special sauce if you’ve been working in tech for 15 years and you know the ins and outs of how a product team works, but I like mixing those two things together because people learn off of each other.

Steve: You like to mix the special sauce.

Leanne: Yeah.

Steve: So, it’s an extra special sauce.

Leanne: Yeah. So, for example, we decided as a team that we wanted to look at all of the internal blog posts and other things that had been written about customers and users, from the point of view of people inside the company, because my team is new to this company. We were like okay, let’s take a look at everything that everybody else wrote. And I was trying to figure out how we would do that. Like, who knows how many internal blog posts there are and how many things that have been written about customer visits and things that different people know about users that they report out internally. And then I realized, I have some former academics on my team, academics who are now working as researchers in industry. They have skills in literature review. So, brilliant. I took them and had them put together like how are we going to review this and code it – special sauce, like they absolutely know how to do that. It was really easy for them to put it together. And then took other people who had more industry experience, or application focused experience, and said, “okay, now you can do what was set up. You can do the review of following the system.” And I have people who know how to set up systems set up the system.

Steve: So, that’s an example of the special sauce. They know how to deal with this.

Leanne: Yeah. And if you have a team who has mixed experiences like that, then they just sort of like, they lift each other up and they teach each other things. And that’s something that you can’t learn if you’re the only researcher, or you’re just one of a few researchers who all have exactly the same background. So, you all come out of the – whatever the programs are now, at Carnegie Mellon, Stanford and Berkeley and University of Washington and on and on. And you go and get a job and you do remote interviewing and some focus groups and some other things But everybody around you is doing exactly the same thing. Then you don’t get a chance to sort of like up level yourself. And you also, like if you’re the person managing that team, you don’t get a chance to learn new things from people on your team that you hired, that like you don’t have the same background as them. So, it just improves humanity basically.

Steve: So how do you think about like what the – if the team is kind of eclectic now and you have all these different mixes of skills and backgrounds and aptitudes, how do you think about what the collective should be? How do you think about the mix?

Leanne: Of a whole team?

Steve: Yeah

Leanne: So, I do miss the team I used to have which had both research and analytics. I think being able to have people on your team who do data science, or who are more like business analysts who know how to count things up and make charts, or someone who knows how to mix data together and come up with reports out of it, you get a much faster feedback between the analytics people and the research people. Analytics is like oh I did this, I want to know why? Research is like I can go out and find out why. Or research says, “I need some access to people who have certain characteristics, can you find me a list of them through analytics? Thank you so much, I’ll go out and talk to them and then I’ll come back to you.” In most companies those roles are separate. So, it is here. Analytics is a separate team. We work super well together, but sometimes when you have them both on the team it’s really nice because you’re both at the same team meeting and you’re both seeing what they’re doing. From a research perspective, having people who are all quant or all qual, or halfway in between or know both, is useful because anybody who’s in the job market right now, when they look for their next job the requirements are going up and up and up. We don’t want you to just be somebody who does interviewing. We need you to know how to design a survey. We need you do know how to do SQL queries. We need you to know how to do – like there’s more and more requirements showing up in job descriptions now. And so, if you can be a part of a team where somebody else knows that thing you don’t know, it just – we’re all going to like leave and get another job someday. Nobody stays somewhere forever. So, it will just help you when you’re going out to look for your next job.

Steve: So, skill development, kind of through proximity to your colleagues.

Leanne: Yeah. And it’s also diversity. Like any time, you have a diverse team, whatever you’re defining diversity as, just makes for a better team environment because all these different perspectives around the table from all the different places they came from, just makes for like different things to feed into feedback about a project you’re working on, or something that’s going on in the company. You know, we didn’t all come from the same school or the same program, so we don’t have a difference of thought in the way we approach a problem.

Steve: So, for you, thinking about the teams that you’ve put together – because you’re talking about different kinds of – I don’t know if diversity is the right word, the way you’re using it, but qual and quant are different orientations, or…

Leanne: And also, academic and industry. Or in tech, because I’ve worked in tech for 20+ years, I look for people to come work at the tech company I’m working at who haven’t necessarily worked in tech before. So, look for adjacencies. Like you did research, but you did it for a car company. Or, you did research, but you did it for an agency. So, that they can bring different things into the team. That’s what I mean by diversity. You know research, but you used to be a professor. You know research, but you were working on an astrophysics PhD and decided that’s not what you wanted to do anymore, and you realize things that you learned as a part of your graduate program – so you totally know how to do surveys and how to talk to people and how to do research.

Steve: I had this Twitter conversation with someone the other day and they asked me where is a writing course that I can learn to be better at writing the kinds of things I have to create as a researcher. They can’t find things like that. They can find persuasive writing which is for sales. I kind of scratched my head and said, “I don’t know that that exists.” I went into this little sort of pontification, as Twitter invites you to do, about learning from adjacencies like take a creative writing class. Take a journalism writing class. And in doing that, you would start to see, oh here’s how the lessons of doing this adjacent thing well could apply. That was the best advice that I had. I feel sort of affirmed, from what you were saying, that adjacencies for certain kinds of things bring a certain kind of value.

Leanne: So, the Luma Institute, who put this framework together around design thinking methods so that it’s easy to teach and easy to learn, calls that alternative worlds. And I think that’s a useful way to think about it, that for your person who is trying to figure out how to write a research report for whatever audience they’re trying to serve, well that’s storytelling. It’s sales, it’s marketing. It’s slide development. It’s all these other things. And when you start thinking of it as how would I think of this from the point of view of a salesperson? Or how would I approach this from the point of view of a marketing person who has to put customer stories on the web and sell that to somebody? Or how would I do this as the point of view of the person I am presenting this research to? Like what do I want to hear? It gets you out of the sort of like oh I have to have 10 slides and one has to have my methodology and the end one has to have like recommendations. You can start thinking about it more as like I want to tell this story so that they land on that naturally. Or, they ask me questions about it when I’m done that will answer all those things, so I don’t have to have it all in my report.

Steve: I feel like this is a transgressive notion of expertise though, especially in tech.

Leanne: Yeah. It would be nice for all researchers if they could just come in and do their research and research was valued and anything that researchers said they were learning from customers/users was taken and put into the roadmap. I don’t know anywhere that happens. And because there isn’t anywhere that happens and the way the culture in these teams work, you have to have this sort of like marketing and collaboration and partnership mentality around your work. If you’re a consultant, you get a project, you do it, you collaborate, you sell it a little bit – you’re done. You don’t have to still hang out and make sure it’s being used and follow-up.

Steve: It’s just all sunshine and roses being a consultant, right!

Leanne: Exactly. That’s why I don’t do it anymore.

Steve: I had a coach advising me, in my consulting business, and what they said was – they said many things, but one of them was anytime you start talking with a prospective client, show them – show me how you’ve helped someone like me. That adjacency kind of framing – the advice was that wasn’t as persuasive. So, that could be what have you done in my industry, in my vertical?

Leanne: Yeah.

Steve: You know, and I think researchers, or anyone that just likes to sort of – that gets deep into different kinds of problems, you know we see connections between things that are – you know we know why this problem is like this other problem, even though it’s different.

Leanne: And you want the other people to see that and sometimes it’s hard to show them a lens that will help them see that the way you saw it. Yeah.

Steve: I think the advice was that’s not where you start the relationship. You start the relationship a little more close in. “Oh, this thing that you’re doing. Yes, I have done that.” Even though every problem is new in ways that – it’s very hard to do that.

Leanne: Yes. Yes.

Steve: Sort of starting the – showing relevancy right at the outset of the conversation.

Leanne: When I ran a consulting agency, that’s how I started it. Somebody asked me if I knew how to do something and I’d never done it before and I said I knew how to do it. And all of a sudden, I had a consulting.

Steve: So, your career is just built on lies.

Leanne: Yes.

Steve: Is what we’re saying here, okay.

Leanne: Yes.

Steve: And we’re leaving this in. Yeah, well I think that’s true too. There’s the confidence to explore. I mean can you do this? Yes. Have you done it? Well, no, but I can do it.

Leanne: Yeah. And sometimes the question never comes up, have you done this? It’s just the can you?

Steve: So, that’s – so, we’re talking sort of requests in the consultant side of things. Inside organizations, in any of the ones you’ve worked in, like how are projects formed? Who decides?

Leanne: So many different ways. So, I would say that if you’re beginning research in a company, you’re taking requests and you’re probably using your boss or some other key stakeholders you recognize to decide which to work on. And you’re probably prioritizing those in some sort of backlog, if that’s in a spreadsheet or it’s a Jira or whatever. So, you’re showing people sort of like here’s what’s in my backlog. Here’s what’s coming up next. If you’re a team, the way I prefer to run a research team, which is the way I’m gradually getting it set up here – how it was set up at my last couple of jobs – is to have researchers who are paired or embedded, or whatever you want to call them, but they’re primarily working with a product area. A product team, a feature team, a product area, however your organization is set up. And they are servicing that team and partnering and collaborating with that team. And they own the research for that. But because they’re a part of the research team they also have a holistic view across the organization, because they’re also meeting with other researchers who are doing the same thing and seeing where their work, or where their product area or features sort of overlap, or interlock with each other. And then having some people who are more senior, principal researchers, just working on special strategic projects. So, that could be, depending on a company, that you have someone who is more focused on market research or competitive research. It might be someone who is focused more on research to understand where we should go next? So, sort of like future casting and sort of like looking at what’s going on, finding edge cases, seeing spots for innovations and other sorts of insights. And then also the operations side. People to manage panels of users and recruit people. And then there’s always, depending on how many people there are, some sort of management tier for people management and coaching and that sort of thing.

Steve: And when you talk about embedded you – it’s certainly a frame that you hear described with some regularity, but your words are a little different. Some people talk about embedded, that that researcher is on that team, sort of fully on that team. You’re almost describing like a workstream or a customer or something.

Leanne: Yeah, because the way people work on teams now is not the way that HR software recognizes. So, HR software, for example, recognizes I have a boss. He’s a VP of design. Recognizes I have people who report to me, all the people on my team. Um, maybe recognizes that I have people that might be good for 360 reviews. So, my direct colleagues who also report to my boss. But, that’s such a limited view of what I call my team because – I’ll just talk for me. Like I work with lots of different people around the company who, if I was going to be reviewed, I would want them to weigh in. People on my team – actually, we had a research team meeting this morning. I had them all bring a stakeholder map because I wanted to see – now that I have people that are sort of like embedded or paired within teams, like who are they working with and what does it look like, compared with people on my team whose work is sort of nebulous around all the different people that they’re managing and all the different work that they’re doing. And it turned out that way. So, if you’re a researcher on a team – a lot of people refer to the three-legged stool, but if you have a researcher it’s a four-legged stool. So, you have someone from research, from engineering, from product and from design, all working together with like a common focus on how are we going to get this out the door and make sure it meets customer needs, solving a problem, all that sort of stuff. If you are somebody in operations or a principal researcher working on a strategic project, you just have a mess of people around you. And so, all these people sort of like vacillate in between teams. Designers do the same thing. Designers are a part of the scrum team and they’re a part of the design team. Engineers are a part of a team for some sort of product or feature. They’re also a part of an engineering team. And if gets even more complicated when your manager is in one city and you’re in another city. Because if you’re enough time zones away you probably have somebody who is sort of like your dotted line manager in that city to help you with like all the things around your job. For example, I have a researcher in Barcelona. We can’t look over each other’s shoulders. There’s only two hours a day when we’re both at work. So, there’s a director of design in that office who is her sort of – who is her manager there. So, if she has questions about how does this work? Or how do I do this? Or who is in charge of this? Or help me review this? He can do that. But no HR software recognizes that. It doesn’t recognize she basically has two bosses. It doesn’t recognize that I have one boss, but actually these three other people are kind of my bosses too. It doesn’t recognize that that designer has a boss for whoever is running that scrum team, but that designer also reports to a design manager or director. So, most people work that way now. There’s very few people inside companies anymore who will say they belong to one team and that’s all they belong to. Everybody belongs to like multiple teams and we have this archaic structure around this person decides like your performance review and hopefully gets input from all these people. But, just lines people up in like lines and doesn’t realize that it’s a little more messy than that.

Steve: So, does that gap between the reality and the structure of the software, what’s – are there consequences to that for how we work?

Leanne: I don’t think there’s consequences. I think it’s just the way it is. I think there’s more 360 reviews now, just in my experience, then there were 20 years ago, because you work with so many other people now. I can remember jobs from early to mid-90’s. Nobody was doing 360 reviews of any of the jobs I had then. Like I worked with a set of people in a geographical space. I didn’t work with anybody else. I think it’s just more like it is what it is and things have come in to help manage and understand that, but we, the people who sort of like do to the sort of organizational management, haven’t figured out how to accurately represent that in a structure of the software that manages the company. Some companies try to go – what was it, like Zappos, that decided nobody had a manager. That seems like part of a reaction to that – Holacracy, that’s what it was called.

Steve: So, we got into this because we were talking about what does it mean to be – there sometimes seems a tension between researchers are embedded, or researchers are centralized. And I think what you’re saying is they are neither, or both. And that that’s how work happens.

Leanne: Yeah, and you have to look at the culture of your org. How does your org work? Who wants to participate and who wants – it’s that old RACI model. Like who wants to be involved? Who wants to be informed? Who wants to partner with you?

Steve: So, wait.

Leanne: Do you know the RACI model?

Steve: I know it, but I need to clarify it.

Leanne: It’s where you define – I might not get this right. It’s like where you define who is responsible/accountable, who is interested in consulting on it and who just wants to be informed. So, if you’re thinking about a project, you always have people who sort of fall into those categories. So, I just think it depends – some cultures are very top down and as you go up the ladder they only want to be informed and not as involved. Some cultures are more participatory and as you go up the career ladder, everybody wants to participate and know and be involved. So, it just depends where you land and get a sense of who just needs an inform, who actually wants to come along and sit with you while you do a remote interview, and travel 3 blocks away to visit the customer. You know how far up the ladder does that go?

Steve: So, coming into these organizations that have done very little with research to date – like understanding the culture, it seems like that’s got to be an initial step to get the lay of the land?

Leanne: Yeah. Who’s making decisions and what data are they using to make those decisions? Do they want any more data? Usually the answer is yes. Anybody wants to know more to make better decisions and, in a corporation, make more money. So, figuring out who those people are, where the gaps are, what they need to know. And then another thing you can do when you start at a company is be the researcher yourself of that org because what you’ll start to notice, just like any research project, is that you’ll start hearing the same thing from multiple people. So, you’ll start hearing like, “the one thing we’ve never figured out is whether people really like blue or red better.” You know there’s always something that’s like the one thing where people aren’t aware that everybody has the same question, but because you got them to open up and talk and had casual conversation and used all your interview tips and tricks to interview multiple, different people who don’t work with each other, you’ll start to see a theme around something that everybody wants to know, but nobody knows, but nobody is like trying to figure it out because they don’t know everybody else wants to know. I’ve seen that at every company. We’re working on a project like that here right now where there’s basically two things that I heard across the entire company when I first came here. So, I was like okay, that’s where I’m going to start some key research projects and then I’m going to bring in some people who are interested in it. And then as I hire people in, I’m just going to start attaching researchers to different product areas and work with the people who run design to sort of think about which designers need the most help right now. Or, what meets the business directives, or the key priorities, or the goals, or whatever – the product org of the company.

Steve: Are there stages of evolution? I don’t know if it’s milestones or some framework that at these points you’re kind of coming in where there’s very little and you start building in a way that is specific to what you’re learning about the culture, are there stages that you could identify that you’re passing through or moving towards?

Leanne: Oh, yes. So, once you get sort of like programs and tools set up you don’t really have to do as much anymore. So, programs and tools like is there decent survey software? Do we have access to users? How do we access them? Is there a panel? When we do research, do we have standard agreements that we sign with the humans that we’re doing research with? Repositories – where are we storing everything? How are we storing it? How are we accessing it? Do we need transcripts? Do we have a tool for that? Do we need to do card sorting? Do we have a tool for that? Just sort of like sorting out all the tools and processes for the practice. What was the question? I got lost on a rabbit hole of process and tools.

Steve: You were answering the question. So, what are the stages or milestones?

Leanne: Oh yeah, okay. So, looking at what’s needed that is more like capital improvements. So, like we need a new roof. Okay, we’ll get a roof, it will last for 20 years. Done. So, things like how you pay incentives and research agreements, that’s sort of like the roof. Then you need like everything that pays the electric bill and the gas bill all the time. So, those are projects. So, what projects need attention right now. And you learn that from interviewing sort of like all the key stakeholders you’re going to work with, which in my example earlier was once you interview a whole bunch of them you start to find out that there are some key things that everybody wants to know. And then figuring out, either with your boss, if she or he knows, or with key stakeholders, like if I was to start bringing in researchers one at a time, where would we start? Who would benefit the most? What would be most impactful? Or said another way, sort of like what are the things launching soon, or later? What are the business goals, or what’s in the strategy? Or what are the business directives? Or all those sort of things that make a company run. And that helps you because then you’re making sure you’re focusing on something that is actually going to eventually affect the company’s bottom line. So, when the research comes out and it helps that, then you’ve been like okay, this is how we do it at this company. We’re not academic. We don’t go sit back and sort of think about things and just, you know, go research things out of curiosity. We look at where the company is going to make sure we’re helping it and helping it do better with the research that we’re doing.

Steve: And then what’s a five year – that’s a presumptive question. They’re all presumptive questions. What’s sort of the horizon for which you would have vision? The time horizon?

Leanne: For the times that I’ve come into a company and there’s been no research, or one or two, like here, individual researchers, what I’ve told people is it’s going to take 12-18 months, with budget and with head count, to get this into a modern research practice. And here’s how we’ll do it incrementally and here’s how we’ll check in and benchmark and measure it along the way. When I was at Autodesk and I inherited a team and then I sort of hired in and added on to it, it was much different. These were really old products that had a really amazing install base. And some of them were market leaders. It wasn’t like companies that were two years old, five years old or ten years old, or fifteen years old. Like, once you’re at 30 years you’re settled in a certain way. So, I did different things. I tried to better understand the products and better understand the company to see where we would could add value. And then also, because that was a much larger company, see where research was happening across the company to see if we could sort of like combine efforts.

For example, I went to some company that’s like 25 years old, that has say 1,000 researchers. Because there are some companies out there now that have that many. And I was plopped in as a director of research with 20 to 50 direct reports. It’s a much, much different role than coming into the company and setting something up. They’re you’re coming in and you’re saying how is this done? Is there any way it could be improved? Is there any way I could sort of like do – no, there isn’t? Okay. So, how do we prioritize things? How do we have impact? How do we collaborate and share out and be good corporate citizens and be a part of the team?

I’ve realized, with this third job that I’m on now, after doing consulting for many years, that I do really like growing things. There’s a challenge to it that like gets me into work every day. Like, oh, we have to figure this out. Oh, there’s lots of problems to solve. Oh, how are we going to prioritize this? Or how are we going to solve for this? You know, how am I going to find someone to fill this role?

Steve: You just listed a bunch of things as positives that someone – if you’re in a different emotional state or a different sort of mindset – could frame those all as negatives.

Leanne: Yeah. Yeah. You have to understand the challenge you like to work on. Is the challenge I want to be the person who works with the team to help them figure out what they need to solve? Or do I want to solve an organizational challenge while also working on business priorities and solving problems?

Steve: So, this is something that kind of sparks you?

Leanne: Yeah. And I didn’t realize that until – it was like 3 months ago or so. We hosted a breakfast here and somebody was asking me what I was doing, and they were like, “Well it sounds like you’re really good at that because you’ve done that three times now, and you started your own company.” And I was like, oh, that’s the first time I realized like this is my thing that I like to do.

Steve: Yeah. That’s great.

Leanne: Sometimes you have to be almost 50 to have finally found yourself.

Steve: Right. It’s the journey, not a destination, right.

Leanne: Yes.

Steve: I’m sure you’re going to keep finding yourself.

Leanne: Yes.

Steve: Anything else that I should have asked you about that you want to share with us? The collective us. I know it’s just you and I in a room.

Leanne: Well, there is one other thing I’ve noticed in the past few years. I have used a community platform, that’s sold by a company and meant for marketing and brand teams, and have used it for product and design research. There’s something about – so, support always has its help forums. Marketing and brand usually do focus groups, but now sometimes they run online communities where you can get badges and win things for participating in their research, or telling stories, or whatever. There’s a space in between that people who are on research teams at companies sometimes use and sometimes don’t. So, a typical team might have a panel, particularly if you’re B2B and it’s hard to get access to people. Or even if you’re a consumer-focused company and you just want to make sure you have like your 10,000 people who you can access at any time to easily invite to research. But I think there’s something about the way that people on social media interact with each other in a group in one-to-many fashion and many-to-many fashion that is happening in some ways in research teams, and sometimes not. So, if you’re a researcher and you can, at any time, poke into a community that belongs to you – this is my panel, but your panel, they can’t all see each other. In a community they can see each other. And so, it acts a bit like a support forum, a bit like this marketing and branding, sort of like what you think of our new colors sort of thing. Or this is our new video for our marketing campaign. But instead it’s for product and design. I’ve noticed there’s a certain fear of it – oh my God, if we get our users and customers to talk to each other, what will they say about us? But I think if you get over that fear and you open it up and put together a private space where they’re all talking together and you and your product design, engineering, marketing sales, whatever partners are also in there, you create this sort of like brand loyalty and product loyalty that you don’t necessarily get just from marketing brand and sales and just from support. So, I’ve done that in my last two companies, created this sort of community where people could see each other, and we could interact with them in little like pop-up sort of projects. Haven’t done it here yet. But when I talk to the vendor that I’ve used for doing this, the last time they demoed it for me, I saw that the examples they were giving me were for a product and design team. So, I was like huh, you’re now selling to product and design teams. So, there’s something going on with people picking that up and doing that.

Steve: How do you set people’s expectations, the people that are participants in this community, for what – as you said, it has elements of other kinds of communities that are maybe more familiar…

Leanne: Well, you have to make sure, particularly in business settings, that – you know don’t share anything that you wouldn’t want your competitors to see, because you don’t know if your competitor is in this community, or not. We’re focusing on sort of like your work and your use cases of these sorts of technology. Don’t disclose anything that you shouldn’t be disclosing. But, if you’re in a space where people are using a technology where they get a lot out of finding out how other people are using it too, then you create relationships with each other and then you create your own relationship with them. Another downside is you can get bullies, so you need to have guidelines. We can kick you out if you behave in one of these ways. And with panels, because it’s the company to the user or customer, and that’s it, it’s harder to find sort of like an intrinsic incentive for someone to participate in research. In a community you usually don’t have to have any incentives that are hoodies or cash or gift cards or whatever because the intrinsic incentive is that they’re getting value from other people being in there. And then every once in a while being able to talk to a product manager or talk to a designer or talk to somebody else. So, I think that – I don’t know if that’s something that will keep going, but I see it as a way, particularly in B2B research, to provide value for your customers that actually helps out your sales people and gives your product people and design people a way to interact with users and then also give the user support that they’re probably also getting over in the support forum, but a different kind of support because you feel like you’re part of a group.

Steve: It’s compelling to me to think about just blurring the different kinds of interactions that companies have with people.

Leanne: Yeah

Steve: Like you said, there’s marketing, there’s research, there’s support and we structure the companies around those kinds of functions and the tools and so on.

Leanne: Yeah. And it can make it hard to see the end-to-end customer experience. Yeah.

Steve: What – maybe we can rewind. Like how did you end up in research as a profession?

Leanne: Back, when I graduated from college, in the early 90s, I went to work for a startup cellphone company as a statistician. My background is in statistical computing and economics from my university education. So, I was analyzing what were then considered very, very large sets of data – terabytes of data. I was running a neural net program on a Sun SPARCstation. I don’t even think it had a gigabyte of RAM. Yeah. It had two processors. It was amazing. It would take like 4 to 6 days to run something, to spit out a model. And if I got it wrong I had to do it all over again. What I was doing was I was working in a marketing organization and I was there to help them like figure out things like who should we sell call waiting to? Like should we just sort of try to sell it to everyone? Or should we target people? So, I was figuring out targeted lists based on behavior that we saw in cellphone use data, to help the marketing team understand who to sell things to. When they did that, what I realized was I didn’t know why the people were buying it? Like, just because I could predict like because you have these like usage analytics, it makes you more likely to buy, but why is that? And a couple of things happened then. The company I was working for got bought by AT&T, became what is now AT&T Wireless Services. When you get bought by a much larger company, a bunch of chaos ensues. Your job often changes. Your boss often changes. I was very young. I was in my early 20s. And at that exact time Mosaic came out and the web became graphical.

So, there were all these things happening that occurred at this point in my life where I was like, huh, this is my first job out of college, but now I’m interested in knowing why these things happen? I just taught myself HTML. I figured out how to put together a webpage. Oh my goodness, there’s this thing called Match.com and I met this girl and she lives in San Francisco and I want to date her. Oh, I’ll just go take this job at this company in San Francisco, this web company that has like 20 people and they’re making websites and they’re from all different sorts of places. And I became a QA manager there. Once again, sort of being asked, “can you do this?” Not, have you done this. And I said, “yes.” And then from QA realizing that we didn’t really know what the users were doing. And I, at this point, had never heard of user research. But I was friends with people who were doing that at different companies. So, I asked them about it. And then I left that company and went to a startup. Was also a QA manager. Got laid off when the company didn’t get its next round of funding and went and did some consulting. Doing like web development for the summer. Set up a company that I had intended to be a company that did QA for web apps and somebody came to me and said, “can you do some researcher interviews with users while you’re doing browser testing?” And I said, “yes, we can.” And I was like here’s my chance. I’ve always wanted to know why people are doing things. And so, I happened into it that way. And in the mid to late 90s anybody could do that. You could teach yourself PHP. You could declare yourself to be a web marketer. You could say I’m an information architect. Like you just said what you were, and you could just do it. So, I quickly went and found some friends and said, “I have this consulting project. I don’t know what to do, help me out.” And they were like well you do AB…like the things I just mentioned. You have to do A and B and C and D. I was like okay. So, I did A, B, C and D. I did the project. I got paid. I thought it was amazing. I was like I want to do this again. I was like a kid who gets ice cream for the first time. Like, I will have more of that. And that’s how it got started. In consulting, what you need, from my experience, is you need one or two projects as reference projects and then that helps you get the next project and the next project and the next project. And then it was 17 years later.

Steve: Wow. You’re saying that there was a time when you could sort of declare yourself to be this thing.

Leanne: Yeah. When the web was new. When this new technology was coming out and consumers were grabbing onto it and companies were seeing that money could be made there and – people think now that there are more jobs than people, but then there really was more jobs than the labor market could supply because the labor market didn’t have the skills and the education system wasn’t educating people with those skills. For example, in looking for research interns for this coming summer, I interviewed people who were getting degrees in user research. Like a Bachelors of Science in user research. And I was like wow, like companies are now preparing people and there’s a whole education system out there in these D Schools and I Schools and everything now that didn’t exist in the 90s.

Steve: Right. I sometimes feel like oh I’m – ‘cuz I came up through the same system, or lack of systems, that you did.

Leanne: Yeah. If I wanted to do this job now – say, if I had gotten my Masters at the Berkeley I School, or whatever, and I went and got a job as like a senior researcher at a tech company, and I worked myself up, way up, it would be much, much different than my experience of teaching myself and figuring it out.

Steve: What’s lost or gained in that evolution, if anything?

Leanne: Well, I can clearly see that my sort of like – YouTube didn’t exist then, but it would have been like the university of YouTube, the way that I went about teaching myself how to do research and starting up a whole consulting agency around it – it would have been much different if I had been schooled in a particular way in how to do research. I think some things would have been lost because it wouldn’t have been sort of like me and a couple of other people just sort of figuring out how to do something. You’d come up with new stuff. But I think some things might have been gained. I might have more confidence – like I have a Masters in this, so I know how to do it. Or, I did my thesis on this and I am – yeah. There’s a certain amount of imposter syndrome that goes around, even if you have that education. But sometimes when you don’t, and you make it up yourself, you can doubt whether or not you really know what you’re doing. Fake it ‘til you make it.

Steve: I remember just being in a consultancy kind of around that period of time and that every prospective client that came in we would have to write a proposal for what we were going to do. And it just felt like writing a Master’s thesis every time. Or some just impossible thing to give birth to. Like explaining what it was, deciding the steps, trying to articulate those steps. We just didn’t have any point of reference and so we were sort of figuring it out every time until I think at some point we started to settle in and like oh, here’s kind of what – what it looks like. And now like writing a proposal is not hard. There’s many other hard things.

Leanne: But now you have templates and you can copy it.

Steve: And you know – I mean I have a narrative that runs through my head that oh yeah it kind of goes like this. We have more tribal knowledge as well as…

Leanne: Yes. More people to reach out to, to get examples from too, than used to exist.

Steve: Right. In this era, this consultancy, not only were we figuring it out, we were doing so in pretty much isolation. Because you knew a few people, but most of us at that era – there wasn’t a community of researchers to kind of connect to.

Leanne: Yahoo Groups did not exist yet. Google Groups was not there. Google wasn’t even here.

Steve: Yeah. This is great. It’s a very interesting conversation. I learned a lot.

Leanne: It was great chatting with you Steve.

Steve: I appreciate you taking the time.

Leanne: Absolutely.

Steve: Anything else? Are we done? Any last thing?

Leanne: It’s fantastic. Yes. Perfect ending.

Steve: Mic drop. Alright, thank you.

That’s it for this episode of Dollars to Donuts Go to portigal.com/podcast for the transcript as well as links for this episode. At the website you can also check out all the other episodes, and subscribe to the podcast on iTunes, Spotify, and all the usual places. My books are available at Amazon and rosenfeldmedia.com. Our theme music was written and performed by Bruce Todd.

Feb 14 2019

1hr 1min

Play

Rank #4: 11. Gabe Trionfi of Pinterest

Podcast cover
Read more

This episode features Gabe Trionfi, the Manager of Research at Pinterest. We discuss the evolution of user research, collaboration between disciplines and the journey versus the destination.

Great researchers really know the on the ground, in the trenches, skills, tools, things they have to do to move forward with the group, but they also have to play this unique role where they pop out and get the bird’s eye view or the broader context and think about that, but not in a way that distracts the rest of the team. And then they port back down to the blood and the guts of what we’re doing every day in the trenches to figure out how can I take those observations that are abstract and make them useful for the team? So all research conversations for me waver between the two. They’re up in the air, really philosophical, abstract, but then also down on the ground, really practical, really technical. – Gabe Trionfi

Show Links

Follow Dollars to Donuts on Twitter (and Stitcher) and take a moment to leave a review on iTunes.

Transcript

Steve: Let’s start with the introduction. Tell us who you are, what you do, where we are today.

Gabe: Yeah, so my name is Gabe Trionfi. I’m the Manager of Research at Pinterest. I’m a researcher. It’s what I do. It’s what I want to do. It’s not the journey, it’s the destination for me. I’ve been doing research for about 10 years. I’ve been at Pinterest for about 3 ½ years now and we’re in the office, third floor, new headquarters for Pinterest. And we’re in Alberta, a Canadian province theme named room.

Steve: Very appropriate.

Gabe: Very appropriate.

Steve: We have two Canucks talking research.

Gabe: Absolutely.

Steve: So you made an interesting comment about it’s the destination, not the journey.

Gabe: Yeah.

Steve: So could you explain that?

Gabe: Yeah, I think for me when I think about research it’s my career. It’s what I want to do and I always want to be doing research in some way. I have the opportunity now at Pinterest to manage a team. It’s my first time managing a research team. I’m always going to want to work as a researcher in some way. I’m not looking to become a PM or a head of product. I want to be involved with research. I also think it’s interesting to think about a career path. Like I’m looking for a positive trajectory for my career, but whether it’s being a manager or an individual contributor, that’s like not the most important thing. The most important thing for me is are we working on something that’s interesting? Is there an interesting research question to be have? Is there a product that people are going to be using? That’s really important. Are we shipping stuff that’s making people’s experience better? I want to be a part of that.

Steve: But some of that isn’t research.

Gabe: Absolutely.

Steve: Some of that is the thing that research informs. In the context in which you’re doing research you want to have a good question, but you want someone to make stuff out of it.

Gabe: Yeah, so I think sometimes when we talk about – in design or creative disciplines like writing and research, people are like well you’re a manager now. Are you always going to be a manager? Or does it mean less if you are not the manager? I would much rather be – go forward and have a role where I was a researcher on something interesting that people were using vs. always choosing to be a manager on something that people weren’t using or was potentially not that interesting from a research perspective.

Steve: You mentioned to me before, outside this interview, that you have a history in theater. It’s almost like you want to direct, but you don’t want to give up your acting jobs?

Gabe: Yeah. Yeah. Or it’s all part of acting. It depends on how you think of it. I think it has – there are some really interesting parallels. So I had never managed a team or built a team. At Pinterest I was the first researcher. There was like an opportunity where they said do you want to build a team? I said I want to try that. I didn’t know what it would be like and there’s been a ton of learning. Obviously when you’re first out of the gate you don’t know enough to be great, or even good, but you learn over time. The one thing that was interesting, taking it back to theatre, was I found that building a team reminded me a lot of casting a play because when you’re really putting together a great research team it’s not just skill sets. It’s not just experience, but it’s also understanding how will these people interact with each other when we’re building knowledge together? How will they be perceived as a group, as individuals? It’s like casting a play, like how will this actor relate? Or to this other actor that we’ve already cast? Do they have chemistry? Do they represent the relationship of the characters well by the way they look and what they sound like? And that was really surprising to me. It’s been a really fun part about building the team at Pinterest.

Steve: I want to follow up on that, but I also want to go back. Let’s just go back to a few different things here before we lose them. You opened with this big idea about the journey and the destination so now I’m going to pick it apart until you wished you’d never said it to me.

Gabe: Okay.

Steve: Because I – I mean I feel like you’re describing problems that are sort of journey problems not destination problems if that’s a way to slice it. Like you cast a play, you can be expert in it and you can know all these actors and know what their strengths are and you can do the right kind of – what’s that audition where you see people play off each other. Like a screen test or whatever that is. You can do all those things, but it’s still an emergent thing. You don’t know.

Gabe: Yeah, I think for me I think of research as a destination as quite large. That there’s many things to be learned. I still think of research as early days, particularly in the context of technology and start-ups. And so I think it’s fast and deep and interesting and sometimes I may talk with people who seem like – well this is how I’m getting into tech. So I have a social science degree and a little bit of background in psychology. I’m going to be a researcher for awhile, but really the destination is like to be a product manager or to be a VP or head of – and for me the actual research part is so interesting and there’s so much to learn and do. I definitely feel like it’s the beginning and that’s ironic of course ‘cuz you’ve been doing this for much longer than I have. When I worked at previous places like IDEO, they have people who have been working in research since, you know, the early 60’s in the context of…

Steve: I have not been doing this since the early 60s. Just for clarification..

Gabe: But in the context…

Steve: But the field…

Gabe: Yeah, the field goes back. You can even go back to World War I or World War II cockpits in air planes. The sort of human factor, early ergonomics. That’s sort of the seed of a lot of this stuff. ’63 is when the Royal Design Conference, right, in England – the Royal Academy of Design – they start talking about research for the first time. Engineers start talking about design and research and really what it means. So it’s a discipline that the idea has been germinating for awhile, but I think in practice every day it’s still fairly new, especially when you get outside the context of physical products and service design and you get into like pure software technical. The majority of people I work with here have never worked with a researcher before. And so that’s a challenge because you’re sort of still at the beginning and sometimes you can do really incredible work and people may not know that it’s that good because they actually haven’t worked with anyone before. And that’s sort of humbling and interesting and it’s a challenge at times, but it’s also a huge opportunity. So that’s interesting. I often talk about research being early like – you know the difference between Joe Namath and like Super Bowl 3 I think – who’s like a big time football player. He is the quarterback. He’s sort of glitzy and everyone knows him. He’s the underdog. He says they’re going to win. They win. He’s like celebrated on a bunch of magazine covers. But it’s like in the 60’s, the NFL. So like he goes home and has a Schlitz or something, right. Like he kind of has a regular life. Fast forward like to the 80’s. It’s Joe Montana, right, for the San Francisco 49ers, who wins a bunch of Super Bowls. He then – he’s a big deal, like Joe Namath, but like he goes to Disneyland afterwards and he gets like a lot of money and he’s super wealthy and commercials and whatever it is. The difference between Joe Namath and Joe Montana, we can argue about it as athletes, but the main difference is timing. When were they the quarterbacks of the Super Bowl winning teams and sometimes timing, when you’re at the beginning that’s just a constraint. It’s just part of being at the beginning and I think that research has a long way to go as a discipline and there’s a ton of opportunity, but like if you’re a researcher 10 years from now you’re going to be standing on the shoulders of schlubs like ourselves who had to do it in the early days. It will be way better then because people will understand it and also the market will shift in a way that will require research or something like research to inform product development. Because it will be more competitive. There will be more services and apps within the same space that need to differentiate from each other. So timing, where we are.

Steve: Where are we between Namath and Montana?

Gabe: That’s a good question.

Steve: Is there a visual continuum here and we could get the whiteboard out?

Gabe: Yeah.

Steve: Let’s just challenge that. A recent issue of the New York Times Magazine was all about work and so this article, this article, this article. There was one – it was really about people eating at their desks, but the side bar piece – unfortunately I don’t remember the name of the person, but the side that kind of goes with the article that supports this photo essay says so and so, an ethnographer, is like the first thing. And it was not an article about oh these people are like anthropologists in safari hats going into corporate America. They didn’t go that thing that journalists were doing 3 years ago and 15 years ago which is ooh, look how weird this is. They were really just – it was sort of incidental and characterized the fact that like hey there’s an expert who’s watching people eat lunch at their desks. To me the cultural moment there is like this is a thing that we don’t have to just point to it and go oh-ah, we can use it then to tell another story. So that seems we’re – I don’t know if we’re taking this person to Disneyland, but we’re not having Schlitz.

Gabe: Yeah, so I think this is an important point which is – like in research we often pay attention to context, so my context today is within tech, within start-up land. I think when we look at research trajectories in terms of its development as a discipline it actually tracks differently across different contexts. So I think when you look at physical products, service design, I think that industry what’s happened there, that collection of industries with research is far ahead of where we are within tech.

Steve: You think tech is behind?

Gabe: Well I think tech is tracing the trajectory. To me it looks the same and I think it’s headed in the same direction, but I do think it’s at a different point much like just the industry, the current modern industry, like post-Internet, apps, software design – it’s fairly new itself, right. And so with a lot of new industries, especially industries that have access to a lot of data, right, even the history of psychology itself has had moments like this where people think it’s all about this type of data, this quantitative data, right. That’s how we can – we’ll find the future opportunities within that and then we realize like well actually one type of data may not be enough. We need different types of data. And then we might switch and say like well I’m going to do this other thing, or I’m going to do this other type of data acquisition or method, right, and then really mature fields, they come to usually a realization which is like yeah there’s no one thing that’s going to do it for us. We need multiple things to try and understand complicated phenomena and then within that, especially within the world of business and tech, then you’re still going to have make a risky call because again none of this data can tell you the future. It can paint out trajectories and possibilities, but you still have to take a risk. I want to work at a place where someone’s going to be able to understand the data, triangulate and then make a strategic decision that’s going to lead to a better outcome for the people using it and the people working on it. So I think that when we get over – again, some disciplines, some industries in the beginning there’s things that seem easy that are a lot more complicated than they seem.

The example I always give is like the common thing is qualitative versus quantitative data which of course they both have subjectivity embedded in them. They both have trade-offs. They probably should be used appropriately which is what’s your question, then let’s choose a method that fits the question. But the metaphor I always say is like I would never hire a general contractor to build my house who is like I only hire carpenters who use hammers. There’s no screwdrivers on my site. It makes no sense because what should determine the tool is the rest of the context. So let’s not use a hammer on a screw. Let’s use it for nails and let’s use a screwdriver for screws. And that’s the same thing for me about methods which is it doesn’t matter what the tool is. What matters is the context and what is the appropriate design to get at understanding the question and also the outcome goal. What is our goal? What are we going to do with the answer? Those are the things that are important to me.

Steve: So you could start to sort of map this trajectory of research and tech and look at well what were the predominant tools – if the tools are a proxy for what questions are we asking, we might just have people showing up with hammers. I remember the point at which Microsoft got usability. Like that was going change – and this was I want to say early to mid 90’s and there was a lot of press about, like the physical rooms they were building. I think I went on an interview there and went on a tour there and you realized oh they’re just going to like run, like huge – I mean they’re going to do usability testing at scale. You know the word that we like to say in research or everyone likes to say in tech I guess, about at scale. They were really going to scale the hell out of this and sort of usability their way into something. And there wasn’t a lot of – I mean that was leading edge thinking back then. And I think as you’re describing this trajectory or this changing context it’s about the tools and the tools are the thing that we can see more easily. I don’t know what design questions were being asked of Microsoft, but I can infer based on the fact that this is where they put all their money. So I think you’re sort of describing a long time ago already, you’re saying now we’re like starting – we’re at a point where we need these tools and we need to be asking these questions. And so – I don’t know. Is that fair, the recap?

Gabe: Yeah. And also sometimes the context is like within software and apps the marketplace actually comes to bear on the types of disciplines you need to invest in to differentiate. So if we’re in a marketplace where everybody has a really high quality engineering team, everybody has a really high quality design team, and we start stacking the disciplines up and you’re going to enter in sort of the messaging as UI which is really popular amongst a suite of apps today, right, where you’re just – it’s a concierge service or an on demand service that is using the interface of like a traditional SMS app. There’s a bunch of them out there, right. They all have great designers. They all have great engineers. They have great marketers. They need to understand how to differentiate from the other 5 apps within their space and they’re going to have to invest in something. They could invest in partnerships with other companies. They could invest in celebrities, right – to sort of like build their awareness. They could invest in researchers to try and understand what is most meaningful about this experience that they’re building and what are the opportunities to separate themselves from the other within the space. So some of it is disciplines and methods over time. Some of it’s the market itself becoming more competitive and sort of more compressed in terms of offerings out there. Traditionally design has been used in recent years to separate or differentiate, right. And I think research follows design and if you don’t have a designer it’s really rare that you have a researcher because design is working on the experience on the front end of software that people see and think of when they think of an app or service. If you’re not investing in that you’re probably not investing in like a deeper understanding of your audience’s needs as a way to differentiate. But like the classic examples – like Target hires a famous designer to make like toilet bowl brushes that are really pretty. That’s a way to differentiate from others within their space who are smell – who are selling commodities like that. I think research can – will and is having the same sort of role within tech as more and more apps and start ups start to fill into the same spaces. It’s not about building the infrastructure, say plumbing the Internet with social connectivity. It’s about like okay there are 5 financial service apps now that are redefining the loan space, how are they different from each other and how can we ensure that ours is the most relevant to our audience? Let’s get a researcher in here to help with that.

Steve: I think what I heard you say though is at a strategy level there’s a lot of different things that we could do if we’re trying to differentiate our financial app. So we could try to get Leo and his posse to use that. Or whatever sort of way to generate attention. We could have a controversial CEO that makes outrageous comments about people. Or we could change our pricing model. Or we could – there’s a lot of things that don’t necessarily come from a nuanced insight about the world. It’s just sort of differentiation for its own sake. Or there’s a strategy there and one of those out of – let’s say there’s 20 obvious ones – and I can only come up with 2 right now – I’m just parroting you back. So one of those might be let’s invest in experience which means investing in design, investing in research.

Gabe: Also writing. Writing is a similar discipline to me than research within tech now. It’s newer. Usually it’s working across many different teams within a tech company usually. It’s critical, like what is the voice of your brand, but also of your product itself. And I see a lot of – I work with a lot of writers here. Tiffany Jones Brown started as the lead of the writing team here, has since become a Creative Director. She has a great team of writers that I’ve worked with before. Building a relationship with that team, I feel just a commonality or similarity in terms of where writing is as a discipline. And the – if you invest in writing it says something about your strategy when you have a tech oriented app or software service. It says like hey we want to make sure that people feel differently about our brand or about our product. And you see a big differentiator in lots of tech companies now in terms of how do they talk to their users through their product. And so again it’s just another strategy that you could invest in. And research, and a suite of different research disciplines is part of that. Also data science can be part of that. Business analytics can be part of that. Ways to, you know, bank capital, not just monetary but like understanding. Like you can bank a lot of knowledge as capital to inform your strategies or to help move your product forward.

Steve: If I take a naïve look at what you’re describing I feel like there are some disciplines that are output disciplines and there are some that are input and by that I mean, you know, writing is crafting the words that go on your website. Engineering is the code. Design is the pixels and every designer – everyone is rolling their eyes at that. That’s a really, it’s a very broad brush, but research is an input to decisions. Business analytics is an input to decisions. I don’t know if that’s a fair – if it’s that binary or not.

Gabe: I think about it similarly. So I think about it as a suite of disciplines that are about indirect influence. So I’m a psychologist. I’m not going to build something with regards to the software that’s going to get into the world, but what I’m building is understanding or knowledge, but also sometimes creating experiences that are inspirational. I talk a lot about information and inspiration as part of the research role that is going to indirectly influence that thing that enters into the world. Sometimes there’s a one to one. You can just say like hey that decision there was based on this work over here. Sometimes it’s much more diffuse in that the whole host of teams that are actually contributing to the output – the most important thing is I want to work in an environment where research is a part of that, whether indirectly or directly, and it’s not something that’s on the side or separate. I often talk about like Sherlock Holmes and Dr. Watson. Like I’m very happy to be Watson to someone’s Holmes because I think that’s a real role and those Holmes stories wouldn’t be the same without the indirect influence of Watson, right, sort of guiding, informing, being a foil for Sherlock as he goes ahead and builds the solution to the mystery. And I think indirect influence is really interesting. Really hard to do. It – sometimes it’s easier just to like come in and say we should do this and then people do it, right. But to sort of like change people’s perspective on something they think they know very well, but actually might be very different outside of their own social circle or their own sphere of experience – let’s say on the other side of the planet there’s a different culture, different country, a different type of person who’s using your product and perhaps they just have a different mental model that you didn’t even know existed. Bringing that back, but also not just a shock, but to potentially influence, like oh hey perhaps we should think about this a little bit differently. Or perhaps it’s another way to think about it, that there’s not just one way to think about our product, there’s actual multiples that we should keep in mind. And that’s a good thing because it means the product is relevant to lots of people and complex and so it’s not just a simple like P2P commerce, like I’ve got this thing, do you want to buy it? Yes you bought it. Cool, transaction, safety, all that sort of stuff. It’s more complicated than that in some cases.

Steve: So I think research has suffered from the sort of the naïve Dr. Watson – well not that Watson himself is naïve, but if you take a shallow look at those stories Watson is merely the chronicler. He’s the sidekick. He doesn’t do anything. He and I – you know, you do the close reading that you do and you see what he enables and how he kind of supports and I think that’s the beauty of those stories. You kind of go at them a little bit deeper and you realize there’s a complicated thing going on and that it – the more you invest in learning what those stories are about the more they kind of pay off. So just to bring it back then to research – research has maybe historically been viewed at the way we naively view Watson, that research is just asking people what they want. Sort of chronically things and reporting them back. And research as you’re articulating it, and of course I agree, has the potential and does more often now play more of that influential, maybe Holmes is the one that gets brought to Disneyland. I’m going to crush all metaphors together. Like Holmes is the one that gets carried on the shoulders but the researcher is – ugh, I can’t untangle this here. So I guess I’m trying to ask in a really twisted way here is about you know what are the things that you’ve seen or that you’ve practiced to give you the chance to play Watson, to enable things, as opposed to being relegated to a mere chronicler?

Gabe: So, I think this is an important question. So for me I had one experience. My sort of first design research job was at IDEO and IDEO is sort of at one end of the scale design thinking to an extreme. Huge opportunities to do different sorts of work and to really be part of a deep, embedded process. Then I worked at Facebook which is an engineering company and it has a very strong engineering first perspective. So when I was thinking about my next destination I was thinking about what lies in between? What are the sorts of things that are out there? And there were a few companies and I think at the time the commonalty is they all had founders who were designers. And I think when you have a founder who’s a designer there’s an awareness of what research might be. There’s an awareness of the Watson role and its power in terms of creating new ideas or – and I think you want – I wanted to try and see if I could work at those places. Pinterest was one of those places. So Evan Sharp is a designer. He’s one of the co-founders along with Ben Silbermann. And so Evan knows about research. He’s not an expert in it. That’s important, but he’s open to it. He’s interested in it. And so that’s why they brought me in. I was about – there was probably about 70 employees when I joined, which at the time some folks thought well that’s early for a researcher, but if you understand what research can do it might not be that early and in fact the broader field we see now – I see startups all the time with a handful of people who are bringing researchers in when it’s relevant to what they’re doing. So to tie it back to the metaphor, to me it’s important to be at a place where the Sherlock wants the Watson to be there because a lot of times that may not be the case. And so Watson’s ability to contribute is limited by the partner’s desire for input. I think at Pinterest there’s a big desire for that. It’s a really interesting place to do research. Very open, very focused on the audience that we’re trying to reach. In part it’s probably due to, you know companies have lots of values. Evan will talk about this. I think the one, the value that I think is really unique at Pinterest, and differentiating, is something called “knit” which is how we talk about interdisciplinary collaboration and respect. It’s understanding that there are many people who can contribute to successful outcomes at the table and we should understand each of their disciplines, strengths and weaknesses and we should leverage their strengths. So “knit” is very Pinterest, knitting right, like weaving together disciplines. There are a lot of starts where that sort of multidisciplinary approach is not part of what they do, culturally or strategically, and it’s a huge part of what’s happening at Pinterest, whether it’s Community Ops, Design, Writing, Brand Design, Marketing, Research. There’s a whole bunch of disciplines – Data Sciences, Business Analytics – that are all working together very closely – Project Management, Product Management – to sort of put together what it is Pinterest does. So that context, that environment that’s willing to accept what you have to offer, but also try to like embrace and really use it. That’s important for research and I think that’s a choice you have as a researcher. If you seek those places out you are in part prototyping or designing your own experience. Versus going to a place where you might know from the outset they want a research, they don’t know what researchers do, they’re not certain that they will use the research. And that’s usually pretty clear in the hiring process or early on. And I think that’s not to say that’s a lesser than experience, but it certainly is one you’re choosing as a research practitioner in terms of your daily experience.

Steve: Or you should be aware of that?

Gabe: Yeah.

Steve: And you’d do well for yourself to know that that’s what you’re choosing.

Gabe: Yeah.

Steve: I think I saw an artifact of the knitting culture. In the men’s room there’s some Python script coding advice hung over the urinal and the title for it – I don’t remember what it said, something about like python knitting or something.

Gabe: I think it’s called – officially those posters are calling knitting in the bathroom. And there’s someone – an engineer here is – there’s I think a couple of them working together to sort of pass on tips and tricks for certain software languages by putting those up in the bathroom stalls.

Steve: It’s not the only company where I’ve seen like technical advice in the bathrooms.

Gabe: Yeah.

Steve: But the knitting – I didn’t understand the knitting reference until you kind of explained what that means culturally.

Gabe: Yeah, absolutely and I think it’s – you know again there’s a lot of great – we have other values at Pinterest that are great. But I think that one is really unique. Again this idea of – for me it hearkens back to some of the experiences I had at IDEO. This emphasis on multi-disciplinary approach to design challenges. I think it does lead to different outcomes. My background, I have an undergraduate degree in theater and I care a lot about process and how we’re doing what we’re doing together and so again that part of Pinterest and being a researcher at Pinterest is real exciting.

Steve: So we never resolved this. Again, I was kind of picking at you at the beginning when you talked about journeys and destinations as things that you – and so I think I understand as you explain more that being a researcher is a successful state for you. It’s not a stage to get to something else. But when we kind of break the world up into journeys and destinations I mean you – theatre is about the journey, right. The payoffs are – and you said someone who cares about process is someone that cares about the journey. So I think I got hung up a little bit on how you were using that phrase and I jumped all over you, but I appreciate finally we have – finally I have clarity. So thank you.

Since we haven’t talked about this at all can you explain what Pinterest does?

Gabe: Yeah. So if you listen to Dollars to Donuts you’ve heard that Pinterest is the world’s largest catalog of ideas. The question is like what does that mean? And I think at that – at the heart of that is like an interesting conversation for researchers, but also an interesting characterization of the opportunity and challenge Pinterest has. So Pinterest is very different than a lot of the apps or services or companies it’s often compared to. For a long time the press sort of positioned Pinterest as another type of social network. Pinterest is not a social network. It’s a very personal experience for the people who are using it. It involves things like saving things that you want to then go on and do or plan or make or cook or buy. But a lot of – when you do research with Pinterest you are really deep into it. They see it as this very personal moment. Sometimes people will even – again it has to do with discovery of content, the things you care about. Sometimes it has to do with learning. There’s lots of stories here from Ben our CEO talking to people early on where they would say to him anecdotally like I didn’t know I had a style. Like I didn’t think that was part of who I was, but then I started pinning clothes or things for my home and one day I was looking at my board about my living room – you know best living room, or like my dream living room – and I realized like wait a second, seeing all those things together it all sort of snapped for me. Like I’m – this is actually my style. This is what I’m into aesthetically. There are a lot of moments like that I think for Pinners because it’s very personal.

Steve: A Pinner is someone that’s using Pinterest?

Gabe: Yeah, someone who is using Pinterest is a Pinner. Someone who works at Pinterest is a Pinployee. We’ve got a lot of pin puns going on here, at the Pin Factory, which is what I call it.

Steve: We’re going edit all those out.

Gabe: So back to the point, I think Pinterest is personal. It’s done in a way where it’s in a public space, but it’s not social in the same way many social networks are about broadcast. Like this is my identity. This is what I’ve done. Pinterest is really about – it’s about aspirations. It’s about anticipating. Like I want to do this trip so I’m planning it. And then it’s about participating. Participation on Pinterest is not within the digital space. It’s within the analogue, the real world. The part where you have this pin for this recipe and then you go cook it. You make it, you serve it. Whereas other social networks are often about broadcasting in the moment, like real time, or they’re about broadcasting this happened. So they’re sort of reminiscing or looking back on the event. Where we see Pinterest is used to plan, you know, in the US Thanksgiving dinner. You start seeing people pinning Thanksgiving recipes months in advance. Halloween costumes is a big – DIY Halloween costumes is a big thing in the US on Pinterest and people start looking for those how to do – you know put together their dream costume, the best costume, the cool costume, like 2 months in advance of Halloween. So that Pinterest is a personal experience. It’s about planning to take action in the future. It hopefully supports you taking that action because you have the information you need. It’s very different then a lot of other services. So what does this have to do with the catalog of ideas? I think the important bit is we see in all of our research, market research, qual, quant, that this idea of a place to find ideas is differentiating. It’s useful, it’s fun and people enjoy that. The challenge is when you sort of – you have to contextualize that. From a marketing perspective you have to say you know okay what about a place to find ideas. A place to find ideas is not enough context. So there’s been different ways we’ve approached that. For some period of time we talked about bookmarks. Some period of time we talked about – more recently we were talking about like a physical thing like a catalog where you go and browse and discover lots of ideas. The key thing is the ideas bit. We know that’s important. The question is how do you contextualize that, not just for a US audience, but actually for a global audience and that’s really challenging. Pinterest itself is based on you know a metaphor of like a pin and board – pinboard. The challenge is that that doesn’t actually internationalize. Not every country has a tradition of a pinboard with a pin. And so how to provide context for what this thing is and it’s not – you know it would be easy if it was just like oh it’s a social network for, I don’t know, saving ideas, right. It’s not that though. That’s not really the core. And so how do you contextualize that? That’s the challenge. So as we think about different ways of messaging to audiences, especially audiences who are not using Pinterest today – there’s a ton of research we’ve done. There’s a ton of research to be done as we look for that really killer way to describe a complicated and robustly powerful product to people who don’t know what it’s about. And that’s an opportunity for research, whether you’re a marketer researcher or quantitative researcher. And it’s part of what the team has worked on and will continue to work on with our partners in writing, marketing and then just even overall strategically, like the decision makers of Pinterest. How do they want to – they have to go and speak to the press. This is something they’re saying all the time. They need to feel that it’s accurate, that it’s representative and that it really gets at the – the core or the central sort of essence of what they think the product is and will be in the future.

Steve: So I asked you what the product is and you describe sort of the research and strategy journey of trying to describe it – to answer that question in the way that sort of has the most benefit for Pinterest and current and future Pinners.

Gabe: Yup.

Steve: But I’m going to ask you again, like – I mean as you say people are out there now talking about it. So you kind of said – you took the phrase catalog of ideas and then you unpacked it into all this wonderful stuff which I think tells us what kinds of things you’re thinking about and what your work is meant to address. But just for that person that’s scratching their head right now and going like okay, but what is Pinterest?

Gabe: I gotcha. It is a place to save ideas digitally, to discover ideas related to the things you’re interested in and then to go and do the things that you have saved and discovered. So it’s about saving, discovering and doing.

Steve: So what kinds of digital information are people saving?

Gabe: Well I think we’re limited to digital today, but it could be many, many things in the future. It could be saving many types of objects. Right now we’re within the space of like the Internet. We see the – the most common use cases are we see a lot of folks use Pinterest for recipes. Planning daily meals is a huge part of what pinners. Also like discovering or trying new recipes, especially if you’re into that. It’s often about doing, right. Another popular use case – trips and travel. Like right now you want to go on a trip so you go to various websites and maybe you have a folder on your computer or a piece of paper – you’re writing down like where do I want to go when I’m in Hawaii? Where do I want to stay? You can save all those things to a board on Pinterest which is similar to a folder, but it’s within the bounds of Pinterest itself. You can save those links. You can find them on Pinterest. You can find them off of Pinterest and save it to your Pinterest. So that you can go to one board and have all the ideas of what you want to do on your trip when you go there. And you can access that when you’re actually on your trip, right. We also see a lot of people who use Pinterest for fashion, whether it’s discovering new trends or thinking about inspiration for what they want to wear. In fact there was a story this week in the – oh gosh, I can’t remember the newspaper – and this writer had talked about – she let Pinterest choose her outfit every day. So what she would do is she would take out of her closet an article of clothing she had – let’s say a pencil skirt. She would go to Pinterest and type in pencil skirt and she would look at the most popular street style or fashion images of people wearing pencil skirts. Then she would find the one she would like and then she would mimic it with her available wardrobe. So every day she let Pinterest dress her. Again, that really critical thing there is 1) she was looking for ideas, 2) she was then doing it and what was also interesting about it is that she was using Search. We see Search as being a huge part of a lot of Pinners experience in order to find those ideas. But Search I would say is a little different on Pinterest. The Search in this case, and what we see across many people using Pinterest is it’s not to find one idea that’s like the right idea. We don’t have a ranking algorithm that puts the right idea at the top. When you’re in a discovery system or a discovery mode you can search for outfits with pencil skirts, but there are many possibilities in front of you. You are looking across this set of data or images that’s about discovery and you’re trying to look at not just the best, but there are some that are good and there are some that are even better to make or to learn or to discover something that you want to then go do.

Steve: So you described before this – you know your own professional path a little bit and that you sort of found Pinterest as a company that was in between maybe other places you’d worked at in terms of what was research and how it was understood. But as you’re talking about Pinterest and what’s going on and what you’re interested in exploring, there’s also a unique class of problem that I hear you kind of digging into with joy and I don’t know if that’s the culture or you or what. And that is we don’t know quite exactly what this thing is and the best way to talk about it and it’s changing and people are inventing new things for it and the world is culturally diverse. So the fundamental, like what is Pinterest and how do we describe it and how do we build for it is – like that’s a hard problem. It’s a dynamically shifting problem. I mean did you know that’s what you were getting into?

Gabe: Well I have two answers. So the first one is like – the first one is sort of addressing the like what is it, it’s dynamic – like I think there’s something really interesting about Pinterest which makes it unique, which is again opportunity, but also challenge, which is there’s a utility element to this, right. It helps you to get things done. It helps you to find the things you want to do. But it’s not just a utility. It’s not a task based software. It’s not about lists per se. Although you could visualize pins and boards as lists, right. But it has this other element that I think is related and unexpected which is it has this way of conveying or inspiring people to think about possibilities. To discover things that they didn’t know about and to learn something about themselves. And it does this in a way that – we know from research that people really enjoy the aesthetic presentation of this information. They think it’s beautiful. And so, you know people can get – we interview people and they talk about my Pinterest time, right. They’re on their couch on a Saturday morning. They’ve got TV on in the background, maybe some Netflix. They’ve got a big cup of coffee. They’ve got like their phone or perhaps their iPad, and they’re browsing through Pinterest. And some of it’s about what they want to do. Some of it’s like what might I want to do? This sort of anticipation, aspiration. This sort of beginning of the experience. And Pinterest puts together these two things which are traditionally separate, right – the sort of exploration/discovery, aspiration/inspiration with like utility. And so that’s a tension within the product that’s really interesting. I think it draws people in, but it can make it really challenging because it would be much easier to do one or the other and that’s not where Pinterest is. It wants to do saving and doing, but it also wants to do discovery. And of course the more saving you do the better Pinterest hopefully gets at making recommendations to you about things that you should check out based on what you’ve done. That’s definitely an interesting part of the product and the tensions within the product, or opportunities and challenges.

Secondarily you were talking about did I know that when I was getting in here? I was super interested in discovery systems. I think discovery systems are a really interesting part of the technology sort of landscape at this point in time. As part of discovery systems it’s like moving away from basic search which again is not basic in any sense, but many companies have been able to perfect the “I need this thing – help me find this one thing.” I call that like the needle use case. Right, I need the – help me find the needle in the haystack quickly. That’s impressive and it’s great, but it’s also now become expected. And so when you move into – it used to be called Search and Discovery. When you move into discovery systems that play up the discovery experience first suddenly, yeah you have to be able to help 5 people find the needle which is like perhaps one small moment within a user’s session, but actually the use case of looking through the haystacks is really interesting to people, right. So they’re looking through like haystacks in a field right and they’re like well what’s in this one? What’s this one about? How is this one different than that one? They’re exploring and discovering things that are interesting to them. They spend a lot of time doing that. And then they get to the needle. That’s something that I was already thinking about before I joined Pinterest and I could see Pinterest as having a unique position within that space. The one thing that’s related to that which we don’t talk to users about and it’s a super – I’m not an expert in this, but it’s an intriguing idea – is the notion of the interest graph. Are you familiar with the interest graph as a concept?

Steve: No.

Gabe: Okay, so social graph is sort of like, you can think of that as a subset of the interest graph.

Steve: What’s a social graph?

Gabe: The social graph is basically let’s map out with technology all of the people and their relationships to other people. And what’s interesting is like that’s been a very disruptive technology moment in the world today. But what’s interesting about it is that’s a big task – don’t get me wrong – but it’s finite. There’s a certain number of people on the planet at any point in time. Computers can count them, right, very quickly and probably count them real time if they have the information. The relationships between those people, they’re finite. You can measure them. You can ask people like we think these are people you might know, do you know them? Yes I do. No I don’t. That’s interesting, but again – huge in scale, but finite. The interest graph resides over top of the social graph and it’s this concept or theory that we could map out the relation between ideas, objects and people. And that’s really interesting to me and that’s a place I think Pinterest has a unique place to play in within the context of the other technology companies. What’s interesting about it is that the hallmark of humanity is every day we make new ideas, objects and people. So the interest graph is infinite. And if you had a technology that could map it out, so it could tell you which ideas are related to each other, which objects are related to which ideas, which people are interested in which ideas, right. You can do really cool things that are technically cool, but actually experientially cool. Something that actually would make people’s lives better which is most of the interests I have today are limited by what I’m exposed to. Like I only can know about the things that I know about, but if I had a technology that could tell me about something happening on the other side of the planet – in Germany there’s a new type of music scene happening that I don’t know exists today. I don’t know anyone who knows it, but if the technology knew that I would love it and could tell me about it perhaps I would discover it years before I would naturally. Or perhaps I would discover it when I wouldn’t ever have discovered it on my own. And I think that’s really cool.

What Pinterest does nicely in that is that everybody is building this sort of interest graph rocket ship, this technology, but the fuel actually makes a difference. So on Pinterest today when people put things together in collections or boards they save them – it’s them. It comes from them. They’re saying something semantically about the objects that belong together to them and have meaning to them. They name that board of that collection and that fuels 100% pure interest. It’s really what I care about. Whereas other companies may have fuel that’s somewhat representative of that. It may be representative of a lot of other things. It may have more impurities into it. So I think Pinterest is uniquely positioned to sort of deliver a consumer experience on interest graph that I’m really excited about. I think it’s a really interesting concept and so one of the reasons I came here was discovery system, interest graph, recommendations. Like this is a pretty interesting moment in time for that sort of technology and one that actually has a real human experience on the other side.

Steve: I love that you ended with the human part. Like that was the last thing that you said in this really amazing kind of romp through some big ideas. And I was sort of reflecting as you were talking that, you know – I mean we’re having a conversation today about research. That’s sort of the context and you’re talking about research, I think, I don’t know, in a unique way. I don’t know if you feel that. I’m kind of laying that on top of you. I mean how is this a research conversation? I’m not forcing you to defend it, but let’s just go meta, like why are we talking about this on Dollars to Donuts?

Gabe: Well I will say that for me when I think of research I think of it as a discipline. Like as a blue-collar type discipline. I don’t think about – it’s a set of practices that you can learn and I think those set of practices can be applied to certain businesses or contexts. And so I don’t think of myself as a – I don’t believe in like super empathy powers. I sort of react negatively to that. I do think research is one way, or one discipline that can contribute to product development, design – service design. There are many that can do it. It’s not necessary to do it, but it can be useful. And so it’s a conversation about the context that research can play in, but also just about what are the duties, the set of practices – what are the things that researchers might think about in terms of their contribution and what they bring to the table. Great researchers I think they really know the sort of on the ground, in the trenches, skills, tools, things they have to do to move forward with the group, but they also have to play this unique role where they pop out and get the bird’s eye view or the broader context and thing about that, but not in a way that distracts the rest of the team. And then they sort of port back down to like the blood and the guts of what we’re doing every day in the trenches to figure out how can I take those observations that are abstract and make them useful for the team and actual for the team. So all research conversations for me, like sort of waver between the two, right. They’re up in the air, really philosophical, abstract, but then also like down on the ground, really practical, really technical. And so that’s probably why we’re having a conversation about like how did you choose where to go next and also some of it was like well I wanted to be in a place where they were going to care about the work that I did, but also there was this huge like theoretical thing that I could learn about in the context of their products.

Steve: And moving back and forth between the tactical to blood and the guts and the big picture, philosophical. I don’t know, I’m guessing that – and just sort of watching your face as you talk, that not only do you like those end points, but I feel like, and maybe this is my own thing coming in here, there’s something about – and this I guess is the journey thing coming back in as well, but there’s something about the motion between those two kinds of thinking that I think for me sparks some stuff. That we can talk about the big picture – I mean you can really talk about the big picture, but sort of the energy we get from that vibration, from the well what does this mean, how do we deal with the blood and guts kind of – with the team every day. That – I don’t know. I’m throwing that out there. It’s a very leading question for you Gabe, but…

Gabe: I think there’s certainly like personality differences that attract certain type of people to certain roles, right. And so there’s probably something about that that appeals to folks who are doing research for sure. I also want to make sure to like be clear which is it’s really important to me that as part of that process we’re shipping something. That I can actually see something in the world that I’ve contributed to. Otherwise I would have then stayed in academics. I would have been working at the theoretical level and for me there was something really exciting about work I did at IDEO where I worked on a furniture project and I’ve seen that furniture in the world many, many times. That makes me feel like there was – it’s not just about ideas, it’s actually about building something and doing something. And that’s I think important for me when I hire researchers or when I think about the type of environment I want to be in. It has to be – it can’t just be interesting. It has to be impactful. It has to be action oriented. There has to be something that happens at the end more than an idea. And so that to me is like I like that fluctuation between the tactical and the abstract, but to an end, and that end is to put something into the world that makes it better for people in some way, whatever the “it” is.

Steve: I really like – I agree with you that stuff being interesting is sort of not an end to – it’s really great to be sort of turned on by stuff and like getting new ideas. You know for me trying to put it in a language – I mean what’s the good and bad word now – the A word – actionable. That you know – I don’t know, people go back and forth on whether that’s a good word or a bad word, but I mean I guess in my own career I had – the ship had sailed a long time ago on like measuring my own success by what shipped. And I hear from researchers as well that sometimes it’s frustrating. That – and so I’ve just sort of shifted my own role. Like I’m in the business of helping people see as clearly as possible what decisions they have to make and what actions they have to take in order to ship something, but I can’t – and maybe this is because, you know the difference between being inside and outside of the organization, I can’t measure my own success by what gets shipped or I would just be an alcoholic. But I do measure my own success by – I mean someone said it to me once like oh, what you do is kind of like take someone to the doorway and kind of like open the door and like point them kind of on the way through.

Gabe: I’m probably using like very tacky language when I talk about ship, but I – so to be clear, shipping something can be pretty broad. What I mean is there was a decision made. That there was some sort of result or outcome of the research. And so definitely some of the hardest things I’ve worked on were the shipping moment was convincing someone not to do something. Like that is just as important as convincing someone to do something, in fact it may be harder in some cases because they’ve already decided, no. No, we’re going to do this thing, we’re going to make this product and helping them to understand how that might be a destructive decision for their business. That’s a type of shipping, right. Shipping a decision. So yeah, maybe I’m overly emphasizing the analogy.

Steve: No, I think I’m just one sort of rung before that which is like I think for me I probably ship influence. I empower people to make a decision, but I can’t actually control if they’re going to make that decision. So sometimes you sort of – you learn something, you package it, you convey it and they need to do something. They need to do something. Like the business is just – the world has changed, the business has changed. They need to do something and the other factors are just so large that sometimes doing nothing is the outcome, but you know that you changed somebody’s mind. You gave them – and I’m not trying to undersell what I do. I mean I’m trying to be realistic like I just don’t get to ship decisions as much as I would like to even. I’m going to go drink after this. That’s what you’ve taught me today.

Gabe: Well as long as I’m going with you it’s fine. I think – so one thing you brought up is control. Like I personally have had moments of development where I do not – I try to focus on what’s in my control. I can’t make a team do something, right. They actually have to make the decision. That’s a healthy part of the process and sometimes they will do things that are risky or potentially counter to the recommendations. And there’s a whole host of reasons why that might be. I try to recruit people and hire researchers who are resilient because I do think a huge part about being a great researcher is going out on a limb to say something important, potentially getting knocked down, and getting back up again and continuing to say that. I think really great researchers, in tech especially, they act like it’s a sprint, but they know it’s a marathon and often times at Pinterest we talk a lot about like the research part and delivering that, that’s just the first step. Now the real work is continuing to bring it up with the team. Continuing to make it part of conversations. That takes a long time sometimes. Particularly like if it’s something very meaningful or hard to do. People just don’t, when you deliver like a hey this thing you thought was the case for many years is actually potentially not quite like that. It’s a bit different and you should really think about it differently. Human beings are terrible at that. Like we don’t do that in our personal lives. We don’t just shift our total point of view without some sort of major event, like some sort of like peyote, or some sort of like near death experience. That just doesn’t happen. And so it’s not going to happen at a company or a business and you have to think about that sort of marathon bit, like bringing it up over time. It’s going to take awhile. You have to work at it and you can’t just think you’re going to come in and say no I have the answer, it’s this, and everyone’s going to be like here, here, good job, let’s do it. It’s harder than that. I just think that’s part of the research gig. That’s part of shipping information and inspiration, right. That inspiration bit. It can be very transformative for a designer in a moment, but getting to the long term outcome of that, the result of that usually is isn’t a couple of days. It’s probably a longer period of time.

Steve: So tactically when you are meeting prospective research hires for Pinterest, how do you – that’s – and obviously you look at many things, but how do you look for that?

Gabe: That’s a good question. I think first of all – how do you look for resilience. I think you talk to them a little bit about their career and I ask people to tell stories about the research they’ve done. You ask them to tell stories about, you know, what’s an example of a project that you did that you think showcases your skill set and really shows the best of you. And then you ask them questions about – they talk a lot about doing the research, then you actually have the real conversation of like so what happened next? What did you do? How did you approach communicating that to other people? What were the challenges you faced? And really – because we want to talk about the researchers, researchers, but again I just feel like that’s the smaller part of what we do in a context of an organization. ‘Cuz that’s what we’re good at and we like and we like that and we want to talk about it and we want to make that the big part, but like great – this is like you can build it and they don’t have to come sort of moment. Or even – I heard this great quote awhile ago which was “you don’t have to make a mistake to lose the game” and that’s about context and timing and sometimes like you have a great insight for an organization but the timing is not correct and it’s out of your control and so then what do you do? Do you pack it up and go away? Do you like start making posters and put them up every month? Do you continue to like any time anyone brings up tangentially relation, like oh that’s like what we found in this research from before, are you aware of it? That’s like part of the story of research and try to focus a little bit on that in conversations.

Steve: Is that something you are teaching researchers that you hire? Or does someone have to come in?

Gabe: I don’t think of myself as a teacher. I did not enjoy that part of academic life. I think of myself as a collaborator and I think it’s something we discuss a lot. How are we doing this? What are we – you know deliverables – what’s the nature of deliverables? Trying things – like the one thing I do try to ask the team to do, even when they’re tired of me and of the challenge is like well what’s something you could just prototype? Let’s just try to do something very different. And I think the team itself has built a good sort of muscle around like let’s try it differently to see what happens in terms of communicating and continuing to push those conversations forward. I do think it’s hard at a startup like Pinterest. When I joined I was around – I think there were 3 people in my class who are all still here – or my whatever, my cohort. We were like 70, 71, 72 – pick your number. Now there are 700 people and I think what’s really challenging with research is that you can learn something and communicate it to the entire organization and everyone can hold that knowledge. In the case of a startup like Pinterest where you’re growing – like next week 20 new people are going to start. And those 20 people they don’t know anything about that. And so you have to continue to make sure that people understand – they’re going to come in and say like hey how come we don’t do this and it’s easy just to say like well we tried it and it didn’t work. Or well it’s not going to work. Or, that’s easy. The harder bit is like actually here’s what we’ve learned and here’s why we think that may not be the best way to proceed or move forward and so – that’s a real – like it’s an ongoing challenge of any like company that’s growing quickly, but particularly it hurts researchers because the artifact of what you’ve done is knowledge and knowledge you either have to read it, or communicate it. And so figuring out how to like maybe put it in a space or you create a wiki, but then you’ve still got to get people to go there. It’s a real challenge I think for any researcher at a fast growing company.

Steve: Right. There’s a lot of talk about – you said at scale earlier in this conversation. That you know dealing with insights at scale is a challenge as companies grow and so on. But you’re talking about sort of insights, like insights in the tornado, or insights when the sands are shifting.

Gabe: Yeah.

Steve: You can have a great meeting – you can – you know all the good things we do in research. We talk to our stakeholders. We bring them along in the process. We bring them out to meet with people. We involve them in synthesis. But over that – say that’s an 8 week process, that team has changed and then two weeks later, after you’ve had some really great meeting to make some decisions and have a workshop and do whatever things that we do. Two weeks after that there’s a bunch of new people and two weeks after that there’s a bunch of new people and two weeks after that.

Gabe: Yeah and also the actual, you know the audience. Like perhaps your insights are stale. The audience has changed significantly, right. Or perhaps the technology, right. The technology itself, there’s like a new innovation that you know we have a new team coming in that’s been acquired and they have like a whole new technology they’re introducing that can really change the context. And so again this notion of in a startup land a research has to again believe – act like – know that it’s a marathon because there’s so much changing and you’re going to have to really stay abreast of all of that and check your own assumptions, check your previous recommendations constantly so that you are really providing the best information you can at that time.

Steve: That’s a really good point and I think that’s sort of the researcher arc, right. You come into a situation. You can tell that everybody’s hypotheses are full of crap. And you’re like okay I don’t know what the truth is, but I know that you’re wrong and we’re going to go learn and you come back with a truth, or a set of truths or a set of things that approximate truths and that you advocate for, but that also – then you could risk becoming part of the problem because now you can kind of clench on those things that you discovered through your own sweat and your own effort in the field, but what you’re saying is those things may have a short shelf life given the changing space that a company like Pinterest is involved in and that you have to let go of those things which is sort of counter to the you’ve got to go – the marathon is kind of about persuasion and advocacy and influence and getting out there and giving that same presentation 20 times to different audiences at the same time as also be willing to let go of it.

Gabe: Yeah, and I think – so there are – I think that is a challenge – a real challenge. I think there are some ways to guard against it, or some strategies. So one strategy at Pinterest, I – my vision for the research team was an interdisciplinary team. Trying to put together both qualitative user experience researchers, of which there are different kinds of qualitative user experience researchers. They’re not all the same. A quantitative user experience research group, market research. Altogether. There’s an economy here in terms of like actually getting things done, but there’s actually, like what it does nicely is it brings together different data points to build the knowledge that we’re learning together.

Steve: Is that knitting?

Gabe: That’s knitting within our own research team right. I think a lot of companies you can have, like those groups can get siloed early on and then you have that moment years later where you’re like well we’re working on the same thing, what’s going on? Or we’re competitive with each other. Trying to put those disciplines together and then what allows – it’s like so work that I’ve done in the past that I still think is relevant, well if it’s still relevant we’re probably still working on it and now we work on it with a perspective of multiple methods. So mixed methods. This is a way to sort of take an idea and really, truly understand it deeply because if it’s – we think it’s – we hypothesize something from the qualitative research then we can hand it off to a quantitative researcher who can then quantify that at scale of which you might learn something that is – you think there’s a correlation between a macro question and a whole bunch of smaller questions. Well if you validate that relationship then you can take that macro question and you can put it in all of your say brand equity trackers for market research and all your markets and then you can measure it there and now what you’re doing it you’re building this interlaced qual and quant sort of lineage of knowledge that over time can become very robust and then you can say now that you’ve validated it at scale and you’ve used multiple methods and you’re monitoring it in each market, well now you can say we know this is a real thing, we’re trying to change it. We can continue to measure it over time with a brand equity tracker and market research to see like are we doing better, are we the same or are we doing worse? That’s super exciting and it’s a way to guard against your own bias, like this is true. Maybe when you get to the step of handing off to a quant person – like no this is not true, it’s scale. Or it’s different – it’s often probably true, but it might be different than what you thought it was. So there’s actually three things going on here, not just one. Or you find out when you pass it over to a market researcher who does brand equity or some sort of large survey, they’re like oh this is true for, you know, Western European countries. It’s not true of Pacific Rim countries. And so – again, bringing in more methods, building out – if it’s an important question, building out different levels of understanding can be really robust and guard against your own bias towards – but I said this before and it was true. It’s still true.

Steve: So in terms of this multidisciplinarity and you’ve kind of said this how you sought to build the team, maybe you can describe the team more specifically and where are you at? Where’s the team headed?

Gabe: Yeah. So we have three qualitative user experience researchers. We have Altay Sendil who comes from IDEO. He has an industrial design background. He was at IDEO as a researcher forever. He brings like a really strong perspective on ethnography and really generative work. We have Larkin Brown who comes – she was previously at Google and she’s like – to me like a real digital UX person and she has a lot of areas of focus. She’s also really into – she serves as a stylist within Pinterest and helps with brand styling and also communication to press around fashion. And then we have Cassandra Rowe who’s – she worked at, previously at Intuit, but really did a lot of international research at Netflix. She was part of their early growth on the qual side. So we have an expert in sort of formative, an expert in digital, an expert in international. Then we have one quantitative UX researcher named Jolie Martin and she is really looking at the intersection of attitudes and behavior for Pinterest. So that’s experiments and – AB experiments and surveys, to truly really understand the quantities and quality of experience. And then we have Paul Pattishall who is on the consumer side doing market research for us – things like brand equity trackers, product surveys, NPS, category deep dives. Like what are the most common interests in France right now for people who aren’t using Pinterest so we might approach them with relevant content. And then recently had Julia Kirkpatrick who joined, who’s doing market research for the business side of the company – Partners Products is what we call it. That’s the people, that’s the disciplines. Again, the important bit is that we’re all working together. We’ve hired people who, they work with a team in an embedded type way, but they also know when – when the team has a question that’s outside their skill set to go to a practitioner within the team to get help and support. We also, you know we’re still – so that’s six people now. We’re adding two more people this year. We’re looking for another quantitative user experience researcher because I think one person is not enough. You have to have two to really be a team so you can play off of each other. And a UX or qualitative researcher to focus on our business or Partners Products. So these are things like the web analytics if you own a domain, or advertising on Pinterest which is something that’s a new area for Pinterest where we can do some really innovative things because of the visual display elements of Pinterest. So we’re adding folks. We’re looking for folks, but the important bit is all those disciplines working together – we’re one team, but you could think of it as three separate teams. But because we’re connected there’s a lot of collaboration and interaction amongst the team members which I think is really – it’s an exciting environment for me as a researcher because people know a lot and you learn a lot and you can pick up a lot of tips and tricks along the way.

Steve: So just thinking to wrap up, as you describe the team and you talked before about – I used the word teacher and you kind of shied away from that term. You said you think of yourself as more of a collaborator and given the nature of how that team is structured and kind of what you have created as a collaborative culture and how you’ve inherited kind of – you’re participating in the larger Pinterest culture – I agree with collaborator. And this is just all a preamble to a leading question – the trouble with interviewing somebody that you know a little bit is that there are certain answers you’re trying to draw out of them. So I’m going to try a few different ways. What – you know in terms of your role as a manager, if it’s not teacher, what would the people on the team, what nouns or adjectives would they use to kind of characterize your managerial jim-jam?

Gabe: That’s a good question. I mean this is my first time managing. So I think my jim-jam is under development. What I strive for is one where they feel like supported. That they – I will – if people ask me for feedback I will give them deep feedback on whatever it is, their deliverable they’re creating because I know in this environment you might not get deep feedback from other people and I think if you’re doing work you deserve to get that feedback. So supportive, probably into the – like I care definitely a lot about research and what we’re doing. So I talk a lot about research. At lunch I’m like the person who’s like, answers the question about what they did this weekend and then follows-up with the like hey let’s talk about this project that’s happening. And so I’m – they would probably say I’m pretty focused on research. So supportive, into the research. I definitely want to empower them to make decisions. So we have a model where they’re working with teams and they are the ones that need to develop the roadmap with their teams. They need to make the calls. I’m there as support. I can provide context. I do a lot of like I – like showcase their work and constantly talk about their work because that’s part of my job is to like hey did you know that Cassandra just did this in Germany with that team, the activation team. You should be aware of that work. That’s a huge part of what I do, connecting the dots at an organizational level. So supporting, you know advocating. Nerding out on research.

Steve: What’s your superpower as a researcher or as a manager?

Gabe: I don’t – that’s a good question. I tend to like shy away from superpowers because I think it’s about like training and practice and whatnot. As a manager, well I don’t think I’m there yet. Like I haven’t done this long enough to really know that about myself. And I have a great team. Like I think I have a lot to learn and a huge part about managing probably is like managing through hard times and I probably need some more of those hard times before I really can assess.

Steve: Careful what you wish for!

Gabe: I know! As a researcher, I mean I care a lot about the work that I do. I think that’s like a double edged sword sometimes because sometimes you have to care a little less. But I think I – it’s a good question. I really – the times I’ve thought – I’ve thought to myself well like I’m pretty good at this thing, but then I’m like that’s just an illusion, right. That’s like I don’t have a special power. I just have a set of practices and things that I try to learn about and do. And so I’m hesitant to answer that because maybe there are no superpowers. Maybe it’s just about hard work and just caring about the details.

I can tell you what my weakness is which is I don’t tend to go to conferences enough or get out and socialize enough in the research team. So when someone comes to visit me, to talk about research, or who’s a researcher, that’s huge for me ‘cuz that’s how I try to like understand what’s happening in the world. That’s a moment for me. That’s like my conference. That’s my getting out and interacting with other folks around research and so – ‘cuz I’m pretty heads down. I’m pretty in the trenches, right. And so that’s why I definitely celebrate those moments and I really appreciate when researchers come visit me. I have sometimes students come by who want to talk about their future career or get more information. I tend to like accept more one on one conversations than really approach like group activities, as a way to just get a sense and reflect on what’s going on in the field.

Steve: Is there anything that we didn’t cover in our conversation today? I know we could go and go.

Gabe: We could. I have a lot of things to cover, but I’ll leave it for another time. I do have some questions for you though.

Steve: I don’t even have to ask that!

Gabe: No, I’m like ready to go. So my first question is…

Steve: Oh your first question!

Gabe: My first question is like what’s the biggest surprise – what’s been most surprising to you in terms of the Dollars for Donuts interviews you’ve done? What stands out as surprising that you didn’t expect before you started this?

Steve: There’s probably surprises in terms of just having these conversations and there’s probably some surprises about just having this as a thing that I put out in the world. But I also – it seems I push back on the notion of surprise. I think I wrote about this in my book even where I used the phrase bland curiosity. ‘Cuz as researchers you get asked the what’s the biggest surprise question all the time and I feel like well that supposes that you entered into it with something and then like a bucket of water appeared and splashed you and you went oh…

Gabe: Okay, this is good. Let me reframe then. What has been the most unexpected theme across these interviews?

Steve: I haven’t synthesized them yet. I mean part of – and you know this is sort of the discipline of research like – and you can do it different ways – you can kind of just go and throw yourself into it and then occasionally you’ll be like oh we haven’t seen any themes except blah, blah, blah, blah. Oh that’s a theme. And so I’m just going to keep thwarting you Gabe. I mean sometimes I resist reflecting and just sort of let it be what it’s going to be. So I mean there are things that come up, but also it’s because I ask about them and then – you know how teams form? What people’s career arcs are? These are sort of the topics that I’ve chosen to talk about, so I don’t know that they’re surprising themes. There’s something that we didn’t get into a lot – there sort of a structural issue that’s emerged from the beginning which is – which I remember this being like a design topic from the 90s and probably goes back before then – where do these people sit? Do they sit with their – are they centralized or are they embedded? So it comes up a lot and I think it’s a really interesting sort of managing – what I like is that no one says this is the right way to do it. People have said like this is what we’re experimenting with now and here’s the tradeoff. So I guess that’s a thing that I would never have thought to ask about and that’s also because I don’t have a job inside a company that has a bunch of research people, so it’s outside my own experience. So I’ve learned about that as a topic and it seems to be one that when I talk to people they say like oh yeah, that discussion and this episode was interesting and we’re thinking about how we’re going to do that.

Gabe: Yeah I think that – I mean I have that as a listener all the time. How are they doing it? Why are they doing it that way? Absolutely. Okay, speaking of being a listener, my second question is will there be a Season 3 of Dollars to Donuts? (laughs)

Steve: We’re going to do it in Oculus. We’re under negotiations right now. Podcasting is over. Maybe it should be – maybe they should be something like – something that’s only – they’re going to be 4 seconds long and then they’re going to disappear.

Gabe: I won’t put you on the spot. This is a more serious question. As a young lad growing up in the cold tundra of Canada during the 70s and 80s, who appreciated donuts, there were 2 major chains. Which was your preferred chain – Country Style Donuts or Tim Horton’s?

Steve: Well I’m sorry. We grew up in the same town.

Gabe: We did.

Steve: And I’m older enough than you that I remember Millionaire Donuts. Do you remember Millionaire Donuts?

Gabe: No, I don’t. But it was the donut of choice?

Steve: Oh I don’t know. I mean Tim Horton’s was kind of the place, ‘cuz Tim Horton’s used to make their donuts in the store. Maybe you’re the one that told me this. They don’t do that anymore. I even went for a job interview at Tim Horton’s as the shift – I didn’t get the job – it was like shift work. You had to make the donuts, like 3 shifts a day. They had these big racks in the back where they would – it was an interesting sort of – I mean I didn’t get to do a lot of research, but I saw how those were made, which maybe everyone knows this, but they like have all the sort of naked donuts there and they like kind of pour over like a big vat of chocolate goo and that’s how they get that sort of drippy thing out their – they just pour it over on these racks.

Gabe: So I don’t think we can really reach a higher point then naked donuts as part of our conversation today, so it was like…

Steve: Are you – not only are you taking over the interview, now you’re wrapping up.

Gabe: How can you beat naked donuts?

Steve: Well there’s a visual that no one needs… Alright, I think that is our – I’m going to take your cue…

Gabe: Thank you.

Steve: … and wrap us up. Thanks Gabe, this was great.

Gabe: Awesome. Thank you.

Mar 15 2016

1hr 17mins

Play

Rank #5: 17. Tomer Sharon of Goldman Sachs

Podcast cover
Read more

In this episode of Dollars to Donuts, I talk with Tomer Sharon, the Head of User Research and Metrics at Goldman Sachs. We talk about how to assess potential hires for user research positions, infrastructure for capturing and searching a body of data, and developing a practice inside a willing, yet large, organization.

Some parts of kind of pure research are creative. Probably the biggest one is translating a set of questions that a team has into okay, what are we going to do to get answers? If it was that easy to come up with an answer to that, then anybody could do that well. That’s not the case. A lot of people are having a lot of trouble with that part. So, I think that’s a creative part. You’re not going to see a beautiful painting coming out of that, but it is creative. – Tomer Sharon

Show Links

Follow Dollars to Donuts on Twitter and help other listeners find the podcast by leaving a review on iTunes.

Transcript

Steve Portigal: Greetings, humans! Thanks for listening to Dollars to Donuts, the podcast where I talk to people who lead user research in their organization.
Over the past while I’ve been putting together a household emergency kit. It’s primarily shopping exercise, and I’ve ordered a hand crank and solar powered radio, a replacement for matches, latex gloves, bandages, and air filter masks (which we made use of during a period of dangerously poor air quality recently). The last step was getting some food that will last – cans of soup and stew, crackers, single-serve breakfast cereals. There’s something satisfying about acquiring a bunch of stuff and storing it away, somewhat organized. And that led to a stray thought that I noticed – “Oh, I can’t wait to use all this great stuff!” And then I realized how crazy that sounded. I don’t want to use it! I don’t want there to be some emergency that is bad enough that I’m drinking the emergency water stored in the garage and eating canned stew, also stored in the garage. I mean, yes, we’ll eat or donate the food before it expires and replace it, but it’s a whole set of preparations that I hope I’ll not use, which leaves me with the hope for no shopping gratification, kind of a confusing way to feel.

But it did remind of the recent workshop I led with researchers in Sydney, Australia. We looked at a lot of the user research war stories I’ve been collecting, like those published in Doorbells, Danger, and Dead Batteries, and we pulled out lessons and best practices. There was – as there always is – a lot of discussion about safety and preparation. It seemed to me that people who worked in organizations with established safety cultures already had a strong baseline of safety procedures for user research and fieldwork, like texting a contact before and after a session, not going out alone, and so on. Their work cultures were strong on processes, especially for safety, so thinking about this for research was obvious. But not everyone works in that kind of environment, and plenty of researchers work for themselves, without that corporate structure to support them in creating best practices for safety. Anyway it led to a lot of discussion beyond just safety about running through possibilities ahead of time so that when any situation comes up it’s not a surprise and there’s at least a starting point already established about how to respond. I think this is a great idea, but I think we have to acknowledge the limitations – you can’t plan for every possible situation, there are always going to be things that come up that you probably haven’t ever considered. I think that some planning for the unexpected will help you to adapt in the moment to surprises, but that’s different than the false comfort of assuming you have every contingency planned for.

I hope I never have to make use of our large cache of sterile latex gloves, but maybe just having acquired them I’m in a slightly better situation for some other unexpected situation?

You can help me continue to produce this podcast for you. I run my own small business, and you can hire me! I lead in-depth user research that helps drive product and strategy decisions, that helps drive internal change. I provide guidance to teams while they are learning about their customers, and I deliver training that teaches people how to be better at user research and analysis. You can buy either of my two books – the classic Interviewing Users and Doorbells, Danger, and Dead Batteries, a book of stories from other researchers about the kinds of things that happen when they go out into the field. You can also review this podcast on iTunes and review the books on Amazon. With your support, I can keep doing this podcast for you.

All right! Time for my interview with Tomer. Tomer Sharon is the Head of User Research and Metrics at Goldman Sachs. He’s worked at Google and WeWork, and written two books – It’s Our Research, and Validating Product Ideas Through Lean User Research.

Thanks for being on the podcast. It’s great to have you here – virtually here in audio space that we’re all sharing. Why don’t you start off – just do a little introduction. Who are you? Where do you work? What are you doing?

Tomer Sharon: Okay. Thank you for having me, first. My name is Tomer Sharon. I am currently Head of User Research and Metrics at Goldman Sachs. I do have a second day job. I’m also heading a design group for a product called PWM (Private Wealth Management). Yeah, this is where I’m at in the past – well, almost a year.

Steve: Alright. So, what is Goldman Sachs?

Tomer: Goldman Sachs is, I would say, an investment bank. Probably one of the more important banks in the world. Big corporate. Definitely not one you would associated with design and research, at least not that type of research. But they are changing and they’re celebrating 150 years this year and they’re moving towards what’s called outside digital transformation and that includes learning more from their audiences and investing a lot more in design.

Steve: What’s the relationship between the existence of your role and this larger shift that’s going on?

Tomer: I think there’s a strong relationship. They have been realizing that they can’t just be living in their own box and they have to open up and try and understand audiences that they’re engaged with already and new audiences. I’ll give an example. Goldman has a commercial bank that’s called Marcus. It’s been around for a couple of years, but still these are consumers that Goldman is now trying to attract. So, it’s definitely not the kind of typical audience that they’re used to. So, they understand that they need to open up, learn from them, and design for them and with them, and that is a shift that has been happening in the past few years. And my role – I didn’t replace anyone. I’m the first one. – is a part of that shift.

Steve: Is there any sort of one point or one incident, or key moment, I guess, which sort of marks that transition in a company like Goldman to say yeah, we’ve been doing it this way, we need to do it this way.

Tomer: I think it happened – I don’t know if there was one event, but I think it happened in the past maybe two years. The user experience team there was very small and then suddenly they decided that it’s time to invest more in that. And from zero it went to several dozens, many dozens, within a year. And the more people do their work, show their work, share their work, and their work is very successful, the more teams and leaders talk about that and then it’s like a cycle that feeds itself and then it grows and grows. So, that’s been happening a lot in the past few years.

Steve: And do you have a perspective on – this awareness that you’re describing at Goldman seems to align with what I’ve heard and observed in financial services in general. That it’s an industry that maybe wasn’t seen as paying attention to consumers, users, design, and over the past number of years.

Tomer: Yeah. I will admit, this is my first financial services job. So, I’m not really familiar with that world other than Goldman. But, so I hear. I don’t really know from first experience.

Steve: But I’m inferring – correct me if I’m wrong – I’m inferring that that also is not a conversation that you’re having inside Goldman?

Tomer: No. It’s very, I would say, kind of practical and tactical. We’re not talking about the concept of having me and people like me there. We’re just focusing on doing the work that we know how to do, we’ve been doing for many years, and just bringing that insight and understanding of that world to an organization that wasn’t aware of that previously.

Steve: That world is the process and tools of learning about people?

Tomer: Um, yeah. I would say process, tools, people. They’re used to hiring different people. I saw that in Google at the time. I saw that at WeWork at the time where even formally you don’t have in the – I don’t know HR systems – you don’t have names for roles for what we do, for who we are. I can’t remember the names, but we were engineers or UI engineers, or things like that. Until you get recognized, and I experienced that at Google, and then you do have a job family for design and for research and so on. It’s a process.

Steve: So, at Goldman you’re trying to then like – that’s part of the…

Tomer: Well, I’m not – we’re too, especially in research, we’re too few people to start having a job family in the HR systems, but we’ll get there. What I’m trying to do is first a lot of evangelism. A lot of conversations with people to plant seeds in their minds that they might need somebody like that. That they might engage in a project, a one-off project, and learn more about it. And with several groups and divisions it works already. So, we have people there and they start doing their work.

Steve: It’s interesting that you’re sort of highlighting evangelism vs. like conducting research.

Tomer: Right. Right. Sometimes it’s integrated. Sometimes I’m asking people to – some kind of a leap of faith. And then we actually, we do the research and then the work talks for itself. I don’t need to evangelize anymore. They get it. They understand. They want more of it and from there the doors are open. I prefer it this way. Even in the past, I prefer to kind of do the work show then kind of wave my hands and talk about it. I found it always be more meaningful to people and more persuading.

Steve: I’ve heard this. I don’t know if this is what’s playing out for you in these situations, but that you help someone take a leap of faith. You do this work. The results speak for themselves for them. Do those results – or how can you help those results spread elsewhere in the organization?

Tomer: Um, I use that example when I talk with others and I invite people to connect. I mean in the same company you can hear from them. You can meet them. I sometimes facilitate those meetings. And then they hear from others. I know it will take time, that it’s not something that happens over a day or a week or a month. Sometimes a year. But these conversations happen, whether I’m aware of them, or not. Whether I facilitate them or not. I’m confident that the more work we do the more people we’ll have, and we’ll be more impactful.

Steve: How does that work across – I don’t know if they’re called business units. Goldman as you said speaks to different kinds of users, different kinds of customers, with its products and services.

Tomer: So, what I do – so, as soon as we have researchers joining the group, we assign them to a – let’s call it a business unit, or a product team. And then they’re theirs 100% of the time. I only support them with infrastructure, career growth and things like that. And then they do the work with the team. I do a lot of legwork kind of before that happens to make sure that they have a team that wants them, that needs them, and that knows what to expect. So, it’s not the first time that they hear about that research thing with the appearance of the person. So, that’s how I intend to kind of continue doing rather than hire to – like work in an agency model. I have a pool of people that kind of come and go kind of based on projects.

Steve: Okay, so, just to reiterate, you talk about help someone to take a leap of faith, and I’m maybe changing your words a little bit. And a next step from that is to hire someone and put them in that team.

Tomer: Yes.

Steve: And then they do the work and then the work, as you said, the work speaks for itself, but it’s the work that you’ve staffed someone into that team?

Tomer: Yeah. I’ll just add to that. In many cases it’s not really a leap of faith because there are people who are – I call myself an outsider – who didn’t grow up in Goldman, that know about this field, know about research, know about people like us. So, they’re very open to having them. So, that’s much easier.

Steve: So, you said you’re not looking to do an agency model.

Tomer: No.

Steve: From your face, I think there’s a point of view.

Tomer: Yes, definitely.

Steve: What’s the point of view behind that?

Tomer: The point of view is that I want people to feel they belong; the researchers feel they belong to a team. And I feel that if I just send them off to short term projects they can’t grow with the team. They can’t have any history with the product. They’re not familiar. They’re really like consultants that come and go. I want them to feel a part of the team, understand all the history, sit with them. I don’t look for them to sit next to me. I want them to sit with their teams. And then going to be a part of that team and then be more impactful. I strongly believe that this is the way to go.

Steve: Yeah.

Tomer: And I’ve seen that happen before with my own eyes. Not WeWork, because we didn’t do that at WeWork, but definitely at Google where we experienced, at some point, a decentralization. So, there was one UX group and they were decentralized into the different business units. Not much has changed because people were assigned to teams already, but it just felt like the right thing to do, to have people sit with their teams, work with the same team, same product, for long periods of time.

Steve: So what implications, if any, does that have for what kind of skillset someone that you’re hiring needs to have?

Tomer: Yeah. That’s a good one. So, I’m looking for more experienced people. More senior people. There will be a time where we will hire more junior people and that’s not the time when we’re just starting. And I had the same thing at WeWork when we were building a group from scratch. You need experienced people. So, I’m looking for people who are eager and passionate about building something from scratch, taking a team that maybe doesn’t know anything about user research and try and kind of build that relationship and build those results. We talked about that earlier. With that team and grow with them, grow the practice with them. And maybe if it’s extremely successful grow a team there at some point.

Steve: There’s always a thing with questions and answers, at least for me. I ask a question, I have an idea what answer I’m going to get. And it’s interesting when the answer is different. I know it doesn’t always cleave this way, but there’s sort of the soft skills and the hard skills thing. I imagined I was going to get a hard skills answer because I was thinking about some of the pragmatic constraints of this, but your answer emphasized more, I think, softer skills. Growth, advocacy, leadership from a research point of view.

Tomer: I think it kind of almost goes without saying that once you’re – I don’t know, maybe I’m wrong. But once you’re – I always like to – I tell that to my people when we hire. We look for resumes that scream researcher. So, if your resume screams researcher, then that’s covered. I’m more interested in the soft skills.

Steve: Yeah. Can I push on that just a little bit more?

Tomer: Sure.

Steve: And I think – my bias here, or the perspective I come from is the consultant. So, I come in and out. So, I know what the limitations of that are and so just from where I sit, I see the challenges of what you’re describing because – I mean there’s a life cycle of when a team is developing – there’s probably a smarter way to say that. But something is being built and it goes through stages over weeks/months/years and so the research needs change and so the way the researcher has to respond to that, or lead that, or support that changes. So, that’s kind of the – I’m just revealing what I was probing on here, what I thought the answer was going to be. When you say screams researcher in a resume does that sort of imply that the ability to – are you inferring from that resume the ability to support that team and all these different areas of its evolution?

Tomer: Yeah. A resume that screams researcher is a resume that you see all the methods, you see the flexibility in kind of not sticking to one method or one approach. Doing that in multiple cultures, multiple companies. Have a trajectory of growth. So, you’re doing kind of from less meaningful things to more meaningful things, and so on. So, that to me screams researcher. Not, you know, you look at the job titles and you ask yourself why am I even reading this resume. And then you read a cover letter that says something along the lines of “I’ve always wanted to be.” Maybe at some point, but not now.

Steve: Yes. There is an archetype of a person I think that identifies strongly with researchers and sort of what they’re about, but their experience doesn’t match that.

Tomer: And I’m smiling because there’s always an exception. I remember one at WeWork where – and I was talking – speaking the same way – only senior people. I’m not looking for more junior people. And then I hired two junior people because there’s always an exception. Because you see somebody who makes me ignore everything I just said and hire them.

Steve: What’s an example of something that overrides that?

Tomer: I don’t even know how to explain it. It’s just – sometimes it’s the spark in the eyes that you see immediately with people and you see that they are going to be like a sponge. That’s a good metaphor. And you just know it’s going to work.

Steve: And you said passion early on.

Tomer: Yeah, yeah, yeah.

Steve: And I think evidence of passion is past performance. But that’s not the only evidence of passion if you’re seeing that with people that you’re meeting.

Tomer: Right, right. Although in most cases I’m very reluctant to hire based on passion if you don’t have experience. Because you can have a lot of passion and motivation, but you know nothing. It doesn’t mean you’re going to be bad, but it means a lot of people are going to need to support. So, I think kind of my approach is that I’ll try and do zero to very little of that because we’re just starting. We’re just a few people. We don’t really have the time that we need to support our teams. We don’t have the time to support another person. Not now. When we grow, yeah, but that’s definitely – you know, if we were 10/20 senior people on the team I would say we have to have more junior people because we can’t have a team of only senior people. But right now, we’re not even close to that.

Steve: Can you say the size that you are now?

Tomer: Yeah, sure. We are four. In a couple of weeks, we’ll be five. Worldwide, both here in New York and in London.

Steve: And is there like a roadmap for number of people to get to that 10 or 20?

Tomer: Yeah. I don’t want to mention numbers, but I think we’re on a path of growth.

Steve: Okay. So, you talk about sort of the time commitment required to support people at different levels. I want to go back and clarify one thing. It was almost an aside. You said that you don’t want to do the agency model. You want people with these teams, but you’re trying to provide infrastructure. And then you used a phrase like career growth. So, given the size you’re at now, acknowledging that there’s bandwidth challenges here, so what does infrastructure look like? What does career growth look like now?

Tomer: Um, infrastructure. So, that’s easy. So, there’s a lot of – when I look around me there’s a lot of motivation to do research, but there are needs in terms of – and gaps in terms of tools, knowledge, guidance, process. So, research was happening before there were researchers, but it was kind of very kind of based on sporadic motivation from different people who are really trying to do the best they could do. So, we need to support them with services that are out there, industry standard services. So, we need to sign agreements and get those licenses on board, different vendors.

Steve: What’s an example?

Tomer: UserTesting, UserZoom. These would be two, but there are more. And, um – let’s see what else. We are kind of creating process for, okay what happens if you want to do research, but you don’t have a researcher and you probably won’t have a researcher in the next few years? But you still – you acknowledge the fact that you need research now. So, um – so we have some – we have established some kind of a way to ask for that and then sign up for office hours or something like that, and then get some help from us. We will advise you and help you kind of get going without the researcher. What else? We started working with OKRs. So, if that’s – to me that’s a part of kind of infrastructure. Or the cool kids now call it research ops. So, research ops.

Steve: So, explain what OKRs are and explain what research ops is.

Tomer: OKRs is Objectives and Key Results. It’s a goal setting approach. There are many. This is just the one that I’m used to, familiar with. And that’s just a way to set goals and see how you’re doing. So, we do that as a research group as well. Research ops – I would take the easy route and say this is everything that helps research happen without the actual research.

Steve: Are the OKRs an example of research ops? That’s what you were saying?

Tomer: So, the tools that I mentioned – the OKRs, hiring, knowledge management.

Steve: All of the infrastructure?

Tomer: Yes. If somebody wants to do – I don’t know, whatever – a usability test, and they don’t know how. So, you want them to have the tool. We want them to have a knowledge base that they can access and see okay, what do I need to do to run the usability test? And we want them to have guidance and support from a person, from a researcher who knows what they’re doing. And I would say all of that is research ops. For researchers, a big part of research ops is participant recruitment. So, finding people to learn from. That’s a big part. And something that I’m truly passionate about is kind of insight repositories. Building some kind of a repository that we can pull from later on. I would add to that any infrastructure involving measuring the user experience. So, building that. I would also include that under research ops.

Steve: So, you said you’re passionate about insight repository?

Tomer: Yeah.

Steve: Do I have to ask an actual question?

Tomer: What do you want to know?

Steve: What are you trying to get to? At Goldman, what are you working towards with that?

Tomer: It’s the same as everywhere. I, for many years, I realized I was kind of bothered by how wasteful research is. That even I felt that I’m “learning the same things over and over again.” I know that other people did research, in the same company, did research that I’m about to do and even if I get their insights I’m going to do that again. And I know it’s really, really messy and hard to retain all the knowledge that you gather from research. The second I had an opportunity to do something about it, I did. That was at WeWork. And we built a system that we called Polaris for that, to solve these problems. We identified the – it’s going to sound funny, but we identified that the main problem, the main root cause for these problems are – or is, the research report, that was what I called back then the atomic unit of a research insight and we changed that unit into – that’s why the metaphor breaks – a smaller atom, that we called a nugget, a research nugget. And that’s what we stored in this repository. So, a nugget was a combination of an observation, evidence and tags.

Steve: What was the last one?

Tomer: Tags. And then due to these tags you can then search through this database and find answers to questions that you didn’t design studies around. So, it happened many times at WeWork where people came to us and said, “what do we know about…?” or, “can we do a study about blah blah?” And we said let’s try Polaris first and realized that we have all the answers without even needing to do more research. This will only happen – you have to change your ways a little bit, not just work with a system like that – this will only happen if you do kind of continuous research – continuous and open-ended research. Back then at WeWork we did exit interviews. So, every WeWork member, customer that decided to leave, we pinged them and talked with them, interviewed them, and asked them kind of very open-ended questions such as why are you leaving? What worked well at WeWork? What didn’t work so well? If you had 15 minutes with the CEO, what would you tell him? Things like that. And then that allowed us to have answers to many, many questions because these research participants, these exiting members decided what they wanted to talk about. If they wanted to talk about – I don’t know, whatever – the price that was too high for them, or the coffee that was too great for them to leave to another place, or whatever it is that they chose, went into the system and then we could pull it out later on and then see okay we heard – and combining with – combining that with a user experience measurement system would lead you to a situation where you can say okay, I saw that satisfaction with coffee in our WeWork buildings in the Netherlands has gone down in the past month. Here is a play list of three Dutch members bitch about coffee from the past month. And then you have the what happened – the numbers. You have the why it happened from these videos. And if you’re going to “serve that” to the person that buys or decides how to brew coffee in the Netherlands then that’s half way through to the solution. So, we imagine something like that at Goldman as well.

Steve: So, it only works, you said, if you have kind of ongoing open-ended research?

Tomer: Yeah.

Steve: Why is that?

Tomer: Because if you always – so, the other type of research I had to give it a name. So, I would call it dedicated research. Dedicated research is research that you do, and you know what research questions you have beforehand, and you answer those questions. And then you can create nuggets and it’s all good. But then you’ll only have answers to those questions. When you do open-ended you have answers to questions you never imagined that you might have, or may have in the future.

Steve: What if my research question is what are the highs and lows of a WeWork member?

Tomer: So, if you do that once you’ll get a snapshot of that point in time. But if you do that continuously, all the time – and at WeWork we, at some point, interviewed -all the UX team members did that on a regular frequency. Then you have thousands and thousands of data points.

Steve: Okay. So, there’s sort of a scale here – scope or scale. I don’t even know which one it is, but there’s an amount of data that covers a breadth of topics and that is also refreshed.

Tomer: Yes.

Steve: Okay. So, what’s – I would hate if anyone asked me this question, but I’m going to ask you. What’s an insight? Because you were kind of saying a nugget is this sort of stuff, but you brought up insight. So, what’s an insight.

Tomer: I actually have a definition for that. I’m thinking about that these days. An insight to me is a deep understanding of the situation. So, you – I’m trying to think of an example. I’ll go to WeWork again. So, imagine a researcher that walks in a WeWork building, looking around, and then sees WeWork has shared open spaces, but also private offices. Okay. So, let’s say they notice in private offices that a lot of them have printers that they’ve brought in and that looks odd because WeWork offers printing. So, we have printers and we offer that as a part of your WeWork membership and if you’re going to go over you pay for more. It’s a nice stream of revenue for WeWork and when they counted it – let’s say the count it and saw that half of the private offices brought in their own printers. They go to a second building and a third building and they count, and they see the same. Half of the members that have private offices brought in their own printer. So, let’s say they stop here, go back to the office and add an insight, nugget, whatever we call it, to the system saying half of WeWork members in those buildings brought in their own printer. That to me is not a deep understanding of the situation. It’s very interesting. It may be indicative of something that’s going on that we’re not aware of, but that’s not enough. We have to understand why? So, I would encourage that researcher to knock on the doors and ask why? And then we may hear things like, “oh, you know you have a 15-page manual on how to install printers and I’m not going to waste my time on that.” Or, “you have to log in each time you go to a printer and that’s taking more time.” And so on and so forth. “We do steal your paper, so we enjoy that.” That’s what I mean by deeper understanding, so we can better understand the situation, know to answer why that is happening, have some evidence and then I would say that’s an insight. That’s a deep understanding of the situation. And to me the system that we built is going to enforce that, so you cannot submit nuggets or insights without that why part? Just facts are not enough. We don’t need facts. We need facts plus why they are what they are.

Steve: Right. There’s a behavior and what’s the reason for that behavior. I struggle when I hear people talk about insights because sometimes they talk about why a single person is doing something, as opposed to sort of why users of a system are experiencing something. Like, in your scenario, you knock on somebody’s door and say what’s up with your own printer? “It’s too hard to install.” And you come back and say – like there’s a difference between we understand why this person was doing it and then sort of the generalized conclusion. I don’t know, I’m putting my own language into your framework. It’s probably not working, but…

Tomer: That’s alright. What we would do with Polaris is gather those individual insights. So, each one would be a nugget. And then if you have 100 of these, the only difference would be the video, the person in front of the camera explaining why they brought in a printer, or whatever it is. And then you can create a playlist and show it to IT, or whoever decided that we’re going to go with this system for printing, and have them decide what they’re going to do about it.

Steve: Okay, so let me push on that a little bit. Because when you talk to say 5 people or something, about this behavior, bringing in your own printers, people are going to give related, but sort of seemingly individualized explanations. And there’s an act of interpretation, analysis and synthesis – words that we often use – to sort of say well let’s look at all of those. Like what’s the overarching reason? Talk about deep understanding. It’s not that we loose track of the fact that the installation manual is messy. There are not enough plugs. There are a lot of sort of reasons, but the larger issue is something like complexity, or not seen as adding to my business or something. There’s a higher order thing. To me, that’s what the insight is. I think you were talking about doing that.

Tomer: Yeah.

Steve: The words are slippery here when we’re talking about insights and nuggets and sort of explanations.

Tomer: Nugget is more the kind of the technical way we called it. But, I agree. And the way it happened in Polaris is through those playlists. So, we encourage people, both in – you know people who belong to the UX team, and ones that are not, to collect these nuggets into playlists and then prove a point, or do this analysis, get to an insight and share it. So, it depends on what you find and what you collect there, but let’s say you search for printers and then you get 73 results. You sift through them and you see that 15 are not really related to what you’re trying to communicate here. The rest is too many, so you’ll pick maybe 7 that the videos are really good, like “good participants” that eloquently explain the point and then you can add your analysis in writing and describe that higher insight, or deeper insight.

Steve: Right. So, Polaris kind of sets you up to make that interpretation.

Tomer: Yeah, yeah.

Steve: It gives you some structure or some way to quickly…

Tomer: Yes, yes.

Steve: Does Polaris capture that thing it facilitates you to make?

Tomer: What do you mean?

Steve: So, create this playlist. You watch the playlist and you come up with sort of new articulation. The biggest issue around printers for us is that we are doing this, and they are thinking that. Right? That’s a new piece of knowledge that’s created by reviewing what Polaris gives you.

Tomer: Yeah. Polaris was not smart. It’s just a tool. It would do whatever – it allows you to do whatever you want to do with it. So, it allows you to add text to do that. So, if that’s a yes then it allows you to do that. Polaris, in its kind of essence, is very simple. Just a tool that facilitates the kind of storage of those nuggets and creation of those playlists.

Steve: Yeah. So, that really helps me understand sort of what you’re aiming at when you talk about insight repository. It’s to me something you can re-query to come up with new conclusions.

Tomer: Yes. Exactly.

Steve: As opposed to sort of here’s all the things that we’ve concluded.

Tomer: Yes. Exactly. Exactly.

Steve: Okay. Wow. Here’s the GIF of the mind blowing up a little bit.

Tomer: You want more?

Steve: If you have more, yes.

Tomer: Now it’s just ideas and that’s not something I’m kind of working towards in Goldman, but I’m thinking – so imagine every company that should have a Polaris has it. That’s also a waste because then every company has its own repository and then I’m sure there’s a lot of overlap. So, maybe in the future there should be a kind of open…

Steve: Panopticon.

Tomer: And open Polaris, or whatever we call it, that anybody can contribute to and anybody can pull from. I’m not doing anything about that.

Steve: A friend of mine – so, this is like a third hand quote that I’m sure I’m misquoting, but a friend of mine told me this and he was quoting the head of knowledge management at NASA. This guy says, “the best knowledge management tool is lunch.”

Tomer: I can see where – I can see why that was said. Yet, I wholeheartedly reject the idea.

Steve: Yeah

Tomer: I mean, that’s not scalable. That’s not what if I went to the bathroom when that happened? What if I started working in NASA the day after that important insight was shared? I can understand the kind of anecdotal part of it, how it’s useful. But to me there has to be something – I don’t know what to call it – more solid.

Steve: Yeah. Analogously, you started off our conversation by describing the lunching that you’re doing at Goldman, connecting people, meeting people. That’s different I think than transmitting nuggets, but you are using sort of lunch, and I mean very vaguely lunch, time with people, talking to them as a way to – I don’t know, as a way to do what? What’s the difference between sort of the NASA lunch thing and what you’re doing with socializing and connecting people?

Tomer: I think what I’m trying to do is kind of socialize a discipline. And if I understand it correctly, the NASA lunch is to socialize an insight maybe. So, yeah, I don’t think we’re there yet in terms of socializing insights.

Steve: And who knows what the context of that quote, which has been quoted, which I’m requoting. It may be exactly coherent with what you’re talking about.

Tomer: Maybe

Steve: I want to loop back to something that you talked about as part of this infrastructure. You said there’s groups that don’t have a researcher and may never have a researcher. So, what are sort of tools, knowledgebase, that can help them do things? I feel like that’s – there’s a thing in our field about who does research? And I’m not even sure what the label for that is?

Tomer: Job security.

Steve: Yeah. Is that what it is?

Tomer: Should we let them do research?

Steve: Right.

Tomer: Yeah. It makes me laugh. Of course, we should let them. As if we’re the authority. But, yeah, of course. I mean why would anyone not be allowed to do research? Because they didn’t go to school? I don’t think so. If somebody wants to do it, to me that’s huge. So, we should give them everything we can to let them do it. Are they going to be doing a bad job? Maybe. But to me bad research is better than no research. It’s a first step and we are good about the tools, the socializing what we do, socializing best practices, things will get better. Yeah, there will probably be crap added in the first few times. Maybe the first 20 times. But if they really want to and they have this passion, then why kill it by saying that it’s our job, or something like that. So, yes, I’m all for “letting them” do research. Definitely.

Steve: I mean I think you highlighted exactly – so there’s – I think job security is a fear, but I think bad research is also a fear, as you said.

Tomer: Yeah. I’m okay with that.

Steve: And you said bad research is better than no research.

Tomer: To, me, yeah. 100%. Yes

Steve: I like how definitive you are.

Tomer: I’m…

Steve: Because that’s a hot topic, I think. I’ve heard people go back and forth on it.

Tomer: I know. I heard that too. I’m definitely on that side.

Steve: I also will say you’re describing ways to limit or mitigate bad research.

Tomer: Help make it better.

Steve: Yeah.

Tomer: Yeah, yeah. I mean first they need to know about it. What happens – there are so many people who develop, let’s say software, at Goldman. They’re not even aware of our existence. It’s not that they think about it and say, “uh, no, I’m not going to do it.” They don’t even know that this is happening, that we are there. So, I think there’s a long way to go. We need to kind of be more popular and be more known and then provide all the tools, help, guidance, knowledge that we can. Knowing that we can’t support everybody. It’s not going to happen. It happened even at Google that had, at the time 100s, and today probably a lot more researchers. There are teams that build stuff. Why not allow them to do research to help them?

Steve: You made the comment that research was happening before there were researchers.

Tomer: Yeah.

Steve: In the history of the world that’s also true, right.

Tomer: Yeah. True. We can’t stop that or control that.

Steve: So, maybe just a slight shift. We’re sort of talking about who’s allowed to do what, or what are researchers and what do we do? You mentioned early on that you also take on an additional role. Do you want to give some context to that? And what does that mean for you?

Tomer: Yeah, I also lead a design group for a product, Private Wealth Management. It allows very rich people to manage their money. It’s a part of a service that Goldman offers. And the digital aspect of it is not the primary aspect of it. It’s just kind of a supporting role. It’s mostly based on a relationship between an advisor, a Goldman Sachs advisor, and a client. And there’s the usual suspect of website apps and so on. I’m now leading a group of people that design that. And I mean design with the expanded way we define it. So, it’s not just designers. It’s also researchers and data people and a writer and prototyping and so on.

Steve: Are there differences between managing researchers – we talked before about people coming in with resumes that scream researchers – vs. all the different kinds of functions you’re working with on that team?

Tomer: Honestly, no. I don’t think so. I would be the first to admit it. I’m personally not the best designer. Not in the world definitely and not that I could be. But, I think there are things that are similar no matter what kind of group you’re leading. It’s good that you know something about what the group is doing, but I think it’s mostly about empowering the right people, giving them what they need, releasing them of things that are just stupid, that they don’t need to do, and they don’t need to be involved in, and focusing them on what they are passionate about. This doesn’t have any direct relationship with design or research or whatever it is.

Steve: Yeah. There’s an email list that I think you and I are both on that’s about design and user research. And there was a thread, or maybe people were having a conference call. I can’t even remember how it manifested, but the topic was researchers managing designers, which seems like it’s a newer thing. If you look historically, like research was sort of the accessory or adjacency to design, so design teams kind of managed researchers. But as research has grown there’s other people in the situation like yours where their label for themselves would lean more towards researcher, but they’re managing designers. So, it’s interesting that you sort of don’t see a difference because I feel like the thrust of this group needing to talk was hey, there’s something different here and so how are we going to deal with it?

Tomer: I know I have kind of my internal bias toward research. So, I’m probably more kind of attentive to mostly when that is not happening. Maybe than a person who would be a designer that manages designers. But I’m just guessing. I don’t know. I know that I’m definitely – I care about research and I notice and say something when it doesn’t happen. I don’t know – does it have to be a designer – designers need to know if that’s their thing.

Steve: I think I might switch gears entirely here.

Tomer: Go for it.

Steve: I’d love to just go way back, like as far back as you want to go and maybe give the story of things you did in your life to kind of get here, whether those are work or school or other things?

Tomer: That got me here?

Steve: Yeah. What’s your sort of background, or your narrative arc, if you will?

Tomer: So, I’ll tell you something from a long time ago that probably really was the tipping point, that I wasn’t even aware of that that was the tipping point at the time. I mean it’s not a secret I’m originally from Israel and I relocated, it was 12 years ago. And while in Israel I served in the Army. I signed for a career in the Army. So, the whole shebang. I was going to be a career officer for the long term. But then when I was 24, long story short, I was a paraglider. And I took a course and I got injured badly and I was out of the Army for a year. I was at home recovering. And that was a bad thing. Bad injury, but that opened my eyes. And during that year I came back to the Army and said I want to cancel the whole thing; I don’t want to stay; I want out. I will do my next job because we planned for one more job, but that’s it. I’m out. And that’s what happened. I think without that accident I would probably – I’d probably be retired by now, but I would be a career officer and not what I am today. And kind of looking back, I’m happy that that’s what happened. I would say that’s the biggest thing that affected what I’m doing today.

Steve: So, opening your eyes was realizing that you didn’t want to go on the path that you were on. Was there any hints for you of what path you did want to pursue?

Tomer: I knew it was creative. I was in a wheelchair for four months and then crutches and then learn how to walk again. And I made my way, once I was able to kind of get up, to a local artist that gave kind of very open-ended lessons in his basement, like a couple of blocks from my house at the time. So, I started painting and tried all kinds of ways to paint. And I know – I didn’t know to say that I will be an artist and honestly, I wasn’t really good at it. But I knew it would be something creative. I didn’t know exactly what.

Steve: If you look at your work today, does it match that?

Tomer: Um, not 100% overlap, but I feel some of it is, yeah.

Steve: I mean I have wrestled, mostly privately, with just the idea is research a creative field? Or are we creative?

Tomer: Some of it is.

Steve: I found myself in a collaboration with people that I think more traditionally fit that job description and I kind of had my hair blown back, just on sort of the speed and breadth of making stuff. It was definitely intimidating.

Tomer: So, I would call myself a researcher, but still I was heavily involved in shaping Polaris. That’s a product. It’s not research work. I’m now involved in creating a system that – I also lead a small team of engineers that build a system to measure the user experience. So, that is definitely more creative than maybe research. But some parts of kind of pure research are creative. To me probably the biggest one is translating one or a set of questions that a team has into okay, what are we going to do to get answers. That’s not always – if it was that easy to come up with an answer to that, then anybody could do that well. That’s not the case. A lot of people are having a lot of trouble with that part. So, I think that’s a creative part. You’re not going to see a beautiful painting coming out of that, but it is creative.

Steve: Right. I think, for me, creating the new story out of a bunch of experiences or nuggets, or whatever you’re pulling from…

Tomer: Realizing, getting to an insight. Yeah.

Steve: So, what did you do after the art class? What do you end up doing?

Tomer: I applied – I applied to what we call today, probably a visual communications program. Got accepted to one of the best ones, if not the best one in Israel at the time, and last minute decided that it’s not for me. And then I learned – I studied copywriting. So, I’m a certified copywriter, in Hebrew though. And then I took my first job – or, I worked. I worked for 3 years as – I didn’t really know what I was going to do, so I did something that I knew how to do and that was I worked in a very small consultancy for military oriented industries. So, I did that for – I was a project manager. I did that for I think 3 years. That was the time where I kind of learned all these things and really set my mind that anything army related is not for me. And then my first real job in that direction was I was – funny how names were at that time. I was an internet copywriter, which we would probably call today a content strategist, for a website that I would compare it to Monster or Indeed, or something like that today. And there I got exposed to – probably at the time it was Jakob Nielsen and people in that area. I started reading more. I did some, again what we would call today, product management work there. And then I asked them to switch to what we would call today a researcher. They said, “no.” And I was like, okay, and I looked for a company who would take me. And there was one company that took me as – again, it wasn’t researcher. It was called a usability something. And that’s it. That’s how it started. They were very brave, I should admit, because I didn’t know much.

Steve: But they took you as a researcher?

Tomer: Yeah.

Steve: So, that was sort of your first time with the title.

Tomer: Yeah. You want an even funnier story. The only person who actually knew what it was was the CEO. He was the one who interviewed me. And then he hired me and then a month later he decided to leave. So, the only person who really knew what I was doing left about a month in. But that went well. That’s how it started.

Steve: What was the point at which you came to the U.S.?

Tomer: Um, so, I had a couple more jobs and I realized very quickly that, at the time at least, there weren’t enough, or even at all, opportunities to grow into managing a group or a team of people who do that. I was always the only person in the company that did that. And I realized, or I got to the conclusion that the only place for me to work in a company that had a lot of people of my tribe would be here in the states. And I also realized the army there is not like the army here. I didn’t have any academic degree. I also realized that I needed the right degree because these companies that I was thinking about would not even read my resume. My resume didn’t scream researcher. So, I went to school. I continued working and went to school. I completed my Bachelor’s and then applied to Bentley University here and then moved. And as I was studying there I – during school there, in Massachusetts, I contacted Google and things rolled from there.

Steve: We talked about WeWork a little bit. Can you – like what was the – what was your role at Google and maybe what was your role at WeWork?

Tomer: Google, I was a user researcher, a senior user researcher. First in advertising. They’ve changed all the names by now, but at the time it was DoubleClick for Publishers. That was the product I was involved in. Ironically, I was the only researcher there in that group of hundreds of people. And after, I think, 2 ½ years I switched transferred to Search and in Search, I was first Voice Search and then we called it, at the time, Core Search – the bag of 10 blue links and how they developed from that point to what you see today as all the visual aspects of the result and so on. So, that started back then, vertical by vertical – TV, movies, music. I was doing a lot of research into search results for sports. I know a lot more than I should about all kinds of sports, cricket and so on. Yeah.

Steve: And then what was the role you took at WeWork?

Tomer: WeWork, I was head of user experience. So, started a group from scratch. The goal there, and this is kind of following several conversations with the CEO and co-founder that hired me. At the time – again, I’m sure things have changed since. But at the time WeWork had three big groups internally that kind of built or created the three aspects of the WeWork product. They were called digital, physical and community. And Adam, the CEO, felt, and he was very right, that while each group is doing a great job, sometimes if you look – if you’re going to think or look from how things are from the perspective of the customer, the member, there are gaps between those groups that we’re not even aware of. And the goal was to identify those gaps, to me that translates to research, and then solve them – solve the problems there. I’m thinking of an example. Think about conference rooms. So, conference rooms is something that WeWork offers. Members pay for it. So, somebody physically designed the conference room. An architect decided on the size and location. An interior designer decided on the mood and what will be in the room. Somebody from IT picked and AV system for that room. Somebody in digital developed a system to book this room. Somebody in community designed a policy of how to use this room. And community team members in the buildings enforce this policy. Everything is good. Everything is working well, but then situations happen. Such as members come to a meeting, they book the room and then another member is squatting the room and refusing to go out. Even if they walk in they realize that I’m a startup, I booked a room for a meeting with a potential investor and then I see a room that is designed as a music room with bean bags and no projector, clearly inappropriate for my meeting. Or, that person who’s squatting, I’m going to the community manager, but they are dealing with a leak of water from the ceiling on another member’s head. They’re all very nice, but they can’t solve my problem right now. So, this is what I’m talking about when I said the gap between one of those groups. So, we’re trying to identify those gaps – because in many cases we didn’t even know about this – and try and solve them. That was the premise back then.

Steve: You’ve written books. You give a lot of talks.

Tomer: Less now.

Steve: You have a good sort of history of creating material. You’ve interviewed a lot of people. People listening, what would you send them to, to buy, read, watch?

Tomer: Well, our publisher would not be a publisher – would be happy if I say that they should buy my book, yours too. And that would be – the name of the book is Validating Product Ideas Through Lean User Research. That’s a book for – I would only mention that because my first book is for researchers, and a lot of researchers do listen, so maybe I’ll mention both. But the second book, the one I mentioned, is for – it’s a step by step guide into answering different research questions that people have. Each chapter is a research question, step by step, on how to answer it through research that anybody can do quickly. The first book is called It’s Our Research. It’s to solve a problem that, at least at the time – I now have different thoughts about that – but, at least at the time a lot of researchers had, maybe today as well, and that is a problem that a lot of people don’t want to do research because they feel they have the answers already, or they have good intuition. And also, once they agree to do research in some cases they don’t want to act on it.

Steve: Wait, act on?

Tomer: On their research results.

Steve: What they’ve learned? Yes

Tomer: So, that’s a book that’s supposed to help with that problem. And I’m kind of having different thoughts now because my answer in that book was – and that’s why I called it It’s Our Research – make them feel that it’s their research as much as you feel it’s yours and then they would want it and they would do something about it. That was my point. Today, I’m having kind of different thoughts about how to get to a point where research is wanted and acted on. And I will try it at some point. We just need to grow a little bit at Goldman. But, my thoughts are when you – and I posted something about that recently – when you plug your own charger into the wall, do you really care how electricity gets there? What’s happening in the power plant? Why it’s working? How is it efficient? Is not efficient, and so on. You don’t really care. You want your phone charged. My thoughts about research is why wouldn’t it be the same. People have questions. Research provides answers. Yes, a lot is going on to get to those answers, but if you have a question, why do we need to bother you with all the details? Just do the thing, trust us to do the thing, and we’ll give you an answer to your question. I’m going to try that at some point. I’m thinking – it’s not political, but building a wall between stakeholders and researchers. That wall could be Slack or something like that through which stakeholders ask questions and researchers provide answers. If we have an answer immediately, if we have a system like Polaris or something like that, we can provide an answer. If we don’t, we will just ask a few questions, follow-up questions, and then do the research and get the answer. Just thoughts. I haven’t tried it yet.

Steve: Which makes me think of the research ops piece a little bit where – like building up participant recruitment infrastructure…

Tomer: Yeah.

Steve: …is an interesting one because back in the old days where we had to do everything ourselves, you’re learning about your problem by figuring how to – by recruiting. You also learn about your problem by dealing with your stakeholder and seeing what – it’s that art piece vs. this kind of process infrastructure piece and it’s interesting to think about like what is lost and what is gained? Or how is changed when you create infrastructure that – like if you’re a researcher and you’re completely decoupled from participant recruiting, that may change how you deal with people that you meet, or how you deal with framing the problem. So, for everything that we build up a process, that’s efficiency, that kind of is a query system, how does that change what we do? And who is coming to this field?

Tomer: Yeah.

Steve: These are not necessarily my own thoughts, but just things I’m hearing from people as well.

Tomer: One other thing that I would send people to is a series of Medium posts that I’ve published in the past year, maybe less, about measuring user experience. A lot of people like to talk about metrics these days. I took the HEART framework from Google and then we have a post per letter about happiness, engagement, adoption, retention and task success. And for each one, what it is, what’s important to measure, why, how, mistakes and what actions you can take from each one? So, this is something that I’m interested in these days, measurements. And I’m trying to figure out the “H” part, the happiness part, specifically. There are a ton of challenges with that. How to measure satisfaction and happiness. I’m also posting – kind of tracking my – not tracking my own life but paying more attention to when I’m exposed to requests to rate satisfaction and happiness and I share them with people, with my thoughts about them.

Steve: Okay. Great. Anything else that we should talk about in this conversation?

Tomer: I said I am speaking kind of publicly a lot less, but I do do that from time to time. I’ll be speaking in two conferences, the Face of Finance in April in New York and in London, User Research London in June, in London.

Steve: Alright. Well, thanks for taking the time to chat and sharing all the information and stories and everything. I really appreciate it.

Tomer: That was fun. Thank you.

Steve: Thanks. And so concludes another episode of Dollars to Donuts. Follow the podcast on Twitter, and subscribe to the podcast at portigal.com/podcast, or iTunes, or Spotify, or Stitcher, or anyplace you get your podcasts. Also online at portigal.com/podcast is the transcript and links for this episode (and of course all the previous episodes). At Amazon and rosenfeldmedia.com you can buy Tomer’s books and my books. Our rocking theme music was written and performed by Bruce Todd.

Mar 24 2019

1hr

Play

Rank #6: 19. Leisa Reichelt of Atlassian (Part 1)

Podcast cover
Read more

This episode of Dollars to Donuts features part 1 of my two-part conversation with Leisa Reichelt of Atlassian. We talk about educating the organization to do user research better, the limitations of horizontal products, and the tension between “good” and “bad” research.

If you’re working on a product that has got some more foundational issues that need to be addressed, but the vast majority of the work is happening at that very detailed feature level, how are you going to ever going to stop kind of circling the drain? You get stuck in this kind of local maxima. How are you ever going to take that big substantial step to really move your product forward if it’s nobody’s job, nobody’s priority, to do that? – Leisa Reichelt

Show Links

Follow Dollars to Donuts on Twitter and help other people discover the podcast by leaving a review on iTunes.

Transcript

Steve Portigal: Howdy, and here we are with another episode of Dollars to Donuts, the podcast where I talk to the people who are leading user research in their organization.

I just taught a public workshop in New York City about user research, organized by Rosenfeld Media. But don’t despair – this workshop, Fundamentals of Interviewing Users is also happening September 13th in San Francisco. I’ll put the link in the show notes. Send your team! Recommend this workshop to your friends!

If you aren’t in San Francisco, or you can’t make it September 13th, you can hire me to come into your organization and lead a training workshop. Recently I’ve taught classes for companies in New York City, and coming up will be San Diego, as well as the Midwest, and Texas. I’d love to talk with you about coming into your company and teaching people about research.

As always, a reminder that supporting me in my business is a way to support this podcast and ensure that I can keep making new episodes. If you have feedback about the podcast, I’d love to hear from you at DONUTS AT PORTIGAL DOT COM or on Twitter at Dollars To Donuts, that’s d o l l R s T O D o n u t s.
I was pretty excited this week to receive a new book in the mail. It’s called The Art of Noticing, by Rob Walker, whose name you may recognize from his books, or New York Times columns, or his appearance in Gary Hustwit’s documentary “Objectified.” I’ve only just started the book but I am eager to read it, which is not something I say that often about a book of non-fiction. The book is structured around 131 different exercises to practice noticing. Each page has really great pull-quotes and the exercises seem to come from a bunch of interesting sources like people who focus on creativity or art or storytelling. Rob also publishes a great newsletter with lots of additional tips and examples around noticing, and I’ve even sent him a few references that he’s included. I’ll put all this info in the notes for this episode. This topic takes me back to a workshop I ran a few years ago at one of the first user research conferences I ever attended, called About, With and For. The workshop is about noticing and II wonder if it’s time to revisit that workshop, and I can look to Rob’s book as a new resource.

Well, let’s get to the episode. I had a fascinating and in-depth conversation with Leisa Reichelt. She is the Head of Research and Insights at Atlassian in Sydney Australia. Our interview went on for a long time and I’m going to break it up into two parts. So, let’s get to part one here. Thank you very much for being here.

Leisa Reichelt: Thank you for inviting me.

Steve: Let’s start with maybe some background introduction. Who are you? What do you do? Maybe a little bit of how we got here – by we, I mean you.

Leisa: I am the Head of Research and Insights at Atlassian. Probably best known for creating software such as Jira and Confluence. Basically, tools that people use to make software. And then we also have Trello in our stable as well. So, there are a bunch of tools that are used by people who don’t make software as well. A whole bunch of stuff.

Steve: It seems like Jira and Confluence, if you’re any kind of software developer, those are just words you’re using, and terms from within those tools. It’s just part of the vocabulary.

Leisa: Yeah.

Steve: But if you’re outside, you maybe have never heard those words before?

Leisa: Exactly. Atlassian is quite a famous company in Australia because it’s kind of big and successful. But I think if you did a poll across Australia to find out who knew what Atlassian actually does, the brand awareness is high. The knowledge of what the company does is pretty low, unless you’re a software developer or a sort of project manager of software teams in which case you probably have heard of or used or have an opinion about Jira and probably Confluence as well.

Steve: And then Trello is used by people that aren’t necessarily software makers.

Leisa: Correct. A bunch of people do use it for making software as well, but it’s also used for people running – like in businesses, running non-technical projects and then a huge number of people just use it kind of personally – planning holidays, or weddings. I plan my kids weekends of Trello sometimes and I know I’m not alone. So, yeah it’s a real – it’s a very what we call a horizontal product.

Steve: A horizontal product can go into a lot of different industries.

Leisa: Exactly, exactly. I’m very skeptical about that term, by the way.

Steve: Horizontal? Yeah.

Leisa: Or the fact that it won’t necessarily be a good thing, but that’s another topic probably.

Steve: So, I can’t follow-up on that?

Leisa: Well, yeah, you can. Well, the problem with horizontal products, I think, is that they only do a certain amount for everybody and then people reach a point where they really want to be able to do more. And if your product is too horizontal then they will graduate to other products. And that gives you interesting business model challenges, I think. So, you have to be continually kind of seeking new people who only want to use your product up to a certain point in order to maintain your marketplace really.

Steve: When I think about my own small business and just any research I’ve done in small and just under medium sized businesses where everything is in Excel, sort of historically, where Excel is the – there may be a product, Cloud-based or otherwise, to do a thing. That someone has built kind of a custom Excel tool to do it. So, is Excel a horizontal product that way?

Leisa: I think so, yeah. In fact, I was talking to someone about this yesterday. I think that for a lot of people the first protocol for everything is a spreadsheet. They try to do everything that they can possibly do in a spreadsheet. And then there are some people who are like the, “ooh, new shiny tool. Let’s always try to find an application for a new shiny tool.” I think actually the vast majority of people take the first tool that they knew that had some kind of flexibility in it. So, if you can’t do it in Word or Excel – people will compromise a lot to be able to do things in tools that they have great familiarity with.

Steve: Yeah. But from the maker point of view you’re saying that a risk in the horizontalness, the lack of specificity, creates kind of a marketplace for the maker of the tool?

Leisa: Can do. I think for it to be successful you just have to be at such a huge scale to be able to always meet the needs of enough people. But I think things like Excel and Word and Trello, for example, they’ll always do some things for some people. Like just ‘cuz you moved to a more kind of sophisticated tool doesn’t mean that you completely abandon the old tool. You probably still use it for a bunch of things.

Steve: So, your title is Head of Research and Insights?

Leisa: Correct.

Steve: So, what’s the difference between research and insights?

Leisa: Yeah, good question. I didn’t make up my own title. I kind of inherited it. If I remember correctly, the way that it came about was that when I came into my role it was a new combination of people in the team in that we were bringing together the people who had been doing design research in the organization and the voice of the customer team who were effectively running the MPS. And I think because we were putting the two of them together it sounded weird to say research and voice of the customer. So, they went with research and insights instead. And, honestly, I haven’t spent any time really thinking about whether that was a good title or not. I’ve got other problems on my mind that are probably more pressing, so I’ve just kind of left it. If you want to get into it, I think research is the act of going out and gathering the data and pulling it altogether to make sense and insights. So, hopefully the sense that you get from it and we do try to do both of those things in my team, so I think it’s a reasonably accurate description.

Steve: How long have you been in this role?

Leisa: It’s about 18 months now.

Steve: If you look back on 18 months what are the notable things? For the experience of people inside the organization what has changed ins?

Leisa: Quite a few things. The shape and make up of the team has changed quite a lot. We’re bigger and differently composed to what we were previously. When I first came in it was literally made up of researchers who were working in products. Prior to me coming in they were reporting to design managers and we, for various reasons, pulled them out of products pretty quickly once I started. And then we had the other part of the business who were running MPS that was being gathered in product and we don’t do that anymore either. So, that team is doing quite different things now. So, we’ve changed a lot of things. We’ve introduced a couple of big programs of work as well. One of them being a big piece of work around top tasks. So, taking Gerry McGovern’s approach to top tasks. Trying to build that kind of foundational knowledge in the organization. So, that’s kind of new. There have always been those people doing Jobs To Be Done type stuff, but very very close to the product level. So, we’ve tried to pull back a little bit to really build a bigger understanding of what are the larger problems that we’re trying to solve for people and how might we see the opportunities to address those more effectively?

Steve: So, trying to create shared knowledge around what those top tasks are – I’m guessing for different products, different user scenarios.

Leisa: One of the things that we really tried to do is to get away from product top tasks and get more into really understanding what problems-based is the product, or combinations of products, trying to address. So, we don’t do top tasks for Jira. We do top tasks for agile software teams. And then through that we can then sort of ladder down to what that means for Jira or Confluence or Bitbucket or Trello, or whichever individual or combination of products we have. But it means that we sort of pull away a little bit from that feature focus. I think it can be very seductive and also very limiting.

Steve: What’s the size of the team now?

Leisa: I have a general rule in life of never count how many researchers you have in the organization because it’s always too many, according to whoever is asking you – not you, but like senior people. I think we’re probably around the mid-20s now.

Steve: And around the world, or around different locations?

Leisa: Mostly we’re in Sydney and California. So, we’re across three different offices there and we have a couple of people working remotely as well. So, I have somebody remote in Europe and another person in California who’s working out of a remote office too.

Steve: Can we sort of talk about the make up of the team evolving – what kinds of – without sort of enumerating, what are sort of the background, skillsets? Any way that you want to segment researchers, what kinds of – what’s the mix that you’re putting together?

Leisa: So, I think at the highest level we’ve got people who do design research, predominantly qualitative and they do a mixture of discovery work and evaluative work. We’ve got a team of what we call quantitative researchers and those are generally people who have got a marketing research type background and so they bring in a lot of those data and statistical skills that the other crew don’t necessarily have quite so much of. And then we have a research ops team as well who are doing some different things. And then we have a handful of research educators.

Steve: And what are the research educators doing?

Leisa: Educating people about how to do research.

Steve: These are hard questions and hard answers!

Leisa: Well, if you dig into it too much it gets complicated pretty quickly. So, I think the reality of research at Atlassian is the people in the research team probably do – best case – 20% of the research that happens in the organization and that’s probably an optimistic estimate as well. A huge amount of research is done by product managers and by designers and by other people in the organization, most of whom haven’t had any kind of training and so take a very intuitive approach to how they might do research. So, the research education team are really trying to help shape the way that this work is done so that it can be more effective.

Steve: I’m thinking about a talk I saw you give at the Mind the Product Conference where you began the talk – I think you kind of bookended the talk with a question that you didn’t answer – I don’t think you answered it definitively which is, is bad research better than no research, or words close to that. It was a great provocation to raise the question and then sort of talk about what the different aspects of that – how we might answer that? What the tradeoffs are? When you talk about this design education effort I can’t help but think that that’s connected to that question. If 80% of the research is done by people acting intuitively then yeah, how do you level up the quality of that?

Leisa: Absolutely.

Steve: Which implies that – well, I don’t know if it answers – does that answer the question? I’m not trying to gotcha here, but if you are trying to level up the quality that suggests that at some point better research is – I don’t know, there’s some equation here about bad vs. better vs. no. I’m not sure what the math of it is.

Leisa: So, this has been the real thing that’s occupied my mind a lot in the last couple of years really. And I’ve seen different organizations have really different kind of appetites for participating in research in different ways. I think at the time that I did that talk, which was probably, what?

Steve: Almost a year ago.

Leisa: Yeah. I think I was still trying to come to terms with exactly where I’ve felt – what I’ve felt about all of this, because as somebody who – well, as somebody in my role, everybody in an organization is – a lot of people in the organization are going to be watching you to see you trying to be overly precious and to become a gatekeeper or a bottleneck or all of these kinds of things. So, I’ve always felt like I had to be very careful about what you enabled and what you stopped because everyone has work that they need to get done, right. And the fact that people want to involve their customers and their users in the design process is something that I want to be seen to be supporting. Like I don’t want to be – I don’t want to stop that and I certainly don’t want to message that we shouldn’t have a closeness to our users and customers when we’re doing the research.

But, I’ve also seen a lot of practices that are done with the best of intentions that just get us to some crazy outcomes. And that really worries me. It worries me on two levels. It worries me in terms of the fact that we do that and we don’t move our products forward in the way that we could and it worries me because I think it reflects really poorly on research as a profession. I think most of us have seen situations where people have said well I did the research and nothing got better, so I’m not going to do it anymore. Clearly it’s a waste of time, right. And almost always what’s happened there is that the way that people have executed it has not been in a way that has helped them to see the things that they need to see or make the decisions that they need to make. So, it’s this really hard like to walk to try to understand how to enable, but how to enable in a way that is actually enabling in a positive way and is not actually facilitating really problematic outcomes. So, that’s, yeah – that’s my conundrum is balancing that. One the one hand I feel really uncomfortable saying no research is better than bad research. But on the other hand, I’ve seen plenty of evidence that makes me feel like actually maybe it’s true.

Steve: So, that’s kind of a time horizon there that the bad research may lead to the wrong decisions that impact the product and then sort of harm the prospects of it. I’m just reflecting back. That harm the prospects of a research to kind of go forward. Right, every researcher has heard that, “well we already knew that” response which to me is part of what you’re talking about. It’s that when research doesn’t – isn’t conducted in a way and sort of isn’t facilitated so that people have those learning moments where they – I think you said something about sort of helping them see the thing that’s going to help them make the right decision. And that’s – right, that’s not a – that’s different than what methodology do you use, or are you asking leading questions? Maybe leading questions are part of it because you confirm what you knew and so you can’t see something new because you’re not kind of surfacing it.

Leisa: Loads of it comes around framing, right. Loads of it comes around where are you in the organization? What are you focused on right now? What’s your remit? What’s your scope? What are you allowed to or interested in asking questions about? In a lot of cases this high volume of research comes from very downstream, very feature focused areas, right. So, if you’re working on a product that has got some more foundational issues that need to be addressed, but the vast majority of the work is happening at that very detailed feature level, how are you going to ever going to stop kind of circling the drain. You get stuck in this kind of local maxima. How are you ever going to take that big substantial step to really move your product forward if it’s nobody’s job, nobody’s priority, to do that. So, a lot of this is kind of structural. That so many of our people who are conducting this research are so close to the machine of delivery, and shipping and shipping and shipping as quickly as possible, that they don’t have the opportunity to think – to look sideways and see what’s happening on either side of their feature. Even when there are teams that are working on really similar things. They’ve run so heads down and feature driven. So, doing the least possible that you can to validate and then ship to learn, which is a whole other area of bad practice, I think, in many cases. It’s really limiting and you can see organizations that spend a huge amount of time and effort doing this research, but it’s at such a micro level that they don’t see the problems that their customers are dying to tell them about and they never ask about the problems that the customers are dying to tell them about because the customers just answer the questions that they asked and that’s kind of what bothers me. And so, it’s not about – in a lot of cases, some of it is about practice. Like I think it’s amazing how few people can just resist the temptation to say hey I’ve got this idea for a feature, how much would you like it? They can get 7 out of who said they loved my feature. Like that feels so definitive and so reliable and that’s very desirable and hard to resist. So, yes that happens.

But the bigger thing for me I think is where research is situated and the fact that you don’t have both that kind of big picture view as well as that micro view. We just have all of these fragments of microness and a huge amount of effort expended to do it. And I don’t feel like it’s helping us take those big steps forward.

Steve: But Leisa, all you need is a data repository so that you can surface those micro findings for the rest of the organization, right?

Leisa: You’re a troll Steve.

Steve: I am trolling you.

Leisa: You’re a troll, but again, I mean in some – kind of yes, but again, like all of those micro things don’t necessarily collectively give you the full view. And a lot of those micro things, because of the way they’re being asked, actually give you information that’s useless. If all of your questioning is around validating a particular feature that you’ve already decided that you think you want to do and you go in asking about that, you never really ask for the why? Like why would you use – like who is actually using this? Why would they use this? What actual real problem are they solving with this, right? So, the problem becomes the feature and then all of that research then becomes very disposable because the feature shifts and then nobody uses it as much as everybody thought they would, or they’re not as satisfied with it as what they thought they would be. So, we just keep iterating and iterating and iterating on these tiny things. We move buttons around. We change buttons. We like add more stuff in – surely that will fix it. But it’s because we haven’t taken that step back to actually understand the why? If you’re talking about those big user need level questions, the big whys, then you know what, you can put those things in a repository and you can build more and more detail into those because those tend to be long lasting.

And then at the other end, just the basic interaction design type stuff that we learn through research. A lot of those things don’t change much either. But it’s that bit in the middle, that feature level stuff, is the most disposable and often the least useful and I feel like that’s where we spend a huge amount of our time.

Steve: Do you have any theories as to why in large software enterprise technology companies that there is a focus on leaning heavily on that kind of feature validation research? What causes that?

Leisa: I think that there’s probably at least two things that I can think about that contribute to that. One is around what’s driving people? What gets people their promotions? What makes people look good in organizations? Shipping stuff – shipping stuff gets you good feedback when you’re going for your promotion. They want a list of the stuff that you’ve done, the stuff that you’ve shipped. In a lot of organizations just the fact that you’ve shipped it is enough. Nobody is actually following up to see whether or not it actually made a substantive difference in customers’ lives or not. So, I think that drive to ship is incredibly important. And our org structures as well, like the way that we divide teams up now, especially in organizations where you’ve got this kind of microservice platform kind of environment. You can have teams who’ve got huge dependencies on each other and they really have to try to componentize their work quite a lot. So, you have all of these kind of little micro teams who their customer’s experience is all of their work combined, but they all have different bosses with different KPIs or different OKRs or whatever the case may be. And I think that’s a problem. And then the other thing is this like build, measure, build – what is it? I’ve forgotten how you say it.

Steve: I’m the wrong person.

Leisa: Build/measure/learn.

Steve: Yeah, okay.

Leisa: The lean thing, right, which is this kind of idea that you can just build it and ship it and then learn. And that’s – that’s – that means that like if it had been learn/build/measure/learn we would be in a different situation, right, because we would be doing some discovery up front and then we would have a lot of stuff that we already knew before we had to ship it out to the customers. But it’s not. It’s build – you have an idea, build/measure/learn. And then people are often not particularly fussy about thinking about the learn bit. So, we ship it, put some analytics on it and we’ll put a feedback collector on it and then we’ll learn.

What are you going to learn? Whatever. What does success look like? I don’t know. It’s very – it’s kind of lazy and it makes – we treat our customers like lab rats when we do that. And I feel like they can tell. There’s lots of stuff that gets shipped that shouldn’t get shipped, that we know shouldn’t get shipped. I’m not talking about Atlassian in specific. I’m talking about in general. We all see it in our day to day digital lives that people ship stuff that we really should know better, but this build/measure/learn thing means that unless you can see it in the product analytics, unless you can see thousands of customers howling with anger and distress, it doesn’t count.

Steve: It reminds me of a client I worked with a few years ago where we had some pretty deep understandings of certain points of the interactions that were happening and what the value proposition was and where there was meaning. Just a lot of sort of – some pretty rich stuff. And the client was great because they kept introducing us to different product teams to help apply what we had learned to really, really specific decisions that were going to be made. But what we started to see – this pattern was they were interested in setting up experiments, not improving the product. And in fact, this is a product that has an annual use cycle. So, the time horizon for actually making changes was really far and some of the stuff was not rocket science. Like it was clear for this decision – A versus B versus C – like here’s the research said very clearly like this is what people care about, or what’s going to have impact, or what’s going to make them feel secure in these decisions. They were like great, we can conduct an experiment. We have three weeks to set up the experiment and then we’ll get some data. And I was – I just hadn’t really encountered that mindset before. I don’t know if they were really build/measure/learn literally, if that was their philosophy, but I didn’t know how to sort of help – I wasn’t able to move them to the way that I wanted it done. I’d really just encountered it for the first time and it seemed like there was a missed opportunity there. Like we knew how to improve the product and I wasn’t against – like they are conducting experiments and measuring and learnings – awesome. That’s research. But acting on what you’ve learned seems like why you’re doing the research in the first place.

Leisa: It feels as though we have this great toolset that we could be using, right? We’ve got going out and doing your kind of ethnography, contextual type stuff. And then right at the other end we’ve got the product analytics and running experiments and a whole bunch of stuff in between. And it really feels to me as though organizations tend to really just get excited about one thing and go hard on that one thing. And growth – doing the experiments is a big thing right now. I see loads of situations where we look at data and we go, oh look the graph is going in this direction. Well, the graph is not going in that direction. Why? And I’ll kind of – like there’s so much guessing behind all of that, right? And if it doesn’t go quite right, well then let’s have another guess, let’s have another guess, let’s have another guess. And this – like you say, there’s so much stuff that we probably already know if we could connect two parts of the organization together to take together more effectively. Or, there are methods that we could use to find out the why pretty quickly without having to just put another experiment, another experiment onto our customers, our users. But the knowledge of this toolset and the ability to choose the right tool and apply it, and to apply these approaches in combination seems to be where the challenge is.

Steve: What’s the role of design in this? In terms of here’s the thing that we know and here’s the thing we want to make. We haven’t talked about designers. Ideally for me, I sort of hope designers are the instruments of making that translation. Without design then you can sort of test implementations, but you can’t synthesize and make anything new necessarily.

Leisa: Well, I mean yeah. Theoretically design has a really important role to play here because I think design hopefully understands the users in a design process better than anybody else. Understands the opportunities for iteration and levels of fidelity for exploring problems in a way that nobody else on the team does. And lots of designers do know that. But the time pressure is enormous, especially in these kind of larger tech companies where a lot of times designers – designers are also concerned about being bottlenecks. They have to feed the engineering machine. And it can be really difficult for them to have the conversations to talk about all of the work that we should be doing before we ship something. So, I feel as though they have a lot of pressure on them, a lot of time pressure on them. They’re being pressured to really contract the amount of effort that they put in before we ship something. And there is – yeah, there’s this huge desire amongst product teams and their bosses to just get something shipped, get something live and then we’ll learn that I think design really struggles with. And I don’t know – it can be really difficult in those kinds of environments to be the person who stands up and says we need to slow down and we need to do less and do it better. So, I have empathy for designers in their inability to shift the system – these problems – necessarily because of this big pressure that’s being put on them just to keep the engineers coding.

Steve: Right. If you say you want more time to work on something that’s – what are the risks to you for doing that?

Leisa: Exactly, exactly. On a personal level everyone wants to look good in front of their boss. Everyone would like a pay raise and a promotion and the easiest way to get those things is to do what you’re told and ship the stuff. Keep everyone busy, produce the shiny stuff, let it get coded up, let it go live, learn, carry on. That’s thinking short term. That’s how to have a happy life. Is that how you’re going to fundamentally improve the product or service that you’re working on? Probably not. But it takes a lot of bravery to try to change that, to try to stop this crazy train that’s out of control. Throw yourself in front of the bus. All that kind of stuff. Like you said, it’s hard, especially when there are loads of people all around you who are very happy to keep doing it the way that it’s being done right now. So, yeah. And I think that’s research’s role, that’s design’s role. I would like that to be PM’s role as well. And most engineers are – a lot of engineers are very, very, very interested in making sure that the work that they’re doing is actually solving real problems and delivering real value as well.

Steve: So, as you point to the individuals there’s a lot of shared objectives. But if you point the system, which is – it’s the system of – it’s the rewards system and the incentive system, but there’s also some – I think there’s just sort of what day to day looks like. Sort of the operations of the system of producing technology products.

Leisa: I think there’s also – there’s something about like what do people like to see? What gets people excited, right? And graphs pointing upwards in the right direction is really exciting. Like certain outcomes are really exciting. Ambiguous outcomes, really not exciting at all. Having things that go fast, very exciting. Things that go slow, not exciting. So, I think there are all of these things that this collection of humans, that form an organization have a strong bias towards, that we get really excited about, that don’t necessary help us in the long run. You know you see lots of people who get really excited about the graph. Very few people dig in behind the graph to the data. Where did this data come from? How much can I actually believe it? Like people are so willing to just accept experiment findings without actually digging in behind it. And a lot of the time, when you do dig in behind it, you go huh, that doesn’t look very reliable at all. So, there’s something about we love to tell ourselves that we’re data informed or data driven, but a huge number of the people who are excited about that don’t have very much data literacy to be able to check in and make sure that the stuff they’re getting excited about is actually reliable.

Steve: So, how could someone improve their data literacy?

Leisa: I think that there’s a lot of work that we need to do to make sure that we actually understand how experiments should be structured. And understand more about – being more curious about where is this data coming from and what are the different ways that we could get tricked by this, right? And there are tons of like books and papers and all kinds of things on the subject. But actually, when you go looking for and you’re coming at it with a critical mind, rather than with a mind that gets excited by graphs, you know a lot of it is pretty logical. When you think about like – like surveys, right? Who did this survey actually – who actually answered these questions? Instead of just going hey, there’s an answer that supports my argument, I’m just going to grab that. To dig behind it and go like are these – are the people who are being surveyed actually the same people that we’re talking about when we’re making this decision. Is there something about the nature of those group of people that is going to bias their response? It’s remarkable to me how few people actually kind of check the source to make sure that it’s worthy of relying on. Everyone – a lot of people are really keen to just grab whatever data they can get, but that supports their argument. I think this is another one of those kind of human things, those human inclinations that we have that lead us towards kind of bad behaviors.

Steve: I can imagine that with this research education effort that you’re doing that people that participate in that are going to learn how to make better choices in the research that they then run, some practical skills, some planning, some framing, as you talked about. But it seems like that literacy is a likely side effect of that, or maybe it’s not the side effect. Maybe it’s the effect. How to be better consumers of research. Once you understand how the sausage is made a little bit then you understand, oh yeah, that’s a biased question, or that’s bad sampling, and there’s a choice to make in all of these and we have to question those choices to understand the research that’s presented to us. I hadn’t really thought of training people to do research as a way to help then in their consumption or critical thinking around research.

Leisa: Something else that we’ve really come to realize recently as well is that because of all of the other pressures that the people who are making these decisions are having to deal with as well, we think we’ll do much better if we train the entire team together instead of just training the people who are tasked with doing the research. Because something that we’ve observed is that you can train people and they can go this is great – I really see what I was doing before was creating these kind of not great outcomes and what I need to do to do better. And then you’ll see them, not that long later, doing exactly the opposite to what we thought we had agreed we were going to do going forward. And we’re like well what’s happening? These are like smart, good people who have been given the information and are still doing crazy stuff. Like what’s going on with that? And you realize they go back into their context where everyone is just trying to drive to get stuff done faster, faster, faster, faster, and you have to plan to do good research. The easiest research to do really, really quickly is the crappy research. If you want to do good research you do have to do some planning. So, you need to get into a proactive mindset for it rather than a reactive mindset for it. And you need the entire team to be able to do that. So, one of the things that we’re looking to do moving forward is not just to train the designers and the PMs, but actually to train a ton of people all around them in the teams to help them understand that the way that you ask the questions and who you ask them of and where – all the different things that could impact the reliability of your research – requires planning and thinking about in advance. So, we hope that means that the whole team will understand the importance of taking the time to do it and it won’t just be like one or two people in a large team fighting to do the right thing and being pressured by everybody else. So, I think it is – the education aspect, I suspect, is super important and it goes way beyond just helping people who are doing research to do better research. It goes to helping the whole organization to understand the impact and risk that goes with doing a lot of research activity in the wrong way or the wrong place.

Steve: Just fascinating to me and I think a big challenge for all of us that lots of researchers are involved in education of one form or another – workshops – a big program like you’ve been building. But most of us are not trained educators. We don’t have a pedagogical, theoretical background about – it’s about communicating information. It’s about influence and changing minds. And it just seems like a lot of researchers, from an individual researcher on a product team to someone like you that’s looking at the whole organization, we’re sort of experimenting and trying to build tools and processes that help people learn. And learn is not just imparting the information, but reframe and empower, like these big words where – I wish I had – I wish you had a PhD in education. I wish I had that. Or that I had been a college professor for a number of years, or whatever it would take – however I would get that level of insight. Personally, I have none of that. So, to hear us – you know we all talk about these kinds of things. I think research gives you some skill and prototyping and iterating and measuring in order to make the kinds of changes in the implementation that your making. I don’t know about you. I feel like I am amateurish, I guess, in terms of my educational theory.

Leisa: Absolutely. And I think talking to the team who are doing this work, like it’s really, really, really hard – really hard work to come up with the best way to share this knowledge with people in your organizational context, in a way that is always time constrained. At Atlassian we have a long history of doing kind of internal training. We have this thing called bootcamps. We’ve got almost like a voluntary system where people come in and they run these bootcamps and you can go and learn about the finer details of advanced Jira administration or you can come in and learn about how to do a customer interview. But the timeframe around that was like – like a two-hour bootcamp was a long bootcamp. Most bootcamps are like an hour. And so, when we started thinking about this we were like we’re going to need at least a day, maybe two days. And everyone was like nobody will turn up. But yeah – fortunately we have – we do day long sessions and people have been turning up. So, that’s great. But yeah, it’s a huge effort to try to come up with something that works well. And every time we do these courses the trainers and educators go away and think about what worked and how can we do it better. So, we iterate every single time. So, yeah, it’s a huge amount of effort. I think in larger organizations too, there are other people in the organizations who are also tasked with learning type stuff. So, we have a couple of different teams at Atlassian who are kind of involved in helping educate within the organization. So, we’re building a lot of relationships with different parts of the org than we have before in order to try to get some support and even just infrastructure. Like there’s just a whole lot of like logistics behind setting this up if you want to do it in any kind of scale. It’s great to have that support internally and it’s really good to start to build these relationships across the organization in different ways. But yeah, I think that we certainly underestimated the challenge of just designing what education looks like and how to make it run well. It’s a massive, massive effort.

Steve: It’s not the material – you guys probably have at hand a pretty good set of here’s how to go do this. If you brought an intern onto your team you could probably get them up to speed with material that you have, but you’re trying to change a culture. And I think the advantage that I have, the context that I have as an external educator is that people are opting in and that I have to by the assumption that they want to be there and I don’t have access to what the outcomes are. I might through someone that’s my gatekeeper, but it’s kind of one them. I have the responsibility for the training, but not the responsibility for the outcomes, which is what you all are kind of working with. So, I envy you and I don’t envy you.

Leisa: Well, I think I – I like it because, you know, going back to the build/measure/learn thing, right. Again, we did a learn before we did our build and measure because we’re researchers and that’s what we do, but it is – it’s super interesting to see the behaviors of people who have come through the training and see whether they shift or not. That gives us – we get feedback from people after the course who tell us whether they thought it was useful or not useful and what they liked and didn’t like, but then we actually get to observe because through our ops team, they come through us to do their recruiting and get their incentives. So, we can keep a little bit more of an eye on what activity is coming out and see if there is any sort of shifting in that as well. And it’s not just courses as well. It’s thinking about like what’s the overall ecosystem? What are other things that we can be doing where we are sort of reaching out to help support people on this journey as well. Before we did educating, we had advisors who kind of made themselves available to go and sort of support teams who recognized that they might have a need for some help with getting their research right. So, that was kind of our first attempt. But we had to pivot into educating because of the time factor. We would go in and give advice and everybody would go it’s great advice. We’d totally love to do that, but I have to get this done by next Thursday. So, I’m going to ignore your advice for now and carry on with what I was going to do anyway. And that was pretty frustrating. So, we felt like we have to invest in trying to get ahead of the curve a little bit – try to get ahead. Try to not necessarily influence the stuff that has to happen by next Thursday, but try to encourage teams to start being proactive and planning to do research instead of doing this really reactive work instead. Or as well as the reactive work perhaps. I don’t know.

Steve: Have the right mix.

Leisa: Yeah.

Steve: I wonder if – and this is probably not going to turn out to be true, but I wonder about being proactive and planning versus time. The time pressure to me is about oh we only have so many hours to spend on this, or that calendar wise we need to be there next Thursday. But being proactive says, well if we started thinking about it three weeks ago we’d be ready for it Thursday to do it “the right way.” I’m wondering, can we tease apart the pressures. One is like no proactivity, the sort of very short-term kind of thinking, is different than hours required. Is that true?

Leisa: I think so. Because I think that even want to do the short-term stuff we still spend like quite a lot of time and effort on it. And the planning in advance, like the more proactive work doesn’t necessarily, I don’t think, entail more actual work. It just might be that you put in your recruitment request a couple of weeks beforehand so that we can try to make sure that the people that you meet are the right kinds of people, instead of if you have to have it done in 3 or 4 days time then your ability to be careful and selective in terms of who you recruit to participate in the research is very much limited. So, we see – when we have those kind of time constraints you see everybody just going to unmoderated usability testing and their panel and that introduces a whole lot of problems in terms of what you’re able to learn and how reliable that might be. Yeah. I was thinking for a second about you know when – theoretically when you do your unmoderated usability testing you should still be watching the videos, right. So, that should take as much time as watching a handful of carefully recruited people being – doing these sessions in a facilitated way. But the reality is, I think, that most people don’t watch the videos, which speaks to quality.

Steve: Here we are back again. It seems like there’s a, for the profession overall, maybe one way to start sort of framing the way around this time pressure thing is to decouple proactiveness versus sort of hours burned. That it’s going to be the similar number of hours burned, but if you start earlier the quality goes up. I had never really thought about those two things as being separate.

Leisa: Yeah. And I don’t think people do. I think when people think about – and this is – I mean I sort of said the word quality. I’m trying not to say quality anymore. I’m trying to talk about meaningfulness more now, I think, because whenever you talk about quality to pushback that you get is well it doesn’t need to be perfect, so it doesn’t need to be academic. I just need enough data to be able to make a decision and understand that. But then I see the data upon which they’re making the decision and that makes me worry, right? And I think that’s – we want to get out of this discussion of like quality levels, more into sort of reliability levels, like how reliable does it need to be? Because surely there has to be a bar of reliability that you have to meet before you feel like you’ve got that information that you need to make a decision. But I see loads of people making decisions off the back of really dreadful and misleading data and that’s what worries me. And they feel confident with that data – going back to the data literacy problem. Like they really haven’t dug into why they might be given really misleading answers as a result of the who and the how and the why, all those decisions that are made around how to do the research, most of which have been driven by time constraints.

Steve: Okay, that’s the end of part one of my interview with Leisa. What a cliffhanger! There’s more to come from Leisa in the next episode of Dollars to Donuts. Meanwhile, subscribe to the podcast at portigal dot com slash podcast, or go to your favorite podcasting tool like Apple Podcasts, Stitcher or Spotify, among others. The website has transcripts, show notes, and the complete set of episodes. Follow the podcast on Twitter, and buy my books Interviewing Users and Doorbells Danger and Dead Batteries from Rosenfeld Media or Amazon. Thanks to Bruce Todd for the Dollars to Donuts theme music.

May 22 2019

47mins

Play

Rank #7: 16. Marianne Berkovich of Glooko

Podcast cover
Read more

In this episode of Dollars to Donuts, I speak with Marianne Berkovich, Head of User Research & Consumer Insights at Glooko. We talk about doing research through leadership changes, setting up opportunities for self-critique, and how to build empathy, especially in health technology, by experiencing some aspect of the condition and treatment yourself.

It really bothers me when smart people go out and build things and spend a lot of time and energy to build things that are not for humans. And I’m like, erh, why didn’t they do that? For me it’s more about empowering people who have that energy and who have that entrepreneurial spirit to make the things that are right. Not just make stuff, but actually, make the right thing. It’s a set of skills that I think maybe everybody should have and maybe once everybody has those skills and can do it well, maybe the role of researcher doesn’t need to exist. Until then, I feel like it’s my duty to go out and spread the gospel, as it were, of this is how you talk to users. – Marianne Berkovich

Show Links

Follow Dollars to Donuts on Twitter and help other listeners find the podcast by leaving a review on iTunes.

Transcript

Steve Portigal: Well, hi, and welcome to Dollars to Donuts, the podcast where I talk to people who lead user research in their organization.

I was reading the New York Times today and I noticed something unusual, although I’m seeing this sort of thing more and more. After the byline, but before the article starts, is some italicized text in square brackets. It reads “What you need to know to start the day: Get New York Today in your inbox.” This is copy that only belongs online, in the app or the website or an email newsletter. Presumably you would click on something. But I’m looking at the newspaper – I feel like I’m starring in one of those YouTube videos where a toddler is trying to swipe a magazine and can’t figure out why it’s not a touch screen. I see the New York Times making this kind of error in their print edition every few weeks, and it’s kind of appalling, because it suggests a lack of detail that I don’t expect from a high quality product like the Times. When you get an email that has the wrong name in the salutation, even though we’ve all done it ourselves, it brings your appreciation down a notch or two. They clearly aren’t taking the care that they used to, and that we would hope for.

And it’s especially interesting because I remember when the opposite used to be true; when the experience we had online, with news especially, was a not-quite-there translation of a print experience. And now it’s flopped. The print edition readers are not the primary customers. The organization has identified a different key user and it’s not us.

I can’t help but wonder about any of the users we learn about, are they in a less desirable category or perceived that way because of changes in internal processes or organizational structure? Do we care about their experience, or are we making them into what we call “edge cases” which is a fancy way of dismissing them. It’s hard to imagine the print reader of the New York Times as an edge case, but hey, that’s where we are.

I want to remind you that I’m looking for ways to be able to keep making this podcast for you. Here’s how you can help. You can hire me! I plan and lead user research projects, I coach teams who are working to learn from their customers, and I run training workshops to teach people how to be better at research and analysis. I’ve got two books you can buy- the classic Interviewing Users and Doorbells, Danger, and Dead Batteries, a book of stories from other researchers about the kinds of things that happen when they go out into the field. You rating review this podcast on iTunes, and you can review both books on Amazon. With your support, I can keep doing this podcast for you.

Let’s get to my interview with Marianne Berkovich. We did this a little differently – as part of a live event. We closed out San Francisco’s local edition of “World Information Architecture Day” with our interview, live on stage, in front of an audience. Then when we got off stage, we sat down in a somewhat noisy room and talked a little more to cover some of the things we didn’t have time for. We’ve cleaned it up as best we can but the audio may be a little bit different from how things normally sound. It was really fun for the two of us to speak on stage, and I hope to have more opportunities to record episodes of the podcast in similar settings.

Marianne is the Head of User Research + Customer Insights at Glooko. She’s worked as a consultant and for Google and Adobe.

Thanks for agreeing to talk with me, Marianne. Why don’t we just start by having you introduce yourself. What do you do? Where do you work? Tell us about that.

Marianne Berkovich: My name is Marianne Berkovich and I am Head of User Research & Consumer Insights at Glooko and we’re an online diabetes management platform.

Steve: What is an online diabetes management platform?

Marianne: That’s a great question. So, diabetes is a condition that’s a lot to do with numbers. You want to keep your blood sugar not too low, not too high. So, it’s really conducive and lends itself well to technology and checking your numbers. We have an app for the person with diabetes. They can track all sorts of things there that could affect their blood sugar and then they can also send that information to their clinician who can further see patterns of you’re high in the mornings and what are we going to do about that? So, it’s a platform that both the clinician can access and look at those things as well as for the patient themselves.

Steve: Where in the history of this company and its product did you come in, to bring in user research?

Marianne: So, I am the first user researcher. The company was founded in 2010. I’ve been there for about a year and a half and I am learning all sorts of things of what it’s like to be, first in a healthcare company, or a health tech company, and also being the first researcher.

Steve: Okay, let’s see how easy this is. What are some of the things you’re learning about being the first researcher and working in a health tech company?

Marianne: First of all, as a researcher, it just feels very weird to be answering the questions and not asking the questions. Maybe I answer questions in a way that makes it easier for the researcher to ask the next follow-up question. So, but you know what a great researcher Steve is, so this is showcasing his talent too, right! So I think some of the things that I’m learning is the role of advocacy. That even though there was a need for hey it’s time for a researcher, we need someone full-time – before they were hiring vendors or kind of doing it ad hoc. And so, it was time to have somebody who could be dedicated and who’s trained in this. But at the same time there’s a lot of – I don’t want to say – in some cases there was resistance, but I think in a lot of cases it’s more of just not knowing what good research looks like, or part of the reason my title is so long is that people had an impression that oh, user research is just usability studies. And so, I talked with my manager and we put in the “& Consumer Insights.” Head of User Research & Consumer Insights – so it’s like it’s everything. It’s going to be going out and doing field visits. It might be surveys. It might be a lot of different things. So, to kind of help with that advocacy. And I feel like I do spend – I mean, I think as researchers we always spend a lot of our time in advocacy mode, but I’m not surprised – or maybe surprised – it’s just taking a lot more of my effort to do that advocacy work of what research is and how can it be used.

Steve: I just want to clarify. You’re advocating for user research?

Marianne: Yeah. So, I think even though the company talks about being patient centered and user centered, that what does that really mean? I think one of the things that I’m finding out – so, there’s a lot of people who have been in the field for a long time and they’re like we understand diabetes. We’ve been in this for many, many years. And to have somebody come in – I don’t have any experience with diabetes; I don’t have diabetes myself – to come in with these, you know, different insights or different ways of doing things and people are like, we’ve been there, we know how to do this. And so, to advocate for, hey, we went out and talked to some people. We learned this thing that’s different and knew and it’s maybe against the conventional wisdom, how might we use this? And maybe some of that resistance of like, well we’ve always done it this way. Or conventional wisdom says this other thing.

Steve: So, the other part that you were learning was working in healthcare tech. It’s a new industry for you. So, talk about maybe what that’s been like.

Marianne: Yeah, so – and before that I was – I did a fellowship and we were working in the legal space. And the legal space, the health tech space, are slower moving than just straight up consumer products. It’s like hey let’s build something and we’ll launch an app and do these things. You know regulated spaces of like what is the FDA like? I had a little bit of background in human factors testing, but we have a product that actually got FDA approved. But what does human factors testing look like? How is that different from just regular usability testing? What does it mean to understand the whole ecosystem of payers and employers and insurance companies and PDMs and all that type of thing of moving that needle and kind of inching things forward when you’re in this space that can’t be disrupted so easily because there’s good reasons why these constraints exist.

Steve: So, the company was about 8 years in existence when you joined. So, how much of that were you building from scratch? Describe a little bit about what you encountered and what you had to build around sort of the tech and regulatory aspects maybe.

Marianne: So, actually, luckily the regulatory aspect is just one part of what we offer. It’s the Mobile Insulin Dosing System. So, when you start on insulin, how do you come up to taking the right dose of a certain type of insulin. So, the product does a lot more than that. So, this was just one small part of it. And it was actually – that was already in the works when I joined. So, the app was already out. The website was already out. MIDS was already sort of in progress. So, I think for me it’s been a role of a little bit of back to basics of like let’s learn about our users. How are people with Type 1 diabetes different from Type 2 diabetes. So, in fact, I was telling Steve before this, I’m doing an ethnography right now and so I was in Fresno this week talking to folks with Type 1 diabetes. I’m in Phoenix on Monday, but kind of building that basic understanding and actually building personas and really having a deeper understanding rather than like, yeah well, we have people now who are Type 1s, they know that they need. To move beyond that, to really have some foundational knowledge, and it’s causing us to go back and be like okay, well, is this – is there things in the app that are working the best way. Maybe we can make the clinician experience more efficient. Maybe there’s things that we learn when we go to visit a clinician – be like oh, that’s not how we thought that worked, can we rethink that. So, I think it’s a little bit of back to basics and rethinking certain things rather than to build anything from scratch.

Steve: Before you talked about advocacy which was kind of saying hey, we need to learn these kinds of things, but in some ways doing that reveals the most scary thing of all – oh we need to rethink assumptions and yeah, a 10-year-old company, it’s entrenched. So, what happens when the things that you’re uncovering are inviting, challenging – they’re inviting the challenging of established belief structures around the product and what it looks like and everything. How are you doing that?

Marianne: I think it leads to another kind of wrinkle of working in a startup is that we’ve had a lot of turnover at the leadership level, actually. So, the CEO who was the CEO when I joined was not the founder. He was the second CEO. And he reached a point where he was like, “you know what I really like getting the company to where it is now, but now that we’re really – it’s time to scale and grow things, it’s not what I want to do.” So, we got a new CEO. We got a new head of product. We got a new commercialization officer. So, all that advocacy I had done and sort of like when we did they Type 2 ethnography when I first started, like all that – you know taking people on the road and showing them and doing these empathy workshops and all that – whoosh, out the window because we’ve got a whole new cast of characters now. So, I think that was part of it too. It was a little bit humbling to have to re-sort of create my credibility again with – it’s a whole new cast of characters and a whole new set of people to influence. Luckily my manager has been a great champion of that. And so, I think it’s finding ways that we can leverage and sort of maybe it’s not the right time for things. So, we’ll go out and do this research and we have this foundational stuff. We sort of did a big aha, then those people went away. Can we save that and find another time to sort of bring that to the fore when it might be a little bit more accepted? So, I think it’s both sort of pushing for it and finding the right time to sort of introduce those things.

Steve: So, what does a manager do? How does that championing work?

Marianne: I think, – and I don’t know if this is true, but I feel like as a woman you know I can say a little of things if she’s also a woman, but like women in general, like it’s much more effective if instead of me saying it, I say like, “yeah, what Steve said.” Or, you know, what somebody else said. So, I think because of that, I think it becomes more effective. So, she’s not tooting her own horn. I’m not tooting my own horn, but she is sort of pointing to the work that I’m doing. Plus I think we have different kind of communication styles, and sometimes she can sort of, – I’ll say a bunch of stuff and she can sort of synthesize it in a really nice way and it sort of becomes a little bit more impactful that way, that like I can spend all my time sort of yelling and screaming – metaphorically speaking – and then she can sort of bring it home in a very succinct way.

Steve: So, part of what she’s doing then is also – you talk about finding the right moments. Or, what should we be doing to have impact on the organization as the layers above us are changing. I don’t know. I’m wondering about when new people come in, and this is going to vary on a case by case basis, but you’ve gone through this whole series of trying to open people’s minds up a little bit and say hey things are different. Those people leave. New people come in, but is there an opportunity there? Do those people – what baggage do they have, or do they bring in? I’m not asking you to like slag anybody individually, but as you see these kinds of changes happen in an organization and you’re trying to craft a story that’s about how the world really is vs. maybe what we hope or assume – how does the – what changes when new people come in, with or without baggage, around what the truth is?

Marianne: I think part of it is understanding where people are coming from. Because I think in my last role – I was at Google for a long time and I think like certain assumptions I made that like everybody knew what user research was and everybody knew this and I kind of got used to that, that that was the norm. Like product managers of course know what I do, and everybody knows how this works. And so, I had to figure out like, oh, okay, I can’t assume that people know what these things are, or that when I say user research and what you think in your experience with user research, like maybe that was focus groups. And they’re like yeah, we did those and it was great and I was like okay, great that’s a start. So, I kind of know where I’m starting from with folks. So, I think that’s part of it. And I think also, because I am the only researcher and things move a little bit more slowly, like we did the stuff around Type 2s and now we’re doing the Type 1s. And so now I’m using all that stuff that I learned last time of like what was effective in terms of running a workshop and getting people to come on visits with me. And it’s like it’s a little bit smoother again this time, but we’re doing that again this time around. So, it seems that having that opportunity to redo that foundational research has resurfaced itself. And so, to take advantage of that opportunity, that it’s like well, let’s look at Type 1s now.

Steve: There’s something here about you’ve had lots of experience. This is not your first job. You’ve been at lots of different organizations, you’ve done lots of research and influenced stakeholders and product teams and so on, but it sounds like that there’s a good measure of learning in this job which is about how do I do the thing that I know how to do to be effective in this context. Does that ever go away for researchers, do you think?

Marianne: I hope not. I think it’s – and I think that that’s part of the reason that like, you know – one of the reasons I left Google is like it took me a while to realize the thing that I was doing there and the things that – that, kind of really I’m passionate about are not things that I could do at Google. But like who leaves Google, right. So, I think finding ways to find my path and things that are not sort of the traditional way of moving up in, you know, you’re a junior research, then you’re a senior researcher and then you’re a manager and all these things. To find my own path and sort of be okay with that and find different ways to learn in different contexts. Like the fellowship that I did with Blue Ridge Labs which is a social impact incubator. And I was again, the only researcher, and so I got to do more mentoring. And so that was interesting. So, I think finding ways to both follow my passion and find ways that like what does this organization look like? And I think we’ve probably heard this all before in terms of doing sort of user research or bringing that lens to our stakeholders or our people that we’re working with as well. I think that part never goes away and it’s always changing because it’s always a different set of people.

Steve: Right, more so – is the landscape changing more in a place like Glooko than in a place like Google?

Marianne: That’s a good question. I think it’s very different. Google’s obviously a very – there’s just many people. And so, I think the dynamics of what it’s like to have an organization that big and when I joined I knew all the researchers and now the scale – each product’s team is much larger than that. So, I think it’s a different type of – it’s a different set of issues. So, I think wherever you go – like I’m from the East Coast and people ask – they’re like well what’s better, east coast or west coast, or all this stuff? And I’m like they’re different, right. I don’t think there are things that you can really compare. So, for it’s, – and this is my first time working in a startup and so I don’t have another startup to compare it to. So, maybe if I went to a different startup I could do more of that comparing and contrasting, but in some ways, it feels like apples and oranges of a large organization that’s established and, you know, is well funded and all that vs. a smaller one that is working in a very different space.

Steve: Just hypothetical – I know we’re not supposed to ask – apparently you aren’t supposed to ask hypothetical projection questions in user research, so good thing this is not user research. If you were to look for a job at another startup, I’m just thinking about you joining an 8-year-old company, is there a maturity that you would look for, or a timespan that you would think differently about in the next stages of your career?

Marianne: I think it’s less for me about that. It’s more about really getting jazzed about the problem that I’m solving. So, for me diabetes is a very big issue. Thirty million people in the U.S. and growing. And so that feels like a real meaty issue and that gets me excited. So, I’m like even if some days it feels like I’m pushing a boulder uphill, it seems like a worthwhile thing and that’s what gets me up in the morning. So, for me it’s much more about that and going in with eyes wide open of like well what kind of organization in it – is it, and like okay do I want to take that on. And if the answer is yes then working around those constraints because that’s just the nature of the beast.

Steve: You talked before about figuring out the right job title that would describe, to the rest of the organization, the way that you were going to work. As you – can you say more about sort of what that process was, what those conversations were that identified an opportunity, that helped get you excited that this was something you wanted to do?

Marianne: Well honestly, I think part of it was naivety, that like this was – I think it was actually the way it was posted or sort of written up was Manager of Consumer Insights and I’m like okay, whatever. You know, I’m not on a ladder. I’m the only one, so it doesn’t really matter. So, for me, I was so excited – in fact I was consulting before and so Glooko was one of my clients and I got hired. So, we crafted the role a little bit based on what I was doing. And so for me it was less about negotiating the right title and in fact I think only after I joined – because I was like, sounds great, like it sounds like this is a good fit, sign me up, and then only when I started and I was like hey, I want to make business cards, but I don’t really like this Manager of Consumer Insights – we had that conversation after I joined. And so that’s when my manager started telling me about like hey, this is some of the perceptions because I had gone in very naively also of like everybody knows what user research is. Like, we’re in Silicon Valley, like everybody knows what this is. And so, it didn’t even really occur to me that that advocacy, or the extent of that advocacy that would need to be done.

Steve: So, what – you worked with the organization as a consultant and then came in-house to kind of lead the effort. What was similar and different about – you know, the before and after that transition?

Marianne: So I think one thing is, it was very intimidating. I didn’t know anything about diabetes and so I ran my first study and like everybody showed up and so they were listening in on the call and I’m like how’s this going to go? Like I hope I don’t say anything stupid. And so, you know I think as a consultant, coming up to speed on a different domain very quickly, and I think to me it’s about asking those open-ended questions and asking questions in a way that it doesn’t really matter if you know the domain. I mean it’s certainly much better if you do, but there are ways that you can sort of cover that up, and especially if you’re going in with an apprentice mindset and all those types of things, it kind of helps the situation along a little bit. So, I think for me that was definitely intimidating to not know the domain and have everybody show up.

I think another difference is as a consultant, and I wasn’t a consultant for very long. It was about a year or so before I decided to take the role with Glooko , but I was very cognizant of how I was spending every single hour and whether it was going to lead to making money or not, because is this going to generate a lead? Is it going to generate a sale? All those types of questions were very much top of mind whereas when you’re in-house I think you can spend the sort of hours – like you’re spending your capital in a different way. You’re building relationships and that’s kind of how you’re sort of making money. So, to me it’s sort of a different way that you don’t have to be so aware of every hour leading to money.

Steve: I’m going to switch gears a little bit and maybe we could just rewind. I’d love to hear you describe maybe your path. How did you get into this stuff? What brought you to where we are sitting today?

Marianne: So, I was an English major for undergrad and after I graduated I didn’t know what I wanted to do. So, I spent a couple of years – I actually at that point did work. I was in D.C. and I worked at the National Museum of American Art. We were digitizing the collections. It was a very long time ago and so I was actually cleaning up all the photos and I’m like eh, I’m not really into this technology thing. My Dad also was a computer science professor. I’m like definitely not into that technology thing. Like that’s stuff that my Dad does. Boring. So, I spent a couple of years kind of bumming around a little bit, decided to move out to Denver and I started working as a technical writer. And I was working at a financial services company and so my role was to write the help text for this complex financial software. So, I would sit with the developers. They would explain to me how the complex software worked, and I would write it up. And then I had a thought. I’m like if we just made the software easier to use I wouldn’t have to write anything. So, that led me to first get a certificate in technical communication. And in that I started kind of looking around a little bit more and found the whole field of human centered design and human computer interaction. I looked around and I found a program at Carnegie Mellon. And at that point I had actually switched over to Lockheed Martin, which is a great role – you can ask me about that in a second. Sorry, I just keep getting the questions to ask. And, but I – you know I decided that I really needed a degree in this thing. That just kind of reading books about it or whatever wasn’t enough. So, I decided to go to grad school and I remember just seeing that description of the master’s program and I was like that’s exactly what I want to do and like all my life had been leading to that point to be like that’s exactly what I want to do and having that feeling of like, yup, this is the direction I want to head in.

Steve: Which master’s program? And then keep going. Yes, please.

Marianne: It was a master’s in Human-Computer Interaction at Carnegie Mellon University. So, the role at Lockheed Martin. So, I’m old. I’m definitely over 35. And this was sort of in the olden days and what we were doing is taking – so, you know how we have wildfires and all that type of stuff? So, the resources to manage all those things, like sending the air tankers and the trucks and the crews and all that – that was being done by hand. So, Lockheed Martin got a contact to turn that into – it wasn’t even online. It was just kind of a digital program to do that. So, we would interview subject matter experts and I was like a requirements analyst. Like we didn’t have roles of like designer researcher. Like that wasn’t a thing yet. And so, for me it was like super exciting to solve a real problem. We got to actually go to Boulder and like see the command center and see some of those things. I mean I really felt like I was making a difference and I knew what this was about. But I felt very much at a disadvantage of like, I don’t know what should the icons be? I don’t know. Nobody has best practices around icons yet. And so, it felt like early days and that’s why I really wanted to go back to school and like learn some stuff because I knew there was a lot that had been done already.

Steve: So, you come out of that program and where do you go?

Marianne: Consulting. Yeah, I actually stuck around Pittsburgh for a while and I again ran my own consulting thing. I do that to sort of – when I’m in exploration phase. Or, when I’m dating somebody who just started a PhD program and I can’t leave Pittsburgh. One or the other. So, yeah, and the guy I was dating, he was very entrepreneurial, and he was like such a great cheerleader. He was like yeah, we can figure this out. He had been running his own startup before he started the PhD program and he was like, yeah you can do this. I’m like, yeah, I can do this. So, he did help me a lot and so I was finding projects and was kind of getting my feet wet with that. But then I found that it was time to leave Pittsburgh. That it wasn’t what I wanted to be doing. I really wanted to learn from other people and so I started looking for other opportunities.

Steve: What were those opportunities that you found?

Marianne: You can take it in a different direction. So, I actually interviewed at Google at that point and they didn’t want me, I didn’t want them. It was a very different company at that point. But I also interviewed at Adobe. There’s actually a pipeline from Carnegie Mellon straight here to Silicon Valley. I mean seriously, just pull up a bus, just put the graduates on it and bus us all out here. So, there were a bunch of people who I had gone to grad school with who were at Adobe and I went to interview there, and it felt like a really nice fit. I remember actually – after the Google interview I came home to my hotel and I just turned on the Simpsons and like all I could do was just sit there and like not move. But after the Abode interview, I actually – it was one of my first times out here in Silicon Valley and like I went hiking. I was like I was that inspired and that energized that I went hiking. I’m like this is a sign that maybe I should take this role, that this is a good role for me.

Steve: We can cue up some Simpsons for you after this conversation. You know, just put you in a dark room. Um- so, when you talk about the work that you’re doing now, you’re doing field work, you’re doing ethnography, you’re taking people out. When did you learn how to do that?

Marianne: That’s a good question. I feel like I should credit CMU with setting me on the path and teaching me about contextual inquiry – thank you Bonnie John. And I think for me one of the things that they told us at Carnegie Mellon is – it’s a one-year program, so if you didn’t come in as a computer scientist you’re not leaving as a computer scientist. If you didn’t come in as a designer, you’re not leaving a designer. But we’re going to give you enough exposure to coding and designing and having your stuff on a wall and getting critiqued and to do some social science stuff. And so, for me, having been an English major and I minored in art history and theology, I mean I was as liberal arts as you could get. I wasn’t going to come out as any of those things and I didn’t gravitate towards it. I was like I’m not a designer. So, I think I just gravitated towards research. And I think maybe the English degree kind of prepped me for that in terms of asking good questions and thinking about things and looking for patterns. So, I think just naturally I’m a pattern seeker and I think honing my ability to sort of ask questions just came with time. So, I think it was having a good solid foundation and then just kind of having a natural affinity for it. And I read your book – much, much later, but I did read your book.

Steve: Right. By the time my book came out you’d been doing this for a while. Can you talk about the environment at Adobe? What was there there to develop the practice of user research? In the practitioners like yourself that are coming in with X amount of experience, what did they do well that let you get to the next level of your craft?

Marianne: I think actually I was really lucky at both Adobe and Google, being surrounded by really smart people. So, that made it okay of like I don’t know how to do this, and can someone help me? Like I learned how to do surveys through some of the really awesome people at Google. So, I think for me it was just being surrounded by really supportive people and I think that was one of the things that was part of the culture at Abode. Like we would go to my manager’s house. She would have like all the researchers at her house and we would do these offsite – and it really felt like family. I mean it sounds a little cheesy, but – and so it felt like it was a place to grow and learn. So, I think for me that was a great stepping stone as my first sort of real job after graduate school, to be like oh, okay, this is what being a researcher is like. Oh, this is how you interact with product teams. So, I think it was just learning all those basics. But I think one of the things that – kind of coming back to the question of learning to do ethnographic work and all that is once you’re out – and especially you’re probably one researcher. You’re rarely lucky enough to work with another researcher who can sort of observe you can say like hey, why are you asking that question? Or, maybe we could do it differently. One of the things that I found, when I did have some opportunities at Google to work with other researchers, I’m like we don’t all have the same skills. Like what happens between you and a participant is different. And so, one of the things that having noticed that is I started a class at Google to critique ourselves. So, give researchers an opportunity to observe each other moderating and to give feedback to each other. And then I read a Medium article about it. It’s called Don’t Leave Data on the Table, if you want to look it up. I think that’s one of the things too is that we don’t get that critical eye anymore and we assume that like every researcher is the same and it’s like well actually, you know – and those things can also creep up on us. Like I see myself all the time, especially now that I’m sort of teaching other designers at my company more of like, ah, leading question. Yes or no question. You know. So, I find myself still doing that, even if I’m aware of it. But I think having those things pointed out to us is also really helpful too.

Steve: What’s the structure of the critiquing class?

Marianne: So, it was slightly different when I taught it externally at CHI, and internally. So, internally all our videos are available to each other. So, basically people would send in clips from research that they did. We would make small groups, so it lets you work in a group of five, and we would watch a segment of somebody conducting a study. And at any point people could stop the video and be like hey, I noticed this. It was like when we were actually practicing to teach it, one of my co-teachers, she actually had her pen and she would do a lot of like pointing and sort of like do a lot of things that indicated high status. And she was like, I never noticed that – until she saw herself, until we talked about it, she wasn’t aware of that body language and what it was conveying. And so basically we’d spend 5/10 minutes, or however the time divides up, to watch that video and stop if we see anything. And we asked people to rent that idea that like, again, we don’t know the context of what came before. We don’t know the context after. But this is what I’m noticing. And that’s what we do as researchers is we notice. So, it could be a pattern, it could not, but at least invite people to consider something that they’re doing. And also, by seeing other people, just having that conversation, seeing what other people – and be like okay, I didn’t show that, I didn’t do that in my clip, but like I’ve definitely done that before too. So, learning from each other as well has been really valuable.

Steve: It sounds like a way to create kind of a safe space where it’s okay to critique is that everybody is kind of up for that critique and we’re all being exposed together. Is that kind of the way to make it safe? Because this is our workplace and now we’re talking about how we’re not living up to sort of the high standards of our profession.

Marianne: Yeah. And I think that’s one of the things that we tried really hard to make it a safe space. And one of the things that we did was actually we had – spoiler alert – we’d give out like balloons, which are also like these cords that you pull to sort of stop the production line. But just having a room full of balloons made it a little bit more fun. When I taught at CHI we had to create the videos and so I had these little stuffed animal guys that people could put their phones on when they were doing the videos. So, then we had little stuffed animals on all the tables. And so, – and also modeling it ourselves that – I don’t remember if we actually showed some of the videos, but we definitely told some of the stories of like when we did this ourselves – it’s like I’ve been a practitioner for how many years? Like I’m still learning. I still screw this stuff up. So, making ourselves vulnerable was part of making that safe space too. And also, I think what we tried to do was, when we did it within Google, was not have everybody who was on the same team be together. So, it wasn’t like anybody who you worked with directly.

Steve: You used the phrase “sort of screw this up” and I think like the pen example and I’m sure there’s others that are clearly things we shouldn’t do as researchers. To me it seems like there’s a certain amount of stuff that’s subjective that you and I would do it differently because we’re different and we have different personalities and different energies. For me, critique about that would be really interesting because I like to see how other people handle things. My question here I guess is, is there – does the critique approach look at alternative approaches? How do you talk about that vs. best and worst practices?

Marianne: I think we also gave an opportunity for people to stop the cord on themselves. To be like I didn’t know what to do here. Like help me brainstorm. So, there’s that opportunity too of like for the person themselves to be like how will we handle this? And to hear different perspectives on – there’s, you know, I could have done this, or I could have done this, or in situations in the past I’ve handled it differently. So, I think having just that variety of there’s not one right way to do that, that it depends on the context. And maybe sometimes somebody explains like well this is what happened earlier in the session and so it led me to do it in this way and we’re like, oh, okay, well that makes a lot of sense. And so, I think having a little bit more of that context helps too.

Steve: Are you able to bring any of this into the environment you’re in now where, if anything, you’re teaching people and leading them.

Marianne: Yeah. And I think I’m learning a lot more about screwing it up also. While I’m out in the field doing these Type 1 ethnography, like I’m trying to help designers to be able to do their own research. And so, one of our more junior researchers is actually doing one on one interviews. So, I helped, and we talked about it and I wrote the discussion guide and the study plan. We talked about it and she was all on board and off to the races we went. And then I actually had a cancellation, so I was able to watch one of the sessions and I was like oh no, I didn’t – I screwed up. I didn’t prepare her well enough. It was a little bit more than she could handle and that was squarely on me, you know. That because I’m so used to doing it and it comes so naturally to me that I didn’t recognize that she was in a different space and she would need a different level of support and that I just hadn’t prepared her. And so, we role played a little bit and we practiced a little bit. But I still feel like – I feel guilty. I’m like she’s got more sessions this week and like I don’t know if she’s really prepared. So, I think that part is on me as well to make sure that the things that I’m modeling and the things that I’m sort of teaching are doable for where the designers are and not just hey it’s easy, anybody can do this.

Steve: In my opinion there are some things in life that we do that we maybe learn more from screwing up than we do by succeeding. I don’t know what falls into that category and what doesn’t. I wonder if, as much as you feel for this person and feel responsible for them, I wonder just ultimate perspective, did you set them up for a lot of learning they might not have otherwise got to by – of seeing the edges of what they’re able to control or what they’re able to execute on?

Marianne: I can’t speak for her.

Steve: Good. Alright. Well we hit a dead end. Let’s see what other questions I have written for you here. If we were to have this conversation again in a year or two years, what kinds of things would you want to talk about that you would have achieved in this organization?

Marianne: I hope that there’s less sort of friction. I think we’re already starting to see the seeds of it, that you know product managers who have been at the company for a long time are saying like oh, I’m doing this new thing, you know I need user research. They don’t know what it is. They don’t know why exactly, but they’re sort of like it’s a good thing generally. So, I think to have more of that. Honestly, I think like some of the things that are more – a little boring, kind of more infrastructure stuff – like I hate recruiting. I’m a terrible recruiter. I’m terrible with like time zones and all those details, that type of stuff. So, like I’d love to have a recruiter. I’d love to have another researcher on board. And some of the things that I feel like I have a great success under my belt, that it looks like we’re moving away from NPS (Net Promoter Score), hooray, thank you, thank you, and moving towards satisfaction. You know asking satisfaction question and how often we’re going to do that. You know, that’s a really interesting thing that I’ve been thinking about is can we pop that up in an app. I mean this is an app that people use for health and like they might be having a low blood sugar. You know this is a stressful time that people are using the app. Like nobody is excited about managing their diabetes. So, to pop up something in the app, to be like how are we doing, that’s not the right thing. So, I haven’t figured out like how we might introduce those satisfaction questions and when is the right time to do those things. But at least moving away from net promoter and to asking the satisfaction questions and asking that on a regular cadence so that we can see over time like are we getting better and kind of what the trajectory is, I think that would be a huge thing to have accomplished as well

Steve: That’s a great note to end it on. So, thank you very much. It was a great conversation. Thank you. Thanks, everybody.

Okay that was the end of the interview on stage, and now here’s the remainder of our conversation. So, welcome back Marianne.

Marianne: Thank you. It’s the bonus track, huh?

Steve: Right, this is the director’s cut version of the podcast. So, one of the questions I wanted to follow-up with you on is you talked about the work you’re doing on the Type 2 and the Type 1 and really trying to drive change in the organization. I’m wondering, sort of tactically or practically, what are ways to bring those experiences that patients have into the organization itself?

Marianne: I think actually one of the most fundamental ways that it started is using our app itself and so – I don’t have diabetes myself, but for the first 2 weeks that I was – or first couple of weeks I was at Glooko, I sat down with our certified diabetes educator and we pretended that I had just been diagnosed. And so, she talked me through what somebody would hear. How they would be introduced to pricking their finger and measuring their blood glucose. And so, I spent 2 weeks and I logged all the food that I ate, all the exercise that I did. I measured my blood glucose twice a day just to get that experience of what it was like and like yeah, it was pretty crappy, even after a couple of days, to prick your finger. And to imagine that’s for the rest of your life. Like, I had the luxury of stopping. And so, for me it was a really small way to understand what it was like for somebody who was newly diagnosed with diabetes to do that. And I also presented that back to the team and I sort of, encouraged other people to try. So, we instituted a program. So, anybody who starts at Glooko now can get a blood glucose meter and to try it out themselves and to see what that experience is like. So, it’s one small way to at least start building that empathy.

Steve: Are there others that you want to describe?

Marianne: Yeah, actually, so I wrote a Medium article about different ways to build empathy when working in health tech. So, one of them was experience it yourself. So, actually to try the app and try measuring your blood glucose all the time. See for yourself. So, when we did the ethnographic visits, to bring people out so they could actually talk to and see people. And listen to others. So, some people that I work with actually do have diabetes and so I wanted to hear their stories. So, we started something called Life with Diabetes Storytelling. So, just a couple of folks each time and they’ll either tell their own story of how they were diagnosed and what it was like. Or some people have children who have diabetes and that whole experience. Or some people have parents who have diabetes. So, it humanizes it even further because these are people that we actually work with. So, it’s one thing to visit somebody’s home and you never see that stranger again, but these people that we see all the time and we think nothing of somebody at the lunch counter, or lunch table, pricking their finger and taking their blood glucose, or pulling up their shirt and dosing insulin right at the table. Like that’s just kind of what happens. And to ask those questions and making it okay to ask like well when did you tell your wife? And what is it like to do sports and kind of do those things that you used to do? So, it’s been a really great way to create empathy and even bond as a team, to hear each other’s stories. We put them up all on the website so people who start new can watch the videos themselves and learn about people stories.

Steve: And what’s – the format is videos. Are there are materials that are – I guess describe what this looks like?

Marianne: So, I was a little ambitious at first. I was like we’re going to have three people who speak once a month and I realized the company’s not that big and a lot of people actually don’t want to talk. And even I was encouraging people to – even if you don’t have diabetes, you don’t know anyone with diabetes, like there’s lot of this stuff on the Internet. Like watch some videos. Read some people’s blogs. Like bring that in. Haven’t had any takers on that yet. So, now kind of what we’re doing is we just have two speakers and we do it about every other month. And I just give people the floor. I say you’ve got 15-20 minutes, tell your story. Some people take less time. Some people are a little bit more hands – they take a little bit more time. But all I’m doing is giving them the floor, just creating that space. And it’s been wildly popular. Like engineers will come. Like people – we actually fill the room where it is. So, it’s just an opportunity for people to tell their story. And I also wanted to make it a little bit special, so the JDRF has these little bears called Rufus the Bear that teach kids who are diagnosed like how to do their insulin and all that. So, I’m riffing off that. So, I have these little bears that I give people after they speak, and it says, dear so and so, thanks for sharing a diabetes story. So, it’s personalized and people sit them on their desk so that when you walk around you can see who shared a story. So, I wanted to make it a little special of like, yeah, thanks for sharing and we know whose stories are out there too.

Steve: That’s great. So maybe a different question, following up on some of the things we talked about earlier. You know thinking – we talked about how you learn research and kind of your process through these different environments that you were in. You know at this point, do you have a super power?

Marianne: I think my super power is complexity busting. So, I think that based on what I’m hearing – and I think qualitative research gets poo-pooed a little bit, but I found that being able to suss out those patterns from just a few interviews and turn that into a robust framework of – you know I did a thing when I worked at Google around ads and what are the characteristics in ads and that framework is still being used. So, some of the things that I’m doing now in terms of persona research – or developing personas and also the Bingo card framework of the needs of people with diabetes. So, there’s a couple of columns, there’s a couple of rows and people have different needs and sort of fill that down individually. And those types of things that lead to action and sort of like synthesizing, encapsulating like here’s the things you need to know. Like, I’ve thought about it and I’ve taken all this chaos and I’ve put it into something that’s usable and actionable that the teams can use. And I think that’s one of the things that really makes me happy when I’ve figured something out and then turned it into a product that the team is like aha, we can run with it. This is actually useful for making decisions.

Steve: Right. I think those two pieces are really interesting because I’m with you on the finding patterns rapidly and being able to tell a new story about them, but I think you’re productizing them and communicating some kind of communication design around them so that other people can understand that, I think that’s two superpowers in one, maybe the way you’re describing it. Yeah. Do you have a way of thinking about your own brand, your own identify as someone that – as a researcher, or someone that works in tech, works in these kinds of spaces? How do you think about yourself that way?

Marianne: I think for a long time I thought I’m sort of like everybody else and like we’re all sort of doing the same things. And I have an opportunity now – I’m part of this incubator for women leaders and it’s having me sort of rethink what is my brand? And that one needs to have a brand. I think I also maybe started thinking about it when I started having my own consulting practice and I’m like how do I stand out and what do I offer and how do I package these super powers, and kind of bring that to the fore? So, – and part of this incubator that I’m in, it just started somewhat recently – but the women who are in it, like it’s not just people who are in tech. I think a lot of us are in tech, but we have like doctor and we have some lawyers and we have some other people who are doing other things. And so, it’s great to be with that different community. And so, one of the things that we did in our first session was really think about ourselves as a company. It’s like we all work for companies. We’re all like what’s the vision statement? What’s the mission statement? Every company, you’ve done SWOT analysis. So, we’re turning that lens on ourselves. And so, part of our homework for this month is to figure that out. And I think one of the things that I was thinking about is my mission is really user centered everything. That user centered stuff can be applied to any domain and having worked in finance and wildland firefighting and consumer stuff and with creative professionals, I think I’ve seen it there. But that’s all been within tech. But really, applied to things outside of tech. So, it can be applied to parenting. It can be applied to government. It can be applied to just a lot of different things. Like in fact I joined a gym recently. It’s called 9Round Kickboxing. And I realize how user centered they are. So, I’ve got to riff on this for a little bit. So, some insight somebody had along the way was people don’t want to spend a lot of time at the gym. They want to go at any time they want to. They don’t want to wait for a particular class and they like the special attention from a trainer. So, what does 9Round offer? You can literally go at any time. There’s 9 rounds that you do in order, but you can start as soon as you get there. Each round is 3 minutes. So, you’re done in 30 minutes and it’s one of these high intensity workouts. And there’s a trainer there who can like help you and spar with you at a particular moment. I’m like this is a fabulous example of being user centered and really understanding what your customers need. And I was like I signed up for the year. I’m like done. This is exactly – I’m your target user. You’ve figured out my needs and this is fantastic. So, thinking a lot more about human centered everything and how can we apply that to domains and sort of just everyday life because when things are human centered it’s more humane.

Steve: Is there a source or a reference or an inspiration for that human label, kind of in what you’re looking at?

Marianne: I think it came from you know having worked in a lot of different domains. I think what really crystalized it for me was I read a book called More Human by Steve Hilton and he’s married to a Silicon Valley exec, but he was in the UK and then he moved here and he was like – went to – started hanging out at Stanford and came upon design thinking and he’s like aha, all these things I’ve been doing and thinking about policy and applying it to business and all these things. And so, he really crystalized it and I’m like, yeah, I believe that. Like what if our businesses were more focused that way and what if like the policy of how do we get rid of homelessness actually like prototyped for us, rather than just spending tons of money and doing a particular project for 5 years. So, I think he really captured that human centered everything for me.

Steve: So, as you think about that as your brand, how are you going to operationalize that brand – to use a terrible phrase. But how does this go beyond a concept for you? What do you think you’re going to be doing?

Marianne: So, I think it’s something that I’ve been doing in terms of mentoring, like working with entrepreneurs – so when I left Google, I was sort of exploring two paths. One was more mission oriented something or other and the entrepreneur side of everything. So, right now my full-time job is much more in that mission driven kind of aspect of it, but I’ve also been doing mentoring for entrepreneurs. I’m mentoring a woman in Venture for America right now. I’ve done kind of office hours for a bunch of things. I’ve taught at Nasdaq Entrepreneurial Center. And so, I think those are ways where I can find different ways to apply it and find ways that – how do you find intrapreneurs and entrepreneurs to give them the tools to be able to ask questions and go out and talk to users to apply it in a lot of different ways? That is a way to make that happen. And so, I think that for me, whether it’s my full-time job, or kind of like the balance of those things may change in the future. So, maybe I’ll be spending more time and I’ll find a role that lets me do more of that working with entrepreneurs and helping them develop that and maybe less in this oriented stuff, or maybe I’ll be able to find ways to combine the two. But I think for me, in sort of the next phase, it’s kind of how do I find the overlap, or sort of finding the balance of those two things?

Steve: You know sort of the thesis of this podcast is around – I don’t know thesis, but just the people we want to have as guests are people who are user research types and I’m wondering, as you describe this sort of human centered everything, and these mentorship roles, are you being a researcher, as a way you would define it in doing that work? We have labels and so I want to ask you to kind of unpack the labels and see what you’re sort of framing on this work you’re exploring vs. maybe the work that connects you and I and why we’re having this conversation?

Marianne: Yeah, that’s a really interesting question. I think for me, it’s – I think once I get sort of hot under the collar about something, like everything else goes out the window. You know I was talking earlier about once I found Glooko and I’m like wow, they’re working on a really big, juicy problem. This is something I wanted to do, who cares what the actual label is, or what the title is. So, I think for me, like it really bothers me when people – smart people – go out and build things and spend a lot of time and energy to build things that are not for humans. And I’m like, erh, why didn’t they do that? So, I think for me it’s more about empowering people who have that energy and who have that entrepreneurial spirit to make the things that are right. So, not just make stuff,
but actually, make the right thing. So, for me it’s a set of skills that I think maybe everybody should have and maybe once everybody has those skills and can do it well, maybe the role of researcher doesn’t need to exist. But until then, I feel like it’s my duty to go out and spread the gospel, as it were, of this is how you talk to users. This is how the type of information you can get. These are types of things that you can’t get and the limitations of the things that we can do as well.

Steve: So, it’s a really articulate lovely reminder of why we do research? It’s what we’re trying to have happen and you’re looking at achieving that goal through research, or through mentorship, or through advocacy and all the things that you’re doing are putting stuff out in the world that helps people – I don’t want to put – now I’m putting a lot of words in your mouth, but there’s kind of a producing of things ideal that you are a champion of. So, I see research – as you explain it, I can see research as a part of that, but not the only one.

Marianne: I think you summed that up really nicely. So, thank you for making me sound more articulate.

Steve: That’s the reflecting back technique. Okay, so now maybe wrapping up our epilogue, anything to add in this? Anything I should have asked you about?

Marianne: I was just thinking another user centered everything. I’m doing some volunteering and working with a foster child and I’ve been able to turn that into a user centered thing too. Because she doesn’t live with me, I just see her for a few hours each week, so I think very intently about the activities I’m going to do and being a crafty person also, I have spent a lot of time on YouTube videos and Pinterest and whatnot and making activities that I’m like well we’ve got to work on like fine motor skills. We’ve got to work on like counting or matching and stuff like that. And I’m having such a great time designing these little activities for her and also seeing how she responds and like, oh, she doesn’t like those types of things. Or, she needs more of these types of things. So, I feel like bringing it to – like I think I just naturally bring it to all aspects of my life and I see the power that it has, and I just feel I’m like – I just need to be a champion for those things because I see the value of it and I just want the world to know.

Steve: That’s a lovely note to wrap up on. So, thank you so much, Marianne.

Marianne: Thank you so much.

Steve: Well! That wraps up this episode of Dollars to Donuts. Go to portigal.com/podcast for the transcript as well as links for this episode. You can follow us on Twitter, and subscribe to the podcast at portigal.com, or iTunes, or Spotify, or Stitcher, or anyplace excellent podcasts are distributed. You can buy my books are available at Amazon and rosenfeldmedia.com. The amazing theme music was written and performed by Bruce Todd.

Mar 06 2019

50mins

Play

Rank #8: 20. Leisa Reichelt of Atlassian (Part 2)

Podcast cover
Read more

This episode of Dollars to Donuts is part 2 of my conversation with Leisa Reichelt of Atlassian. If you haven’t listened to part 1 yet, you can find it here. We talk about corporate versus government work, scaling research, and changing organizational DNA.

I love research, I love the way that we learn things and what that means, but the thing that really drives me is seeing an organization almost like a design problem and thinking about like what do we – what levers can we pull? What do we choose to do? How do we position ourselves so that we cannot just do fun research, but we can actually really have this knowledge and this insight and this practice fundamentally change how this organization operates? – Leisa Reichelt

Show Links

Follow Dollars to Donuts on Twitter and help other people discover the podcast by leaving a review on iTunes.

Transcript

Steve Portigal: Hey, and here we are with another episode of Dollars to Donuts, the podcast where I talk to the people who are leading user research in their organization. This is part two of my interview with Leisa Reichelt. If you’re just joining the podcast with this episode, I’d encourage you to go back to the previous one for the first part of our conversation.

As a reminder, my public workshop Fundamentals of Interviewing Users is happening September 13th in San Francisco. There’ll be a link for more information in the show notes. It’d be great if you recommended to this a friend or colleague. I also teach classes directly to in-house teams so reach out to learn more. Beyond teaching, in my consulting practice I also lead user research studies, so let’s talk if that’s a way that I can help your team.

Of course, supporting my own consulting work is the best way to support Dollars to Donuts. Share your feedback about the podcast by email at DONUTS AT PORTIGAL DOT COM or on Twitter at Dollars To Donuts, that’s d o l l R s T O D o n u t s.

You know, as user researchers, we love our sticky notes. A few years ago, in an article about how IBM was being transformed by design, a key achievement in this transformation was the ability for staff to order post-it notes. In some ways, that was the saddest thing that I ever heard, that IBM was so broken that ordering a quotidian office supply item was verboten, and that enabling this was seen as a victory worthy of mention. But it also was very real and acknowledged how much of uphill battle these kinds of corporate transformation efforts really are.

So this carrier of innovative meaning both in its legendary origins and in its rapaciously consuming audience, this product of the 3M company clearly struggles itself with innovation. We’ve had the weird dispensers, the odd sizes and shapes, the reverse fan fold which may be tied to the dispenser but I’ve mostly just found them showing up in the most aggravating moments in a session. I think many years ago they came out with Super Sticky notes which seems like a complete contradiction of the value proposition but I think it’s just an acknowledgement that the original formulation fell off more than we’d like. I don’t think of them as Super Sticky, just the proper amount of sticky for most uses. Anyway, the latest thing I came across really made me scratch my head. It’s a pack of bright orange Post-Its, probably part of a series of exciting new notes with the aspirational branding A WORLD OF COLOR. This particular pack had an even more aspiration and even less relevant tagline “Rio De Janeiro” Collection. I can imagine interior paint, fabric even car finishes being marketed this way but it’s so strange to see on a package of sticky notes. The post-it, for the researcher, the designer is a backdrop, a carrier for something else. It already is aspirational, because of what we’re putting on it. Making it a “collection,” associating with a far-off fabulous city, is just ridiculous.

Okay, here’s part two of my interview with Leisa Rechelt, the Head of Research and Insights at Atlassian in Sydney Australia. It was quite an in-depth interview that’s been broken up into two parts. This is part two, you can check out the previous episode for part 1. Let’s get to it!

I’d love to hear about some of the other kinds of organizations you’ve worked in and what those have been like.

Leisa Reichelt: Sure. Well, for probably 5 or 6 years before I joined Atlassian I worked in government. First of all, in the UK and then I moved back to Australia and worked in the Australian government for about months. Something like that, I think. And that was super different, super different to Atlassian. So, it was a much more kind of familiar ground for me in that it was organizations that you’d go into going we should really go and like involve people in our design process and they would go why would you possibly want to do that? So, that’s a whole different problem set than what we’re dealing with, I think, in some of the tech companies. But hugely rewarded as well. So, yeah, really very different.

Steve: Both those governments seem like their commitment – I guess they’re just different cases, but their respective commitments to sort of design – I don’t know, digital services seems to be the term that gets thrown around for that. But it seems like there’s been significant commitments in both those cases. I mean you’re coming into environments where someone has said we want to do this, we want to change that default. And – like in the UK what was sort of the – how did that get initiated?

Leisa: Okay. So, in the UK we had a MP, Francis Maude, late in his career. I don’t exactly know how it came to him that we could probably be doing better with our digital services than we are. I don’t fully have that back story, but it did come to his attention that we could do better. And he recruited a lovely woman by the name of Martha Lane Fox – Dame Martha Lane Fox now – to basically help him come up with an approach to how we should solve the problem of the UK digital government services not being what they need to be. Martha worked with a bunch of very smart people. In particular, a guy called Tom Loosemore, to come up with some recommendations. And it was off the back of those recommendations that the government digital service was put together. Tom and a bunch of other people that he’d worked with in various places around London and the UK came together and started working on trying to transform how government thought about approaching digital services. And they had – their design principles, I think, were really the best way of setting out what their beliefs were. And fortunately for me, and number one of those principals, was around putting user needs and not government needs first. And so yeah, culturally there was a cohort in there who were well supported within the political system and were able to really kind of make great shifts and changes and progress on that front as a result.

Steve: It seems to me, just from like watching Twitter or LinkedIn, or just who I keep coming across, that there are research people, titled researchers in every kind of nook and cranny of government services – digital and otherwise, I think in the UK. It seems like from the time that you got involved it’s built into something – it seems like it’s sort of changed the way that government is delivering services. I don’t know if that’s an accurate assessment.

Leisa: Yeah. So, it’s when I started at GDS I kind of came off the back of like – there was a lot of talk about being user centered, but I couldn’t really see exactly where the users were in the process. So, I was publicly a little bit critical of them at one point kind of saying well you know I see that you’re thinking about users a lot and you’re looking at the data that they leave behind a lot, but are you actually – actually involving getting a good understanding of them in the process of designing and transforming these services? And so, I was given the opportunity to come in and put my money where my mouth was, such as it was. And at that point there was the odd kind of researcher here and there. There were like 3 or 4, I think, at GDS at the time, and they were stretched across about 25 different projects. I remember sitting down at their team meetings and the team meetings were basically sitting in front of a spreadsheet, looking at all of the projects that they were supposed to be covering, dividing their days into quarter days and working out how on earth they were going to try to help to support these teams, not all of which were in London. So, a lot of them kind of theoretically required either quite a bit of traveling or just dealing with on the telephone. And that was it. So, they were there, but they were really not well set up to be effective. And I was not really – you know I’d worked with government as a consultant in the past and was pretty skeptical about whether or not that would be a brilliant place for me to work long term. And so, when it came time for me to say what I thought we should do I wasn’t really that worried if I lost my job. So, I was able to say what I thought we should do which is I thought we should have one researcher per team, which at that point in time was like just an outrageous thing to be saying. It was absurd. And we didn’t quite get one researcher per team, but we did get quite a big chunk of researchers and so – and the other thing that I did that I kind of look back on and think that was a really important thing to do was I put one researcher on as many teams as I had researchers and I just left the rest, pretty much unsupported. And the thinking behind that was if I could just give a bunch of people an opportunity to show what good looks like then we could create demand for that good. And if we stretched everybody across multiple things we’d never be able to create that showcase, that exemplar that let people see what they could have. And I think that kind of made all the difference. In the early stages again, speaking about getting proactive rather than reactive. One of the big challenges that we had was like how do we get people even to plan for this? Like how do we get researchers, and budget for what researchers need to do, in the projects ahead of time because, you know, people would come to us saying, “we’re starting discovery next week, I’ve heard you can help us with that.” And by then, like you know, the budgets had been approved like 3 or 4 months before. You had to like kick off an engineer to get a researcher which is always a super popular thing to do – not. And then there were things that we do where we just sort of made rules about you have to have one researcher per team, at least 3 days a week. And it was 3 days because then you couldn’t split them across two projects equally. You had to do research at least every sprint, which was 2 weeks for us. And you had to involve at least this many people. That basically gave us the formula for working out how much research would cost in a project which let us get like that number into the project budget really early on. And these are like really totally boring things that have got really nothing to do with research at all, but were hugely enabling. By the time I was leaving GSA, instead of having the problem of going you should have a researcher in there, you should have a researcher in there, I would have people coming to me, initiating projects, going I’m starting a project, I’m going to need at least 3 or 4 researchers and you go no, no, no, that’s too many, just start with one and kind of go from there. Yeah, it was something that really – I think because people were able to see the difference that it made – and researchers were really facilitators for the rest of the team. So, their job, yes was to run the research and set it up and facilitate it and do the analysis and that kind of thing, but they created these opportunities for the entire team to be able to see how what they were working on was solving problems. Or, how what they were working on worked really well, or didn’t work so well. I think that was like the most powerful thing that they did was really to be very open and inclusive in the way that the whole team was invited and expected to come along and see for themselves, first hand, what was happening. Our research labs that we had onsite we’re always full. Like our observation room, we just had to keep making it bigger and bigger all the time because so many people would come in and watch these sessions. And I think it was that – a) it’s super helpful, but b) I think it’s really – it’s great feedback that the team gets. It provides – it shows the meaningfulness of the work that they’re doing and I think that’s a really desirable thing for teams to be able to see what I’m doing makes a difference. It has an impact. It matters. And I like to think that that contributes to team health as well.

Steve: What’s the difference in the culture of government vs. software technology companies where that kind of response, that kind of commitment and engagement can happen?

Leisa: Well, I was going to say governments don’t have growth teams. They do have behavioral scientists though – behavioral economists – which are kind of a similarish type thing, but they tend to still experiment on letters more than they do on digital things. But I think that it’s a few things. I think that people in government feel – it’s such a generalization. I don’t even know if I believe it 100%. Certainly, in places that I’ve worked people in government have felt their responsibility to the people who are the service receivers and the idea of experimenting on them is something that you would approach much more carefully then what we see, I think, in a lot of other sort of software/technical organizations. I think that’s one part of it. I think the other thing is very few people that I met in government thought that they could do my job. They didn’t know that my job was a thing and they didn’t think that they could do it probably as well as I could. Whereas I find in tech companies there’s a lot more of that. Like everyone’s got some kind of research background. Everyone feels a little more confident and capable at doing this stuff themselves. And I know that that’s not universally true. I know there are a lot of companies where everyone is terrified of going out and talking to customers. But I do feel as though in tech companies – some tech companies – there is a lot more kind of over-confidence about your ability to go out and do this, especially this – and I don’t want to sound like I’m a complete downer on the lean sort of thing because done well it’s great, but I think that has really encouraged people to think that there’s not too much too this, that anyone can go and have a chat with a customer and that’s what we do. I think there are probably a few other things that are really different.

Steve: That’s a good comparison. Can we go back in time a little bit further maybe. I’d love to hear about how you found research as a thing? How is it that you ended up in this field?

Leisa: Well, I was one of those people who growing up didn’t know what I wanted to be. I briefly wanted to be a vet, but that was mostly because of a popular television show that was on in Australia at the time.

Steve: Which one?

Leisa: It was called A Country Practice and it had a character called Vicky who was the vet and I thought she was the bees’ knees. But then kind of beyond that – that was like when I was in primary school and then beyond that I honestly really don’t know what I want to do when I grow up. After high school I went and did a university degree in communications and I could have gone and done a bunch of other different degrees. I kind of applied for like this variety of all different kinds of things in different cities. Like it was a toss-up between doing music in Melbourne or communications in Sydney. In the end I chose communications in Sydney because they had 9 hours face to face time and it was pass/fail and I thought that sounded pretty cruisey enough, after being a bit of a study nerd in high school. I thought maybe I’d earned a break. It was not my best thought out decision and I didn’t really know what I was doing there. And then I was working at the same time. I really didn’t know what I wanted to do at all. I met the Internet at uni for the first time and then in one of my kind of weird jobs that I was doing, I was working for a legal services company and the guy who had started that company, one of his kind of big market advantages was this software that he had had designed basically, which ran all of the kind of logistics of his company. And the guy – you know as software does sometimes, it just stops working or it needs changing. And the guy who was in charge of the software was a supplier and when he couldn’t be bothered coming on site to fix stuff he would ask me to do it. So, I got to understand a little bit about kind of how software happened. And then we started thinking about doing internet stuff and it morphed into a really early E-commece offering and I was asked to be the project manager of that. And I thought to myself, this internet this – this is it, right? This is why I didn’t know what I wanted to do when I grew up because the Internet didn’t exist when I was growing up. So, I thought that was really exciting and then I went and worked in some digital agencies. I had a job called a producer and a producer basically did everything except for writing most of the code and the visual design. So, you did the project management, the account management, the interaction design, the information architecture, all of that sort of stuff. A lot of the sort of project strategy stuff as well. This was in the early days when agencies were still kind of really small. And then over time I just kind of got rid of the stuff that was less interesting to me to focus in on the stuff that was very interesting to me. At the time it was information architecture and the research that goes behind that. And I kept doing both of those things for quite a while. At one point I was working on this huge project in Australia and trying to get budget to do some research. And it was a tiny, tiny budget that I wanted and I had to fight so hard for it. And it was a project that was being run by an ad agency and I’m pretty sure they spent on lunch every week what I was asking for, for research for the entire project. I got very frustrated. And that was basically what kind of drove me to go and move to London. I could see that they had agencies there where they had lots of like like me working in the same company, which just like sounded extraordinary and I wanted to go and see what that was like. So, I moved to London and went and worked for a place called Flow Interactive, which at the time was filled with what I thought were the smartest and most interesting people I’d ever met in my life and I learned loads from that experience. Then I started getting to the point where I realized that in a lot of large organizations, if you stayed in one place long enough the same brief would come around over and over again. And that you could have a great time doing research and nobody would do a blind thing with it. And that stopped being fun when I realized that it was fun to do the research and then nobody would do anything with it. It was good the first few times. The third time I’m like, hmm.

And then I was – my boss at the time apologetically said to me, “I’m really sorry Leisa, but I’ve got this friend and he’s got this start-up, would you mind – can you do me a favor? I know it’s not very glamorous to work on start-ups, but would you mind working on this start-up?” And so, I did and I remember sitting in the usability lab doing usability testing, as you did, and in the observation room was the entire team, which was amazing that the team actually turned up. Not only that, but they were changing the prototype as I was doing the testing. So, like compared to the other companies that did nothing with anything ever, like they were doing it as I was doing the research which was surprising, but you know, but good. And that was it. From then, I just sort of decided I need to go work with people who are actually going to use the research and that led me into working with tech start-ups for a long time. Yeah

Steve: You worked for yourself as a consultant?

Leisa: Yeah, I had my own business for a long time and worked a lot with tech start-ups in London and then also with some larger companies as well. It was like some big publishing companies and universities and places like that too. It was actually a combination of all of those things over time led me to believe that actually probably the biggest, most important challenge is how you change the DNA of the organization that you’re working in. That you can do the best research project in the world, but if the company that you’re working for, the organization that you’re working for doesn’t have the motivation to actually do something with it, then it’s kind of pointless. I remember looking back on like of work and just going what have I got to show for this? It felt like very little. It was not long after that actually that the opportunity to go join GDS emerged and I thought to myself, right, well this – if I can’t have an impact at this organization I can’t have an impact anywhere. Because, like you said before, it was so full of people who were really smart and really orientated to do the right thing. And super meaningful work as well. And I think that’s where I still find myself today where I love research, I love the way that we learn things and what that means, but the thing that really drives me is seeing an organization almost like a design problem and thinking about like what do we – what levers can we pull? What do we choose to do? How do we position ourselves so that we cannot just do fun research, but we can actually really have this knowledge and this insight and this practice fundamentally change how this organization operates?

Steve: That’s a great arc that you just articulated for yourself. It’s really fabulous. So, let’s go back to you were describing sort of the make-up of the team at Atlassian and you talked about two pieces which are unusual but may be worth digging into a little bit. One was research ops, which is sort of an emergent practice. I’d love to hear you explain a little more about what that looks like for you all. And that market research is part of the team as well. Let’s talk about both of those.

Leisa: Sure. So, our ops component is relatively new. I was fortunate enough to convince Kate Towsey to come and meet me in Sydney and start sort of really building that capability for us. At the moment it really comprises kind of three main bits, I think. One is the recruitment thing, so how do we create a really good infrastructure for doing what’s for us mostly kind of B2B research recruitment, which is pretty tricky, in a way that serves the needs or our organization. And that’s – you know that turns out to be interesting in a lot of ways in terms of set out an infrastructure for that. But also seeing that as potentially an opportunity to try to help shape that kind of who’s doing what kind of research when thing. It gives us visibility into what the current activity is? What kind of stuff is happening, which is good. We don’t really have a good infrastructure for taking every bit of demand and shaping it because that would be a huge bit of work. And also, going back to the time stuff. So, recruitment is one bit of it.

We have another part which is really looking at the technical infrastructure. There’s actually, these days – there’s a big technology component to setting up research well in organizations. Whether it’s how the recruiting happens? Where it’s where the data goes? We run kind of the large-scale surveys out of our organization as well. There’s a surprising amount of tech involved in doing that reasonably well also. So, we have somebody whose job it is really just to completely look at the research tech side of things. And then we also have what we call an engagement and impact team which is a new experiment for us. And this is really looking at how can we try to build a really strong muscle around making sure that the work that we’re doing in the team that’s not embedded is consumed and understood and acted on by the rest of the organization. So, this might be some of the customer happiness surveys that we’re doing. Or it could be their top tasks work or some other sort of strategic work that we’re doing.

How do we make sure that when everybody’s got their nose to the grindstone, focused on the thing that they’re interested in right now, they have this kind of curiosity and interest and understanding for how this other research that’s happening that they haven’t commissioned could be useful to them? So, yeah. So, that’s interesting. And so, the ops thing for me is really thinking about like how can we help make sure that the researchers are really focused on research and not all of the kind of logistical stuff around it. And then secondly around thinking about scale. How can we try and make sure that the work that we are doing in our team scales out to the rest of the organization and that the things that we do at scale are of good quality?

Steve: So, you described in your own arc how you discovered that the change of the DNA, changing minds, was maybe the most compelling part of the research for you. But in building up that third piece, the engagement – and, I’m sorry, there was…

Leisa: Impact.

Steve: …engagement and impact part of ops – you’re sort of asking researchers to not be involved in that part of it as much.

Leisa: So, in the way that we are setting up our team, and it’s a work in progress right now, we have chosen – I have chosen to centralize a bunch of the work that we are doing. Mostly in order to create the opportunity to do the work that I think we need to do that wouldn’t come naturally out of demand within the product teams. So, to do some of this work that steps back from the features and gets to this really kind of grounding understanding of what the needs are. So, we chose to do that because we would never be asked to do that by the teams, but I felt that we really needed to do it. Increasingly now we also have researchers who are working embedded in the teams which is much more like what we did at GDS and DTA and other places. And so, the engagement impact teams are really looking at supporting the work of the centralized teams, but because the people that are embedded tend not to need it so much. They have really close collaboration with their product teams and they’re doing – they’re either meeting the demand or really kind of closely shaping the demand in the teams for the research work that’s happening. But I think what we want to do over time is really have this kind of good, creative tension between the crosscutting strategic research, all that sort of foundational work, and the stuff that’s happening in the teams as well. When you have a large – Atlassian is really not that big. We’re about three and half thousand people. So, tiny compared to some organizations, but quite fragmented. Like lots of products, lots of product teams. When you have that kind of complexity in your organization you have to work really hard to make sure that all the people who should be seeing the thing see the thing and that they understand why it’s important. Because a lot of

people don’t make the connection. Like for us when we do the research we go ah, it’s completely obvious why that person over there should find this really interesting. But they are so focused on something else, they often see the work and then go why should I care about that? So, you often have to really do that sort of bridging exercise as well. So, that’s where engagement and impact are really focused on, I think, is looking at really supporting the crosscutting work, the work that’s not done for any one particular team, but actually is super relevant to them if we can get them to attend to it.

Steve: I think ops in general brings up lots of interesting questions about where do you delineate what a researcher does? I appreciate your point that it depends on the context of the organization and what they’re sort of – the optimal remit for them is to lead to this larger kinds of change. So, if I’m a researcher I want my research to be consumed, to be relevant. I don’t think you’re doing this, but were I to be completely isolated from the people consuming it then I don’t get that feedback loop. You’ve talked about sort of that feedback loop of the researcher is an important thing. That being said, if there is a team that can support me, or that I can support to help drive impact, to help drive that interaction, then that enhances the researcher’s ability to kind of reach people. You’re hearing me thinking out loud about where is an ops person an enhancement to what the researcher can do versus – ‘cuz a lot of ops, like the way I hear people talk about dev ops is like it’s stuff we don’t want the developers to be doing. I don’t know. I think there’s things that research ops does that, in my judgment, I want researchers to be doing. And maybe it’s about building up infrastructure to help them do that better, not take it away from them.

Leisa: Exactly. So, the idea is definitely not that there’s like a handover point that the researcher goes here’s the insights and then engagement and impact take those and spread them out to the world. That’s definitely not what it is. I see engagement and impact people as being people who know how to get things seen by the right people and know how to bring people with interest to come and pay attention. And so, a lot of it really is about making those connections. But the other thing is like I find in my organization I’m one of very few people who has got visibility across everything, across all of the research work that we’re doing and lots of things that are happening in the organizations. And so, I can draw connections that other people can’t draw. But I’m just one person and I spend most of my time in meetings. So, I’m kind of hoping – and it’s very early days for us with this whole engagement/impact thing. So, we’re shaping it on a weekly basis right now. But my hope is that because these people are tasked with having really good visibility across what’s happening in the organization, and they’re helping to facilitate these connections between the researches and the organization, they’re going to get a similar view to what I have where they can start also to make these connections. Because I feel as though that’s one of the opportunities that we have in a research team, looking out across the entire organization, is to be able to see opportunities for connection, to reduce duplication, to sort of build sort of sense making from seeing things from all these different perspectives. To connect people to things that they didn’t even know existed that would be really useful to them. And so, I’m kind of hoping that that’s something that over time this team will be able to build as well is that ability to see where the opportunities and connections are across the entire organization that really nobody else, except for me right now, can see.

Steve: I see a parallel with the design education effort where there’s all these things anyone who has worked in research for any amount of time and says, “oh yeah, we have a lot of responsibilities. In addition to doing the research we have to be teaching people and we have to be trying to advocate for our research and planning.” It seems – I don’t know I’m inspired the more you explain this because you’re basically saying yeah, if that’s a responsibility, maybe we need a team. Maybe it has to be somebody’s job. As opposed to sort of bearing it. And when we make it all part of one person’s job we don’t necessarily even acknowledge that hey, education is part of my job. Hey evangelism is part of my job. Hey driving impact. By creating a team to me it makes me more mindful. Like oh yeah, that’s a thing that you could spend fulltime on. And it doesn’t mean, to your point, you’re not taking it away. You’re creating specialties for all the different facets of – we talk about research, but it’s so many different things.

Leisa: Absolutely.

Steve: And you’re naming and staffing specialties that I haven’t seen named and staffed before.

Leisa: Yeah. It’s really interesting to try to get people to take these jobs because they’re like what’s the career path for this? And I’m like I don’t know. I have no idea what the career path is for this. Let’s work it out together. So, it requires really special people to be willing to take these roles and to kind of – to believe in what the opportunity is and what the purpose is. But we are taking a lean approach to shaping the team, but like I said, I always think about this as a design problem. I’m always thinking like what’s the need. And like we talked before about the huge amount of specialism that’s involved in being a great educator. So, the idea that you can just do that in your spare time, in between projects, is just not realistic, I don’t think, if you want to do a good job at it. And it’s the same with the engagement piece, right. As soon as – like you’re always under pressure to get a project wrapped up and there’s always something that you should be starting next. And so, it can be really difficult for the researchers to go, right, now I’m going to do the engagement and impact part of my job and I’m going to spend 3 weeks just making sure that everyone knows. I’m going to go and do all of these town halls and I’m going to go and run these workshops and I’m going to write these blogs – I’m going to do all of these things. Like that’s – it takes a lot of work, especially if you’re exhausted from all you’ve just done – all of your analysis work. You want to crawl under a rock somewhere and rest for ages and now you have to go out and do all of this like roadshowing of what you’ve learned. They still – the researchers still have to do most of that work, but having engagement and impact there I think can make it so that it’s much more streamlined. We don’t have to sit down and go alright, who do we have to go and talk to? Like when are they having their meetings? How can we get into their team meeting and just present there? Part of what this team is doing is just really having all of that knowledge and going right, okay, so what you want to do after you’ve done a project is this, this, this, this, this, this. And it just takes a lot of that repetitive effort out and it means that the whole process of engaging and being impactful becomes more streamlined.

Steve: I talked to someone a little while ago and they described hiring a librarian, like a reference librarian. We had the conversation about the data repository and they’d come to the realization that it’s not about the infrastructure, it’s about having someone own it, advocate for it. It’s the same thing. Here’s a problem that we have, that we always talk about, what would happen if we created a role for that and could take the pressure off the other people that have other things to do.

Leisa: I have a research librarian on my list of roles to hire yet. I was just waiting to get the headcount for it. But yeah, I completely, completely agree with that point of view.

Steve: So, now market research. That also is part of your group.

Leisa: Yeah, yeah. Market research came about because I inherited NPS and that’s a mixed blessing, obviously. But I wanted to kind of embrace it as an opportunity because it was something that particularly executive people in the organization cared about a lot. It was a good way – it was a good sort of vehicle for getting access at the highest levels and getting people’s attention at the highest levels. But obviously we were pretty – I was pretty unhappy with our methodology which was really surveying way too many people in product when they were trying to get their jobs done. And also, the methodology behind how we did the analysis of that. So, we spent a bunch of time thinking about what we wanted to do that was different to that and when we sat down and thought about the kinds of people that we needed to run a better program of work it became really clear to me that, certainly in Sydney, we don’t have quantitative user researchers. I don’t know, in the valley you can hire them fairly – well they exist. People acknowledge them as a thing. It’s not a thing in Sydney. And so, the closest thing that I could get to that was to look at market research and particular kinds of market researchers who would have that ability to actually really design good surveys and to have really strong skills in thinking about how to analyze them and share that information back as well. So, yeah, so that’s – we’ve gone kind of large on that now and that team is now pretty much staffed with people who have got market research background. And it is really that ability to bring in the quantitative analysis that has made a big difference. And it opens up all kinds of opportunities for us. So, we now are able to work super closely, for example, with our brand and marketing part of the organization because we have skillsets that they value in our team now. We can start to pick up research that they would either have built a separate team for or would outsource. And I think having those opportunities to have, you know, what might have been a voice of the customer team, what might have been a marketing research team, all in the one place is super exciting for us and we would not have been able to do that without bringing some marketing researchers into our team. So, that’s been something that I probably – like if I had not inherited the NPS probably wouldn’t have naturally thought to do. But it’s one of those kind of happy accidents that I’m really, really pleased has happened and I would definitely do again in the future.

Steve: The phrase mixed methods research seems to be sort of a – like a hot term right now which to me is about the collaboration between these different types of research tools or research professionals or research teams. I mean the term seems newish. I’m not sure that the practice is that new. I don’t know, is that – so, with these different capabilities that you have is that part of how you’re able to work?

Leisa: Yeah, absolutely. And I think also it gives us a really strong conduit into some of the other more quantitative parts of the organization as well. So, working much more closely with product analytics is part of what we’ve kind of naturally had to and being able to do as a result of the work that our quant researchers are doing. So, yeah it does, it really – I mean this is something that we’ve talked about a lot as well is trying to look at these mixed methodologies and looking at all the different data sources that we have available to us. And that includes people like our support organization and our kind of field and sales operations as well. But yeah, so I think – I think that having – yeah, having that particular capability in the team has been really critical to helping bring the other sources closer to ours. And we’re starting to get into a much better place at being able to sort of cross-reference each other’s wok and demonstrate that being able to do that is a good thing and kind of go back to the reliability question again, right, that that’s something that you should look for when you’re looking to the reliability of the work that you’re doing. Like can I actually see this in other data sources in this organization? Well, yeah, I can. Okay, great. Then I feel like I can rely on them more.

Steve: If we were to have an episode of Dollars to Donuts in in 2025, let’s imagine that there’s still podcasts and there’s still Atlassian, and you’re still in the role, like what would we be talking about? What would you be highlighting?

Leisa: I don’t know. Hopefully a whole, new different set of problems than what I have right now. I have a picture that I drew. I call it my FY20++ next gen research plan, vision type thing and it has a whole bunch of different kind of sections, components to what a research and insights team might look like, that we would build over the next couple of years. And I – so, that’s – I have kind of a sense of where I want to get to with it but can’t – I’m not yet at the point of being able to anticipate what the problems would be then. Here I am assuming that we’d be talking about the problems and not all of the great accomplishments that we’ve had since then.

Steve: But that’s – I mean that’s the way I think a research leader – your response isn’t unique. I think finding problems that are big ones and tackling them is why – of course that’s our response because that’s the kind of person that ends up in the role that you’re in.

Leisa: The thing that I would be most proud of, the thing that I’m really looking for as we go forward is across the organization for us, whether it’s in product teams, whether it’s at executive level, whether it’s in like the sales or the support, wherever in the organization it is, I would really love for us to have developed this natural tendency to talk about problems and opportunities from the users and the customers point of view, and not from our own product or feature point of view. That for me feels like the big transformation that could make a huge difference. And so, a lot of the things that we’re working on now is how do we build that understanding? How do we build the connections between that understanding and the things that we’re working on – the things we’re thinking about working on – what are the big things and the tiny things that we can do to start to change the way that we think about things? And so, I remember when I was at GDS one of the tiny things that we did was we just said it’s not user testing, it’s usability testing or user research. And tons of people will listen to this and go, “it really doesn’t matter.” But what it did do is by getting people to do that little correction in their head every time they accidentally said the wrong thing, or accidentally almost said the wrong thing, and they’d go we don’t test users, we test ourselves. We test our work, we test our ideas, we test our designs. That, in a government context, was a really super important thing. And so, it’s that kind of – it’s almost that linguistic correction of like making sure that when we talk about things we don’t – like I often say to people that I’m working with at the moment, like if you can’t describe the problem without saying an Atlassian word then you don’t understand the problem. And so that’s what I’m looking for. I’m looking for a future where when we talk about things we don’t talk about it by the product name or the feature name or some other kind of internal, buzz wordy thing what we have, but we use language that clearly comes from an understanding of our users and our customers and the needs that they have and the problems that they are trying to solve. That’s, that’s – that’s my kind of mission, I think, more than anything else right now because I feel like that will have the biggest cultural impact.

Steve: That’s fabulous. Anything else we want to talk about?

Leisa: No, I think we’ve done it to death, Steve.

Steve: Alright, well thank you very much Leisa. It was really interesting to talk to you. I really appreciate all your time.

Leisa: Thank you very much. It’s been fun.

Steve: Thanks for listening to Dollars to Donuts. Follow the podcast on Twitter, and subscribe to the podcast at portigal dot com slash podcast, or Apple Podcasts, Stitcher or Spotify, and the rest. The website has transcripts, show notes, and all the episodes. Get another copy of my books Interviewing Users and Doorbells Danger and Dead Batteries from Rosenfeld Media dot com or Amazon. Thank you Bruce Todd for the great Dollars to Donuts theme.

May 29 2019

44mins

Play

Rank #9: 22. Vicki Tollemache of Grubhub

Podcast cover
Read more

In this episode of Dollars to Donuts I speak with Vicki Tollemache, the Director of UX Research at Grubhub. We discuss how to manage incoming research requests, running a weekly research session for testing designs, and why candidates should come into job interviews with a point of view about the company’s product.

To me, researchers are educators. They’re there to translate and educate the organization that they work with about who their users are, what they’re experiencing, where their pain points are, what they care about, what their motivations are. There’s a number of ways you could communicate that and you can educate. Experience is probably one of the best, but due to time constraints, not everyone can come into the field with us and experience that. If you just rely on reports and communicating from the researchers there’s something that’s left out in the details. There’s a richness that’s not there that I think even researchers realize. – Vicki Tollemache

Show Links

Follow Dollars to Donuts on Twitter and help other people find the podcast by leaving a review on Apple Podcasts.

Transcript

Steve Portigal: Thanks for joining me on Dollars to Donuts, the podcast where I talk with people who lead user research in their organization.

I’ve mentioned a public workshop happening in San Francisco; it looks like that has fallen through but I will be speaking at the Mind the Product Conference in San Francisco next month. I’m also doing in-house training workshops so let’s talk if that’s something your team might want to pursue.

In my consulting practice I have the opportunity to work with different organizations with varying levels of investment in research, varying levels of maturity in their research and product practices, and so on. I started this podcast as an extension of that, as a way to highlight the emergent practice of user research leadership. So supporting me and my business is the best way for you to support this podcast. My consulting work informs the podcast. It also pays for this podcast. If you have thoughts about the Dollars to Donuts, email me at DONUTS AT PORTIGAL DOT COM or write me on Twitter at Dollars To Donuts, that’s d o l l R s T O D o n u t s.

In 1961 the new chairman of the FCC gave his first speech, addressing the National Association of Broadcasters. He saw the potential of television, but complained that watching the programming currently available revealed a “vast wasteland.” That phrase echoed down the decades. In 1992 when cable TV executive John Malone coined the phrase “the 500 channel universe.” While that might have signified opportunity to the industry, but to the viewing public it felt like the vastness was simply increasing in scope. Yet somehow we made it today where 500 channels seems almost quaint, and critics herald the Golden Age of Television also known as Peak TV. Well, we purchased a Roku TV set recently. Roku basically provides an operating system for the television where you can add apps, just like you’d add apps to a phone. So we added all the usual suspects like Netflix, YouTube, Hulu, NBC, Vimeo, Amazon, and some others that we discovered like Pluto which is a free streaming channel with a ton of programming. But what I didn’t realize was that Roku is a somewhat open platform that allows interested parties to add content. Looking into how to do that, it doesn’t seem that much harder than putting out a podcast. And while the choice isn’t as overwhelming broad as with podcasts, there are an astonishing number of Roku channels. I found a blog that every week or two updates with the latest channels, not all of them, just the ones that they’ve reviewed! Here’s some of their latest update reviews.

Funny TV Network – A collection of funny clips from Family Feud hosted by Steve Harvey
Know Your Tools – Tool reviews, safety tips, recommendations and innovations
The Home Depot Channel – Tips and tutorials for home remodeling, home maintenance, and tool use
Louisiana Cajun Recipes – A Cajun cooking show hosted by a self-taught cook
Smoky Ribs BBQ – Essential grilling recipes
Stories of the Century – A 1950s Western TV series about a railroad detective who roams the west, tracking down outlaws and bandits who are preying on the railroad
SOS Coast Guard – A 12-chapter 1937 film serial starring Bela Lugosi and Ralph Byrd
Sci-Fi Zone – 19 vintage Sci-Fi movies from the 1940s and 50s
DramaTV – Vintage public domain dramas from the 1940s and 50s
Scary TV – 20 vintage horror movies from the 1930s, 40s and 50s
Shemaroo Yoga – Yoga tutorials from Anushka Malao
Aircraft Channel – Aircraft accidents and near accidents reconstructed with analysis of what actually happened
Rockwell Off Road – Mud-bogging and proving grounds videos

It’s not particularly different from what’s already online on some platforms, but I was surprised at how the bar for “television” had got, I certainly expect a lot of crap from TV in general but this is barely curated Internet detritus that mingles with TV channels from established media players. I’m not saying all this content is necessarily garbage, or isn’t of interest to some people, perhaps many people, but that my mental model for changing channels on a television set, even if I can’t pick anything over the air where I live and haven’t had cable for many years, that’s still an entrenched mental model, so to find that this new television lets me watch NBC and the Opossum Saga at the same navigation level in the menu is just surprising. I think they’ve got some way of streamlining the experience so searching on the platform will more likely reveal big brands that have more traffic or that perhaps have paid slotting fees. You won’t come across The Lawnmower Channel unless you know to look for it, I think. And just like I’ve mentioned in previous episodes these shifts in mental models, in how producers expect something to be used versus how consumers expect something to be used, these are fantastic things to explore in user research, especially as systems grow in complexity and scale, like this 5 zillion channel rokuverse.

Well, I think it is time to get to the interview. I spoke with Vicki Tollemache who is the Director of UX Research at Grubhub. All right, well, thank you for being on the podcast.

Vicki Tollemache: Thank you for having me.

Steve: Why don’t you start by introducing yourself? Who are you? What do you do?

Vicki: I am Victoria Tollemache, but everyone calls me Vicki. I am the Director of UX Research here at Grubhub. Essentially, I manage a team of researchers across our ecosystem and my job is really to make sure that we are working in a way, doing research that’s going to impact our organization. So, it’s a lot of strategy. I’m working with product leads and our design VP to make sure that we are positioned correctly and chasing the right questions.

Steve: Hmm, chasing the right questions. What does that mean?

Vicki: That means I mean – my research team – I think generally research teams are much smaller than the design organizations and product organizations they’re supporting. We have a million questions coming at us of all shapes and sizes and it’s determining which questions make the most sense for where we are as a business and which questions, if we get answers to, will the business be able to respond to and we’ll be able to have impact? So, making sure that we’re positioning in the right space.

Steve: Right. You mentioned impact as well right off. So, what does impact look like?

Vicki: Impact to me is, in the scheme of things maybe it’s identifying needs within our users ecosystem that we could create solutions for, to better solve for pain points they’re experiencing. Or it is – maybe we’re creating a new experience and we’re not quite certain if what we’ve created matches the user’s mental mindset. So, doing research at that point that our product and design teams can respond to and make changes to.

Steve: Does that fall under the label of evaluation? It feels like there’s something else to it.

Vicki: It can be. What do you mean?

Steve: I mean maybe you’re just describing the way I’d like to see evaluation done where it’s not just thumbs up/thumbs down on…

Vicki: Oh no, no, it’s iterating right through the process and we’re working with design to be like what can we change about it to make it a better experience? And then potentially testing again.

Steve: And you’re framing it around mental mindset which is sort of the underpinnings of a concept.

Vicki: Sure, absolutely.

Steve: Not the implementation of a concept. I think that’s why evaluative to me is like – evaluative is maybe about the details of the design?

Vicki: Sure.

Steve: And so I feel like maybe there’s another word that describes what you’re doing which is looking at the value proposition or the construct, or the mental mindset.

Vicki: Sometimes we call it an experience audit here, but I feel like that’s a term we’ve made up ourselves. I don’t know if I’ve ever heard that in the field or in the wild. Because we were trying to describe some of the things that we’re doing and we’re like are we auditing the experience?

Steve: Not that you asked my opinion, but audit seems more passive than what you’re talking at. To get at that gap between a mental mindset of a consumer of something and producer of something is more – that’s more – you have to extract that. You have to synthesize that.

Vicki: Sure. I think I’m speaking to two different types of research, right. One is when we’re going out in to the field and trying to understand that environment that our users live in, and especially where they meet, because we tend to find that that’s where a lot of the friction points are. And then working with design and product to be like how do we solve for some of these problems that we’re finding? Are these problems that we knew existed even? And then once we’ve gotten to that place where we’ve come up with solutions, then pairing with our users to determine have we solved for this in a way that actually makes sense for the users?

Steve: When you say “where they meet” in that first part?

Vicki: So, I mean Grubhub is an ecosystem, right? We have what we consider four distinct types of users. We have drivers. We have our restaurant partners. We have our diners – these are the people who are consumer facing. They’re ordering from us. And then we have internal employees. I mean the easiest, most basic way to explain that is we have an internal care provider that provides help to all our different users and those users don’t exist by themselves. They’re constantly interacting with each other, right.

Steve: Yeah.

Vicki: And usually they’re interacting with each other when there’s problems. And so either we can help them solve those problems, or sometimes we can maybe be a hindrance in how they solve those problems and they have to try to work around us. So, those are sometimes some of the biggest opportunities for us.

Steve: Okay. So, where there’s interactions or intersections between those different…

Vicki: Types of users.

Steve: In the ecosystem.

Vicki: Yeah.

Steve: Is opportunity.

Vicki: Opportunity, yeah, for us to – sometimes it’s about providing them more independence and building trust, right. And then there’s understanding why is there trust breakdown and why do they feel like they don’t have autonomy to kind of control the – the situation that they’re in.

Steve: Maybe we could step back.

Vicki: Sure, we did kind of jump right into it.

Steve: That’s me, not you. So, what is Grubhub? It’s an ecosystem of these things, but what is – how do they combine?

Vicki: I think Grubhub considers themselves a marketplace that is providing restaurants the opportunity to compete for consumers in regards to ordering food for delivery or pickup. So, that’s what we provide.

Steve: In what parts of the world are you running services?

Vicki: The U.S. and then we have a little bit of presence in London, but that’s generally on the corporate aspect of it, right. So, we also have a corporate platform where companies can partner with us and provide their employees a credit and then the employees kind of – it keeps them at work if they can order lunch at work and kind of eat through a meeting kind of thing. So, we have a little bit of presence in the UK with that. But otherwise, I think we’re in 300 markets in the U.S.

Steve: Is that a lot?

Vicki: I mean I think we have a very good presence in the United States. I think if you went into any major city, or even minor city, we would be there, right. So, we’ll be in Little Rock, Arkansas. We might not be in Jasper, Texas. And I’m using references from the South because I am from the South, so it’s easier references.

Steve: Okay. Alright. That’s good context. So, let’s – so, you’re sort of describing some of the things that you’re working on. And you made a point that research teams are often smaller than design teams.

Vicki: Yeah, product and design teams, right.

Steve: Yeah. And that’s the case here.

Vicki: That is the case here. I think – there are 7 of us currently, including myself. We have an open position, so that will be an 8th person. And then we also swell in the summertime because we usually bring in one intern. And last year we were fortunate enough to hire an intern, so it’s a good opportunity for interns as well.

Steve: What does a user research intern do at Grubhub?

Vicki: Well in the days since I’ve been here I always want to win best internship. That’s always my goal. I want that intern to get as much experience as possible. Usually they come in and they tell me they would feel more comfortable getting moderating experience, being able to run projects on their own. Last year we had the intern own – it felt like the easiest place for them to start was kind of doing more smaller research questions, so more evaluative research. So, they took over – I think we called it Research Day – and essentially ran that for the entire summer. But then we also took them out into the field with us for one project. So, they got to go on a trip and experience what it’s like to be within the restaurants, shadowing the drivers, us going into the diner’s home and learning about food. So, they’re owning one thing, which is kind of nice right, that they get to own something. They’re getting experience moderating. They’re putting projects together. They’re working with product. They’re working with design. And because Research Day goes across the entire ecosystem they’re getting exposed to everything. So, it’s like a really good place for them to start and then they will get to partner with some of the researchers. Kind of like a more mentor/mentee and go out into the field on some of our bigger research projects.

Steve: What is Research Day?

Vicki: Well we called it – at first it started as Diner Day. It’s just a standing day, once a week users come into the lab. The week before the designers can pitch ideas they have. We go through, based off how thought out, flush out the ideas, priority based against our organization’s needs. What are we going to be testing? Do we have access to it? Can we get the right users in? So, we go through and we kind of have a pitch day. We choose the idea and then 5 days later we test that idea in a lab. So, it’s supposed to just be really, really quick – 24 hour turnaround for findings. Just a bulleted list of findings that the designers can then go take action off of.

Steve: And that runs?

Vicki: Every week. It’s really frequent cadence. I found in past organizations sometimes people can push off doing research because they say it takes too long. It takes too long to go through scoping. It takes too long finding people to recruit. So, I felt like if we had this standing day setup for diners – just bringing 5 diners in once a week – it meant that designers could kind of come in last minute sometimes and we can get research done for them. And they can also experience that and watch that because we’re in a lab based situation. Or we can also go on the streets of New York. But in the lab base it’s kind of nice because we can also do remote interviews and reach out to people outside of New York, because New York is kind of an outlier compared to the rest of the country in regards to what our platform looks like. Also, people are very experienced ordering food from Grubhub or Seamless here and sometimes we’re looking for a split of like maybe less professional food orderers.

Steve: Seamless is another platform.

Vicki: Seamless and Grubhub are the same thing, just different brand. Eat24 is part of our brand as well. But they’re all the same. There’s just a different brand logo in the top left hand corner.

Steve: When you do Diner Day or Research Day, so a tactical question, you don’t have a lot of lead time.

Vicki: Um-mm. It’s quick and dirty.

Steve: So do you already know kind of what people are coming in before the idea is pitched? What’s the dependency there?

Vicki: Generally it’s a mix of diners, right. It’s starting to get harder and harder to find truly new diners to our platform, but it’s a mix of new or returning diners and we can sometimes switch what that ratio is within the 5 days we have from doing the pitch to actually doing the recruiting. So, we just know we’re going to bring 5 diners in. Some of them are going to be new to us, never ordered from us. A percentage of them are going to be returning to us, or we may shift that where it’s all returning based on what the project is.

Steve: So, you have kind of infrastructure in place such that you can quickly change.

Vicki: Yeah. That’s the nice thing about food, right. I mean I worked – I came from AT&T before this and I worked on the B2B side and we had to recruit network engineers and it was really difficult. It’s expensive. They don’t have much time. It’s hard to get them in. But so many people order food. It’s just an easier recruit, right. So, within 5 days we can get those people. We used a platform called userinterviews.com. Have you heard of it?

Steve: Yeah. I haven’t used it, but I’ve heard of it.

Vicki: It makes it very easy. And they’ll recruit anywhere for us in the U.S., so it’s super nice and they turn it around very, very fast.

Steve: So, once you understand what that flow is going to be…

Vicki: We have that relationship and they know that we’re doing that on certain days, so it’s just like an ongoing recruit that we have going.

Steve: Okay. Just coming back to one of the things you said, you mentioned a couple of times kind of aligning the research activities you do, whether it’s at that level or other things, with the business questions, the goals of the organization. And I think maybe you mentioned this, but maybe you could talk a little bit about how you come to an understanding of what those goals and questions are for the organization so that you can choose the research activities that support that. How is that input for you?

Vicki: I think when I first came – I mean I’ll be honest about this too. When I first came in it seemed like – and we still have this to a certain extent and I think a lot of companies deal with this – ongoing reprioritization of everything. We begin to work on something and then it will be like, no our priorities have changed. So, we had like a lot of like stop/start projects. And at the time that didn’t make sense to me so I wanted our research team to prioritize things based off what we knew our users needs and identifying those. And then we had little bit of leadership change and we had a new CPO come in, Sam Hall, he’s from ClassPass, he’s great. And he’s actually – I work pretty closely with him to understand like what the business priorities are. And I work pretty closely with our design VP to understand some of the design challenges that we’re facing as well. And even for something like Research Day we have an ongoing backlog of research questions that’s coming from the designers and/or product, but there’s some weeks where we don’t have anyone come to pitch and then my team has questions that we will go answer. So, we don’t not do a research day because product and design aren’t asking us questions, but we have questions as well based off things we’ve seen in the field and some of those can actually be themes that come up later. So, in the end, my answer is it varies. Working closely with product helps us from a business perspective. Working closely with our design leadership helps me because they have questions that they are also separately going after based off kind of some of the design questions that they have. And then I just work to prioritize those. We do a scoring. Does that answer the question?

Steve: A scoring?

Vicki: Yeah. So, certain people get a weighted score and in the end it’s also kind of what my team thinks is going to be the most valuable for us, and also where we’ll have impact. I mean I worked at organizations where they’re like every year they set a goal for the research team to do more research, like more projects. And that doesn’t make any sense to me. It should be about doing projects that actually impact the organization, right, that people actually make changes from. You can do a million research projects. If no one does anything with that research it’s like dealing into the abyss.

Steve: So, if I come to you and say I have a question about how people go through this flow in the app, how would you score that?

Vicki: Who are you? Are you a designer? Are you within the leadership team? And then what is that flow? Like what part of the experience is that? Is that something that we have as an initiative that we’re heavily focused on that we’re invested in as a business to like make a better experience? So, I’d have to ask some of those questions and then I’d also look at do we already have previous research that might be able to answer some of your questions? So, there’s a number of factors.

Steve: And sorry if I’m getting like obsessively detailed, but like are you – when you say score to me that means there’s a bunch of questions and you kind of apply a number to those and total them.

Vicki: Sure. Yeah. And it does work like that. It depends on who’s asking for it? Where we are in the product lifecycle? Like if we do this research for you will you be able to make changes from it within this release? Or are you telling us you might be able to make changes six months from now. So, we’re asking – do we have impact? Who is asking for this? How does it align to our business priorities? How does it align to our research and design priorities because we’re part of the design organization? And then there are certain scores that go with each of those answers.

Steve: Okay. I think I’ve heard those questions as kind of an intake or prioritization approach, but I haven’t heard it labeled as scoring which I think makes so much sense when you say it.

Vicki: And then I usually – because in the end our design organization is within our product organization and I usually also work fairly closely with our CPO to be like this is what I’m getting from the design organization. This is what I’m getting from your product leads. This is how we’re prioritizing. Just to make sure. Like it’s a gut check, like does this seem right?

Steve: So, I mean, there’s the example where if nothing comes in for Research Day you have your own things.

Vicki: Yup.

Steve: Are there situations where you’re proposing research to these kind of leaders that you’re working with?

Vicki: Absolutely. Last summer we had a new VP of design come in and one of the things he really wanted to push is quality of our product, right. And then the question came to me, like how do we measure quality? So, there’s a number of ways that we could go about measuring quality. I work with the data science team to go after certain metrics, or to understand how we might be able to measure that within some of our data. But then from my background I was like we could also do benchmarking. Like that’s something we haven’t done before and we rely really heavily on A/B testing which sometimes is like building things very piecemeal. So, I proposed that we go through and do a twice a year benchmark after some major releases to measure like what is this experience for users in our current product. And then if you look at the definition of quality you can only truly measure quality by comparing it to something of greater or lesser quality. So, the suggestion was where we can can we also do a comparative where we’re benchmarking against one of our competitors. And they were completely on board with that. And they allowed us to go to an outside vendor and go through the benchmarking process.

Steve: I thought you were going to say you’re benchmarking against yourself over time.

Vicki: You can do that – we’re doing that as well, but also against our competitors. When we can. We can’t do that obviously on the business side, but we can do that on the diner’s side.

Steve: You can’t do it in the business side because you can’t get access…

Vicki: We don’t have access.

Steve: …access to that part of it.

Vicki: No.

Steve: Okay. I see.

Vicki: And that’s always one of the things – I’ve heard of researchers who will ask – generally our restaurant partners or driver partners, they’re also using our competitors, but from an ethical perspective I don’t like to ask those types of users to share that information. I just don’t think it’s appropriate.

Steve: How do you benchmark – you know when you’re doing competitive benchmarking, how do you compare apples to apples?

Vicki: Our product teams are broken up into initiatives. On the diner side we have two initiatives. One of them is more about like the core product, right. And then benchmark is really about is our core product usable. Like where are there friction points. So, I worked with our product lead and our designers to identify like what do we consider core tasks? And those are based off like what are common like user flows, user journeys, through the app? But it also was based off where do we also know that we see CPO, so care calls taking place, because we also are always trying to lower those, right. So, from that we’ve mapped out like what our tasks were and then we took another one of our competitors, who essentially is going to have the same, similar tasks and we just did a benchmark against those tasks for those competitors. So like, you know task timing, success, ease of use. We also created this quality metric which is kind of based off of reliability. And then also just qualitative feedback, right. But there’s other things as well. There are things sometimes we hear from our users that are taking place that maybe we weren’t aware of and we definitely will go to product and design and be like hey we think that we need to do a deeper dive here. We’re hearing a number of things. There’s friction taking place that we weren’t aware of. Maybe we’ve made changes somewhere – and that’s the thing with an ecosystem, right. You might think you’ve made an improvement in one space and you’ve actually created two problems somewhere else and we’re definitely given the freedom to then go explore that and suggest solutions for those additional problems. Or maybe roll back the first solution to begin with.

Steve: You brought up solution as part of the research process…

Vicki: Yeah.

Steve: … a couple of times and I wonder if you could just say more about what’s a researcher’s role in – what is…

Vicki: A researcher’s role in solutioning?

Steve: Yes.

Vicki: Um, well, at Grubhub, again because we’re a small team, I think in some ways I wish that we could be more hands on than we are. Maybe we’ll identify a need and our product team has also had a hypothesis that that’s a need as well, so they’ll be like how do we solve for this thing that we’ve identified? It’s an opportunity for business. We see that it’s an unfulfilled need for users. Perhaps we’ll put together – we’ll define what the problem is. We’ll put together a workshop. A researcher will usually be involved in that workshop, but sometimes they can’t be so then maybe one of us come in as like a speaker, to speak to like – see the work and solutioning that they do and comment on it or consult on it. But it usually begins there and then we’ll go through rounds of iterating on the design as we test it. And a lot of that will take place in Research Day. Does this make sense?

Steve: It makes sense. Maybe you have thoughts about sort of the philosophy.

Vicki: I think my team would like to be more hands on with what takes place in those workshops, but we just have a limitation of time.

Steve: Yeah. Why do you think that’s something that they want? If you look at researchers around the world, what – I feel like some are very interested in the product, the details, the solutions.

Vicki: Absolutely, right.

Steve: Some are – some maybe don’t see that as their role.

Vicki: I think there’s always like, just from a human perspective, a desire to control. So, I mean, from that basic level it makes sense to me that people want to be more involved in solutioning, but I also think – you know researchers, they’re out in the field. They’re witnessing that. Maybe they write a report. Maybe they communicate their findings, but sometimes they have to be fairly like theme, high-level findings and there’s a lot in the details that can get left out of reporting and I think they feel like they provide value to be in those solutioning sessions because they can speak to some of that nuance that maybe wasn’t communicated within a report, right.

Steve: Yeah. Well even in the example you gave where there’s just no bandwidth, that maybe you guys are working in a consulting role.

Vicki: Yeah. I mean when you have a smaller team, unfortunately it becomes that way, right.

Steve: And whether you’re consulting, or in those sessions, you’re still there to – it sounds to me – so, tease apart – I think what you mean by solutioning is a collaboration where solutions are generated, but it’s not that researchers are there to do design.

Vicki: No, yeah.

Steve: But they’re there to bring that extra detail to inform those design decisions.

Vicki: Yeah, I mean – I think to me researchers are educators, right. Like they’re kind of there to translate and educate the organization that they work with about who their users are, what they’re experiencing, where their pain points are, what they care about, what their motivations are. And there’s a number of ways you could communicate that and you can educate. Experience is probably one of the best, but due to time constraints of everybody, not everyone can come into the field with us and experience that and I think there’s something to be said about – outside of not having the experience yourself with users in the field, if you just rely on reports and communicating from the researchers you kind of – there’s something that’s left out in the details, right. There’s like a richness that’s not there that I think even researchers realize. Like if my presence is there I can speak to some of those one-off situations that maybe we didn’t cover in the report, but might get brought up within solutioning. Does that make sense?

Steve: Yeah, yeah. I think we’re sort of unpacking what solutioning looks like. That it’s not – that there’s a dialogue and a bunch of different perspectives that can come forward and that the researcher isn’t necessarily there to say oh I think we should start the task here instead of over here.

Vicki: Yeah. Yeah. It’s more about bringing that richness of like, if we start the task here this might actually happen to some of the users. Whereas if we started it here maybe that won’t happen, if that’s a good or bad thing. Does that make sense?

Steve: Yes.

Vicki: Just that deeper understanding. I really do believe that researchers tend to just be educators.

Steve: That’s a good metaphor, isn’t it?

Vicki: Yeah. It’s what it feels like to me. I remember – well I was reading this book recently, John le Carré – do you know who he is? He wrote Tinker, Tailor, Soldier, Spy.

Steve: Um-hmm.

Vicki: And he was talking about the role of a war journalist and it’s really not about reporting on just what’s taking place in the field, but it’s also about like building empathy and trying to encourage people to care. I kind of feel like in some ways we’re kind of like war reporters, based off his description. We’re out in the trenches.

Steve: Right. You want to bring it back in a way – I mean it’s back to your impact point early on.

Vicki: But there is a lot left out when you communicate through reports and findings, right. Sometimes you’ll have a user who’s extremely insightful, but do you just report to that one thing, or is that like a one-off? But then you’re in solutioning session and you’re like oh, well it’s actually useful here. This is where I should share it.

Steve: So, yeah, if we took away the constraint of limited resources or just the bandwidth that the folks on the team have, what are the ways that they could be educating?

Vicki: Well – I mean that’s the thing – not just taking constraints from my team, but I think one of the things realistically, I wish that we could take – or I wish the product and designers – product team members and designers could come into the field more and have that experience more, right. I think by them being better versed in what that experience is like within different markets, within different types of restaurants, with different types of users, they have a better kind of base to make decisions from. They’re more informed. They have a deeper understanding. They understand what some of the more nuanced situations might be and therefore they can make informed decisions. But unfortunately with meeting schedules everyone is like strapped for time, right. Like sometimes the reality is they just can’t come out into the field.

We have another program here at Grubhub called Parts Unknown. I’ve branded all of our research programs. We also do – it’s not all within programs, but Parts Unknown, started by – we were inspired by Anthony Bourdain, obviously – this was before, unfortunately, he committed suicide, but bringing people out into different markets and taking a deep look at what our ecosystem looks like outside of New York and Chicago, because we are kind of outliers, and understanding like what food culture is in these cities and how people think about food and what types of food they order and how that impacts our ecosystem and what our ecosystem looks like there, and we were able to bring – and we still do this – product and designers into the field and I feel like that has been some of the more like enriching, like those people come out with a much, much more informed – they’re much more informed and I think they’re able to make better decisions as a result of it.

Steve: That sounds like it’s kind of an immersion experience.

Vicki: Yeah, yeah. And sometimes we were partnering with food bloggers too to have a night with them, just to understand like their take on the market. It was pretty interesting. So, we still do that, but we’ve also kind of evolved that program as well. It’s not just about looking at different markets. It’s also about looking at different parts of an experience that maybe we haven’t looked at in a while and trying to determine we think it’s this, is it actually this for our users?

Steve: Right. Is this a category where you look at the different people in your ecosystem – is there – how rapidly do these things evolve?

Vicki: What do you mean?

Steve: I mean like how static are the things you know about different people in the ecosystem?

Vicki: Well I think as our industry changes and people are competing against each other within our industry that there are things that are happening that are changing the ecosystem on a pretty regular basis. Does that make sense?

Steve: Yeah. So, hence the need to – if there’s areas of the ecosystem that you haven’t looked at in a while, you have to go back…

Vicki: We have to revisit. Absolutely. And that’s kind of – this is again where I get nervous about if I should speak to this or not? So, some of the things – we have just certain areas that we haven’t looked at in a long time. I mean our company is 20 years old and we’ll be like oh this is our workflow and then we’ll have our CPO be like I want us to take a look at this because I have questions about what this workflow really is. So, we’ll go through and do this thing as an audit. The researchers will sign up as users and go through an audit with that experience, what it’s actually like as a user. And then we can pinpoint like this is what we thought it was, this is what it actually is. We think it takes 10 days. It actually takes 30 days and this is why it takes 30 days. That kind of stuff.

Steve: Yeah. So, if there’s so many different facets to the experience in the ecosystem and there’s change happening.

Vicki: Yeah. And you have – within our – I mean it’s not just our product organization, right. Like we’re working with an ops team, a sales team, we have a marketing team and sometimes different teams are building different parts of a flow and they aren’t coming together to check like does that all make sense. Isn’t it always like the idea of like your product is a reflection of your organization and how well they work together?

Steve: Right. What’s the cliché about don’t ship your org chart. I don’t know who to attribute that too.

Vicki: Yeah, I think that is a cliché, yes.

Steve: Okay. Can we talk about your team a little bit?

Vicki: Sure, what would you like to know?

Steve: What kind of people have found their way to your team?

Vicki: Well I inherited the majority of my team and they kind of have a pretty diverse background. We have a couple of – I think my team is comprised of a designer/comedian, an industrial designer, someone with a sociology background, someone with a psychology background and then two researchers who actually have like human factors degrees. And they come from a variety – most – only one of them, our intern from last summer, this is her first job. The rest of them come from a variety of other industries to get here. And one of our researchers, our associate principal, he and I have worked at like 3 to 4 other jobs together. I’ve known him for a very long time at this point.

Steve: I feel like I would be remiss if I don’t ask about /comedian because that’s just there to be asked about.

Vicki: He does like a whole bunch of live storytelling, but I think it makes him a very good presenter. He kind of has like charisma on the stage. I’ve actually had a – when he presents I have people reach out to me to be like you should hire more comedians. Because it makes him very engaging. He’s very good at telling stories, which is why I mention that he’s a designer/comedian background. But he’s worked in research for like, I think, the last 8 years.

Steve: I’ve definitely come across people in research who have either actual, or sort of – they have actual education in theater or some related field.

Vicki: Well you know communication is such a huge aspect of it and if you have that charisma and ease with yourself I think it just gets the message across better sometimes. People are more engaged and feel maybe less threatened by the message. Does that make sense?

Steve: Yeah.

Vicki: There’s something to it. He definitely knows how to charm an audience. I think it also makes sometimes the research a bit more stickier because people are more engaged with presenters like that.

Steve: I think – right, people that find their way into research eventually discover that so much of the work is about working with your colleagues and not…

Vicki: Oh, absolutely. It’s about relationships, right.

Steve: Far less than it is about fieldwork or research…

Vicki: I think that’s anything, right. Like in the end it’s all very relationship based, right. But I think there’s something too, I know the previous director here, he had us do an improv workshop and there is something about – my sister I told you is a comedian. There is something about like comedians in the end generally are introverts who kind of force themselves to be extroverts, which I feel like there’s parallels with that with research. And then there’s this idea like, with especially in improv, where instead of saying no to things it’s kind of that yes/and and building off of each other which also tends to work better with relationships as well. So, maybe it kind of facilitates that. I don’t know. But yeah, in the end it is all relationship based.

Steve: When you talk to people as prospective hires – I don’t know who you hire, but whether it’s the interns or other people, what kind of things are you looking at that kind of strike you as this person might be worth talking to?

Vicki: I definitely want someone who is very analytical. I mean we have – I’ve had candidates come into interview who, when you ask like what do you think of our platform, they just say it’s great. I want someone who has actually maybe taken a look at it from a critical lens and has identified things that they think might be opportunities. Even better if they showed friends of theirs, or family members, and kind of did a small usability test on it and came back with a critique. My team is very collaborative. We get along very well. So, huge egos aren’t a big thing. Like I want people who are going to come in – kind of say everyone brings something to the table, everyone offers something, and I feel like for researchers they learn a lot from working together as well. So, I definitely want someone who is going to fit into that culture and is excited to collaborate with other researchers.

Steve: But how do you look for that in a…

Vicki: An interview process?

Steve: Or whatever – the scouting or any part of that process before you actually are working with somebody?

Vicki: I mean for me, we have a recruiting group, right, who does all the recruiting and they bring in resumes. We talk about kind of what I’m looking for in the resumes. I mean a lot of it I’m sussing out when I’m doing phone interviews for the initial screen, right. And I guess a lot of it is me just trying to get to the core of who they are and how honest or comfortable I feel like they are with describing who they are. The intern that we have right now, I keep referring to her, she was probably one of the best interviews I’ve ever done and she just told me a very honest story about a research project she did, that they presented it, that the company rejected the idea that they came up with, and I was like what did you do? And she was like I drank a glass of wine, it was 6 weeks of work. And then she told me about like, she’s like but then the next day we went back to the drawing board and we decided to do – and it was just like a very honest – she exposed vulnerability. She was very much herself. She was funny. She could speak very clearly to herself. And I’m kind of looking for that. When I interview someone I want to feel like I’m having a drink with someone, that they’re comfortable describing who they are, which I also think is a sense of maturity where people aren’t – I mean interview process is kind of nervous, right, so you’re coming in maybe with a little bit more of a sense of yourself and a little bit more comfortable about who you are and being honest with who you are. I think a lot of people make the mistake of hiding themselves in an interview and I think you want to be very, very open and honest about who you are because if you aren’t it may not be the right job for you to begin with, right. So, I think I’m kind of looking for that. Just some who can speak to what they’ve done and who they are in a very clear and honest way. Does that make sense?

Steve: Do you think that’s a research specific way of looking at candidates?

Vicki: That’s human specific.

Steve: That’s just hiring someone you want to hire.

Vicki: Yeah, that’s hiring – but then on top of that then I’m looking for like, I want to hear examples of projects that they’ve done in the past and how they’ve approached research? I definitely like to hear about how they’ve handled failure in the past and what they’ve learned from it? Situations that they’ve had to work – environments they’ve had to work in that were not comfortable for them, and why? I think generally the most basic questions anyone asks in an interview though. But it’s definitely about how comfortable and honest and open I feel that they are in the interview that is the thing that probably sways me. But also that they have to be analytical. I do have an issue when people come in and they haven’t done any research about Grubhub, or looked at our product or had any thoughts on it at all. Because if you’re a researcher you should be asking questions, right, and doing some research.

Steve: I want to pull apart some of what you’re saying.

Vicki: Sure.

Steve: Because you’re saying analytical, but the example you’re giving is them having looked at Grubhub. Those seem like different things to me.

Vicki: Well, I want them to have – I mean when you go – when anyone goes in for an interview, haven’t you done a little bit of research on the company that you’re going to work for? You’re curious about the product, like what is this product? Like what kind of research will I be doing? Let me take a look at it. Are there a huge amount of problems that I see? Is this going to be like a battle, or is this like oh, I can see like some big – I want someone who has like thought through like what am I going to be working on? What seemed to be the problems here? And I want them to speak to like problems that they’ve identified. Does that make sense?

Steve: Yeah.

Vicki: It’s kind of like they should have been a little heuristic, or maybe even like shown it to a couple of their friends and have an idea of like these are some of the things I think that you might be dealing with. And even speak to like how they might solve some of those problems. I mean we usually also have them go through a practice where we use a site that is not Grubhub to understand like how they look at something? What problems they identify? And then what kind of research solution they would put together for that? But that’s after the phone screen.

Steve: Okay. So, I understand a little more what you’re referring to when you say analytical. There’s a bit there of initiative, but also thinking through something before you kind of get into it.

Vicki: Yeah. And to me, if they can speak to that – like because one of the concerns I have is the communication aspect, especially when you’re doing any sort of like generative, ethnographic work, I mean that’s – I used to have a researcher who called it the big messy and it’s – you go into the field – there’s a lot of stuff that you can focus on, like what do they – how do they hone into what’s important, right. And I kind of feel like I get that from talking to them about like what they’ve seen within our own product. If I have someone speak to the fact that UberEats changed their app logo color and they didn’t like it, I don’t know if I think that that is necessarily a big important finding. Whereas, if I have someone talk about problems they experienced within our search results and like how that might not be conducive to someone looking for food, that makes more sense to me. So, what are they honing in on? What do they consider a finding? And how do they communicate that to me? That speaks a lot to how they might do that for the organization.

Steve: I’ve worked as a consultant my whole career, pretty much, and many years as an agency and I think – maybe the reason you’re hearing me sort of check in on this approach is that I think in the agency lifetime that I had, and certainly my own practice, there’s kind of a consultancy hubris of like coming in and like let me tell you about your thing. So, when you say you want your prospective hires to tell you about your thing.

Vicki: I want them to have an opinion, right.

Steve: Yeah.

Vicki: And I want them to be able to communicate that opinion. And I want to understand what that opinion is based on. And if they don’t that’s concerning to me.

Steve: Yeah. So, I think there’s something here about the process more than the substance of their opinion, but the process – because they don’t have access – like I never want to be the person, personally, just to come in and say look I don’t know what strategy decisions you have, but here’s what I think you should do.

Vicki: I’m not wanting them to lead that, but I’m wanting them to have looked at our product and been like here’s some things I think might be weaknesses, right. Like I’ve considered what your offering is. I’ve looked at it and I’ve noticed these things. I mean that’s the least you can do when you go in for an interview, right? Whenever I go in for an interview I review whatever I can access of the product, depending on what part of the business I’m interviewing for, and I make sure like, if someone asks me what do I think of it that I have like feedback based off some sort of heuristic I’ve done and I can tie that back to some sort of best practice of some kind, right. And so I kind of expect that as well from researchers coming in. If they haven’t looked at the product at all they kind of haven’t prepped themselves for the interview.

Steve: Yeah.

Vicki: And then that concerns me. And also researchers should always be asking questions, right. Like you should want to know like who would I be dealing with here? I’ve had companies reach out to me where I’m like oh my God that taxonomy would be crazy. It’s so much, like how do they deal with that, right. Like I can look at that and then I’ll have those questions for like how do you guys handle that. You can tell a lot by the questions people ask as well.

Steve: Right, that’s another aspect. What kind of questions are you looking researchers to bring you in the hiring process?

Vicki: We definitely want them to ask questions. I’m always shocked at how many people have no questions for me. I mean I expect them to ask about typically how we do research. I know right now we’re hiring for kind of an entry level person. So, a lot of times I’m trying to understand have they approached certain methodologies? How are they like wanting to expand or grow as a researcher themselves? I want to understand from them – like where they want to work, what experience they want to gain? And then also like how we handle research at Grubhub, kind of similar to all the same questions that you asked me at the beginning of this interview. I kind of expect them to be asking those same questions, right.

Steve: Now they’re just going to listen to this.

Vicki: I do think sometimes on Fridays – I call it 30 minutes with like a college grad, because I get a lot of college kids reaching out to me through LinkedIn, asking me how to get jobs. And I’ll talk to them for the first 30 minutes of my Friday and just give them feedback about what I think is a good interview and we’ll go through that. It’s very similar to all the questions that you’re asking right now.

Steve: I’m just the college kid, right!

Vicki: So, from the agency side – I came from an agency background as well. I worked for a company at the beginning of my career called Usability Sciences. They’re based in Dallas. They’ve been around like 30 years. It’s always interesting to me the agency side because I feel like even though agencies want to come and be like this is what you should do, there’s always kind of like this sphere of like overstepping that with the client, right. There’s that like kind of thin boundary and I know talking to researchers who are just from an agency background, they sometimes feel like they have no voice. Or they can’t push back hard enough because they’re afraid they’ll lose the client. And so therefore sometimes they’re not quite – especially if they’ve only had agency experience – I think that’s one of the things they talk about experience they want to get is like being able to work closely with product and scope and push back when they feel like they are more empowered to do so. And that’s something that they lack from agency.

Steve: What about for you? So what are the key things you’ve seen as a difference when you’ve worked in-house or worked as a consultant?

Vicki: Well I started – I left consultancy in 2013 and for me I’d started in 2007/2008 and I just saw a change in the industry, right. 2007/2008 sometimes we’d be working with someone from marketing. They would come in. They would want us to do like just the usability test. That same client would come back year after year, same usability test, no changes. Same usability test, no changes. And then as time was progressing I would see that companies where all of a sudden you had like a research director. Or you had a researcher and they were our go to person. And I could see that people were staffing up inside and that I think the industry was swaying maybe from like agency, or definitely what my consultancy offered, to that offering taking place within the business. So, yeah, I wanted to go see what it was like. Like why are people not making changes and coming back to us year after year? What is the political landscape that they’re in that maybe is leading to something like that? How do you maybe have more control or more say when you’re internal vs. some of the limitations you feel like when you’re consulting? And just to be more embedded, right. To get to work closely with designers and to work – I mean we would come in and do like maybe 3 weeks with a company. We wouldn’t do like a long, long embedded process. So, it was just that understanding of like what is it like to be embedded and work with product and design and what are the limitations they’re facing? Because I could see those within the projects that I was working on, like that was happening, but I didn’t understand why that was happening?

Steve: What was the first job that you got post-agency?

Vicki: JCPenney. That was one of our clients and I jumped over to them. But probably after – I don’t know if you know anything about JCPenney? They went through a big shift where they hired this guy, Ron Johnson, from Apple. And he decided to kill all of their rewards and sales programs. And if you know anything about the JCPenney customer, that’s – customer shops sales first at JCPenney. And their loyal customers are very much versed in that whole points/rewards program, so by getting rid of that they alienated their core customer base, the company kind of took a dive that I don’t know if they’ve really come out of it. I think it’s also like retail has changed, right, and that was taking place in the midst of that change of retail because of online sales. So, I went there. It was interesting, but also the company was doing massive layoffs, constantly. So, then I went to AT&T after that.

Steve: What were you doing at AT&T?

Vicki: AT&T was a closer commute. At the end JCPenney was a 2 hour commute for me each way which is bananas. I was in Dallas at this time. AT&T, on the business side, essentially they had a portal of portals and we came into a product organization that’s first question to me was we don’t know who are users are or what they do. So, there it was just like starting at the very bottom and trying to help inform product who was just desperate for information. So, that was a lot of talking with network engineers. And I have mixed feelings about personas, but that was actually one of the best scenarios I’ve seen for why you build personas. Like if someone is like we don’t know who our users are, maybe you build some personas to explain who your users are. They didn’t have much access to data either. They had little knowledge of who was using the platform and how.

Steve: So they didn’t know anything?

Vicki: Yeah. It’s interesting. There’s like a spectrum of like not knowing anything to like then working in a place where people assume they are the user themselves and that they know everything, right. I mean both come with their own challenges and problems. It’s probably better to be somewhere in the middle.

Steve: And from AT&T?

Vicki: Came here.

Steve: What’s the role that you came into?

Vicki: Manager of the New York – so I manage the New York team which is all the diner facing. And we had a director that oversaw everything. He left, I guess about a year ago now. And then they gave me 6 months to progress into his space.

Steve: So, is this your first…

Vicki: Director position?

Steve: I was going to say leadership in general around research?

Vicki: I’ve been a manager since JCPenney. But this is first director. But I don’t have a manager beneath me. So, I’m still managing as well, right. Everyone direct reports to me.

Steve: Is that the difference between a manager and a director? Can you tell I don’t have a job inside an organization?

Vicki: I think it’s different for every company, but within our org structure most directors have managers and I do not. So, it’s me and six, soon to be seven, direct reports.

Steve: Can we go back then before this agency – what was it that led you into that kind of work?

Vicki: I started my career in marketing research at a small ad agency in Dallas who handled all of McDonald’s local DMA markets. McDonalds provides an allotment of money for national coverage and then depending on the size of the DMAs, the DMAs are given money to do local advertising and then sometimes the DMAs themselves fund additional advertising. So, my agency handled all the local advertising for markets.

Steve: Is DMA direct mail?

Vicki: Designated marketing area. And through that I started doing – they started investing in like creating microsites for like ad campaigns and we would track those through Google Analytics. And what – I was in charge of helping tag those and I’d have to come up with like what’s the user’s journey? Where should we tag? What do we consider success? What are we measuring? And then I was at a Christmas party and I met someone and explained to them what I did and he worked at Usability Sciences and was like oh, you should come work for us. It sounds like you’re just doing like heuristics. ‘Cuz oftentimes when I went through tagging I would discover like there’s no clear path how to get from here to here and that’s one of like the core paths that users would try to accomplish on this microsite. And so from a Christmas party I got hired at Usability Sciences. Yeah, but it was just – I went to school with a business degree. I came out and worked within marketing research. I did a lot of – I essentially was like a data analyst if I’m honest. And then just through the fact that they had certain jobs that they didn’t know who to give to and they gave it to me. I started working with Google Analytics and Omniture and tagging and then somehow that led to me getting hired as a researcher. I had not even considered that this was a career when I was in college that I could – I mean I was in school in the 90s. No one talked about it. This was not like a career path I’d ever heard of.

Steve: Right.

Vicki: But I did enjoy research. And also I had – part of my degree was business, computer information systems, and so I had a pretty good understanding of databases and database structure from that which helped a lot with doing data.

Steve: So, when you think about your team now, like there’s probably a few different ways that people have come up – come into research. Like what you just described. You said some people have come through different kinds of designs, some human factors trained people – like what’s the…

Vicki: A successful background? I don’t know if – is that what you were going to ask?

Steve: No, I wasn’t, but you can answer a different question.

Vicki: I don’t think there is one. I think it kind of comes down to the person, right. I don’t – I don’t know – I have a weird view of education. I kind of – my mom is English, my Dad is South African and generally there school is considered as a way to make you a more well-rounded person and then once you leave school maybe you specialize in something and then you pick up those skills somewhere else. And I kind of feel like, just from all the different researchers I’ve worked with and the different backgrounds that they have, that I don’t know if having like a degree in human factors actually means you’re going to be the most successful researcher. I’ve worked with researchers who have nutrition degrees who are amazing. So, I think it just comes down to the individual.

Steve: I feel like there was a period of time where most – I think most researchers, some came from a very heavy social science background, but most in industry came from just ad hoc backgrounds like what you’re talking about, or my own background.

Vicki: But that’s changed, right? Or it is changing.

Steve: Yes. And, right, so what does that mean – we’re probably at the early stage of that change, but…

Vicki: I don’t know if we are. ‘Cuz I have a question. Like a couple of years ago when I, like maybe like 2012/13, I was considering a move to the Northwest. And at that time I felt that it was very difficult to have an undergrad degree and experience I had, which was a decent amount of experience, like 6 years, right, and find a company in the Northwest that was willing to hire someone who hadn’t specialized and done a post-grad, or a PhD. Now they might hire someone with a PhD who had no experience in the field over hiring someone who has 6 years of experience, but not PhD. And then I feel like since then that attitude has changed, but I’m not quite certain why that attitude has changed. It seems like people are more open to hiring people from various backgrounds. Is that just my experience? Have you experienced that?

Steve: I mean my sense with research is that demand exceeds supply.

Vicki: There’s a low supply, right.

Steve: So, doesn’t that mean we have to revisit sort of who gets to play?

Vicki: I didn’t know – I thought maybe – I’ve worked with a number of PhD candidates, or people with PhDs and they are very strict and maybe one-minded about how research has to be done. Whereas I think sometimes, especially within the world of business, you have to be able to maybe make compromises, or like maybe take shortcuts, but feel comfortable with those shortcuts, and sometimes I find that people with the academic background are less comfortable taking those. Have you had that experience? And I didn’t know if that was actually what was impacting? Like maybe the application of research within the business world, maybe academics, it was a hard blend. Does that make sense?

Steve: Yeah. I mean my perspective is anecdotal at best.

Vicki: Yeah, mine too. I’m just like guessing. Like I’m trying to understand like is that what happened? Or like maybe your point. Maybe it’s just there’s not enough supply so they’re like we’ll take anyone.

Steve: Right. But is – and I feel like early on I met academics who were sort of early in the industry and who represented some of that mindset that you’re talking about and that just constantly I meet people with high levels of education who are so excited and hungry to extend what they knew how to do.

Vicki: Sure.

Steve: Kind of in the kind of work that we’re talking about. So, what I don’t know is how much did sort of academia shape and maybe limit their mindset. But then I’m also excited about teams like yours where you have different backgrounds.

Vicki: Yeah. I think you want the blend, right? Like everyone should kind of have a different mindset, right. Because they’re all bringing a different lens. That’s one of the reasons I also like designers and product people in the field. Everyone is seeing something different through their lens because of their background and their experience. And so it’s much more insightful when you have a pretty good mix, a motley crew as I like to call it.

Steve: Yeah. That’s a great phrase. And maybe if someone you know, for example, comes from academia and has kind of a here’s how I was trained to look at problems, that person is going to do well among a motley crew where they’re going to be kind of creatively elbowed once in a while.

Vicki: Actually, at AT&T we had a researcher I loved and her background was anthropology and the rest of us were maybe a little more – had come to it a different path, and she definitely pushed the rest of the team in a very good way. It was a good balance.

Steve: Yeah. If you look at research as a team sport which is kind of where you started off.

Vicki: I think it is, yeah. I think it is a team sport.

Steve: Then you’re casting a team to have this great thing together.

Vicki: Yeah. That’s one of the things – a lot of the people I interview are coming from backgrounds where they are maybe one – a solo researcher in an organization, or they’re one of two researchers and they don’t get to work together because they’re only one of two and their desire is to work with a team of researchers and I actually do believe – in my consulting life we always worked as a pair and I actually think it just strengthened you because you were learning from another researcher and I feel like you picked things up and learned just from seeing what that other approach was doing or talking about – like understanding how people communicated things. Like there’s something to be said about research as a team is much stronger. I always feel for people who are doing it – they’re on the path alone, by themselves.

Steve: What do you think about researchers and research in like 2029? Who is it going to be made of? How do we kind of…

Vicki: I don’t know.

Steve: What’s the desired future? How do we get there?

Vicki: I mean I kind of go back and forth on this. I do feel like if companies – I think researchers are always – if research is a team they’re always doing to be less than there are of design and product. And in some worlds design and product need to take on aspects of research, right. And so, then is it that people will just be – we’ll have design and product or maybe jobs will begin to blend together more and there won’t necessarily be a dedicated researcher? But I can go back and forth on that where I’m like no, I kind of thing you’ll always probably want at least a research team who can oversee and maybe help teach people how to do their own research, that maybe don’t necessarily do all the research on their own, or it’s not its own kind of thing, because I do think the more involved product and design are in research, the more that they take on, the better they become. So, I don’t know. The future is – I’m wondering if design and product and research will all kind of blend together and people will begin to own all three of those skills which might be a lot to ask from someone.

Steve: But even think about your motley crew metaphor. You’re talking about sort of motley crew – it’s hard to say without sounding like I’m saying the name of the band.

Vicki: Well it is the name of the band.

Steve: I know, but I feel like the emphasis is different. Motley crew, I’m trying to just use the original phrase, but it’s coming out like the band. If research as a team, the way you have it now, is built up of a motley crew, imagine – what you’re talking about makes me think about like if you’d sort of change the departmental structure there’s a different make up of a motley crew that includes the skills of design and the skills of product and the skills of research in a diverse way.

Vicki: Sure.

Steve: And so what is that? Is that a new thing that’s like a hybrid?

Vicki: I don’t know the answer. I mean my tendency is to not be that extreme. I think that design and product and research will have to blend because they talk about like embedding design and research and product together, but then at some point people are going to be like, well can’t the designer also do part of the research, or maybe own the product? So, it will blend, but to the extent I think depends on the company and where they are in that evolutionary process. And in some ways, I mean design, even research, like to take that on – I feel like researchers themselves wear a lot of hats and then designers doing researchers, that’s a lot more hats. And then mixing product – I don’t know if a human is possible of like being that multi-faceted, right? So, I don’t know what it will look like, but I do imagine that they are going to have to blend together more. I don’t know exactly what that will be. Having a research team, especially if these research teams working more in like a consultancy basis because they’re so much smaller, and knowing that you build a better product and design organization by them being more involved in research just lends me to believe that there will be more blending, but I don’t quite know what that will look like and to the extent of it, and I’m sure it will be different on a basis by basis situation, depending on how evolved that company is.

Steve: That’s a very good, specific, non-conclusive…

Vicki: Did I just dance around that?

Steve: Well, we’re talking about what the future is going to be and how to get there. So, it’s not like you have the magic answer.

Vicki: Yeah, but I don’t want to be like it will be this, ‘cuz the future – do you ever read Chuck Klosterman? Are you a Klosterman fan?

Steve: Yes.

Vicki: Did you read his last book about where he’s like essentially everything we know now might be wrong? And it’s like trying to think through what we know now
and how that might be wrong and therefore what multiple instances of the future might be based off us being wrong now is the only thing that we know.

Steve: Yeah.

Vicki: So, that is leading me to my roundabout way of answering that question. It’s a very good book if you haven’t read it. It’s really, really good.

Steve: That’s a nice pivot from my Motley Crue reference.

Vicki: To Klosterman. It melds well.

Steve: Okay. It seems like maybe that’s where we should look to wrap up. Is there anything else that we should talk about?

Vicki: I don’t know. Have I answered all of your questions?

Steve: It’s not possible to answer all of anyone’s questions.

Vicki: That’s true. I mean my greatest fear for this is that I would be your most boring interview.

Steve: Well, my eyes were just fluttering. It’s for the listeners of this to decide if they’re bored or not. If they made it this far.

Vicki: They’re engaged in some aspect.

Steve: Some aspect. Whether it’s just rage listening, or, I don’t know.

Vicki: Rage listening!

Steve: Is that a thing?

Vicki: I’m assuming it must be for someone.

Steve: Alright, well thanks very much for being a guest. It was great chatting with you.

Vicki: Great chatting with you as well.

Steve: All right. That’s the wrap on another episode! Subscribe to Dollars to Donuts wherever you get podcasts. If you’re using Apple Podcasts, how’s about giving the podcast a rating and even a short review. This helps other people find out about the podcast. Portigal dot com slash podcast has transcripts, show notes, and all of the episodes. Follow the podcast on Twitter, and buy my books Interviewing Users and Doorbells Danger and Dead Batteries from Rosenfeld Media or Amazon. Our theme music is by Bruce Todd.

Jun 27 2019

57mins

Play

Rank #10: 3. Frances Karandy of Citrix

Podcast cover
Read more

Today’s guest is Frances Karandy, a senior manager within the Customer Experience Group at Citrix. We discuss doing product-focused research in a company with a large number of products, what to look for when hiring researchers, and how to select projects that not only support the business but also help team members to develop.

Design goes hand in hand with research.. it’s about solving complex problems. How do we improve not just the UI or the screen, but also the product itself? – Frances Karandy

Show Links

Follow Dollars to Donuts on Twitter (and Stitcher) and show us some love by leaving a review on iTunes.

Transcript

Steve Portigal: Thanks so much for being with us, Frances. I’m looking forward to speaking with you today.

Frances Karandy: Thank you, Steve.

Steve: Let’s just start at the high level. Maybe tell us about Citrix at a broad level, and what your role is, and what that involves.

Frances: Yeah, great. Citrix, if you don’t know the company, is an enterprise software company. We create virtualization, device management and collaboration software. What that actually means is that employees can be mobile and working from anywhere, on any device, securely.

We’re most known for GoToMeeting, you might have heard of that application, and for providing Windows applications at any endpoint. That could be a laptop, a thin client or even a Mac or iOS tablet.

I’m senior manager within the Customer Experience Group. I lead a team of user researchers who primarily support our application and Desktop Virtualization products.

Steve: Do you have counterparts that support other products?

Frances: Yes. I would say, in all, we have maybe two dozen or so user researchers and we’re supporting different business lines in different areas of the product. In addition to that, there’s also the Customer Insights Group that we work really closely with. There’s probably about another dozen or so, maybe a dozen and a half, on that team that we work closely with. Yes, we’re aligned to different product lines.

Steve: What’s the size of your team?

Frances: Four people. And then, myself would make a fifth.

Steve: Is there a specific type of customer for those product lines that you’re trying to learn about?

Frances: Yeah, actually, I would say not just our product line that we work on, but most of the enterprise product lines within the full Citrix suite. Really comes down to three segments that we’re often looking at or working with, within an organization. When we go in and we partner with a customer, pretty much it comes down to three types of users that we’re working with.

First one being the buyer. Who’s the buyer within the organization? Typically, that tends to be a C-level or IT executive. They would be the one signing the check for the technology solutions. There’s also the IT administrative team or staff. These are the folks that are responsible for setting up, deploying, rolling out, and maintaining the Citrix products within their organization.

Then, we also have what we call the end users. That would typically be the employees who are within the organization and accessing their applications, their data, really through Citrix. It’s these three segments that we could be focusing on at any one time.

For the most part, a lot of the work that we tend to do falls towards building out the administrative consoles, as well as that end user experience. We also have a small team that looks at the end user experience of our products, as well.

Steve: Let me just follow up with that a little bit. You’re looking at these different users, but you have a focus, your team has a focus on a specific set of products.

I’m wondering how the work that a different team is doing, that looks at a different set of product… users and organizations might be dealing with a whole range of products from Citrix, but organizationally, it’s divided up. I’m wondering if that comes up, that you have the challenge of some of what would be helpful for what you’re doing falls under the purview of another team.

Is that even a scenario that you deal with?

Frances: I think one of the interesting things that Citrix has been able to work through, finding the customers and sharing customer access, one of the things we’ve had to look at is who the customer base is and where they’re coming from.

Citrix has grown via a lot of acquisitions. When we bring on a new company, there’s that set of customers. As we expand and sell more of our other product lines into that customer, or vice versa, then it’s reaching across those product teams and lines, too, to say, “Hey, are there folks within your teams or contacts that you have in your organization that we could leverage?”

I think it comes down for us, finding the right sales or account person that has the connection with the customer that we might want to go do research with. More often than not, a Citrix customer will have more than one product deployed. It doesn’t tend to be too difficult to find out who has what and to be able to share that contact list. If that answers your question?

Steve: I like how you framed it, because I think there’s two aspects here. One is, in an organization that has multiple products and multiple individuals having relationships with customers for those products, just from a Citrix point of view, there’s a finding participants issue, which I think you’ve spoken to.

I’m wondering about the parallel piece, which is just that you are focused on product A, and you are visiting a customer, and there’s obviously different ways that you can look at what their experience is. One of those is to restrict it to their use of product A (I can’t remember what I had product A now) but given the kinds of work that you’re enabling, I’m wondering, and maybe I’m just digging into something that doesn’t exist…I’m wondering if, as you’re talking about that kind of work, the things that you’re going to hear, the opportunities you’re going to uncover, or the breakdowns that they’re struggling with could possibly span multiple products, given the breadth of offerings that you have, and this really large focus on how work is being done in these different kinds of work contexts, I’m wondering how focused your lens is given how broad the company overall is? Are you trying to strike a balance between what you’re learning, and how you can apply it within Citrix?

Frances: Yeah. I think it depends on the type of research study that we’re doing too. Certainly, we have opportunities to go out from time-to-time and actually sit down and visit with our customers and watch their users use our product, whether it’s the employee, or the administrator, or whatnot.

I think in that case we’re trying to be as broad as possible to really understand, what is their workflow? How do they do their work? A lot of times Citrix is that invisible back-end that’s enabling them to get access to their apps or data. They may or may not know that it’s actually Citrix running it.

When a company deploys a Citrix product, they might click on an icon that might be a Citrix icon, or that might be hidden. I think we have to be very broad on how we probe users and ask how they go about their workflow. On the data capturing end, we’re trying to look for what applications are they using? What are they experiencing? How is their work setup? And really come back and share that broadly within the organization.

Because, you’re right. We might find out about a user’s workflow could apply to not just our immediate product line, but other products that Citrix has. I think it’s a matter of trying to be as broad as possible and comprehensive, when we report that story or that set of findings back.

However, when we’re doing more narrow, I guess you could say more narrowly focused release research, or we’re trying to do usability testing on a particular workflow, that tends not to extend so much in those cases, unless either we specifically probe for that or the customer or administrator brings it up for example.

Steve: That helps. That’s great. What’s in place inside your organization culturally or process wise, for that first example where you are going to share things to the broad audience that can make use of them? How does that happen?

Frances: We have a couple of opportunities that we created to share research within the customer broader experience organization, and one of that is like a weekly sharing session. Anybody in the customer experience organization can come together. We’ll highlight a project, or a talk, or a topic. There’s that opportunity.

We also have a meeting specifically for researching customer insights that we hold once a month as well. That’s just yet another opportunity to maybe take a look at it from more of a research angle, or a techniques, or approach. We might bring a research question to share with the others and get feedback on. There’s several different avenues.

Then there’s All Hands Meetings that we hold on a monthly basis as well for the broader organization. There’s quite a few opportunities if you want to make the results of your work available within the customer experience organization, and of course beyond that to product teams and engineering teams that we work with on a regular basis.

So it’s usually those opportunities where, if someone hasn’t reached out to us that wants more details on our research or would like to be part of that, we’ll find out then if we haven’t identified them as part of that process to begin with.

Steve: Maybe one last question on this. I don’t want to bore you or everybody else. Those folks that express interest…you’re supporting specific product groups or specific teams with what your main assignment is. But through these kinds of sharing sessions, you may end up engaging with people outside that.

Frances: Right. I think maybe one clarification that might help with that is, going back to the portfolio of products that Citrix has, they have been acquired and have been intended to complement each other. So it’s really looking at the portfolio from a solutions sense.

We support the products and apps on desktop, but a lot of times our customers are running NetScalers, which is on the networking side of our business. Having a customer deploy multiple products inherently means that there’s in many cases a shared customer segment there or base, and that we’re going to hear feedback about other products, just because that’s so critical to setting up and deploying an environment, and making sure that these products talk and work together correctly is part of the setup process to begin with.

Inevitably we’re going to get feedback on other products and it’s just a matter of either sharing that broadly and/or reaching out specifically to teams to say, “Look, we did this research. They mentioned some things about the product,” or “Here, you might want to take a look at this a little bit further,” so we’ll be proactive about that, in addition to just sharing it broadly and see who else might be interested or how they might want to take that research forward and adapt it to whatever objective they have as well.

Steve: I could see that creating something quite charming about how this process or how talking to customers can help the company. If I’m someone that’s on a product team and I have a group assigned to me who’s my go-to research people but I’m interacting with all kinds of other research groups based on other things that they’re learning, to me it starts to suggest I’m waving my hands to show some kind of network diagram, where stories about how people are using things are flowing in a lot of different directions.

Frances: Yes.

Steve: Because you started off describing something deliberate and structured, but as you are revealing more about how it works, it seems like this multidirectional sharing is I guess I’ll call it a cultural element of how you guys are using insights at Citrix.

Frances: Yes, and I think we have to for a number of reasons. Maps, journey maps are one area that we’ve invested in as well Journey mapping the customer experience and what that means from the onset, for example, of even before they’ve purchased a Citrix product all the way through setup and deployment and then how customers stay on and advocate for products longer than that.

So we’ve taken a look at a very, very broad perspective of not just looking at it from one product line, but what is that customer experience really from start to finish, if there is a finish? That’s become one way to communicate. The intersection of how our products come together, and not just Citrix products but also other technology products that they have to deploy to set that ecosystem up.

Maps and visuals are a great way to do that. The other purpose that that serves, in illustrating and mapping out visually how the products intersect, is that Citrix and a lot of enterprise software is really sophisticated or complex, so to speak. There’s a lot of different moving parts that have to come together to make that work.

For us, coming into user experience or the customer experience group, a lot of us don’t have that technical background that our engineers do or someone who’s a subject matter expert in networking does, where they speak and they code and they write this and they sell this all the time.

One of the things that we’ve had to really focus on and think about is, when a user experience person comes into Citrix, how do we on board them so that they have some level of understanding about the product? That, for example, as researchers we can have a useful conversation with our user base or customers or partners and be able to facilitate the kind of research that we do?

But then, also be able to go back and work with product and engineering and suggest, really help effectively shape, recommendations and product enhancements. We have to have some level of at least conceptual product knowledge, and that’s where visualizing and mapping and journey mapping and diagramming and working with our designers has helped tremendously.

You can read a hundred-page technical write up and whitepaper on how the technology works, but if you can get a designer to map that out, you know that “a picture is a thousand words”? It is so much easier.

Steve: You’re talking now about on-boarding other researchers?

Frances: Yes, I am. But I was also talking about how diagrams and maps become a really important output for our research, but then they also become a communication tool in teaching others about how the products work and how that comes together. Visualizations and visual diagrams take on definitely multi-layered meanings here.

Steve: That’s really interesting. You’ve alluded to a couple different kinds of projects, something more evaluative like usability testing, which I think was tied to a release cycle. Then, you described this looking at workflow, maybe more generative.

Are there other kinds of projects? What’s the overall mix of work that you’re doing look like?

Frances: Right now, we are looking at…We do a mix of generative, evaluative, summative work. At any one point, we might be having projects in any one of those categories, or, depending on where we are on the release cycle, emphasizing one over another.

The generative work, let’s say ideation, a lot of the customer visit work, a lot of the stepping in and let’s just understand the problems base or how our customers use our product in a very broad sense, maybe, for end users, how are they adopting tablets within their organization? What’s their bring your own device policy, for example?

Some of those questions that we explore, there tends to be maybe 20 to 30 percent of our work goes in that bucket. Then, as we move through and work towards release deadlines, there’s a fair amount of evaluative work that has to happen, too. Because, again, when you launch and release an enterprise software product, it’s typically, at least in our case, while we do have some SaaS and Cloud based services, for the install software product, that tends to be a longer product cycle.

Also, a lot of different screens and components that could make up the product itself, so a lot of sub-components. A lot of testing and design enhancements go into that work. There’s a fair amount of work that goes into that piece. That tends to be, at any given point, maybe 60 percent of the work that we do. 50, 60. Then, we’re also looking at, more recently, how do we measure the impact of our releases? How do we measure the user experience and the experience enhancements that we made to the product?

We’re also looking at ways to survey and benchmark based on some of the customer experience dimensions that we would like to be able to measure ourselves on, and ensure that, release over release, we’re actually building and testing a better product. That’s reflected in our customer sentiments, of course.

Steve: Do you have a road map or some plan that says, “Here’s what we’re going to be working on for the next period X in time”?

Frances: We’ll typically look at the year out ahead and coordinate with the release managers, product and engineering to understand what the timelines and the drops are for, let’s say, design lock-down or code lock-down.

If we’re doing release related research work, then we need to work backwards and identify the windows of opportunity that we can do testing, for example, or do earlier research. There’s a stream of research that we do that maps to the product release, and we have input into that checkpoint process, which is great.

But then there’s also a stream of research, and this tends to be the broader themes or bigger pillars that we want to go and address that might span multiple releases. If we want to look at a bigger topic a little bit more in depth, then that can be done in parallel to release research.

We take the opportunity to do both types and make sure that we’re resourcing in both areas so that we’re not just looking at the release ahead, but we’re also looking, like, how can we solve, let’s say, a broader opportunity for our customers or do the necessary generative research and fit that into the timeline?

We have a fair idea year out of what’s coming, but then we also have a closer view of the next couple of quarters, and then we tend to tweak from there.

Steve: How do you think about, within your team, applying resources to the different things that you’re planning for?

Frances: That’s a great question. It comes down to a couple of different factors. Having grown our research team in this area over the, I guess, last year and a half, I’ve done a lot of thinking about this. It comes down to mapping the researcher’s experience level and where they want to focus on in terms of growing their skills to the opportunity.

Where do I see need to maybe dive into a specific skill area, for example. Then, also there’s an aspect of, not just in terms of research methods and techniques and mapping that to the right level and giving folks the opportunity to grow from a research perspective, but also the product knowledge as well.

When I think about aligning folks to the different types of projects, I’m trying to think about, will this expand their level of knowledge about working with customers, getting to know the audience, getting familiar with the technique? But, also, how much product knowledge do they have now, and could this be a way to introduce them to different aspects of the product so that they can take on a bigger and more strategic project the next time around?

As I mentioned before, one of the areas that we want to look at carefully is, how do we get non-technical folks up to speed on a technical product? That comes in a variety of ways. You can certainly do online training and go through courses that we provide to do that. But some of it’s also exposing them to the right project at the right time in a way that they will learn via way of speaking with customers and partners themselves, and picking up on how our products are used in our customers’ organizations.

It’s a mix of probably those two factors. Then, there would be a third, of course, and very important, is aligning it with the business strategy. What’s our strategy and our goal and our mission for the year going to be? How do we make sure that we’re focusing on the right business questions and the projects that line up to that so we can start to move the needle in addressing those questions and accomplishing, as a company, what we’re trying to accomplish?

Steve: You said you’ve been thinking about this for the last year and a half with this team. I’m wondering if you can describe the team’s evolution, or how you think about the kinds of folks and the kinds of skills that you need to have on a team like this.

Frances: When I look at the types of skills that we need, I look for a couple of things. Is there an aptitude, of course, to want to solve complex problems, and really dive deep into a space and work on something that might seem intimidating at first?

Certainly, when I first came here, I remember one of the first conversations I was listening to that a researcher was doing with one of our customers. This person was a Windows IT administrator, and they rattled off so many terms that I had never heard of or not paid attention to.

I know if you’re a Microsoft system administrator, you would know what some of these terms mean, but I didn’t. It took that time to want to be able to go do the extra work to look up what that meant and go ask questions and, again, leverage the folks internally to say, “Please explain this to me. I don’t understand it.”

Why is it important that we’re providing this capability, for example, and what do I need to know and understand about it so I can ask the right questions? But when I look at bringing folks on the team, it is, A: Do they want to solve complex problems, and is that something that interests them?

It’s also, too, I look for critical thinking. Obviously, critical thinking is core to our work. We’re always knee deep in analyzing the data and figuring out what’s going to be our approach to answering the questions, and how do we want to highlight the most relevant insights. Working with the constraints that we might have and trying to figure out, how do you do your best research work within that? Then, too, I’m also looking for folks that have an intuitive sense of data, how data and methods and tools are used.

Sampling, for example. All of these things become really important when you’re trying to identify the right method to use. The right question to answer, number one. Number two, the right method to use. Then, how do you go about doing that? Even if I come across a researcher interviewing for a role that may not require a lot of experience quite yet or may be out of college, I’m still listening for that intuitive sense to know, here are the conclusions that you can draw from this information, and here’s what you can’t conclude.

How they think about the question and how they think about the data becomes really important. Too, along with wanting to solve complex problem spaces, is always looking for that passion. What you might call the fire in the belly is really, really important.

We all want to be doing work that we love to do, and that’s not any different when it comes to research. Having that fire in the belly to go after and go beyond, let’s say, what the scope of the work is, and wanting to push themselves to learn more and be able to share that back with the organization.

That fire in the belly is a little bit more elusive sometimes. It’s sometimes the hardest to find, because it’s really personal and it’s intrinsic to individuals. But then, when you see that motivation and you see someone come in and light up and talk about the work that they do and why they’re energized by it and how excited they were to solve that problem, that becomes really, really valuable.

Because that also is that sustainer for working through more complex problem sets, being able to put in the extra work to make sure that you’re going to get the results and impact from the work that you do. That makes a huge difference.

Steve: I want to reflect back, you said many great things, but one thing that struck me was characterizing a researcher as someone who solves complex problems. I love that definition. It makes me think about people that I meet, and I’m sure you’ve had this experience as well, who are interested in research, but don’t have any experience. But they find it compelling, and they are very interested in growing their skills there.

Usually, people in that stage of their life tell me that they’re curious, they’re interested in people, it’s a hunger for learning and gaining information and observing. I’ve never heard it characterized as solving problems, let alone as solving complex problems. That’s much more of an action word than learning or discovering or understanding. It’s about doing something with it. I love hearing you frame it that way.

It gets me excited about thinking, “What is it that we do?” And checking myself, is that what I do? Am I solving problems? It’s a great frame on it.

Frances: I guess I would say to that, design, and design goes hand in hand with research, is about solving complex problems. Depending on the maturity level of UX in an organization, typically, most companies today have some design function, at least around here. You don’t come across too many companies that do not have that, even in start-ups. The first level is looking at, how can we improve the interface? But as you expand that to think about, how do we improve not just the UI or the screen, but also the product itself?

In our case, documentation and the information experience is as much a critical part of how a user learns and absorbs and manages our product. That becomes really important. Then, beyond that, you can develop and ship a product, but you have to understand how that connection happens to the marketing and education of that product in the first place.

Then, how do you tie those research and insights back into what happens when a customer experience is something unexpected in your product? How do they go about resolving that? Really thinking about that end-to-end experience and journey. When you think about launching a product, there’s so many different factors that contribute to its success, and you have to know where those touch points are within the organization and make sure that your research identifies and is relevant to those groups that might take advantage of that.

What we’re learning about our customers, let’s say, in a usability session or an onsite visit, could be very relevant to our product marketing teams, or maybe how we want to position that on the website. It does become a complex problem, because shipping, especially shipping an enterprise software product, it requires a lot of teams to put that together.

You have to understand how the business operates and how you go to market to understand the value that the research and insights that you’re getting from our customers could benefit so many different areas of the company, and how that entire product experience comes together. It’s complex on different levels, not to mention that it’s very sophisticated technology as well.

Steve: Can you step back a little bit and describe some of the history within Citrix of customer experience, and user research specifically? Because you’re describing a fairly rigorous and big picture view of what everyone is doing together, and I wonder what the history of that is or the genesis of it has been.

Frances: Great question. Research and the history at Citrix is a very fascinating story. Our Senior Vice President, our SVP of Customer Experience, her name is Catherine Courage. She joined the company, I believe it was 2009. She came on as a VP of Design.

At the time, there was a fairly small group of designers and maybe one or two researchers that were working out of our Florida offices. For a couple of years, they had been focused on UI, icon design, a usability test here or there. But she was brought on to expand the role of design within the company, to provide that road map for, how can Citrix bring design to the forefront so that we can differentiate ourselves from competitors and shift the way products are developed so that we’re building better products and we’re building better product experiences around that?

What’s interesting is that Catherine has a research background. Her career early on was doing research, so she intuitively and from experience knew how important that was to developing great products that people love, even if they are more technical. It starts with that deep understanding and empathy of your customers and how they do their work.

She got backing to put a lot of the executives and leadership through Stanford’s d.school, the design, the boot camp. That was very instrumental, because that exposed them to design thinking and the power that that can bring into creating products and services that your customers are going to want to buy and really love.

From there, it cascaded and blew open the doors for everybody. Almost everybody in the organization has either heard about or gone through some level of design thinking workshop and training themselves. I also made the point that it’s not just the product design related folks or product management or engineering or folks that touch product development that have gone through the training.

But it’s actually been pockets of the organization like finance, like facilities, operations, groups that you would never necessarily expect, and other teams that you would never necessarily expect in a company to want to understand and embrace this as a mindset. But that’s been really amazing, because what that did was also bring a design thinking approach to internal projects.

Or, let’s say if you are on the education team and you want to redesign how we deliver video training to folks. Now, that team that might have gone through that training now knows to say, “Hey, let’s start with the users,” and identify what are the pluses and minuses of how it works today, and really understand their needs and go from there.

There’s been a lot of not just product design focus in terms of how we can bring a research and user approach to that, but also other non-product areas as well. That’s led to a cultural mindset shift and changing the DNA of the culture to embrace design, and have folks and teams across the organization have a shared understanding of what that means and even how to practice that.

Steve: If I’m in an organization that doesn’t champion research, as people sometimes say, what would you tell me to take away from the Citrix story that could help me?

Frances: Thinking about how you start with the customer and empathize with the customer first, or user in this case, they may not be our customer. It’s something that, it’s a mindset shift, and that philosophy can be shared and taught with others. If getting research for a particular product or project is not creating the strides that you want, then find ways to introduce the concepts or the approach and that mindset to the folks that may not have bought into that.

For that, there are always projects going on around the organization or maybe some that you’re a part of that maybe you could apply that approach to and say, “Hey, let me help you solve this problem. Let me show you some techniques that I’ve used to think through that problem space.”

Create a workshop or figure out how you can teach the process, as well as maybe help them achieve results through that. The approach, the mindset there is that it was really eye-opening to see that folks across the organization really embraced that. They said, “Wow. Design thinking, I get it. I can interview my counterpart next to me and ask them what challenges do they have going about their day or using their phone. That’s not something that only researchers have to do.” Proving that out in practice and showing by example can be a really powerful way to change somebody’s mind.

Steve: It makes me think of a bigger, more abstract question. What does research do? We’re talking about research, but of course we’re talking about much, much more. You’ve hit on design and culture change and teaching and making products and shipping products. But we’re starting, in this conversation anyway, our ground zero is research. I guess to throw to you, in the big picture, what does research do for a company? What is it? What’s its benefit?

Frances: The first thing that we have to do is really try and understand… put a lot of time and thought into extracting what those key objectives are and helping people understand, when we go to research the problem and find out the problem, we might want to go a little bit broader in scope, for example, and see what we find. There’s that aspect of research that helps shape the direction of how you go about solving the problem.

Then, as researchers, too, we have to be thoughtful about what methods and techniques are we going to select to best get to that answer, and what type of data could best answer that question, or combination of methods and techniques? That’s where, we may collaborate with other non-researchers in that process, of course, and keep them moving along with us, but that’s where the researcher skill comes in and says, “I’m going to try X, Y, Z techniques and maybe make sure that I align with certain parts of the organization for that to come together.”

In a broader sense, that’s part of what we do. Again, the role of research is to, as we’re going out and speaking with our user base or customers, to be sensitive to not just the immediate questions at hand that we’re trying to ultimately address and answer for the business, but also pay attention and listen very closely to what your customer or user is saying.

Because oftentimes, they’ll reveal something that you had never thought to ask. It’s being patient and open and being flexible in some way to get more than what you came in asking for.

When I think our role has a lot to do with curation, we often come back with scores of data or really interesting stories and information about our customers on any given topic. But, yeah, that’s a lot of information for our stake holders and our audience to hear and so we have to be thoughtful about going through that synthesizing the information and really curating it for your audience.

There is definitely a heavy aspect on communications there that, I think, is a large part of what we do. That, I don’t think, maybe… I would say… I don’t remember that being a course in my graduate studies, for example, OK.

I think that’s one of things that when you come into an organization, you get your first few projects or it’s your first job, you immediately find out that, “Hey I learned all these techniques, I can do a usability test and I can launch a survey because I learned it in school or I did it before.” Depending on what you do with that information, what you do with that results makes all the difference. That research can go somewhere, everywhere, or nowhere.

I think one of the things that you have to learn…sometimes the hard way is really kind of how do I make sure that I select, first of all, that identify what the insights are. I think insights are different than findings on some level. Being able to communicate that in a very compelling way that gets your audience’s attention. It is really relevant for them. You have to be thoughtful about, well, I do the scrape work but how I share it with…what I choose to select to let say share with the marketing team might be different than what I share with the engineering team. Or take a different spin or angle on that.

I think as you get in to the details of the work or what customers did or what they were experiencing. your conversation with the designer might be very different than your VP of product development, for example. Being able to be flexible and adaptable to how you think about what’s relevant for your audience is super important. I think…

I think my view of what research does in an organization is…our role is to bring the market and the customer perspective into the business. I know that sounds very broad.

We do that via a number of techniques and approaches, but I think we also have to have alignment and a voice with the other parts of the organization that also have access to customer feedback and data and really think through, “What does this mean from a business intelligence perspective?” A lot of folks within the organization talk to customers, and they have their perspective and approach.

I think we also add something to the mix, too, especially when it comes to interpreting that for let’s say product design, that I think our role is to really inform the business strategy, product development, service development, whatever your company is providing its customers.

I think it becomes really important because, as you and I know, we do research and we talk to customers, and we find out what’s really important to them. If you’re not doing that as part of your business charter strategy, then you certainly run the risk of developing products that don’t ultimately or fully meet customer needs. Then, from there, that could lead to eventual erosion in your market leadership position, for example, or healthy growth over time. I think it’s really important that this becomes a fundamental part of how you market and launch and build products. I don’t know if that makes any sense whatsoever.

Steve: Yeah, it’s wonderful. I like how you ping-ponged back and forth between looking at research as an organizational function and looking at it as an individual’s, almost, a way of being, I think, professionally. You’ve talked about the researcher and the business action of research. You’ve blurred them.

You’ve articulated a lot about both, but I like that your answer encompasses both the individual who brings this thing and then the thing and what it does for the organization. I guess it makes me want to ask you, you as someone who lives the life of a researcher, who brings that to their world and their work.

I wonder if there’s something about you that is your way of being or in your background that just…What do you think makes you great about what you do?

Frances: [laughs] I think it’s the variety of positions and roles that I’ve had in my career and the fact that I’ve worked across a number of industries and domains. Early on, it was not in a UX capacity, at all. Just my own trajectory, I went to art school. I always wanted to go to art school and ended up going and majoring in illustration, getting a BFA, because that’s what I had had my heart set on for years and years, ever since I was in grade school. I did that.

At the end of that four years, I realized, “Well, I don’t exactly want to do illustration. I definitely don’t want to be living paycheck to paycheck as a freelancer.” I think, in some ways, I didn’t know how to pitch and market myself. They definitely did not have any marketing courses as part of the program at the time, except for “Here’s how you put a portfolio together.”

That’s not enough. I think there’s a lot more to the sales and branding aspects that I think I certainly didn’t know back then. Coming out of that, I had actually stepped into the hospitality industry, just by accident. I spent a number of years in hospitality. From there, I went to magazine publications. These were a variety of customer-service-oriented functions.

Also, sales and catering and event planning, and then moving on to publications where I was in marketing roles. Design had always been a siren to me, but I didn’t know where my niche was. I eventually had this calling to come back to say, “I’m going to get back to design in some capacity. I don’t think that I want to be a sales executive for a hotel chain.”

I thought to myself, “How do I get back in design? Let me just figure that out. I think I want to go to business school, as well, because I’ve done so many different types of roles and worked in different types of departments and functions and companies.” That was very interesting to me. I’ve always taken, in some ways, a horizontal approach to the work that I do and really learning and understanding and empathizing with folks across the other teams that you work with. I think that’s probably just how I approach my work, naturally.

Then, I discovered this field. Someone told me about human factors, and I thought, “I’m not sure what that is, but I’ll go check it out.” It seemed to make sense. Now, here was a way that I could apply my “customer service” background and marry that with design. That really led me to quit my job. I went back to graduate school and got a human factors degree and a business degree, as well. That always seemed like the right choice to do.

I thought to myself, “Even if you’re influencing the design of a product, you ultimately can’t do that with understanding how the business functions. How does the business take a product to market in the first place? How do they develop it? How do they sustain that? How do they grow the business? Why is that important?”

Even though you might be working on one product line or an aspect of a product, it doesn’t matter. All of these functions need to happen for the business to be successful and for your product to be successful.

The more you understand and have insight into how those processes and organizations work, I think that becomes really important in your role as a researcher, because now you understand, “Who do I need to go talk to within the organization? Who needs to see the research? What can we do about it, collectively?”

I think that’s really just a product of my approach and growing up in my career and taking on jobs that I thought were really fun and interesting, but weren’t necessarily always on a UX path. I guess, in retrospective, it was.

Steve: That’s a great story. It’s obviously going to continue to be written, as you go on to do other amazing things. Just thinking, looking at our time here, I think we should probably try to wrap up. I wonder if there’s something that I should have asked you about that I didn’t.

Frances: You asked the question, “How do you get leadership support?” I feel like most of the time we already have that leadership support. It’s just a matter of directing it in the right way by checking back with stakeholders, making sure everybody’s aligned with the objectives and opportunities, and we’re doing the right research aligned to the projects.

For the most part there is not a lot of resistance in that way. I was thinking about that question because I thought, “Where have I encountered resistance before in the past? What was my approach and take on it?” I think one of the things that people don’t want to hear sometimes is, “It may not be the right time for your project.”

When I look back on some of those projects where I really wanted to get that research done right at that time, I thought, “This was the right thing to do. We’ve got a product launch coming up. This is the approach that I’ve laid out. We really should do this study first, then this one, and this one, and we are going to put it all together. It’s going to tell a great story. It’s going to answer all your questions.”

I think as researchers we tend to get excited about, “Yes exactly, this is the way to go about it.” Sometimes when you don’t get that funding and support right away, its like, “I knew it was the right thing to do.”

I think what I’ve learned over the years is that sometimes that could actually be a blessing. There really is a right time. Timing is everything. I think one of the responses to that question is that sometimes you just have to be patient. Getting support for your research is not just about your conviction and your timeline for doing that. You really have to understand what the competing priorities are and be sensitive to that, and understand that.

I also think that you want to plant that seed regardless. There will come a time where that research is still relevant and there is still an opportunity to do it. Even if you have to wait a couple of months or a year, if the research need is still there it’s going to be obvious. It’s going to come to a point where others can’t ignore it. That’s usually the right time that you can go in and pitch yet once again.

I’ve also learned over the years that organizations shift. There is a lot of organic growth or projects change, people change in their roles. Sometimes that timing or that confluence of events coming up could really set you up to even more success. Whereas, maybe six months before it just wasn’t the time and you couldn’t get the budget for it. I don’t know if anybody has ever reflected on that before but when I think about…that’s a question for you Steve as you hear about these coming out questions like, “Yeah, we’ll create a better pitch deck,” or something…

I think sometimes you really have to assess what’s going on in the organization. Wait for the right time and also socialize with so many people as you can. Even if you don’t sell it the first time around, at some point, someone’s going to turn around and say, “Hey, it would be great idea if we could do this.” And you are thinking, “Yeah, exactly.” [laughs]

Steve: It’s even better when it comes back, it’s not attributed to you as your idea, someone else recommends it.

Frances: That’s the best. Because that’s when you know you personally pitched it a long time ago or maybe it certainly could have been your idea and you take get the credit for it but actually that’s not the end game. The end game is success – the success of the research and the results and the impact that it has on the product of business. If you’ve got a champion and sponsor that and is willing to put that forth and they think it’s their idea, by all means it’s going to go farther.

Steve: I think it’s time to wrap up. Frances, thanks so much for your time. Your great stories and examples, I think it’s really just super helpful, really interesting stuff for everybody. Thank you again.

Frances: Thank you Steve for the opportunity. This has been really a pleasure to speak with you as always. I look forward to more opportunities for that.

Steve: Great. Thanks again.

Jan 21 2015

50mins

Play

27. Colin MacArthur of the Canadian Digital Service

Podcast cover
Read more

In this episode of Dollars to Donuts I chat with Colin MacArthur, the Head of Design Research at the Canadian Digital Service. We talk about bureaucracy hacking, spreading the gospel of research throughout government, and embedding researchers in complex domains.

Often the idiosyncrasies in people’s research and the sort of surprises that don’t fit within the template are the most important things that our researchers find. – Colin MacArthur

Show Links

Follow Dollars to Donuts on Twitter and help other people find the podcast by leaving a review on Apple Podcasts.

Transcript

Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization.

I just read the 2011 book “It Chooses You” by filmmaker and artist Miranda July. It’s one of the best books about ethnographic research that isn’t really actually about ethnographic research. In the book she describes a period of her life where she was creatively stalled in finishing the screenplay for her film “The Future.” As a way to either get unblocked or just avoid what she should be working on, she develops another project, to call people who have placed ads in the free classified newspaper the PennySaver, and arrange to come to their homes and interview them.

She reports on each of the interviews, including excerpts of the transcripts, and some amazing photographs. The interviews are sort of about the thing being sold, but because she’s getting outside of her cultural bubble, she takes a wider view, asking people about a period in their life when they were happy and whether or not they used a computer (since even in 2011 a newspaper with classified ads was a relic of a previous era).

These interviews are confounding, hilarious, disturbing, touching – everything you’d hope. And July is honest about what makes her uncomfortable, about her own failures to properly exhibit empathy when it’s needed, or her challenge in exercising caution in some dodgy situations while still being open to connecting with strangers. She incorporates her feelings about her own life as she hears from people about their hopes and their reflections back on their lives, lived well or not so well. She articulates her own judgements about the people she met and how that informs her current thinking about her own life and her aspirations for her future.
In one chapter she meets Beverly, a woman with Bengal leopard babies and birds and sheep and dogs. Beverly was clearly excited for Miranda’s visit, and prepared an enormous amount of fruit-and-marshmallow salad which neither July nor her crew want, but accept out of politeness, eager to get away from Beverly and her home, but then head straight to a gas station and throw the marshmallow salad in the trash, covering it up with newspaper in case Beverly stops by. Reading it, I felt my own judgement of Miranda for her judgement of Beverly, but I can imagine doing the exact same thing in a similar circumstance, and I appreciate July’s ability to observe her own judgment and infuse it with compassion at the same time. Ultimately, she views her struggles to connect as her own personal failure, saying “the fullness of Beverly’s life was menacing to me – there was no room for invention, no place for the kind of fictional conjuring that makes me feel useful, or feel anything at all. She wanted me to just actually be there and eat fruit with her.” In articulating something so nuanced and personal, we learn an awful lot about Miranda July as well as all the people, like Beverly, that she meets.

I can’t believe it took me this long to finally read this book, and I can’t recommend it highly enough.

Through Dollars to Donuts, I’m gathering stories about maturing research teams, looking for best practices, emergent approaches, and insights about organizational culture. This is of course highly related to the services I offer as a consultant. In addition to leading research studies in collaboration with my clients, I also help organizations plan and strategize about how to improve the impact research is having. Whether that’s working as a coach for individuals or teams, or running workshops, or advising on best practices, or leading training sessions, there’s a number of different ways you can engage with me. Please get in touch and let’s explore what the ideal collaboration would look like.

You can email me about this, or with feedback about the podcast, at donuts@portigal.com.

Coming up soon is the first Advancing Research conference. Rosenfeld Media is putting on this conference March 30th through April 1st 2020, at the Museum of the Moving Image in Queens New York. I’ve been working on the curation team helping to assemble the program and it’s looking like a really fantastic event. I’ll put the URL for the conference in the show notes. You can use my discount code Portigal dash ar10 to save 10 percent on your registration. I hope to see you there!

All right, on to my interview with Colin MacArthur, the Head of Design Research at the Canadian Digital Service.

Well, thanks so much for being on the podcast. I’m really happy to get to speak with you.

Colin MacArthur: My pleasure. Thanks so much for having me.

Steve: All right, let’s just start with who you are? What role do you have? What organization do you work for? I’ll just throw all the launching stuff onto you and we’ll go from there.

Colin: Absolutely. My name is Colin MacArthur. I’m the Head of Design Research at the Canadian Digital Service. We are an office of the Treasury Board of Canada that works with departments across the Canadian federal government to improve the way they serve their constituents. We do that on my team using design research and by helping people inside the government get closer to the folks who they’re trying to service. We do that in partnership with designers and engineers and product managers and policy experts on interdisciplinary teams, but the perspective that the researchers bring is a hard and fast focus on the people who we’re trying to serve and what their experience with our services is like.

Steve: Why is this under Treasury?

Colin: (laughs) Good question. The Canadian federal government, or government of Canada is organized with departments and then with a number of central agencies. And Treasury Board is one of those central agencies. Its name is a little odd in that it’s not actually the department of finance. It’s not the Ministry of Finance. It plays a central role in sort of managing and consulting with other departments about how they run themselves. It’s a management board of government. And so we are in kind of an interesting place because from Treasury Board we have a view of lots of interesting things happening across government. We’re positioned kind of naturally to give advice and also learn from departments that we work with and then to work across lots of different departments in a way that would be a little more unusual if we were nestled inside a dedicated department itself. So, Treasury Board is sort of one of the central agencies of the government and that’s why we’ve ended up where we are.

Steve: Is that where the Canadian Digital Service originated?

Colin: That’s exactly right. So, we’re relatively new. Founded just a couple of years ago and we started right in the same place we still are, inside Treasury Board, reporting to Canada’s first Minister for Digital Government who was also the President of the Treasury Board.

Steve: What’s the relationship between digital service and, I don’t know, what would you call like regular service. Like the things that government does. Because you included policy in kind of the mix of people that you collaborate with. So, everyone else you listed seemed – design, engineering, research, PM – seems very – yeah, this is how sort of software is made. How digital services are made. But policy – and this is from someone outside government, so maybe it’s a naïve question, but policy just sort of begs the question for me, like oh what’s digital versus just services?

Colin: Yeah. What a good question. I think the way we choose to look at it is we’re interested in improving services, period. So that means the elements of those services that are online, but also the elements that are offline and drift into paper processes and drift into things that are more typical policy problems. But in this day and age it’s pretty hard to have a meaningful discussion about service in general without talking about the digital pieces of those services. So, when we put together teams to work with departments we absolutely come with a digital emphasis. That’s one of the strengths that we can bring. That’s one of the pools of expertise that we have. But we’re just as interested in looking at the non-digital sides of that service. And in reality, they all fit together and attempts to kind of separate them out into the website and the paper part are often pretty hard and don’t end very well for the people we’re trying to serve. So, we kind of view ourselves as tied up in both and we try to staff our teams with expertise that allows us to do that. That said, our name is the Canadian Digital Service and I think often our entrée is our digital skills. But we try to be more than that. We don’t think digital problems can really be solved by just looking at the technology piece.

Steve: That seems to me analogous with so many efforts to do transformation or introduce new services or new products – you know private and public sector around the world where – I mean you’re kind of hinting at the power of the like the construct of digital. That it invites maybe a different mindset in approaching a problem that can exceed beyond the boundaries of what is digital. Or, as you said, they’re intertwined.

Colin: Yeah, absolutely. I think we view the computer part as only part of the digital mindset and approach that we try to bring. I think departments appreciate that. I think that we are often able to look at a problem from different angles because we’re not just looking at the IT systems involved. We’re looking at all the pieces related to the problem they have.

Steve: So, what is your role specifically?

Colin: My role as head of design research is to first of all lead and coach the researchers on our team. So, I help them as they’re embedded in product teams chart the direction they want to take with their research, improve the quality and the quantity of their research. Make sure their research is kind of fitting into the product lifecycle in the ways that we hope. And also make sure they’re developing as professionals. We are happy that folks don’t just come here to build services. They come here to build skills and we do that with our researchers, like with all of our other disciplines. So, helping and supporting the researchers is one big piece of my job.

Another is to convene the government wide community practice around design research. So, we know that design research happens in places other than CDS. We know that research is a tool that lots of other departments are using. But what I noticed when I first arrived was that there were very few conversations between people doing research in different departments, or even within a particularly large department. So, I use my role and my limited additional time to convene folks across government of Canada who do this kind of work and talk about the challenges of doing it and talk about ways that we’ve worked through some of those challenges. The joy of that is getting to see all of the sometimes hard to spot places where people are doing interesting things. You know taking interesting methodological approaches, having new discussions that aren’t visible if you don’t get them all in a room and get them to start talking. So, convening that government wide community of practice is another key part of the job. And related to that is trying to create some tools and some policy change that helps that whole community move forward. We try not just to talk about what the problems are and swap stories on the solutions. We try to learn from that community and then use our position and knowledge to nudge forward policy changes or government sort of practice changes that can help researchers across government be more effective.

Steve: Do you have an example of a policy change or process change?

Colin: Sure. So, one of the central pieces of research regulation in the government of Canada is something called the public opinion research rules and guidance, known as POR. I’ll try not to use that acronym, but public opinion research is this kind of class of research that the government has a process for managing – very thoughtfully and deliberately over a relatively – I don’t want to say slow, but certainly a longer timescale than most user research or design research would happen on. And so one of the key areas of confusion we saw when we started doing design research with partner departments was wait, can you really do this without going through the whole process for public opinion research? Isn’t what you’re doing just public opinion research? Why are these things different? Why should we believe you that they’re different?

And so we went to our colleagues in another part of Treasury Board who actually own the policy on public opinion research, and we said to them, “look, these are the kinds of things we do. These are the kinds of questions we ask. If these are the kinds of questions involved is this really public opinion research?” And their response was, “if that’s really all it is then probably not.” And what was great was that we could then work with them to write a clarification to the guidance around the policy which really just meant like updating a webpage to have a section that talks about POR relates to user or design research and kind of further explain to departments that they could be doing user design research that didn’t need to fall within this typical public opinion research cycle. And that kind of guidance is so important to loosening structural challenges to doing research across government, right? That kind of guidance from the center is what helps people trying to do research in the edge of their departments make the argument to their manager that this isn’t quite as risky as they thought it might be.

Steve: The perceived risk is in doing the research? Is that what you mean?

Colin: Exactly. And I would say risk is probably a strong word. People say “well we have these rules for doing public opinion research. I don’t really know about any other kinds of research, so you better follow these rules.” And I think that what we’re trying to do is say “well there are actually some other kinds of research that are useful and not quite the same thing”. And think about those and realize that maybe what you’re trying to do falls within those umbrellas and thus doesn’t need to go through the same process and make management and the executive layer of departments a little more comfortable with that fact.

Steve: I think implicit, and maybe you said this explicitly, the kind of processes and procedures and things that would be required to do “design research” are less onerous than to do public opinion research. Is that correct?

Colin: That’s right. I think that public opinion research is – is I think often interested in policy and government level questions about public opinions, right? And design research is often focused on questions like what’s it like to interact with this department or this service? And those are different things that require different methods and different rules of evidence and so what we try to do is keep people from getting them mixed up in one another. Now I think if my colleagues from other parts of Treasure Board were here, they would remind us that there are some forms of design research that can kind of verge into typical public opinion research and then it’s important to engage the public opinion research process. But there are also lots of things like usability testing of a new service interface that are clearly not in the realm of public opinion research and therefore we’re really happy to encourage departments to do.

Steve: So, in this convening that you describe, sort of finding these, you also mentioned that there are maybe hard to find – I may be putting a word in here – surprising…

Colin: Yeah.

Steve: …sort of areas of research. How have you come to find those people and that work?

Colin: Well you know it’s funny Steve. I think as we were setting up the community of practice, I realized that recruiting participants for research was kind of the best practice I could have for recruiting members of a government design research community of practice. So, like when you’re recruiting people for research, you put out a call for the kinds of folks you’re interested in, but you also – you snowball, right? So, we started with people that we knew and we said hey, do you know anyone else that does this kind of work, or is interested in these kinds of questions and do they know anyone else and do they know anyone else. Often the third or fourth degree from there, you get to folks who are like, “oh, I didn’t even realize what I was doing was design research, but it is and I’m excited to find this group of people who’s all trying to do something similar.”

Steve: So, by bringing these people together, some of whom wouldn’t even have identified with the labels that we would put on what you’re doing and creating a chance to share and improve practices, this seems like it spreads far beyond what design research within Canadian Digital Service would be involved, right?

Colin: Right.

Steve: The reach is much broader.

Colin: That’s absolutely right and that’s the reason why we did it, right. So, our mandate isn’t just to build services with departments. It’s to continue to improve how the government delivers service more generally. And one of the ways we do that is through finding the folks involved in doing what we do and trying to enable them, right? Give them more tools, whether they’re policy tools or methodological tools. Whether it’s – give them a space to vent or give them a space to celebrate. One of the hard things, I think, about pushing something like design research within government is it’s – it can be hard to find a place where there’s a group of people like you who are really excited about the kind of work you’re doing. And so I think there’s some practical benefits of the community and there’s also some sort of emotional support that happens in the community of practice that’s really heartwarming.

Steve: Are there ways to find – this is maybe the counter example, or the other part of the set – so that teams or departments or groups that should be doing this, that maybe don’t know that it exists, or don’t know that it’s accessible to them, that it’s reasonable or feasible, but would help them in their efforts to improve the way that they serve the people that they’re serving? How do they fit into – I mean it’s kind of a boil the ocean question since you’re already finding everyone that is doing it, but sort of opportunities to build the practice, whether you all are providing that, or you’re enabling them the way you’re enabling the people that you’ve convened? How do you think about that?

Colin: Yeah. It seems like your question is really like what about the people that aren’t doing research and should? And they are also an important group. So, I’ll say a couple of things about them. I think that most major departments do have some groups of people trying to do research by some name, right? So, they’re trying to do client experience improvement, or your digital transformation group has a group of user centric specialists, right? So, there are people trying to do something sort of related to design research, at the least. I think what we try to do is find those folks and then we try to introduce them to the body of knowledge that’s specific to conducting research. Like we believe research is a craft in and of itself and it’s hard. And it takes some work, but it’s also accessible and something many folks can learn. So, with that in mind we try to locate the folks who are working in the generally related areas and inspire and equip them to do this kind of work, or continue to improve how they do this kind of work. You know we talk about the challenges to not only design research, but sort of digital best practice, at multiple levels. One of them is the policy or sort of structural level and we – things like our POR guidance clarification, the public opinion research clarifications, those certainly help folks, but they’re not enough, right? They – folks also need the skill to do the work and they need opportunities to learn. We try to create lots of informal opportunities to see what other people are doing. And then, you know, the bottom layer, beyond just policy and skill, is, for lack of better word, inspiration. Showing that it’s possible, right? Like showing that this work can happen within the government, within all of the unique factors of the government, and that it’s useful. And so I think we try to not just give folks kind of on the edge some policy tools, but we also try to expose them to the skillset and we frankly just try to show it’s possible and continue to cheer them on as they push it forward.

Steve: That’s good. I think we’ll probably come back to the ways that you’re working with teams and departments, but I want to go back to one of the other things you said early on. You were kind of describing the two main things that you’re involved in. We’ve talked about sort of convening the community of practice aspect here, but you also talked about leading and coaching researchers. You were really emphasizing building skills was almost a core value. Like this is a thing that you’re really thinking about being an outcome for people that work in research. Where does that come from? That is not a universally held belief I think in groups of researchers.

Colin: Yeah, well I think part of it is born out of the broader mission that we’ve been discussing, right? So, I think we are not just here to do research that helps create better services. We’re here to help the gospel of research spread around the government. And so in order to do that I think I need to care both about continuously building the skills of our staff and then taking those same tools and making them available across the government. So, for me it’s hard to figure out how I could say, “ah, broader government you should be building your skills in this way” if I wasn’t also practicing that as a leader in our own community. And I’ll say we’re not just interested in sort of helping folks build their skills, but helping – and helping researchers within our group build their skills. We’re interested in them learning about how to teach other people about research. So, I think we’re trying to build the empowering and enabling and teaching ethos into the folks that come into our group that makes it easier for them to interact with our partner departments. It makes it easier for them to have useful conversations across the design research community of practice. So, I think that from a mission angle is why that’s been a central part of how I view my work. I also just fundamentally believe we can all continuously become better researchers and that one of the ways to do that is focusing on skills building as a continuous improvement approach, not as a sort of set it and forget it professional development activity.

Steve: There’s often service design happening, of various kinds, without there being research happening. Even though we should all clutch our pearls at that idea, it still happens. So, at some point research is given a title, given a mandate. Someone like you is involved. Can you talk a little about sort of your role and your own trajectory? Where it came from and how it got to where it is today?

Colin: We’re relatively new and relatively young. I think we are blessed to have a dedicated research group and research leadership, given our size and our age. So, how did we get there? You know as – when CDS was a young and scrappy startup or a handful of people, I think the roles were not as clearly defined and people did whatever they could do to help the partnerships move forward. And luckily one of the skills in that mix was research. There were people there who thought that design research was important and who liked going out and talking to people about services. And so from the very beginning that was a part of who we did our work. And so as we grew, and then we had to kind of start dividing into more discrete communities of practice, and being a little clearer about what people’s roles were, I don’t think there was ever a question that research was kind of an important dedicated skill. I think it’s very related to design and so I think that we’ve gone through some iterations of figuring out kind of exactly how we relate to design. But right now we’re a parallel community at the same level as design or product management or engineering. And I think the more we exist that way, the more we like it because researchers need different kinds of support and coaching than other folks do, right? Research is a really different skillset then designing, at a service or an interaction or even a visual level, right? And so I think our researchers are really happy to have a dedicated group where they can get feedback on their craft and have managers that are rewarded and selected by not just their expertise sort of in general design, but in doing research in this world. So, I think we’ve been happy with how that’s shaken out. And I think we’re just – we’re lucky that research was kind of part of CDS’s way from the very beginning, down to the people who were founding the team.

Steve: Did you come into CDS in the role that you have now?

Colin: I did. So, I came to CDS with experience at a similar organization in the U.S. federal government. And when I arrived it was to lead the design research team with a recognition that that was a distinct team with sort of distinct support needs and a need for a manager that knew research and knew about doing research in government. So, I know that that’s not always true and so I feel very blessed to have ended up in a situation like that.

Steve: So, there was a team, or a nascent team, and that team needed leadership?

Colin: Yes. It was a small team at that point. I think there were three of us when I arrived, and we’ve grown substantially. I think we’re now 10 people or so, and continue to be a part of the growth plans for the org.

Steve: So those 10 people – you sort of described early on about how they would be working closely with a specific department or team that they were kind of part of.

Colin: Yeah.

Steve: Maybe you can talk about what that cycle looks like or sort of how projects or jobs or roles and researchers are matched up over time. What does that look like?

Colin: Sure. Yeah, so researchers are embedded on interdisciplinary teams. We call them product teams. And those product teams work with partners through some phases. So, the first phase is a discovery phase and that is a phase where researchers really lead in open-ended research about the nature of the problem that the department is trying to solve. We usually get set up with departments who know they have an issue and are interested in kind of our different approach to things, or they have a goal and they’re interested in us helping them achieve it. But researchers lead their teams in unpacking what that means, particularly for the humans involved, right. There’s important technical discovery that happens in parallel, but we really try to get the whole team involved in talking with both the members of the public implicated by the service, but also the staff involved in delivering a service. You know government services are this wonderful like sociotechnical network of people and we try to understand how those all fit together in discovery. It’s very rare that we would just talk to the public. I think we would spend a lot of time doing that. We really emphasize that, but we also really try hard to see the other side of the service and how all those people work together to create the experience the public sees. So, that’s discovery, right. Getting the lay of the land and understanding perhaps some of the possible roots of the problem.

And then a team transitions into alpha and beta phases of building something. What that is varies tremendously based on what the problem is. I think one of our product teams with Immigration, Refugees and Citizenship Canada did a bunch of work on a letter that was involved in a rescheduling process. That was sort of classic paper content design, tested by a researcher. Happened in conjunction with some digital work they did, but was just as important. So, that could be part of an alpha or a beta, or it could be about building a new digital service like we did with Veterans Affairs Canada building a new directory of benefits for veterans. Those products take different shapes, but regardless the researcher is trying to bring the voice of all of the people we talk to in discovery back into the product development cycle. and so that can mean usability testing, or content testing. As our products get more mature, we also use quantitative methods. We run randomized control trials. We try to use analytics on things that are out in the wild. They use all of the methods at their disposal to try to keep bringing the voice of the people we’re serving into the product process. And I think what that looks like, as I say, just varies tremendously and we kind of like it that way. I think that’s one of the cool parts of being a researcher at CDS is we say well you have this team, they have their needs, you need to serve them and make sure they have the right information about people to make good product decisions, but there are lots of different ways we can get at those answers and we agilely – probably to overuse that word – assemble the methods and the timelines accordingly.

Steve: So, a researcher working on, for example, the Veterans Affairs, for the duration of that program that’s the thing that they’re working on?

Colin: Exactly. Exactly. So, we’ve been really fortunate to basically have one researcher per product team and to be able to keep them on the same team. Which – and sometimes that’s not possible, but you know research, as hard as we try to not do this, sometimes becomes a practice of implicit expertise, right? I think researchers who have a long history in an area, or with a product, kind of know things about how people will respond to the service that are hard to articulate or sort of systematically articulate in reports or the other ways that we try to codify that knowledge. And so we see huge benefit to people having some longevity in the product teams and thus in the domain that they’re working in. Not always possible, and I understand not possible everywhere, but for us I think researchers really enjoy kind of building a deep expertise that comes with doing 10, 15 or 20 studies for a particular product in a particular area.

Steve: Depending on the type of organization, the carryover from one product team to another – I mean in my mind government is sort of an example of a category where there’s a lot of different things that you’re doing.

Colin: Yeah.

Steve: And obviously there’s cultural things and organizational things. Whereas if you’re working in some commercial enterprise that maybe makes a lot of different products and maybe serves different customers, the breadth might be less.

Colin: Yeah.

Steve: So the value, I guess, if I’m doing the math in my muttering – the value of that sort of hard won knowledge is maybe necessary to preserve or cherish it in a different way, just given the breadth of what you’re doing.

Colin: I think you’re right. And one of the fun things about getting to support these folks is that I can have conversations on a given day that range from how the Royal Canadian Mounted Police want to handle cyber crime to how Canada Revenue Agency processes tax returns for low income people to how the Employment and Social Development Canada issues benefits, right. And those are incredibly different, both business processes and missions, but also involve very different people and very different kinds of concerns on the parts of those people. So, it’s pretty neat to get to hear about all of that, but it also creates a challenge in that when we start on a new team, or a new researcher comes to a new team, I think there’s some sort of basic government knowledge they’ll bring with them, but often they’re trying to come up to speed on a pretty complicated area, pretty quickly.

Steve: When you described earlier your role is to – you know in that kind of coaching relationship you have with the different researchers – but it sounds like you’re the one that has the overview or has the window into these different product teams and what they’re doing. Are there researchers – what kind of interaction do they have that isn’t through you necessarily about what each is working on or what kind of challenges they’re facing?

Colin: Yeah, absolutely. I am – one of the things I learned very early on was that I was not the best conduit for all of that information between them and each other. And so I increasingly view my role as creating opportunity for them to share directly with each other and across the org. So, what that boils down to – there are first of all researcher standups, where we talk about what research we have done, every week. And we try to focus on not just kind of what we did, but what we’re learning. And that’s usually not enough to create understanding, but it’s enough to create a hook, right. It’s enough for one person to say, “oh, that person is really talking about the experience of submitting a form that’s an application for benefits and that’s actually kind of similar to what we’re doing, so maybe we should go have a more deep conversation, or I should ask for some documentation.”

We also have rotating, dedicated research critique groups. So, researchers form groups of 3 or 4. They meet weekly to give each other feedback on their work. It’s a little trickier to give critique on research than it is on design artifacts, but just as important. So, they’ll go through research plans with each other. They’ll go through recordings of interviews and sort of talk about the approach that the researcher took and different ones other people might have took. They’ll go through reports and deliverables. And although the kind of stated purpose of those sessions is to help people grow and share skills with one another, I’ve observed that a real common output is better cross product knowledge between them, right. If you’ve spent the time really thinking about the pros and cons of someone’s research approach you tend to understand their product a bit better. So we have critique groups.

And then one of the other kind of structures we have at CDS is these research community meetings that are actually open to everyone from the organization. So, every other week we host these 45 minute kind of brown-bag, lunch style meetings where researchers give talks. They give talks on either their product work, when they’ve recently finished it up, or on their – or on new or interesting kind of methodological or government logistical things that they’re working on. So, we try to create lots of channels for that information to flow. I think that we’re still kind of a size where some of those meeting and interaction driven approaches work. As we grow, I suspect we’ll have to get more creative.

Steve: Right, critique circles for ten is different than doing that for 80, or brown-bags. Which is crazy that I would throw that number around and everyone would nod, like yeah, that’s the size of research groups in some organizations now.

Colin: Sure, sure.

Steve: It’s not that long ago that that was an absolutely ridiculous idea.

Colin: Absolutely.

Steve: So, at ten, you can have some kind of communal knowledge just based on – well, you’re putting formal things in place for sort of semi-formal knowledge exchange.

Colin: Yeah. Yeah, that’s right. I think that we’re certainly beyond the totally informal everyone talks over lunch about what’s happening size. I think we are at the edge of what works well for critique groups and sort of community meeting exchanges. But again, it’s a little trickier for us because our domains are so different, right. And so it’s harder to say we’re building a common understanding around the user of a particular product that we all study. Often, we’re studying very different things and there are things to learn across them, but there’s also real differences between them.

Steve: I’m going to switch gears a little bit. I think it builds on some of what we’re talking about, but as you talk about the team having grown, and maybe growing into the future, what do you look for? What makes for a good researcher for CDS?

Colin: So, we focus broadly on a couple of things. I think the first is craft, for lack of a better word. So, when we hire people, and when we’re going through interviews, we really dig into the details of their previous work. You know what – how did you construct the research questions? Why did you pick those questions? How did you pick the methods that you used to answer those questions? Why did you pick those methods? We spend cumulative hours working through that with folks because we really believe that the basic skills are so important because they’re hard to – if you don’t have them down, they’re
hard to maintain in our challenging context, right? So, you really have to be a great baseline researcher and know the basics really well to succeed here because – and this gets to the kind of second factor – it’s hard to do research within government and on product teams. And I think it’s hard to research everywhere, but there are a number of challenges that folks need to be ready to meet. One of those that we look for and also spend a lot of time talking about is ability to recruit research participants and build relationships and structures around doing that. So, because we’re changing domains and we’re going into new areas, it’s not uncommon for us to start a product and put a researcher on that product that has a very specific kind of group of people that we’re trying to talk to, and would be very hard to pay a commercial recruiter to access. And so then they have to get creative. They have to go make friends in the advocacy worlds related to that department. Or they have to work with that department to use administrative data to find people. All of these things are much harder than hiring a recruiter to do the work for you. That’s not to say we don’t use recruiters. Sometimes we can, but a lot of what we do is so narrow that we need to find people who don’t just love the craft of doing research, but love the craft of recruiting. I think over time we continue to look at ways to make less of the job, but it frankly just still is a reality given how many contexts we operate in and what people have to do. So, recruiting is one piece of it.

And I think the other piece of it is broadly what we call bureaucracy hacking and that is being able to navigate a bureaucratic process to get your work done within the time that we need it to get done and with enough cognitive flexibility to kind of think through what the really important pieces are, where there might be alternative routes to the really well trodden one, and ultimately get a result despite a complicated, multi-person, multi-process situation. So, we look for people who are excited to do that and who have some demonstrated skill navigating situations like that.

Steve: Is the public opinion research story you talked about earlier, is that an example of bureaucracy hacking?

Colin: I think it’s an example of institutional level bureaucracy hacking. I think that product teams themselves and people who are working on those teams often have to do the same thing, but kind of at a lower level. So, they’re working with a partner who says, “oh, like we have a departmental process around talking to this group of people. We need to go through that process in order to do this research. That process usually takes nine months. Your research is supposed to take 3 weeks, how do we make that happen?” And it’s kind of an interesting skillset, right, that’s required to navigate those situations. It’s not just sort of knowing what good research is. It’s also your ability to think analytically about a process and understand the reasons for the pieces of those processes and then look more broadly. I will say, as we grow, we try to do less of that at a departmental level and more of that at kind of a broader institutional level, but that’s still a work in process.

Steve: It’s interesting, and maybe just coincidental, but you talked about looking for recruiting skills, very specific kinds of skills, in talking to researchers, and then this bureaucracy hacking example that you gave is around the logistics of recruiting participants.

Colin: Yeah, yeah.

Steve: You know and – I mean I’m glad we’re highlighting recruiting. I think it’s sort of a neglected – just get people and talk to them sort of is sometimes the belief in research. You’re talking in some cases about getting to maybe harder to find groups of people, or groups where there’s a specific relationship. And we also talk a lot, in research in general, about operationalizing some of those things. We haven’t talked about that. What’s your view on – whether it’s the recruiting part or just in general, in the context that you’re in, how does that idea play out?

Colin: Yeah. So – I mean I think that it’s not surprising and certainly important that the industry at large is increasingly focused on operationalizing processes like recruiting and thinking about I think ways to devise that work and to scale it more efficiently. That seems totally reasonable given the size of research teams that are now part of the modern organization. I will say, at CDS I approach it with some caution, and I’m worried I’m going to sound a little like old and cantankerous, despite not being much of either…

Steve: You have me though on the call. So, just compare to me. I can be old and cantankerous. I’ll take the heat on that.

Colin: I appreciate that, Steve. No, I – for a lot of our work, the work of doing the recruiting is part of the research itself, right. So, actually going out and making connections with community organizations or professional boards, or senior centers, or all of the places where we do our research, that’s part of how we understand who we’re trying to serve and the social structures that surround them. And so I – when I think about how we’ll scale, I try not to think about ways that we would totally take that out of researchers’ hands because I think they would lose part of the picture of who they’re trying to study if they were to do recruiting in its own right. If they were to sort of separate that into a different role. I think that – the other thing that strikes me is that so little – like when I look at the substance of our research, at the substance of what we do, some of it would I think be possible to kind of mechanistically do faster, right? Like if we had better templates and better knowledge stores and sort of better processes to string those things together, like it could speed things up a little bit. But often the idiosyncrasies in people’s research and the sort of surprises that don’t fit within the template are the most important things that our researchers find. And you know perhaps that’s a sign of our organization’s maturity, but I get worried about kind of whisking all of it into a well-oiled, sleek process for – you know I wonder about the sort of edge of whiteboard, straggly sticky-note on the margins insights that we miss. And those – and for us, in our work, those are so often key, right. Those are – we’re still building an understanding of the space and so often those things that don’t fit well within your predetermined framework are the most interesting parts for us. And I worry about losing that. That said, making the logistics of scheduling people and where they go easier, I am all for it and we continue to look for ways to do that. But I think we have to be thoughtful about what we automate in our industry, just like all knowledge workers should be, I suppose.

Steve: You made a comment a few minutes ago when I asked – when you just said research in general – you said, “it’s hard.”

Colin: Yeah.

Steve: I just want to – I want to go back to that. I don’t know if I agree or disagree. Can we just reflect on that notion of research is something that’s hard? Like what about it is hard? Should it be hard? Is that a bug or a feature?

Colin: Yeah. As it came out of my mouth, I’m like oh that’s kind of a complicated statement, I wonder what you meant by that – to myself? I would say this. In some ways research isn’t and shouldn’t be hard. I think one of the things I love about this work is that I can sit down with someone from any one of our teams and explain some basics and they can go out and start learnings things in a more systematic way with some good pointers from a good 30 minute discussion. And that’s great. And so I don’t want anything I’m about to say to be read as kind of discouraging making research accessible to everyone. We say everyone at CDS can be a researcher and I really believe that. I will say that research is also a – one of those things that’s very easy to do mediocrely. I think that there’s a lot of subtleties to how you make people comfortable in research sessions. There’s subtlety to how you digest lots of qualitative information. There’s subtlety to how you arrange your research, so it influences your product in the most responsible, but impactful way. That’s all – that’s all – I think there is subtlety and trickiness to that and so I think it’s important to have an appreciation for people who are really good at doing those things and to – I don’t necessarily count myself as one of them for all of them – and to recognize that that is a real skill and that is a skill that we should – that we should celebrate in kind of our broader industry community. That is, I think we can do that without also saying hey, there are some basic things that people can do that are easy and quick and help make more people gather more data to make better decisions. I think those are sometimes placed in a false binary in the Twitter sphere, for example, and I am not as interested in that. At CDS, and I think my comment was somewhat related to the Canadian Digital Service in particular, there are some things that are quite – I won’t say uniquely hard about research, but are specific to our context. You know one of those is the sort of incredible domain switching that we expect researchers to do every 3 to 9 to 12 months. And that’s, I think, not unusual in consulting, but it is somewhat more unusual in a product driven organization like ours. There is also, I think, the hard work of figuring out how to fit your research into a interdisciplinary team. You know we don’t just write reports and make presentations and give them and sort of hand them off to designers. We don’t let people off the hook at that phase. They’re kind of in the trenches with all those other people as the product is being developed. And we expect the way that teams run their agile development process to reflect research and for researchers to be a voice in that. And I think there are lots of organizations that have that expectation. I think we’re trying to do that in combination with domain switching and with the third part of it which is, again, we’re trying to build skills. Not just in ourselves, but in our departmental partners. So, you’re trying to learn a new domain, fit your research skills into this rapidly evolving, interdisciplinary team, and help a partner learn the basics of research and appreciate research. I think that’s hard. I think it’s fair to say that’s a difficult thing to do.

Steve: I want to switch a little bit. I would love to hear if there was a point in your own personal, professional path when research was a thing that you identified with? Like I want to do that, I am that, that’s for me? I don’t know, is there a moment or a stage at which you connected with the field that you are in now?

Colin: Yeah. There was. I conducted my first usability test when I was in 7th grade. I was – this was pretty early in the days of such things. But I was working on the school’s website. That was sort of my hobby. I helped the computer teacher with that. And I was reading Jakob Neilsen’s Homepage Usability book. There was this beautiful book with multiple sort of printed out homepages and he talked about his methodology in the back and I read that. I was like oh, this usability test is sort of something we could do. So, I got someone into the computer lab and tried it out and there was this moment of like wow, when you ask other people to try something, and when you ask them questions about what they’re thinking and what their goals are, you challenge yourself in ways that you – that I didn’t expect. And it’s also incredibly rewarding. It’s an incredible high. I don’t know how else to describe it. And it continues to drive me and the folks on my team. So, from that moment on I knew that I wanted to do this kind of research in some way professionally. There were lots of steps between there and here, but I knew pretty early in life that I really loved learning about how people used computers and services and understanding their approach. And, you know, I’m really blessed to work alongside lots of other people who are similarly jazzed to learn those things about people.

Steve: And that’s an astonishing story both – to me – in the early point in your life at which this happened and the specificity when it happened. I think often these stories are about oh I went to a farm and then I realized I wanted to be a marine biologist. Like the connections are more diffuse. But you, at a very young age, were doing exactly, or pretty close to exactly the thing. I mean it’s not even a metaphorical discovery.

Colin: Yeah.

Steve: You literally discovered the work.

Colin: It’s odd to me too. And I do look at friends’ and family’s career trajectories and I’m like hah, I didn’t think that’s how it would work out for me, but I really – I just loved it. And I looked into other things, but kept coming back to this, this being what I really just enjoy doing.

Steve: Are there any points, whether it’s 7th grade or older or younger – what kinds of things can you look to in your earlier parts of your life that were, I don’t know, maybe weak signals or kind of nascent superpowers that connect to what you’re passionate about and what you’re spending time in now? Does anything exist earlier in your life?

Colin: You know, one thing that has always been somewhat odd about me, and I do think relates to my passion for this work, I have always loved seeing the metaphorical or physical backroom of the process. So, like when I’m flying, I’m always like hah, like what does the computer system that the, you know, gate agent used look like? And like what do they do and when do they do it and why? And I’ve always – I’ve always been like that to a point of I think some well natured teasing from my family about my desire to understand the details of how the parking ticket machine dispenser system works and what the numbers on that thing mean. So, I think, especially to do this work in government, you have to kind of have a love of uncovering process and humans and how they interact with that process. And the trick is you don’t have to love like process, but you have to love kind of post modern process. Right? You have to love the fact that like it is different things to different people and kind of is this great mirror game of clarity and unclarity all at the same time. That’s something from an early age that I’ve always really enjoyed. I also would say that I – so early in my career I worked for the U.S. National Park Service and one of the things that I did there was being a park guide. So, be out on the trail talking to people and listening to them and answering their questions and seeing how they interact with a place. And there – it’s real joy – I feel real joy to seeing people in an environment and both helping them discover things and see how they discover things. And often I feel like my work today isn’t that much different, right. I think I’m more open ended than park interpreters are, but it’s still here is this new land and this new world that you are encountering, and I get to be with you as you’re doing that and help you think through what you’re doing. And so I think the kind of odd areas of joy I found in that work certainly kind of echo through to what I’m doing now.

Steve: Is there any example of a notable success? Something that, you know, maybe that your researchers have done and kind of completed that you feel proud of? Or just kind of a notable example to share?

Colin: Yeah. I think that – it’s funny. You know we’ve done a lot of research that I’m proud of and I’m proud of our researchers for doing at CDS. But the things that make me most excited are the moments when I see our researchers teaching and coaching people and departments to start doing research themselves, right. So, for example, with the Veterans Affairs Canada project they – the department identified someone who would be a researcher and sort of continue on the research with that product after we handed it back off to them. And I, you know – those moments when those people were working together and getting to support them through that, I think those probably make me feel prouder than any particular deliverable or product that we’ve gotten out the door.

I will say the other exception there is I’ve looked – I was looking at our sort of internal database of research before this interview and I’ll say the other thing I’m particularly proud of our team for doing is becoming more