Jon Krohn: 00:06
This is episode number 658 with Brian T. O’Neill, founder of Designing for Analytics.
00:19
Welcome back to the SuperDataScience Podcast. Today I’m joined by Brian T. O’Neill, who founded and runs Designing for Analytics, a consultancy that specializes in designing analytics and machine learning products so that they are yearned for and adopted by users. Brian also hosts the Experiencing Data Podcast, an entertaining show that covers how to use product development methodologies and UX design to drive meaningful user and business outcomes with data. In today’s episode, Brian synthesizes the most critical information from his hundred-plus podcast episodes to fill us in on what data product management is, why so many data projects fail, how to develop machine learning powered products that users love, and the teams and skillsets required to develop successful data products. All right, let’s jump right into our conversation.
01:10
Brian, welcome back to the SuperDataScience Podcast. Kirill, the host of the show before me had you on in episode number 353. It was a great episode, but it was also years ago, and we needed to have you back. Brian, how’s it going? Where in the world are you calling in from?
Brian T. O’Neill: 01:25
I’m still in Cambridge, Cambridge, Massachusetts, People’s Republic of Cambridge, as it’s called sometimes. A little less Covid going on. That was the deep Covid, like when we did that show. I remember.
Jon Krohn: 01:37
Oh, really?
Brian T. O’Neill: 01:39
Things are a little better in that regard.
Jon Krohn: 01:42
I’m delighted to hear that. And so we were introduced to each other by Tom Davenport, who was a recent guest on the show. So he was in episode number 647, and as with many guests at the end of filming with him, I said, Tom, you’ve been an absolutely A+ amazing guest. Who else do you know is an A+ guest? And he right away said, Brian T. O’Neill. And I immediately looked you up, loved what you had going on, asked you to be a guest on the show, only to later discover that Kirill had already had you on the show. But it doesn’t matter because you’re a fascinating person. And I’m gonna be asking different questions anyway.
Brian T. O’Neill: 02:21
All right. Let’s do it.
Jon Krohn: 02:24
So you’ve been leading Designing for Analytics, it’s your firm, you’ve been leading Designing for Analytics for 25 years. So, Designing for Analytics is focused on making data products successful. So first let’s define for the audience what a data product is.
Brian T. O’Neill: 02:44
Sure, sure. Well, first I just wanna correct that. I think that came from LinkedIn. So Designing for Analytics is my new consultancy, well, newish, it’s about six, what, six, seven years old now. So I’ve been doing design consulting for 25 years. I changed my name and I formally focused in the data space, whatever that was, six, seven years ago, something like that. Since I had kind of been naturally doing it with just the clients and projects and even before when I had W-2 employment jobs, I had been doing a lot of this kind of work. So I put an intention behind my focus back then. So just wanted to correct the record on that anyhow.
Jon Krohn: 03:20
So we could say, Designing for Analytics with like capital letters has been happening for six years.
Brian T. O’Neill: 03:25
Correct.
Jon Krohn: 03:25
Designing for analytics with lowercase has been happening for 25 years.
Brian T. O’Neill: 03:30
Yeah. Along quite a while. A lot of stuff with data. Yeah. So, but what was your second question? Sorry. Now that I’ve, I’ve,
Jon Krohn: 03:37
It was, what is a data product?
Brian T. O’Neill: 03:39
Oh, what is a data product? Tom doesn’t, Tom doesn’t totally love my answer to this, and I understand why, but I’m gonna give you what I call the producty, the producty definition, which is not like the, there’s some other definitions that are out there, but the way I kind of think of this is it’s a full end-to-end human in the loop decision support solution that’s so good, somebody might pay to use it. And the important part of it, this is not a perfect definition. It will not capture all solutions. The most important part of this definition, I think for data professionals who are maybe hearing about data products and this idea of product orientation, the most important part of this definition to me is the last part of this definition, which is so good that somebody might pay to use it.
04:35
So what do I mean about this? If you look up like a definition of the word product, we might think of product as like, it’s the byproduct of labor, right? It’s the thing that comes out of some labor that is absolutely not what I am talking about. That’s called an output. And I think the data community is really good at generating technical outputs. What I’m talking about is something that could be traded, it could be bought and sold. It has inherent value where value is a subjective thing in the eyes of the beholder. And so even if you’re working internally, like let’s say you work at a bank and you’re, you help the fraud department and you build risk models or something like this. Someone in that like risk department probably, maybe they’re compensations based on like how much risk they cap or how much they prevent, you know, the bank from losing money or something.
05:30
Something’s on the line for them that’s high value. And maybe they actually are funding projects with funny money between, you know, within the bank and all this kind of stuff, right? But the important part here is like, it becomes a product when it’s so good that they would pay to use it or they would give something up of value. It’s indispensable. It has some quality attribute that is in their eyes and not ours. It’s not about how accurate a model is. It’s not about how cool the dashboard is cuz it uses violin charts or whatever it may be. It’s all subjective from the them perspective. And that’s the producty mentality that I’m trying to push with this. It’s not about, it’s not really about the amount of effort that went into it or which kinds of tools and technologies went into has nothing to do with that.
06:18
It’s human in the loop. Because the other side of this is like Tom, you know, Tom was talking, is like, the one thing I don’t like is like this, this excludes automation, you know, for example, solutions that are running fully automated. In my opinion, there’s not a lot of solutions that are entirely fully automated that never have any humans. Like what about model drift? What, when things go off the rails, what happens when Covid happens? All of a sudden, there’s humans in the loop again, right? So there’s that whole side of it, but there’s also the who are the stakeholders or the sponsors or the third party affected people that may not even be in their room when the thing is made. But let’s take like your bank, another bank example like the, you know, getting approved for credit, a line of credit, right?
07:03
Some customer on the end who was not involved with building the model that decides whether they get a mortgage or not, that’s a third party. All of these comprise humans in the loop, when we talk about this from a design standpoint, these are all humans in the loop. So there are humans in the loop, even when the system is fully automated because the business is humans. The customers are humans. The customers are not machines. The business is not machines, at least not yet. You know, there is no, this business is sometimes talked about this entity like it’s this, separate organic thing, or it’s a machine or something. But ultimately it’s John and Jane and whoever in some department that’s quote the marketing business and then their teams, et cetera. So that’s why I think it does, humans in the loop is an integral part of this. Anyhow, that’s my definition.
Jon Krohn: 07:54
This definition, yeah, it makes perfect sense to me anyway, especially from the UX perspective that you’re taking here. It’s, sure, you know, there’s a human with eyeballs on a screen probably who’s interacting with this product, right? And in the background, there are data or data models that are allowing aspects of the product to, yeah, to function right? So lots of different ways that data or models could be involved in products. So how do you bridge the gap between just having, like you said, that a lots of people, like data scientists like me, get super excited about, “Oh, this amazing model. Wow, look at this crazy thing we’re doing. Look at how accurate it is”. How do you bridge the gap between me being excited, the us and that thing that you were talking about Brian, me being excited about this data science capability, and somebody out there being so excited about this capability that they’re willing to pay me for it? How do you bridge that gap?
Brian T. O’Neill: 09:02
So are you saying that the gap is that when Jon likes it, it’s inherently not something yet that the customer likes?
Jon Krohn: 09:12
I guess so.
Brian T. O’Neill: 09:14
And then, or not you, whoever,
Jon Krohn: 09:17
Yeah so I mean, I guess the way that I, the way that I stated that question, I guess doesn’t automatically assume, like your premise, the premise of your company and your expertise assumes that there are situations, and from my experience, a large range of situations where a data scientist or machine learning engineer creates an amazing model and people don’t wanna use it. So how do,
Brian T. O’Neill: 09:46
Hey, it’s not my studies that says that. It’s all the other studies out there that say it. And there, you know, it’s time and time again. There’s been repeated studies about the problems of these, we’re still, we’re standing up giant data infrastructure. We, as Mark Madsen calls it, you know, intergalactic, you know, data systems, in order to stand up potentially usable things in the future. And so a lot of these projects become, well what’s our strategy? We’re moving to Snowflake, we’re doing data mesh. That’s the strategy. That’s like, that’s not really a strategy. There’s no, there’s no defined outcomes. There’s no definition of what done is, and done meaning qualities, OKR, like metrics, KPIs, that have changed that, that’s all about building technical infrastructure, which yes, I understand that’s part of it, but that’s all just inputs that has nothing to do with outcomes, right?
10:39
So, yeah, we go a lot of different directions here, but this premise that low adoption is a problem is something that’s been quantified quite a bit. So, and the problems with low adoption can range. They can be all over the place from, the problem is not well defined, which I think is quite frequently a problem. And I think one unique thing that data scientists, and maybe some of your listeners have probably heard this, is that what the data tennis game happens, I call it. And this is when, “Well, what’s your business problem that you need help with?” “Well, we want to use machine learning, but you know, what could we do with this data?” “Well, what are you trying to, like, what’s wrong? Like, what are you trying to do?” “Well, we wanna know how machine learning can help us do something.”
11:28
And so we’re hitting the ball back and forth and no one is stating what the problem is. The business thinks you’re gonna help me figure it out. That’s what you’re here for. And the data science team is like, “No, I’m here to build models and predictive intelligence and help us solve problems that are really complicated. But I need to know what’s wrong before I can do anything.” So then you get to this question, well, whose job is it to surface the needs and the problems? And you can turn that into a role, or you can decide you’re gonna take ownership of that cuz you get tired of building stuff that doesn’t get used. I think there’s definitely a class of data scientists that are still happy to work on, technically write effectively wrong solutions, which you can write a paper about it.
12:10
There’s high predictive accuracy, maybe some unique technical aspects, but they, the model sits on a shelf and GitHub, it collects dust. It doesn’t get used. And the feeling is that’s not my problem. That they don’t know what they want to do or how they have no plans to operationalize this. I hate that word. I don’t even like using it. The designer’s way of thinking about this is no, the operationalization of the model is part of the design of the system entirely. You don’t do that afterwards. It’s not a, it’s not phase two. The operationalization, the usage of the thing is designed into the solution from the beginning. And that’s also just a product idea too. It’s outcomes over features, right? It’s not about building the things, it’s about generating better future states for the people that we’re here to serve.
13:00
So you can either decide that it’s someone else’s job to figure out the problem or you can, you can jump into that role. You can ask for help, but you can hire data product managers, which I think are ultimately the people that are gonna own that respond. You, you have to do that kind of work. If you’re really gonna be doing data product management, that effectively is your work, right? It’s to figure out what is the value that we’re here to provide against the problems that we have surfaced. And I say surface because they’re usually not on the surface. Data scientists particularly, you’re going to get a lot of, what are called presenting problems, right? We need to use deep learning to do X and your audience probably has no idea what they’re talking about. And I’m sure some of the people are laughing right now cuz you’ve had this conversation. Or we need to use K-means clustering. We wanna use K-means clustering to do, fill in something where it’s not appropriate to use K-means clustering.
13:56
And, but, so you can, you can laugh at that and say they have no idea what they’re talking about. Or you can look at it as this person’s actually reaching out to me and they’re trying to talk my language to help me understand what it is that they want. I get the same thing as a designer. When people come to me, it’s always the dashboard needs to be redesigned. The dashboard needs like more, the charts need to look more like, what if Apple had done it? It needs to look like the iPhone. This is the designer’s equivalent to what I know the data science community hears a lot too. And I used to roll my eyes and like, okay, whatever. And now I know it’s like, that’s just like a, it’s an opening line. It’s just like the opening thing in our little negotiation, the dance that we’re gonna do.
14:42
And I know deep inside there’s actually a real need behind that. And so my job is to get in there and to ask good questions. And this is really good too for, for introverts. And I think my general feeling after several years of this, because my own audience is very quiet, is that there’s a lot of introverted data professionals. You don’t have to talk a lot. You actually can just do a good listening and ask good questions. And by asking good questions, we can start to dig into really what’s behind this person who wants to use K-means clustering to do whatever. And maybe that’s a right tactic and maybe it’s not. And maybe that’s the nine month version of a project that could be done in one month, but they don’t know the one month version. And you do, and maybe it’s only 70% accurate, but that’s a billion dollars in cost savings and eight months faster.
15:31
And so your job really isn’t to give them K-means clustering unless that actually aligns with what the real needs are, which are not on the surface. They have to be surfaced much of the time. And so this is kind of actually normal state I think for people with product orientation. This is all normal state. We hear this stuff all the time. People aren’t really good at, like, I’m having this omniscient view of their own problem space. I mean, I was talking to someone, who was I talking to about? Oh yeah, I’m gonna have this guest on my show to talk about building data products about reducing bias and models where the customers are data scientists. So we’re talking about building tools for people who are generally pretty confident in their intelligence that there’s a sense sometimes that I don’t need it dumbed down. I don’t want it too simple.
16:22
I’ve heard this for my life, designing for options traders and finance types “Don’t take anything away from me. Don’t take my data away from me.” But data scientists too have jobs to do. And there’s busy work. If you’ve, if you’ve spent time building pipelines or just trying to get access to the data set to do all the stuff you were trained to do, all that stuff, that’s all tool time and stuff that you didn’t, that’s not really what you’re there for, right? That’s not your magic zone. Good design. We can eat all that stuff away. And in the same way you can be trying to, to take off that kind of busy work stuff from your customer, your business sponsor or whatever to help them get the thing that they want. But they’re not gonna tell you what that is on the surfaces because they can’t, most of the time they just, they haven’t thought about it that way. They probably don’t know how to measure it yet either.
17:14
And I think data people can ask the right questions and probably could suggest how to measure it. But it’s like, “Well, when do you wanna get a reduction in risk? And by how much?” And like, “Well how are we doing right now? I have no idea then how do we know it’s wrong?” And then you find out, well someone yelled at me, okay, now we have something, someone’s angry about X. “Well why are they angry?” “Well, we lost 2 million because this model went off the rails.” “Oh, okay, now we’re getting somewhere.” You gotta kind of do that dance. It’s called laddering when we ask the the, or the five why’s, there’s different language to it, but asking why and you kind of keep, we call it laddering up until you kind of hit this nugget.
17:51
And it could feel weird doing this kind of work cuz you feel like you might be nosy or I’m challenging leadership or something like this. And I can guarantee you if you do it in the right way in a genuine way that’s about serving, it’s not about pointing out someone’s fault, but it’s like, I wanna understand it cuz I don’t wanna waste your time and money. I don’t want to go build the wrong thing. This stuff is hard. It’s not guaranteed to work and it can take a long time. So the more I understand what’s behind your ask for K-means clustering, the faster I can come back with something that will serve you. It’s a, it’s a gift.
Jon Krohn: 18:33
Yeah. So, so on one side of the equation, we have users or senior stakeholders who have some idea in their head of some functionality that could be better, if data or data models were involved. And then on the other side of the equation, we have data scientists who can implement these problems but maybe don’t always know the right questions to ask. They don’t ladder, they don’t go into the five why’s. They don’t understand what the core need is. So the data scientists end up building something that is roughly you know, it is the deep learning model, but it doesn’t actually solve the problem that the user or the senior stakeholder came to them in the first place with. So it seems to me like a key role for bridging those two sides of the equation would be a data product manager. Yeah. So what’s a data product manager and how does that role differ from a standard product manager? Like is there some kind of skillset that a data product manager has that a product manager doesn’t have?
Brian T. O’Neill: 19:45
Well, yeah, and I don’t think there’s a a great singular definition for that. I think, obviously with the data product management space, there’s this assumption that there’s this underlying foundation of ingredients, which is going to be information and that’s gonna be part of these building blocks of whatever it is that we make. I think a lot of the day-to-day work doesn’t necessarily change from product management, except that like in your, you know, your non-digital native companies, the product may not be sold, right? The product may be something for internal use. The same mentality though can be applied to these internal works, even if they’re not going to pay for it. If you approach it as if our salaries depend on this, the business depends on this and set the level of quality such that we have to be, we want to be not just doing okay but ideally delighting people with the work that we do and set the bar at a quality where again, that’s measured and it’s subjective based in the eyes of the person it’s for, not us reflecting on our own work, that can still be applied there.
20:58
And this is just very different than a project mentality. Cuz a project is kind of like, I’ve assigned the parameters, they fit in a Jira ticket, here’s a definition of done. Done. And it’s like, we have no idea who asked for this. We don’t know what department’s gonna use it. We don’t know what engineer’s gonna implement it. It’s just like done. And at some point, yeah, there are gonna be technical delivery points that happen in any, any kind of technology. These are, we’re still talking about software at the end of the day, we’re still building software of some kind, right? But a product is actually kind of a never ending game. It’s kind of like reduce risk for the bank. Does the bank ever not have risk? No.
21:37
So that’s actually a zero. It’s just never ending game. And so for leadership, part of this idea, and Manav Misra talked about this, I interviewed him, he’s the chief data analytics officer at Regions Bank about how he had to help get management to think about funding these initiatives as kind of ongoing games, right? Like there’s always gonna need to optimize marketing spend or again, reducing risk for the bank or whatever it may be. They don’t really end, right? So it’s really a matter of like, this is always going on, but there’s gonna be milestones which is like, you know, this quarter we have this objective around risk, it’s still under this overall thing of risk management, but like this quarter it’s about X, Y and Z. And another producty way of thinking about this, and this is a lot of software companies are still maybe not doing this yet, Marty Cagan talks about this, but it’s about assigning ownership of the problem space to the team and not the solution space.
22:34
So the team, which to me the classic trio is someone in this product management or data product management role. Someone in a user experience or design role, whether or not it’s actually a designer, it’s someone that has some responsibility, maybe it’s a BI developer, or a dashboard developer, but somebody who has some responsibility for what the end people are going to interact with. You have a software engineer who’s gonna package up the solution into whatever it’s gonna be. And the fourth role I think especially for data products is going to be a data scientist or some relevant data professional on that. But the idea is that you take this power trio or a power quartet and you assign them the ownership, which is like, hey, the bank needs to reduce risk by this much.
23:18
They have an itch about fraud claims or they have an itch that there’s lots of, you know, I don’t know what it is. There’s ATM fraud, they think something’s wrong there and we could have less ATM fraud. I don’t know what it, I’m just making something up. We don’t know exactly what’s wrong, but we wanna reduce this by 50% next quarter, go. That’s very different than like build me a model that will tell me which ATMs seem to have unusually high withdrawal rates check done. And business sponsor’s like, “Great, I always wanted to know this, but like, great, what are you gonna do with that now? You gonna install cameras on the machines? Like they’re all different. Some of them are a hundred times unusual, some of them are 10 times usual. There’s no plan.” It’s just someone thought they needed to see the most unusually high withdrawal rates on ATMs cuz it sounds like it’s going to be useful.
24:11
But no one asked the question, what decisions are you gonna make once you find out? And the good thing is you can actually mock all this crap up in low fidelity. You can literally make this stuff up. Here’s a report of the top 100 ATMs with strange amounts. Now what would you do with this user? And this is exactly what designers do with pay for prototypes or low fidelity stuff. We use realistic data as I talk about, and it’s important with data products that we actually design our mock, our wireframes or mockups or sketches with realistic data just means it’s fudge data. But the point is we’re actually designing a little bit to figure out what needs to be designed. So by sketching out and working in low fidelity, whatever that is, that could even be an Excel. You just hardcode a bunch of numbers into Excel and give them this report and it’s just a list of these hundred eighteens with weird numbers and we’re using that and putting it in front of the stakeholders said they wanted this and saying what are you gonna do with it now?
25:08
Oh well okay, I’m gonna put it into this thing and pivot it and then I’m gonna upload it into Tableau. Cuz really want to do is I want to see a chart of when these happen to see if they all happen at the same time. And then if they happen at the same time, then I’m gonna call the security department and say, “Hey, could you watch on Mondays at 2:00 PM cuz it seems like a lot of money gets taken out on Mondays at 2:00 PM.” “Oh. So what you’re really trying to do is figure out is there theft happening at certain times and like, well maybe we could just go do that for you and figure out and predict it, tell you when it happens and maybe get in front of it before it happens. Like would that?” “Oh, that’d be awesome.” But what they started with was, “I need a report of all the ATMs with weird withdrawal amounts on it.” Right? And you can see the trap here. And this is, this is why listening to the customer is important, but you’re not there to listen to what they want you to make. You’re there to dig into what the problems are. You’re there to understand what have you done in the past about it to look at past behavior, which is a much better indicator of future behavior. You’re not there to build exactly what they asked you for. It’s almost always a trap. It’s a trap! Okay, you gotta pull that star cruiser up.
Jon Krohn: 26:17
You had a couple great tidbits in here. So I wanna start off with the power trio versus power quartet. So, in a standard product, so not a data product, a standard say software product, we have this power trio, of a product manager, a designer, and a software engineer. But when it’s a data product, we want to add in an additional person, a cellist, no, a data scientist. And that product manager in this quartet might be a data product manager. So we now have data product manager, UX designer, software engineer and data scientist on this data product development.
Brian T. O’Neill: 27:05
Yeah.
Jon Krohn: 27:06
And you brought up a really great point there, which is that this data product team needs to be asking the stakeholder, the user who’s going to be using this data product given these results, given these say low fidelity as you described, fake results, realistic fake results.
Brian T. O’Neill: 27:26
Yeah.
Jon Krohn: 27:27
How are you going to act? And by going that extra step, you might be able to solve for the real problem that this person is looking for instead of the intermediate one. Right and yeah, so, you know, you could be spending months devising a solution only to come to having that solution in place and then say,”Oh okay, this is great. Now what I’m gonna need is this.” And you could have just built the,
Brian T. O’Neill: 27:50
Second thing.
Jon Krohn: 27:50
The second thing, right off the bat.
Brian T. O’Neill: 27:52
Yep. I also say that that doesn’t necessarily mean you need four bodies. The trio quartet, what the important thing is these are hats and they’re roles and you may not have all four of the roles, but I think all four of those are distinction. Those distinct brain types perspectives are important to have. And depending on the complexity of the situation, there might be a lot of need for a lot of data science hat on a project. Another one might, you know, it’s like software. Maybe it’s this model’s gonna have multiple touchpoints and internal services, API endpoints, mobile app, website, desktop, text messaging, like all over the place, right? Maybe the UX is really complicated cuz customers touch it. Admins, partners, vendors, roles, everyone has roles. Admins have this level, I can’t see all the data that went into the model, but you can blah blah blah.
28:42
There’s lots of complicated UX stuff. It’s just important that these lenses are thought about here. And the other thing is by having the team work with the customers and the users here, you’re gonna ask different questions. And this is the idea of like diverse opinions, diverse questions gonna get you more information because like if you and I, Jon went in and did a project together, you’re gonna be asking questions that I’m not and vice versa. I’m gonna be thinking a ton about workflow, I’m gonna be listening for emotionally charged language that gets people up. I’m gonna be thinking about what are the other tool sets to use? “How do you do this kind of stuff?” Now you might be looking at all this stuff no one is even thinking to ask for. It’s like, “Okay, well like what if this scenario happens? Or it’s like, what if this pipeline goes down and we don’t have any customer data?
29:29
Or what if it’s Christmas? Christmas, like, we’re expecting sales are gonna go crazy here. So have you thought about how it’s gonna happen when, when we have these normally unusual high volume situations, which are not anomalies that are kind of expected. Do you know when all those are? Should we be accounting for that in the model or not? Should those just be real blips that we wanna treat?” Like, “Oh my god, what’s going on?” It’s like “Nothing”. You’re gonna be thinking about all those kinds of perspectives, right? The engineers will have a different thing. The product persons could be thinking about, frankly about marketing operationalization. How do I get everybody on board? What do I need to do to make sure that all the work that we did actually gets out and gets used, and the business is aware of it? What’s the ROI on this? Blah, blah, blah. So again, it’s more about, it might, so it could be two people that 50:50 share these roles somehow, like two of them. Kind of like one data scientist who’s also enough engineer that he can cover all of that person or he or she and the designer can also be the product person or whatever, you know.
Jon Krohn: 30:35
That makes perfect sense to me. And that kind of thing happens, especially in early-stage companies where,
Brian T. O’Neill: 30:39
Absolutely.
Jon Krohn: 30:39
We can’t necessarily have all of the roles that we want. And we have to find people that are flexible. And who are interested in, in being that kind of flexible.
Brian T. O’Neill: 30:46
Right.
Jon Krohn: 30:46
So, following along with this power trio slash power quartet analogy, Brian, you’re not just the founder of Designing for Analytics, you’re also a professional musician.
Brian T. O’Neill: 31:03
You’re also a customer. That’s correct.
Jon Krohn: 31:08
Yeah. So you have a professional music education.
Brian T. O’Neill: 31:11
Yeah.
Jon Krohn: 31:11
And, you do professionally, fo you wanna just, just tell us a little bit about it? I think it’s interesting.
Brian T. O’Neill: 31:16
Sure. Yeah. I went to school for percussion performance, so yes, I get to play the drums. I got paid to play the Triangle a couple weeks ago at Albany Symphony. So yes, there, I think I heard, I didn’t watch the Super Bowl, so I was like, did you see the ad about the triangle players or something?? It’s like, yeah, that was me. I literally got paid to play the triangle in one of the pieces. But, yeah, I, besides the triangle, I do play a bunch of other percussion instruments and I have a couple groups that I run. So I do a lot of Broadway, theater type touring work that comes through the Boston area tour with my own groups, called Mr. Ho’s Orchestrotica is one of them. I play in a Klezmer Balkan party band for weddings and things like that.
Jon Krohn: 31:57
That’s cool.
Brian T. O’Neill: 31:57
Irish music, all kinds of different stuff. So yeah, that’s my other life.
Jon Krohn: 32:03
Super cool. Brian O’Neill into Irish Music. You’d never believe it. And on top of your Designing for Analytics company and your professional drumming, you also host the Experiencing Data podcast.
Brian T. O’Neill: 32:17
Yeah.
Jon Krohn: 32:18
So, this show, as people would probably suspect given everything that they’ve heard learned about you so far on this episode, it gives a designer’s perspective on why outputs alone, ao things like machine learning models, dashboards, or apps aren’t enough to drive user or business outcomes with data. And so you have lots of great guests on the show, people like Tom Davenport himself, to yeah, to provide a designer’s perspective on solving these problems. So the kinds of the way that you are framing problems that you’re describing to our audience today, you’re doing that over the course of hundreds of episodes of your own show. So people who are interested in this, and I think a lot of people should be because designing human-centered data products is key to you as a data scientist or software engineer or business person having your data product be used and generate revenue.
Brian T. O’Neill: 33:12
Yeah. You got it. Nobody wants technically right, effectively wrong, right? Like, I mean, some people are okay with that, but, but no one that’s paying for it or waiting for your thing wants technically right, effectively wrong. They don’t really even want machine learning. They don’t want analytics, they want decision support, they want to feel empowered, they want business value, they want to get a raise, they want a job title change, they want trust, they want empowerment. That’s all this downstream feely stuff we have. And you gotta get to that. That’s actually where, where it’s at. And it feels weird cuz it’s not about the model, right? But that’s, that’s ultimately where the value is.
Jon Krohn: 33:52
Nice. Yeah. I can see how invaluable you would be in so many data product design discussions. Brian, if people want to follow your work after this episode, other than your own Experiencing Data podcast, how should they follow you or get in touch with you?
Brian T. O’Neill: 34:06
Yeah, I’m pretty, I’d say LinkedIn is probably the place I’m most active. If you go to DesigningforAnalytics.com, you can find my LinkedIn link over there, my email’s Brian@designingforanalytics.com and yeah, I’m a little bit on Twitter, not too, not too heavily involved over there, but feel free to reach out. I’ve got a mailing list, a free mailing list on where you put out insights about kind of this product orientation and increasing user adoption of data products and strategies for that. So, yeah.
Jon Krohn: 34:31
Nice. Love it Brian, thank you so much for coming back onto the show. I learned a ton and I’m sure our audience did too. I’m sure it won’t be long, hopefully not another pandemic before we need to have you on again.
Brian T. O’Neill: 34:43
Yeah, sounds good. Thank you.
Jon Krohn: 34:46
Cool. That’s it for today’s mind-expanding episode with Brian T. O’Neill in which he detailed how so many data products fail because the user problem being solved was not well understood from the very beginning. How data product managers work alongside UX designers, software engineers and data scientists to form a power quartet that can develop outstanding data products and how mocking up results with realistic simulated data enables you to simulate how a user of a data product would react, perhaps enabling you to dig deeper into what the user is really after. Alright, that’s super helpful and super actionable guidance. I hope you thoroughly enjoyed this episode with Brian. Until next time, keep on rocking it out there folks. And I’m looking forward to enjoying another round of the SuperDataScience Podcast with you very soon.