Turning data into dollars - Philip O'Donnell

Turning data into dollars - Philip O'Donnell

Dec 15, 2022

This episode features an interview with Philip O’Donnell, Group SVP of Data Platforms at the Adecco Group, the world’s leading talent advisory and solutions company. Philip has 13 years of experience in data analytics leadership and strategy consulting across a variety of industries. Prior to the Adecco Group, Philip served as Director of Data Science at Lee Hecht Harrison. On this episode, Philip discusses managing data at a big enterprise, how to prevent business decisions based on bad data, and turning data into dollars.

Learn more https://www.talend.com/resources/

Quotes

  • ”Just showing data to people leaves it up to their interpretation. And that’s not usually the value you’re providing. You’re there to communicate something to them. And we talk about things like data storytelling or crafting narratives with the data, because data by itself is just too unhelpful. You really have to turn that into something that people can understand, and that’s a different skill set than it is to just analyze and reproduce the same information.”
  • ”Most large firms are struggling with knowing that it’s better to have all this data in the same place. But that’s really hard, and it takes investment, and it takes time, and it takes executive commitment and buy-in, and I’m blessed to be able to have that Adecco. They have really put the focus on, ‘Let’s figure this out.’ Like, ‘We know it’s hard. We know it’s not easy, but let’s do it. Let’s make sure that we’re dedicating the resources to do it.’”
  • ”It was always the question of how much do you trust the data that you’re getting? As data professionals will tell you, we don’t create the data. We’re getting it, we’re interpreting it, we’re reading it, we’re organizing it, we’re structuring it. But we don’t create it. Something else creates it, some sort of business process. And I’m not in charge of that. So in some cases, the data quality is, did the report refresh on time? Now that’s the kind of data quality that we, as data, professionals should be able to own. But the other kind of data quality is, did someone enter it in the system correctly? And we can’t really control that. But what we can do is give visibility to whether or not that’s happening correctly.”
  • ”It’s one thing for a report to be wrong. It’s a different thing for you to tell the person that report is wrong so that they don’t use it to make a decision. And then there’s some sort of a process that’s correcting it. And I think that’s where we have to try to focus, is it’s a very realistic assessment of what our scope can be as data professionals. And we mostly focused on informing and visibility. If there are data quality issues, the worst case scenario is that someone makes a decision on bad data.”
  • ”If there’s a piece of data that we’re asking people for as a part of the process and it’s not actually required for the process, [but just] because we want to know it, it’s going to be very difficult to have that be high quality [data]. Because there’s not an incentive from the person entering it, other than the threat of being yelled at because you didn’t do it right… So if you can give ways of providing value to those users with the data that they’re putting in, then you create a sort of incentive feedback loop… You have to provide ways of giving people incentive to enter the right information that is actually then helping them do their job better instead of it just being something that management dictates you have to put in.”
  • ”You need to understand the data. You need to understand what it can tell you, what it can’t tell you, and then you need to figure out what you can do [with it], because that’s how you demonstrate value.”

Time Stamps

0:00 Intro

7:09 The role of the data professional

10:09 Consolidating mass amounts of data at a large company

11:07 Risk management and controlling exfiltration

15:18 Proving value and ROI in data

20:50 Understanding incidental data and how to monetize it

24:20 To centralize or to decentralize the data?

29:09 Learning to trust the data

31:40 Incentivizing accurate data input