Humanizing Data: Session Recap: Key Takeaways from Scott Smith at Digital Travel

09/02/2025

At Digital Travel 2025, Scott Smith, Director of Airport Operations at United Airlines, presented the case study "Humanizing Data: Leading with Future-Focused Metrics in a Rapidly Evolving World." Drawing from his experience improving customer experiences at Panasonic and now United, Smith challenged industry leaders to move beyond outdated, normalized metrics like time-to-gate. His session offered actionable strategies for creating tailored data stories that prioritize traveler journeys in today's complex aviation landscape.

Key Takeaways

1. Define metrics at the start

Smith stressed that new innovations demand fresh metrics established upfront, not retrofitted at the end. Too often, projects launch without clear success measures, leading to flawed evaluations. By building feedback loops from day one—asking "What question does this metric answer?"—teams can align data with goals like operational efficiency or customer experience, enabling real-time pivots as seen in United's hub electrification and boarding process overhauls.

2. Ditch outdated industry standards

Using time-to-gate as a prime example, Smith critiqued metrics from 30 years ago that ignore modern complexities like 93 checkpoint stops or shopping delays. This generic A-to-B measure fails to pinpoint issues in check-in or TSA flow. He advocated reevaluating such relics, which were often built for staffing rather than traveler satisfaction, to create more relevant insights amid evolving travel demands.

3. Humanize data with traveler stories

Smith encouraged viewing metrics through the lens of individual journeys, like "Bob walking to the gate" who wants to shop, not get stuck. He suggests analyzing customer complaints for keywords since searching thousands can reveal 3-4% trends driving disruptions. Tools like a disruption index via AI video analysis can quantify stoppages, turning raw data into empathetic narratives that boost satisfaction and revenue, aligning with United's 2025 customer experience investments.

4. Leverage data teams early

Integrate analysts and engineers parallel to product development for manual-to-automated metric creation. They'll spot trends, like process flaws, allowing mid-project adjustments. Smith repositioned data teams as allies proving success, not critics, shifting perceptions so "data calls" excite rather than alarm. This fosters collaboration in fast-paced environments like United's agile "hit squad" operations.

5. Craft compelling data presentations

Start meetings with the story, proposed solutions, and new metrics upfront; not burying audiences in 40,000 data points. If a metric truly tells the journey, executives will embrace it without explanation. Smith shared convincing skeptical CEOs by simply showing impactful visuals, emphasizing mission statements that humanize data for tangible improvements in traveler flow and loyalty.

Why It Matters

In a travel industry facing record passenger volumes and investments, like United's $59.1B 2025 revenue driven by capacity growth and CX upgrades, sticking to outdated metrics risks missing genuine pain points. Smith's approach addresses this by humanizing data, connecting operational tweaks to traveler frustrations amid AI integrations and sustainability pushes.

For leaders, it means turning complex journeys into measurable wins, enhancing efficiency, satisfaction, and competitiveness in a post-pandemic world of premium demand and hub expansions.

Actionable Insights

  • Define metrics upfront: Build feedback loops at project start to measure true success.
  • Question legacy metrics: Ask what original problem they solved and if it fits today.
  • Mine customer complaints: Search for keyword trends to inspire new indicators like disruption indexes.
  • Lead with stories: Present data via traveler journeys and mission statements to win buy-in.

Want more insights from Digital Travel? Click here to learn more about the program.

Click to View Full Session Transcript ▼

2025, Digital Travel. CASE STUDY PRESENTATION: Humanizing Data: Leading with Future-Focused Metrics in a Rapidly Evolving World

Announcer: All right, everyone, let's keep this show moving along. Our next case study is humanizing data, leading with future focused metrics in a rapidly evolving world with Scott. Okay with Scott from the United airlines C. Come on up, Scott, run outta applause.

Scott Smith, Director Airport Operations, United Airlines: I like that song. Hey everyone, I'm Scott, I'm a nerd. Okay, so we got that outta the way. I'm also gonna be referencing Lauren's presentation a couple times 'cause she's a victim of her own success. Like all true data people, we like to say, tell a story with your data. So I'm gonna give you my story.

I was born in a small town in Wisconsin. And like every small child, I thought that racing dirt bikes professionally was gonna be my life. That didn't happen. I started an auto shop and then I moved to a company called Panasonic, which oddly enough was like this huge electronics company, but now it's like dwindling, I think maybe 'cause they don't sell TVs anymore.

Anyways, I was there for 15 years. I did some online content creation and I ascended through a group called Field Engineering. What we did was we looked to quantify the customer experience. Based on software and hardware. So what we did was there's a TV screen in your seat back in your airplane, and Panasonic has about 95% market share.

So if you've used the seat back tv, it's probably come from Panasonic. They just didn't brand it. So when your screen crashed or rebooted, it was my fault. So I worked to help fix that. I eventually led Latin America, which was a really awesome experience. And then globally for Panasonic. Now I'm at United where I lead a very small and agile team.

And essentially what we do is we look at very targeted problems, and we try to assign data metrics and solutions to solve those problems. So things like we just finished electrification and solve over our hubs for going green. The boarding process that we reworked last year we're very like a hit squad of nerds that come in.

We disrupt, we look at how to quantify things, and then we move on to the next. So it's super fun. All right. I hate falling into tropes of speaking, but I want you to think of a recent product demo procedure improvement, anything you can think of where you were laid out, an example of what you wanted to improve.

And one of the things I enjoy about speaking about data so much is everyone is infinitely familiar with it. I'm following two people speaking about data, right? So all we do is talk about data. So it's. It's a challenge for me, personally to say, I want you to think differently and I want to demonstrate something that you may not have thought of already.

Which again, is hard because everyone talks about data. So think about that and then I hope at the end of this you can reflect later and think, yeah, he's right. Like we didn't do this, or, yeah, we could have pivoted here, or something like that. So I wanted to talk about something or give an example of something that everyone has a relation to.

Or everyone has an opinion about, and that can be very difficult, but, and oftentimes the simplest solution is very easy to find. So what I want to do is I want to talk to you about normalized in industry metrics and how we're getting stuck in a rut with these industry metrics. I asked you to think about a process or procedure and during that process or procedure, when it was pitched to you, by your executives, by your colleagues, whatever it might be.

I'll bet you they talked about everything but what metrics they were gonna use to define the success of it. And if they did, those metrics were probably normalized or industry standard. New innovation requires new metrics, and that is something we have to push ourselves, whether it's the customer experience, the operational efficiency, no matter what it is, these metrics have to be defined at the start and not at the end.

Far too often we go through this policy improvement, and then at the end, the finish line, we're saying, okay, how can we define this? How can we say, did this work? Why isn't that being defined at the beginning? Why aren't we looking at this and building that feedback loop right at the start and saying, okay, here's what we're gonna do.

Here's what we wanna be. How do we measure that? And if we can't measure it with industry defined metrics or metrics, we're already tracking, what metrics do we need to make that definition to build that feedback loop? And that's one thing I think is sorely missing right now. And that's my last point here.

Start over. Ask yourselves, are these metrics actually solving the intended purpose? Are these the right questions? That we're asking. When you look at a metric, for me, one of the first things I do is I look at, okay, this metric is X. What was the question that was originally asked when X was defined? We often just look at the data point or the metric, but we don't actually look at what question they were asking themselves when they defined that industry metric.

So start over, ask yourself that question. Is that the question you want to answer? So I said I was gonna give you an example. TSA checkpoints. One of the metrics we use in the travel industry is time to gate. I just laid out a very generic example with a lot of lovely belt scenarios with someone getting stuck in the metal detector.

But is time to gate actually what we want to be looking at here? Think about it, if I told you it took you 34 minutes to get through to the gate. I don't know if you spent 20 minutes shopping or if you got stuck behind this person with 14 belts. So the metric was defined 30 years ago, so in 1995 was time to get the same as it is today.

It's so much drastically different. The two aren't even comparable. But we still, and I can tell you this, as a person who works at United Airlines, we still look at this metric. Time to gate. It has almost no purpose besides being a super generic sized metric of A to B, right? We don't know if the check-in process was bad, we don't know if the TSA experience was bad.

We're starting to figure those things out. But when we look at the metrics. They're old and outdated. They were built for a different world that we don't have anymore. And we have to start asking ourselves the questions, time to gate. Why was it developed and does it suit our current need?

So I wanna go through a couple statements. Time to gate doesn't represent the journey. I think we can all agree on that. It's one metric for an extremely complex solution. So how do we continue to use this metric? 'cause there's always value in data, but then add to it and supplement it and actually make it usable for our mission or our purpose.

Statement two, it's outdated, it's genericized, and it's, today's world is far too complex for this metric. So how do we reevaluate it? We've already asked ourselves the question, why was this developed? It was developed because when it was made, we didn't have 93 stopping points from check-in to gate.

It was how long does it take you to walk through the airport? 'cause there's basically no security statement. Three we're overlooking travelers, genuine frustrations and stoppages. And this one I hope everyone can relate to. We don't know what happened in that time to gate journey. But we need to figure it out.

So again, what metrics can we add to meet the needs to support our improvement? Statement four is an interesting one that I hope some of you have come to but will realize when I talk about it, a lot of the metrics, when you look at them and you ask yourself, what was this metric developed for you actually find the root is staffing passenger flow.

Things that don't meet the intended purpose of, I want to make Tom's journey to the gate better. It was designed for, let's say, a TSA checkpoint that says, we can move 10,000 passengers per lane per hour. Okay a staffing model metric, does it meet the needs of a customer experience metric? That's completely different.

And when you start examining these metrics, you'll see that a lot of them were built for a completely different purpose.

So through the process, we want to define the improvement. We wanna look at the product that we're purchasing, creating, developing, whatever it might be, and we wanna design those feedback loops initially at the start of the project. One thing you'll notice when you start developing a strategy like this is when you're working with your nerds, they'll tell you.

Hey, we can get to these data points and we can develop these data points in parallel with the development of your software or whatever product you're developing, process, whatever it might be. But the analysts that are making those data points manually at first and the engineers that are working to automate it, will start to look at it and say, Hey, I've noticed in the data I'm seeing this trend.

So that can help you pivot your product to better suit its needs while it's being analyzed in process and in parallel with your product. Take your customer feedback. What are customers actually saying? One of my favorite things to do is I open up our backend data and I look at customer complaints, and I'll just read one or two of them and there'll always be like a phrase or a key word that kind of stands out to me.

Maybe I have to read 10, maybe I have to read one. And then I will search hundreds of thousands of complaints for a keyword, and I'll pivot and I'll say, okay, this is 0.01%. Okay? It's a small percentage, but every once in a while you'll get hit. It's three or 4% and you think, okay, that's big enough for me to put my attention to right Generic.

I search off of Single complaints are a wonderful way to look at how can you define that process? How can you make that process better? And then define those possible new metrics. Something like a disruption index. Are there checkpoint stoppages? Are we gonna use AI and video to analyze when a customer's coming through the checkpoint to see when the stoppage occurred?

That way we can focus on that stoppage and create a better process or flow to, to lessen that disruption, to make that experience better for them. What's the end to end impact? What is the, what does the journey look like and how do we tell that story through data, not just looking at random data points.

How many people have been in a meeting? You have a title slide, and then you get 40,000 items of data. You're busy reading that slide. Trying to determine, okay, I think Scott's looking at this. I'm not sure. No, the first slide should be, here's what I want to tell you. Here's how I'm going to show you in data.

And then here's the three solutions I'm proposing. And if it's a new metric, you can display the new metric. Now, in one of my talks, I got a question that said, I came up with a new metric, but my CEO doesn't wanna see it. How do I convince my superiors, my executives, that this metric matters?

Just show 'em the metric. If you've done your work and the metric tells that story, they're not gonna dispute it. They're gonna ask you ways that they can improve upon the other policies and procedures based on that metric. The metric tells the story, the journey. You don't have to explain it if you're spending time explaining the metric and not how the metric is improving.

The metric is probably a little bit flawed, sorry to tell you. And I've had that plenty of times. Lastly, develop a mission statement. Think about what do I want to accomplish when I'm looking at this, what helps me get to that mission statement and have that drive, that product, that innovation, humanize that metric.

Far too often we look at 17 different metrics, but we forget that Bob is walking to the gate. He doesn't want to be stuck somewhere. He wants to go buy chocolates on his United credit card, and we want him to do that as well. Define that mission statement. Define what you wanna see and define that early.

Define what metrics can supplement that mission statement. And if they're not there, go make them. Push yourself for new metrics. Don't rely on industry standards. Don't rely on normalized metrics.

This one doesn't matter. I showed you an example of a TSA checkpoint, and I did say I am from a small town in. Outside of Milwaukee, but I wanted to bring this up 'cause this is one of my favorite things whenever I go back home. Milwaukee is the only airport in the world with a Remodulation area and there are lots of news stories on this fact.

So if you ever fly through Milwaukee, you will go through the TSH checkpoint and they literally have about five different signs that tell you this is where you can rebo. And I looked it up. It is a real word and it is just as awesome as I promised you. Okay, so that's the general thesis. So I want you to think about the, my trope of, think about a product or a procedure that you have been pitched recently, up, down, peer, whatever it might be.

And think about that pitch and think, did we discuss the metrics at the onset of that pitch? And it's probably, no data can be a lot like hr. HR does a million wonderful things for you. They do your benefits, they make sure you're paid on time, but when HR calls, you still get that pit in your stomach what did I do?

Data teams can be a lot like that, and that's one thing I'm working really hard to change When the data team calls. It's not because, Hey, we found a mistake. You made your process isn't improving. The data team wants to help you succeed. The data team wants to prove your good work. At least my data team does.

And I want to make sure that is developed at the onset of these products or procedures because it's immensely important. And again, your manual effort of analysts looking at these things will help you redefine your process and shape your process as you develop it. That's it for me. I have a couple minutes.

I'm not sure if anyone has any questions. Otherwise I will be at the networking thing. So if you want to talk nerdy to me, I'm completely okay with it.

Announcer: Any questions

Scott Smith, Director Airport Operations, United Airlines: go

Announcer: in once.

Scott Smith, Director Airport Operations, United Airlines: Awesome. And I got you back on time. There you go.

Announcer: Thank you so much, Scott. Thanks everyone. Round of applause.